2 October 2016
Tantalus was a Greek mythical figure, who got punished to perform a task that was both torturous, and would never end. Do your efforts to improve data quality sometimes feel the same? As if the errors flowing in match –or worse– or even eclipse the pace of errors you are fixing? It can be a frustrating experience. For much of my career I have been fending for the business value of better data quality. In cases where we went the extra mile to build a substantive, quantitative case for data quality, the numbers invariably shocked and astonished stakeholders.
When I read this excerpt from Thomas Redman’s new book in Harvard Business Review, even I was incredulous. Could bad data really cost US companies 3 Trillion per year?!? I tend to believe Tom Redman, he’s been in this business as long as anyone around. But still: 3 Trillion?!? Hard to believe, yet if you factor in all those hidden costs, people reworking data that have already been produced and disseminated, it does make sense.
It doesn’t have to be that way. Once you recognize the organizational dynamics that are at play, you are well equipped to think about structural improvements: how can we organize ourselves, how can we change accountabilities, to arrive at a more beneficial, less costly dynamic? Unless you change the structure of an organization, along with showing results, making things better, fixing the costliest and most egregious errors first, you might well find yourself in Tartarus, endlessly repeating the same tasks. As a responsible data steward, you owe the organization better.