An important article in Saturday’s NYT is on the payoff from data-driven decision-making. This is an area that has not been much studied in the past, and must be distinguished from the the different question of efficiencies from automating or providing tech support for underlying processes. In other words, this is not about doing document assembly or webpages, rather it is about using data (including from such innovations) to direct and manage the delivery of services.
The Times article reports on a research paper, titled Strength in Numbers: How Does Data-Driven Decisionmaking Affect Firm Performance? The abstract of the paper (which deals with the private sector) is as follows:
We examine whether performance is higher in firms that emphasize decisionmaking based on data and business analytics (which we term a data-driven decisionmaking approach or DDD). Using detailed survey data on the business practices and information technology investments of 179 large publicly traded firms, we find that firms that adopt DDD have output and productivity that is 5-6% higher than what would be expected given their other investments and information technology usage. Using instrumental variables methods, we find evidence that these effects do not appear to be due to reverse causality. Furthermore, the relationship between DDD and performance also appears in other performance measures such as asset utilization, return on equity and market value. Our results provide some of the first large scale data on the direct connection between data-driven decisionmaking and firm performance [bold added].
In other words, firms that use data to make decisions are 5 to 6% more efficient — and it is not that they use data because they are more efficient. (The conclusion of the paper states that DDD is also associated with higher Return on Equity and better asset allocation, but not necessarily with improvements in Return on Assets or profit margin.)
But for us the core message is that collecting and using data to make decisions means higher efficiency. While 5% to 6% is not massive, lets remember that we are only at the beginning of this process and that improvements over time build upon each other (it took surprisingly long for the efficiencies of PCs themselves to have a real impact, with training lags and other adoption issues causing delays). And, surely courts, legal aid and pro bono organizations are way way behind in this.
How many of our organizations can answer yes to these questions?
- Do you use data about incoming cases to change outreach services?
- Do you use data about different casehandling times to look at where and why time is being spent or saved?
- Do you look at outcomes to look at service protocols?
- Do you look at workflow patterns to identify areas of inefficiency
- Do you compare data on different needs for services to make sure that the most efficient techniques and delivery mechanisms are used?
Ways to do many of these things are suggested in Wayne Moore’s new book, discussed in a NewsMaker Interview (part one with part two to follow soon) on this blog.
This article, and this data, should be used to encourage the use of data driven decision making throughout access to justice organizations. In these times, we can not afford not to do so. Indeed there is an argument that where so much of our capacity to attract resources is not market driven, but rather driven by the ability to tell the story of the broad impact of our work, we stand to gain even better than the private sector by gathering and using this data.