A few years ago when I started DFA, I wrote this article that aggregates many of the studies on failure rates for big data, analytics, and now AI projects. It serves as a reminder that you can keep throwing money at data projects, but if you don't focus on the people involved, you can easily fall in the last mile. (Incidentally, this article is also the most viewed article on the DFA website!) Today, I added yet another bullet to that list đ
Recently, two leaders from the Venturebeat Transform 2019 conference discussed a finding citing that 87% of data science projects never make it into production. For the tech companies and leaders on this list, this is less relevant probably, as your whole product depends on shipping software, but for many of you working on internal data initiatives, I think this article summing up the panel from Deborah Leff (CTO for Data Science @ IBM) and Chris Chapo (SVP of Data and Analytics for The Gap) is telling.
âOne of the biggest [reasons] is sometimes people think, all I need to do is throw money at a problem or put a technology in, and success comes out the other end, and that just doesnât happen,â Chapo said....But now that [data science is] a team sport, and the importance of that work is now being embedded into the fabric of the company, itâs essential that every person on the team is able to collaborate with everyone else: the data engineers, the data stewards, people that understand the data science, or analytics, or BI specialists, all the way up to DevOps and engineering. âThis is a big place that holds companies back because theyâre not used to collaborating in this way,â Leff says.
Why I am highlighting this is that creating a new tool/service/app/product like this is not just a technology initiative; it's a people initiative. The design process forces us to empathize with people, and it necessarily includes *all* the people: internal stakeholders and players, as well as the end users/customers of the service (suspiciously absent from the list above). Design creates an environment where, in part by elucidating the problems and needs clearly to the product/service development team, innovation can emerge more quickly because blockers and concerns to the final end result can be addressed earlier.
For example, just today as part of my current research on a new training offering, I found out that one of my colleagues biggest concerns is a stagnant IT org that does not want to innovate, and focuses primarily on risk avoidance, security, and compliance. On top of that, they have had created predictive pricing models that aren't being used by the sales team because the team doesn't trust or understand the models yet. (The team is, however, spending time trying to vet the model predictions manually and often come up with similar findings, but ultimately, this undermines the business value the analytics team intended to deliver).
Is this an analytics, data, IT,  design, or business problem?
*YES.*
If you take a "product" mindset, even if you're working on a non-commercial internal analytics/data science initiative, then ask yourself, who is responsible to ensure the initiative delivers a great result in the last mile where humans start to interact with your AI? If you're a leader, get somebody in place to ensure that these expensive investments in data actually deliver value. If you're not at the top of your org, then consider asking your nearest executive, "who's responsibility is it to make sure that our efforts actually get used? What will we do to prevent creating a useless/ignored AI/analytics service and if we do, how will we address the issue of non-engagement?"
At some point, boards and leaders are going to wise up. A 13% success rate for data projects is no longer going to cut it, and either budgets will be cut, or personnel will be changed.
You might get lucky once in awhile, but most innovation is not happening in a box or the smartest mathematician's cube.