Half of AI Projects Fail
A good rule of thumb is that about half of all AI and analytic projects fail to bring business value. Here are some recent articles that remind us of this:
- Gil Press writing in Forbes [Press2019] summarized some of the statistics around the failure rate of AI projects. One of the most relevant ones was:”25% of organizations worldwide that are already using AI solutions report up to 50% failure rate; lack of skilled staff and unrealistic expectations were identified as the top reasons for failure [Press2019].”
- John McCormick writing a column for the Wall Street Journal discusses a Gartner report: “A just-released Gartner Inc. report found that in the last two years, companies with artificial-intelligence experience moved just 53% of their AI proof of concepts into production. A similar survey in 2018 by the research and advisory firm showed that only 47% of AI proof of concepts were fully deployed [McCormick2020].”
Companies that consistently and repeatedly build models, deploy models and extract business value from models generally use processes that share some common characteristics. The role of an analytic maturity model is to identify these processes as a first step to improving them.
Two Dimensions of Analytic Maturity
I have been involved in assessing the analytic maturity of organizations for about twenty years. Based upon this experience, in an (open access) article [Grossman2017], I introduced a framework (call it AMM-17) for evaluating the analytic maturity of an organization that was based on the software maturity capability model [Paulk93].
I have found this framework quite useful and it captures many of the important characteristics of developing analytic and AI models, but it doesn’t address an important difference between developing software (the subject of the capability maturity model) and developing analytic and AI models. In general, for most organizations, software is developed by a single department or division, whereas it is important that analytics be used throughout an organization, wherever that it can add value.
In this post, I’ll remind readers of the AMM-17 model (for more information, see the post “Improving the Analytic Maturity Level of Your Company”). The first four levels (Analytic Maturity Levels D1 – D4) can be best thought of as applying to a single department, center or business unit developing analytic models. In the section that follows, I’ll introduce a second dimension of evaluating the analytic maturity as analytics is replicated throughout an organization (Analytic Maturity Levels E1 – E4).
The Maturity Level of a Project Team, Department or Center to Build and Deploy a Model
The first dimension is the ability of a project, department or analytic center of excellent to build and deploy and analytic model. There are four essential levels of analytic maturity for a department or similar unit, which we call D1 – D4.
D1: Build Reports. An Analytic Maturity Level (AML) D1 organization can analyze data, build reports summarizing the data, and make use of the reports to further the goals of the organization.
D2: Build and Deploy Models. An AML D2 organization can analyze data, build and validate analytic models from the data, and deploy a model into an organization’s products, services, or internal operations.
D3: Use a Repeatable Process to Build and Deploy Models. An AML D3 organization follows a repeatable process for building, deploying and updating analytic models. For most organizations, a repeatable process for building and deploying analytic models usually requires a functioning analytic governance process.
D4: Strategy Driven Repeatable Analytics. A team, no matter how talented, can usually only build a handful of genuinely new analytic models each year. An AML D4 organization has an analytic strategy, aligns the analytic strategy with the organizational strategy, and develops and deploys the models as prioritized by the analytic strategy.
Of course, with the appropriate automation, any given model or category of model can be built many times, but when something new is required, it takes a team, and this is usually what is in short supply. With a analytic maturity level of D4, the effort of the team is spent on the right models, that is, the models that bring the most value to the organization. In other words, the choice of which models to build is driven by an strategy that is congruent with the organizational strategy.
These four levels are the same first four levels in the AMM-17 model described in my earlier post.
Analytic Maturity Levels for Enterprise-Scale Analytics
As companies grow in size, it is critical that analytic be replicated in any department, center of division that it can add value.
E1: Enterprise support. In AML E1 organization, IT support for analytic efforts, so that analytic projects get the data they need, get the computing infrastructure they need, and the support they need to deploy the models that they build?
E2: Replicate. An AML E2 organization replicates analytics throughout the organization, across the departments/divisions that can benefit from it. Different departments/divisions have the support they need from enterprise IT services
E3: Effective analytics governance to coordinate. An AML E3 organization has enterprise analytic services supporting different analytic projects, efforts and groups that provide enterprise data and services when required, while providing the support, flexibility and autonomy each effort requires to move at an optimal velocity. An AML E3 organization has services for enterprise data and metadata that provide a uniform level of integrity for the data, models, and scores across the enterprise.
E4: Holistic management and integration of analytics. An AML E4 organization aligns and integrates the analytic models used by one unit with those used by another unit. For example, in a company that provides credit, analytic models for customer acquisition are aligned with analytic models for risk determination so that new customers are the right customers and not customers who are likely to default on their payments 18 – 24 months later. As another example, for social media companies, analytic models that rank news items and posts for displaying to users to maintain customer engagement are aligned with analytic models for identifying news items and posts for inappropriate content so that items prioritized for users are consistent with the organizations policies and values.
Five Steps to Improve Your Company’s Analytic Maturity
Here are five basic steps to take to understand and quantify the analytic maturity level of your enterprise.
Step 1. Identify where in the organization analytics is done and evaluate the analytic maturity of each of these efforts (using D1 – D4)
Step 2. Identify the enterprise level of analytic maturity that supports these efforts and harmonizes them (using E1 – E4).
Step 3. Review:
- the level of IT support for each analytic effort and how it can be improved
- the effectiveness of analytic governance in helping each of these analytic efforts (vs slowing them down) and how it can be improved.
- what are analytic opportunities and analytics risks that fall between the cracks of existing efforts and are the responsibility of an enterprise analytics office?
Step 4. Identify one to three new analytic efforts that can provide the most value to the organization as a whole.
Step 5. Identify one to three risks associated with analytics that can provide the greatest harm to the organization as a whole.
Related posts
- Improving the Analytic Maturity of Your Company
- Analytic Governance and Why it Matters
References
[Grossman2017] Robert L. Grossman, “A framework for evaluating the analytic maturity of an organization,” International Journal of Information Management, 1 February 2018, Volume 38, Number 1, pages 45-51, available online 22 September 2017 at https://doi.org/10.1016/j.ijinfomgt.2017.08.005 (open access).
[McCormick2020] John McCormick, “AI Project Failure Rates Near 50%, But It Doesn’t Have to Be That Way, Say Experts”, Wall Street Journal, Aug.7,2020, available here.
[Paulk1993] Paulk MC, Curtis B, Chrissis MB, Weber CV. Capability maturity model, version 1.1. IEEE software. 1993 Jul;10(4):18-27.
[Press2019] Gil Press, “This Week In AI Stats: Up To 50% Failure Rate In 25% Of Enterprises Deploying AI,” forbes.com, July 19, 2019, available here.