• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
Analytic Strategy Partners

Analytic Strategy Partners

Improve your analytic operations and refine your analytic strategy

  • Home
  • Blog
  • Books
  • About
  • Services
  • Contact Us

Uncategorized

Three Reasons All Corporate Boards Need Someone Who Understands Both Analytic Innovation and Analytic Strategy

January 14, 2021 by Robert Grossman

According to a 2019 report from CB Insights [1], between 2010 and 2019 there were 635 AI acquisitions. The acquisitions break into three groups, as can be seen in the visualization below (Figure 1) from CB Insights. Facebook, Apple, Google, Microsoft, Amazon (FAGMA) and Intel accounted for 67 acquisitions, each making 7 or more acquisitions during the period 2010 to August 2019 (Group 1). Fifty two companies made between 2 and 6 acquisitions during this period (Group 2), and 431 companies made a single acquisition (Group 3).

Figure 1 This histogram from CBInsights show the number of AI acquisitions that a company made during the period 2010 – August 2019. Facebook, Apple, Google, Microsoft, Amazon and Intel accounted for 67 acquisitions, each making 7 or more acquisitions during this period. Fifty two companies made between 2 and 6 acquisitions during this period, and 431 companies made a single acquisition. Source: CBInsights, retrieved from: https://www.cbinsights.com/research/top-acquirers-ai-startups-ma-timeline/ [1]

Reason 1. Analytic and AI strategy is too important for a company not to have someone with board level experience in this area.

Whether it is to ask critical questions about the single AI acquisition that 431 companies did between 2010 and 2019 or to ask critical questions about a company’s own analytic and AI efforts, a board member who has experience overseeing deployed analytic and AI applications is important.

It is important to note here the difference between someone who has experience in the entire life cycle of analytics from project start to deployment versus someone who only has experience developing analytic models. This is because most analytic projects don’t get deployed and don’t bring the expected value when deployed.

I taught a course at the University of Chicago’s Booth School of Business for three years called the Strategy and Practice of Analytics. One of my favorite case studies was HP’s acquisition of the AI company Autonomy in 2011 for $11.1 billion. As the New York Times reported a year later:

“Last week, H.P. stunned investors still reeling from more than a year of management upheavals, corporate blunders and disappointing earnings when it said it was writing down $8.8 billion of its acquisition of Autonomy, in effect admitting that the company was worth an astonishing 79 percent less than H.P. had paid for it [2].”

Reason 2. Board members with experience developing, deploying and operating complex analytic projects have critical experience using technology to innovate, not just operate.

All boards understand the importance of having board members that understand how to manage operations and risk to align with corporate strategy and drive financial performance. On the other hand, these days using technological innovation to align with corporate strategy and drive financial performance is also important. Successful senior leaders in analytics and AI generally have a deep understanding of technology innovation and how to use it to drive financial performance. See Figure 2.

In contrast, it is common for many CIOs to spend their time as CIOs managing IT operations, reducing IT costs, and using IT to quantify and control risks, rather than using IT to drive technology innovation and drive financial performance.

When successful, good analytic leaders find ways to use data, analytics and AI to change a company, not just run it. Having this perspective on a board is very valuable, as is experience with analytic projects that leverage continuous improvement and analytic innovation.

Figure 2. Traditional CIOs who serve on boards can help boards understand how to use IT to improve the efficiency of operations and reduce risk. Senior technical leaders who understand data and analytics can help boards understand how technical innovation can align with corporate strategy and improve financial performance.

Reason 3. An analytic perspective for a board member helps with evaluating cybersecurity and digital transformation, both critical topics for many boards.

A 2017 Deloitte study found that:

“high-performing S&P 500 companies were more likely (31 percent) to have a tech-savvy board director than other companies (17 percent). The study also found that less than 10 percent of S&P 500 companies had a technology subcommittee and less than 5 percent had appointed a technologist to newly opened board seats. … Historically, board interactions with technology topics often focused on operational performance or cyber risk. The Deloitte study found that 48 percent of board technology conversations centered on cyber risk and privacy topics, while less than a third (32 percent) were concerned with technology-enabled digital transformation.” Source: Khalid Kark et al, Technology and the boardroom: A CIO’s guide to engaging the board (emphasis added) [3].

A senior analytics executive with experience supporting cybersecurity is a double win for a board. Even without this experience, behavioral analytics plays an important role in the cybersecurity for a large enterprise, and senior analytics executives almost always have experience in behavioral analytics.

The remote work caused by the COVID-19 pandemic has accelerated the importance of board level understanding of digital transformation. As a Wall Street Journal article from October 2020 states it:

“If you didn’t have a digital strategy, you do now or you don’t survive,” said Guillermo Diaz Jr. , chief executive officer at software firm Kloudspot, and a former chief information officer at Cisco Systems Inc. “You have to have a digital strategy and digital culture, and a board that thinks that way,” he said. Source: Angus Loten, Many Corporate Boards Still Face Shortage of Tech Expertise, Wall Street Journal [4].

It is hard to image a digital strategy without an analytic strategy. Chapter 8 of my Developing an Analytic Strategy: A Primer [5] describes seven common strategy tools that can be easily adapted to develop an analytic or AI strategy, including SWOT, the Ansoff Matrix, the experience curve and blue ocean strategies.

References

[1] CBInsights, The Race For AI: Here Are The Tech Giants Rushing To Snap Up Artificial Intelligence Startups, CB Insights, September 17, 2019. Retrieved from: https://www.cbinsights.com/research/top-acquirers-ai-startups-ma-timeline/.

[2] James B. Stewart, From H.P., a Blunder That Seems to Beat All, New York Times, Nov. 30, 2012. Retrieved from: https://www.nytimes.com/2012/12/01/business/hps-autonomy-blunder-might-be-one-for-the-record-books.html

[3] Khalid Kark, Minu Puranik, Tonie Leatherberry, and Debbie McCormack, CIO Insider: Technology and the boardroom: A CIO’s guide to engaging the board, Deloitte Insights, February 2019. Retrieved from: https://www2.deloitte.com/us/en/insights/focus/cio-insider-business-insights/boards-technology-fluency-cio-guide.html

[4] Angus Loten, Many Corporate Boards Still Face Shortage of Tech Expertise: But more CIOs are expected to earn a seat as the pandemic forces companies to lean on digital, Wall Street Journal, Oct. 12, 2020. Retrieved from: https://www.wsj.com/articles/many-corporate-boards-still-face-shortage-of-tech-expertise-11602537966

Filed Under: Uncategorized Tagged With: AI, AI acqusitions, AI strategy, analytic acquisitions, analytic strategy, analytics, board membership, corporate boards, corporate governance, deploying analytics, digital transformation, innovation

Five Steps to Improve the Analytic Maturity of Your Company – 2021 Edition

December 14, 2020 by Robert Grossman

Half of AI Projects Fail

A good rule of thumb is that about half of all AI and analytic projects fail to bring business value. Here are some recent articles that remind us of this:

  • Gil Press writing in Forbes [Press2019] summarized some of the statistics around the failure rate of AI projects. One of the most relevant ones was:”25% of organizations worldwide that are already using AI solutions report up to 50% failure rate; lack of skilled staff and unrealistic expectations were identified as the top reasons for failure [Press2019].”
  • John McCormick writing a column for the Wall Street Journal discusses a Gartner report: “A just-released Gartner Inc. report found that in the last two years, companies with artificial-intelligence experience moved just 53% of their AI proof of concepts into production. A similar survey in 2018 by the research and advisory firm showed that only 47% of AI proof of concepts were fully deployed [McCormick2020].”

Companies that consistently and repeatedly build models, deploy models and extract business value from models generally use processes that share some common characteristics. The role of an analytic maturity model is to identify these processes as a first step to improving them.

Two Dimensions of Analytic Maturity

I have been involved in assessing the analytic maturity of organizations for about twenty years. Based upon this experience, in an (open access) article [Grossman2017], I introduced a framework (call it AMM-17) for evaluating the analytic maturity of an organization that was based on the software maturity capability model [Paulk93].

I have found this framework quite useful and it captures many of the important characteristics of developing analytic and AI models, but it doesn’t address an important difference between developing software (the subject of the capability maturity model) and developing analytic and AI models. In general, for most organizations, software is developed by a single department or division, whereas it is important that analytics be used throughout an organization, wherever that it can add value.

In this post, I’ll remind readers of the AMM-17 model (for more information, see the post “Improving the Analytic Maturity Level of Your Company”). The first four levels (Analytic Maturity Levels D1 – D4) can be best thought of as applying to a single department, center or business unit developing analytic models. In the section that follows, I’ll introduce a second dimension of evaluating the analytic maturity as analytics is replicated throughout an organization (Analytic Maturity Levels E1 – E4).

The Maturity Level of a Project Team, Department or Center to Build and Deploy a Model

The first dimension is the ability of a project, department or analytic center of excellent to build and deploy and analytic model. There are four essential levels of analytic maturity for a department or similar unit, which we call D1 – D4.

D1: Build Reports. An Analytic Maturity Level (AML) D1 organization can analyze data, build reports summarizing the data, and make use of the reports to further the goals of the organization.

D2: Build and Deploy Models. An AML D2 organization can analyze data, build and validate analytic models from the data, and deploy a model into an organization’s products, services, or internal operations.

D3: Use a Repeatable Process to Build and Deploy Models. An AML D3 organization follows a repeatable process for building, deploying and updating analytic models. For most organizations, a repeatable process for building and deploying analytic models usually requires a functioning analytic governance process.

D4: Strategy Driven Repeatable Analytics. A team, no matter how talented, can usually only build a handful of genuinely new analytic models each year. An AML D4 organization has an analytic strategy, aligns the analytic strategy with the organizational strategy, and develops and deploys the models as prioritized by the analytic strategy.

Of course, with the appropriate automation, any given model or category of model can be built many times, but when something new is required, it takes a team, and this is usually what is in short supply. With a analytic maturity level of D4, the effort of the team is spent on the right models, that is, the models that bring the most value to the organization. In other words, the choice of which models to build is driven by an strategy that is congruent with the organizational strategy.

These four levels are the same first four levels in the AMM-17 model described in my earlier post.

Analytic Maturity Levels for Enterprise-Scale Analytics

As companies grow in size, it is critical that analytic be replicated in any department, center of division that it can add value.

E1: Enterprise support. In AML E1 organization, IT support for analytic efforts, so that analytic projects get the data they need, get the computing infrastructure they need, and the support they need to deploy the models that they build?

E2: Replicate. An AML E2 organization replicates analytics throughout the organization, across the departments/divisions that can benefit from it. Different departments/divisions have the support they need from enterprise IT services

E3: Effective analytics governance to coordinate. An AML E3 organization has enterprise analytic services supporting different analytic projects, efforts and groups that provide enterprise data and services when required, while providing the support, flexibility and autonomy each effort requires to move at an optimal velocity. An AML E3 organization has services for enterprise data and metadata that provide a uniform level of integrity for the data, models, and scores across the enterprise.

E4: Holistic management and integration of analytics. An AML E4 organization aligns and integrates the analytic models used by one unit with those used by another unit. For example, in a company that provides credit, analytic models for customer acquisition are aligned with analytic models for risk determination so that new customers are the right customers and not customers who are likely to default on their payments 18 – 24 months later. As another example, for social media companies, analytic models that rank news items and posts for displaying to users to maintain customer engagement are aligned with analytic models for identifying news items and posts for inappropriate content so that items prioritized for users are consistent with the organizations policies and values.

Five Steps to Improve Your Company’s Analytic Maturity

Here are five basic steps to take to understand and quantify the analytic maturity level of your enterprise.

Step 1. Identify where in the organization analytics is done and evaluate the analytic maturity of each of these efforts (using D1 – D4)

Step 2. Identify the enterprise level of analytic maturity that supports these efforts and harmonizes them (using E1 – E4).

Step 3. Review:

  • the level of IT support for each analytic effort and how it can be improved
  • the effectiveness of analytic governance in helping each of these analytic efforts (vs slowing them down) and how it can be improved.
  • what are analytic opportunities and analytics risks that fall between the cracks of existing efforts and are the responsibility of an enterprise analytics office?

Step 4. Identify one to three new analytic efforts that can provide the most value to the organization as a whole.

Step 5. Identify one to three risks associated with analytics that can provide the greatest harm to the organization as a whole.

Related posts

  • Improving the Analytic Maturity of Your Company
  • Analytic Governance and Why it Matters

References

[Grossman2017] Robert L. Grossman, “A framework for evaluating the analytic maturity of an organization,” International Journal of Information Management, 1 February 2018, Volume 38, Number 1, pages 45-51, available online 22 September 2017 at https://doi.org/10.1016/j.ijinfomgt.2017.08.005 (open access).

[McCormick2020] John McCormick, “AI Project Failure Rates Near 50%, But It Doesn’t Have to Be That Way, Say Experts”, Wall Street Journal, Aug.7,2020, available here.

[Paulk1993] Paulk MC, Curtis B, Chrissis MB, Weber CV. Capability maturity model, version 1.1. IEEE software. 1993 Jul;10(4):18-27.

[Press2019] Gil Press, “This Week In AI Stats: Up To 50% Failure Rate In 25% Of Enterprises Deploying AI,” forbes.com, July 19, 2019, available here.

Filed Under: Uncategorized Tagged With: AI failures, analytic failures, analytic governance, analytic maturity, analytic maturity model, deploy analytic models, deploying AI models, developing AI models, repeatable analytics, repeatable process, software maturity capability model, strategy driven

Machine Learning vs AI Business Models – What’s New with the Economics of AI?

November 12, 2020 by Robert Grossman

The Economics of AI

Ajay Agrawal, Joshua Gans, Avi Goldfarb and Catherine Tucker have organized a series of important and influential conferences on the economics of AI. The proceedings of the 2019 conference are open access and full of interesting persepctives.

Three of the conference organizers (Ajay Agrawal, Joshua Gans, and Avi Goldfarb) are all from the University of Toronto Rotman School of Management and published a 2018 book for general readers called Prediction Machines [1]. In this book, they view AI systems as prediction machines that dramatically lower the cost of predictions. In principle, as the cost of prediction falls, organizations can make more and better predictions, and hopefully better decisions. They are fundamentally focussed on the lower cost of predictions and how that is changing business.

There is absolutely no question that the price of predictions has been falling dramatically. I tend to look at this through a longer and broader perspective of commoditization. 1) The commoditization of compute has been driven by Moore’s Law over the past several decades; the commoditization of software has been driven by open source software, and, more recently, by cloud computing and Software as a Service (SaaS); and, 3) the commoditization of data has been driven by the exponential growth of new sources of data from the internet, from smart phones, and from IOT/OT devices. For over forty years, analytics has been at the intersection of these three trends and this confluence has been changing business over the same period. We just keeping calling it something different: data intensive statistics in the 1980s, data mining in the 1990s, predictive analytics in the 2000s, and AI in this decade. See [2].

From an economics perspective, the cost of predictions drops and continues to drop.

The Economic Challenges of AI

On the other hand, if you are launching an AI start-up or starting an AI initiative, it is important to look at some of the barriers in building a successful AI business. Martin Casado, a partner from Andreesen Horowtiz, and his colleagues have written a series of articles that are well worth reading on this topic, and the broader topic of the economics of AI, including:

  • Martin Casado and Matt Bornstein, The New Business of AI (and How It’s Different From Traditional Software), February 16, 2020.
  • Martin Casado and Matt Bornstein, Taming the Tail: Adventures in Improving AI Economics, August 12, 2020.
  • Martin Casado and Peter Lauten, The Empty Promise of Data Moats, May 9, 2019

In these articles, there are insightful comparisons of AI start-ups compared to software as a service (SaaS) start-ups. Perhaps the most useful take home message for those not working in the industry is the following formula from [3]

Equation 1. An important Equation from the article by Martin Casado about the new business of AI [1].

The importance of this formula from Martin Casado and Matt Bornstein’s article “The New Business of AI” cannot be over emphasized. Although many AI start-ups, data science start-ups, and analytic start-ups may initially view themselves as a software start-up, they generally also end up involved with curating data and building models over the data; that is, they find themselves in a services business also.

The Four Elements of a Successful AI Business

Whether in the era of statistical modeling (80’s), data mining (90’s), predictive modeling (2000’s), or AI (2010’s), there have always been four critical elements.

Element 1. You need the data and the IT infrastructure to manage it.

Although data is being commoditized, getting the data you need to solve a problem that can be monetized is not always easy. Specialized IT infrastructure, may, or may not, be needed, depending upon the volume and velocity of the data.

Element 2. You need the expertise to clean the data and build the models.

This is often labor intensive and often involves exploratory data analysis, careful cleaning of the data, and experimentation to improve the model.

In addition, some models may take substantial amounts of data and substantial amounts of computing power, raising the cost of the model and its maintenance.

Element 3. You need software to build models.

Depending upon your solution, you may, or you may not, need to develop your own software.

We can summarize these three critical elements with a slight addition to Equation 1 to get Equation 2.

Equation 2. Although data is commoditized, getting the data you need to solve the business problem of interest is often still a problem.

In practice, both Equations 1 and 2 miss a critical element that is at the core of most successful analytic companies.

Element 4. You need a business model that generates enough business value to justify the costs required to collect the data and build the model.

The point to keep in mind here is that services required to curate data and build models is often labor intensive and therefore the analytic model must generate enough business value to justify the costs to collect the data, curate the data, understand the data, build the model, improve the model, and manage the edge cases. This is not easy.

This brings us to Equation 3:

Equation 3. What’s needed for an AI business.

Although the economics of AI is dramatically lowering the cost of predictions, finding a business model to provide the foundation for a competitive and sustainable AI business still requires some effort. In addition, as pointed out in the articles by Martin Casado, the resulting AI business generally does not have the margins and scalability of a software company. These two basic facts have been the case for the past forty years.

In my book Developing an Analytic Strategy: a Primer [4], I take a slightly simpler perspective. These days with cloud computing and Software as a Service, the software is usually not the critical path. If you have the data, if you have the expertise, and if you have the business model, you can generally succeed. I call this the DEB Framework.

  • Data. Is the data (“D”) required for your analytic strategy available? If not, do you have a realistic plan for getting it?
  • Expertise. Is the expertise (“E”) required for processing and transforming your data available? Does this expertise include people who have developed, deployed, operated, and maintained similar models?
  • Business model. Have you identified a business model (“B”) for monetizing the data or extracting the required value that is sustainable, provides compelling competitive advantages, and can be protected from current competitors and future new entrants into the market?

References

[1] Agrawal, Ajay, Joshua Gans, and Avi Goldfarb. Prediction machines: the simple economics of artificial intelligence. Harvard Business Press, 2018.

[2] Robert L. Grossman, The Structure of Digital Computing, Open Data Press, 2012.

[3] Martin Casado and Matt Bornstein, The New Business of AI (and How It’s Different From Traditional Software), February 16, 2020.

[4] Robert L. Grossman, Developing an Analytic Strategy: A Primer, 2020.

Notes About Links

There are no affiliate links in this post. and I get no revenue from the Amazon links. I do get a royalty from the sale of my books.  

Filed Under: Uncategorized Tagged With: AI, analytic start-ups, business models, data science start-ups, economics of AI, machine learning start-ups, software as a service, software vs services companies, the cost of analytics

Why Great Machine Learning Models are Never Enough: Three Lessons About Data Science from Dr. Foege’s Letter

October 12, 2020 by Robert Grossman

Figure 1. William H. Foege, MD, MPH standing next to the bust of Hygeia, the Greek goddess of health on the grounds of a CDC facility in Atlanta. Dr. Foege was the Director of the CDC from 1977 until 1983.
Source: https://phil.cdc.gov/Details.aspx?pid=8149.

Foege’s Letter

In September, William H. Foege, MD, MPH sent a private letter to Robert Redfield, the Director of the CDC reminding him that the “best decisions come from the best science” and the “best results come from the best management.” The letter became public on October 6, 2020 in a USA today article written by Brett Murphy and Letitia Stein and it is well worth reading.

In this post, we look at how these insights apply to building analytic and AI models and applying them to challenging real world problems.

Dr. Foege trained in the Epidemic Intelligence Service (EIS) of the Centers for Disease Control and Prevention (CDC) between 1962 and 1964. The EIS is a fellowship program run by the CDC that trains epidemiologists and is famous the quality of the epidemiologists it trains and for the effectiveness of its investigative and emergency response efforts. In the 1970s, Dr. Foege made critical contributions to the global strategy that led to the eradication of smallpox, culminating in the May 8, 1980 declaration at the 33rd World Health Assembly (WHA) that the world was free of this disease. Smallpox is one of only two diseases that the WHA has designated as eradicated. He served as the Director of the CDC from 1977 to 1983.

Great Science Supports Great Management

To say the least, Dr. Foege is well qualified to understand the role of data and science, management and coalitions, and the leadership necessary to tackle challenging problems, such as the COVID-19 pandemic and how to organize and lead the response to it. In the letter he states:

The first thing … [is] to face the truth. We have learned that the best decisions are based on the best science and the best results are based on the best management. William Foege, MD, MPH, in a letter dated Sept 23, 2020.

From an analytics perspective, I would add two more layers

  • the best results are based on the best management
  • the best decisions are based on the best science
  • the best models are based on the best data
  • the best data are based on the best data sharing (or data collection efforts)

The first two are the domain of management; the second two are the domain of data science. The role of analytic governance is to knit these together through an analytic strategy and to develop a strategic implementation plan to produce the best results. See Figure 2. For background information about analytic strategy and analytic governance, my Primer may be helpful.

Lessons for Tackling Challenging Data Science and Analytic Problems

From the perspective of this blog, I would highlight three lessons that Dr. Foege’s letter suggests:

  1. When you have a challenging problem, face the truth and speak the truth.
  2. Clearly separate the data science / analytics from the management, and make sure you have the best of both. It is critical that there is sufficient analytic governance and strong enough leadership to guarantee that the best science supports the best management.
  3. Good models require good data, and one of the best ways to get good data is by through data sharing collaborations. This is especially important in times of national emergencies.
Figure 2. In analytics and AI, the best data sharing and data collection leads to the best data; the best data leads to the best models; the best models lead to the best decisions; the best decisions lead to the best results. The best results require both the best science and the best management. Analytic governance is the governance structure that knits this all together.

Filed Under: Uncategorized Tagged With: analytic strategy, EIS, face the truth, the best data sharing, the best management, the best models, the best science, William Foege

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Interim pages omitted …
  • Page 9
  • Go to Next Page »

Primary Sidebar

Recent Posts

  • Developing an AI Strategy: Four Points of View
  • Ten Books to Motivate and Jump-Start Your AI Strategy
  • A Rubric for Evaluating New Projects that Produce Data
  • How Does No-Code Impact Your Analytic Strategy?
  • The Different Varieties of Advisors & the Difference it Makes

Recent Comments

    Archives

    • May 2022
    • April 2022
    • March 2022
    • February 2022
    • January 2022
    • December 2021
    • November 2021
    • October 2021
    • September 2021
    • August 2021
    • July 2021
    • June 2021
    • May 2021
    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • June 2019
    • May 2019
    • September 2018

    Categories

    • Uncategorized

    Meta

    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org

    Copyright © 2025