a
Article • Data Foundation Failures

Why Most AI Pilots Never Leave the Runway

The three structural failures keeping organisations stuck in AI proof-of-concept purgatory , and what it takes to finally scale.

Decision Inc. Data & AI Advisory 2026
Campbell Brown
Campbell Brown Managing Director, Decision Inc. Australia
88%
Organisations using AI in at least one function
McKinsey State of AI, 2025
7%
Have fully scaled AI across the enterprise
McKinsey State of AI, 2025
6%
Qualify as genuine high performers
McKinsey State of AI, 2025

Scaling AI in corporate is hard to get right. McKinsey's 2025 State of AI report found that 88% of organisations now use AI in at least one business function but only 7% have fully scaled it across the enterprise, and just 6% qualify as genuine high performers generating meaningful financial impact.

Every CIO has heard some version of this conversation. The CEO wants to know what the organisation is doing with AI. The board is asking about competitive positioning. Meanwhile, the technology team is wrestling with a data platform that cost a small fortune but hasn't replaced a single legacy system, a Copilot rollout where half the licensed users have reverted to old habits and a handful of promising AI pilots that have no clear route to production. And with the continued focus on AI, the gap between ambition and execution is widening, not narrowing.

Based on observations across our more than 200 managed data platforms, and hundreds of client engagements, three structural failures account for the stalling of AI initiatives before they scale.

01 , The First Failure

The Data Foundation Is Not Ready

Organisations often reach for AI before their data layer can support it. And even when the organisation is on the front foot, there is often a significant lift required to drive business value from the data foundation deployment. From our recent survey of 200 IT leaders, only around 15% of organisations are generating analytic scale from their platform, where data outcomes directly shape strategy.

Often times these projects go well on the surface: executive buy-in is secured, an internal team is assembled and a platform gets stood up that ticks many of the initial boxes. But despite this, projects often fail to deliver the expected return on investment.

Cost Savings Aren't Realised

Legacy systems don't get decommissioned , often used as the key driver for a unified data platform business case alongside AI.

Platform Cost Is Underestimated

Costs balloon beyond forecast due to poor implementations or pricing balloons as consumption-based pricing behaves differently from legacy licensing models.

Lack of Groundbreaking Insight

The business which was promised better, faster insight sees the reports as simply the 'same old' - with a slightly different background.

Without remediation - the platform quietly becomes expensive shelfware.

The lesson for CIOs is not to pause on driving unified data as the precursor to delivering on AI. It is to sequence investment more deliberately. A clean, scalable data layer is not a precondition for every AI use case, as an example, document comparison and conversational intelligence can deliver immediate value with minimal data infrastructure. But for anything that requires learning from organisational history, the foundation needs targeted intervention before the AI layer is added on top.
AI pilots taking off
The Defining Challenge

"The technology is outpacing the operating model."

02 , The Second Failure

The Process Has Not Been Redesigned

AI high performers are 55% more likely to have fundamentally redesigned workflows rather than simply layering tools onto existing ones , and those following four or more deployment best practices achieve roughly 12% cost efficiency gains versus 5% for others.

"This is arguably the defining failure of the current AI moment, the technology is outpacing the operating model. Organisations are deploying AI into processes that were never designed to use its output, and then wondering why adoption stalls and ROI fails to materialise."

Recently an organisation built out AI to flag poor quality sales calls in real time. The technology worked perfectly. However, the project stalled because nobody had asked the follow-up question: what do we actually do when we identify these calls? The process had not been reimagined to absorb the output.

A more disciplined approach is to treat process redesign as a prerequisite piece of groundwork, not a follow-on. Before deploying an agent or model, map the process end-to-end. Identify the inputs, the outputs, the waste, and the decision points. Then ask where AI genuinely accelerates or improves the process rather than simply augmenting it cosmetically.

Often 70 to 80 percent of the available efficiency gain can be captured through process automation alone before a sophisticated AI is applied. Building on that foundation produces substantially better outcomes than layering AI onto broken workflows.

03 , The Third Failure

The Organisation Has Not Come With You

The third failure is the most human , and the most consistently underestimated. Copilot is a useful case study. Organisations license it, connect it to SharePoint, and communicate its availability. Adoption peaks for two to three weeks, but then trends downwards. Licensed users who do persist use it for email drafting and document search, leaving the more powerful capabilities entirely untouched.

This is not a failure of the technology. It is a failure of training and change management. The organisations that have done this well run structured AI literacy programmes before rollout, select early adopters deliberately rather than randomly, and invest in hands-on upskilling that connects AI capability to the specific jobs people actually do. The business case for this upskill is what determines whether an AI investment generates a return at all.
04 , The Path Forward

What Good Looks Like

The organisations generating real returns from AI share a common characteristic: they are thinking about it as an operating model question, not a technology question. They have defined how AI will function within the organisation, centralised versus federated governance, decision rights, a delivery lifecycle with clear gates, before they start deploying at scale. They treat each initiative as an asset to be catalogued and reused rather than a point solution that delivers once and disappears.

Perhaps most importantly, the CIOs who are moving fastest are the ones who have resisted the pressure to move in all directions simultaneously. They have identified the pockets of the organisation where the data is genuinely ready, the processes are well-understood, and the appetite for change is high, and they have gone deep there first.

"Concentrated success is more valuable than distributed experimentation, and it creates the proof points that unlock broader organisational commitment."

In Summary

The window for competitive differentiation through AI is real , but it is not unlimited. The organisations that will emerge with durable advantages are not necessarily the ones moving fastest.

They are the ones moving with the most structural discipline: getting the data right where it matters, redesigning processes before automating them, and investing as seriously in their people's capability as they do in the technology itself.

Let's Talk

Ready to Move Beyond Pilot Purgatory?

Take our Data Maturity Survey to benchmark your organisation against the '6% High Performers' across data, process, and people.