Same tools. Same models. Same budget. But the results? A 7.2x gap.
PwC looked at 1,217 executives across 25 industries and found that 74% of AI's economic value is being captured by just 20% of companies. The dividing line wasn't tools — it came down to one strategic choice.
What did PwC actually find?
This is PwC's global AI Performance Study, published April 2026. They surveyed 1,217 director-and-above executives across 25 sectors in October–November 2025, then scored 60 AI management and investment practices into what they call an "AI fitness index" — and linked each to actual revenue and efficiency outcomes.
The result is striking. 74% of AI's economic value is being captured by just 20% of organizations. And they're not just doing slightly better.
Leaders are seeing 7.2x more AI-driven revenue and efficiency gains than the average competitor, 4 percentage points higher profit margins, and 25–40% productivity gains in AI-augmented workflows. The median company? Just 3–7%.
PwC global chair Mohamed Kande put it this way. "2026 is shaping up as a decisive year for AI. A small group of companies are already turning AI into measurable financial returns, while many others are still struggling to move beyond pilots."
Same tools — so why the different outcomes?
Here's the interesting part. The gap doesn't come from how much you spend on AI or how sophisticated your tools are. The real fork between leaders and laggards is where they point AI.
Laggards mostly use AI for cost reduction and efficiency — doing the same things faster, with fewer people. Leaders use AI for new revenue and business reinvention instead. Specifically, using AI to capture industry-convergence opportunities is the single strongest factor influencing AI-driven financial performance.
| Laggards (majority) | AI leaders (top 20%) | |
|---|---|---|
| Primary use of AI | Productivity & cost cutting | New revenue & business reinvention |
| AI for industry-convergence opportunities | Rare | 2–3× more likely |
| Responsible AI framework in place | Rare | 1.7× more likely |
| Cross-functional AI governance board | Rare | 1.5× more likely |
| Employees trusting AI outputs | Lower | 2× higher |
| Decisions made without human intervention | Fewer | 2.8× more growth in |
What's counterintuitive: stronger governance actually correlates with more autonomous decisions, not fewer. That's the key insight from PwC's analysis. It's a compounding loop — trust → more automation → faster learning.
Why "growth" beats "productivity" so decisively
Productivity gains are a shared game. If you become more efficient, competitors eventually catch up and your margins get normalized. New revenue is a creation game. Cross an industry boundary and open a new market, and the gap compounds in a way nobody catches up to.
Kande also said this at Davos, citing a separate PwC survey: "Somehow AI moves so fast that people forgot — adoption of technology has to go back to the basics. Clean data. Solid business processes. Governance." Not the shiny tools. The foundations.
How to start — practical steps
- Re-examine why you're using AI in the first place
Pull up your current AI project list. Classify each as "cost reduction" or "new revenue." If under 30% are in the latter bucket, you're closer to the laggard pattern. - Hunt for revenue opportunities beyond your industry's borders
Leaders don't just chase efficiency inside their own lane. They look at adjacent industries' customer pain points and use AI to address them as a new revenue line. Fintech moving into insurance pricing, healthcare using financial behavioral data — that's the starting shape. - Build a lightweight Responsible AI framework + governance board
Don't make it grand. A one-page document on where you use AI and where you don't (e.g. not in HR decisions), who owns it, and what pre-use checks are required. Plus a small review group with one person each from legal, tech, business, and HR. Leaders are 1.5–1.7× more likely to have this. - Make AI outputs trustworthy enough for your team
You need citations, source displays, and interfaces where users can verify and edit results. The 2× gap in trust feeds directly into the 2.8× gap in autonomous decisions. - Self-audit your "AI fitness index" every quarter
You don't need to score all 60 items. Just look at the two dimensions — AI use and AI foundations — figure out where you're weaker, and make that next quarter's priority. That's the fastest way to close the learning gap.
Want to go deeper?
The full PwC 2026 AI Performance Study press release All the key numbers — 74%, 7.2×, 25–40% — sourced from the original pwc.com
Want ROI from AI? Go for growth — PwC PDF The full report unpacking the "fitness index" and "industry convergence" concepts pwc.com
Mohamed Kande interview — Fortune Pairs the 56%-of-companies-getting-nothing finding with the PwC chair's broader diagnosis fortune.com
Bernews — PwC finds widening AI performance gap A one-page summary of the same research, good for a quick scan bernews.com
EME Outlook analysis — Why AI value concentrates in 20% A structural breakdown of leaders-vs-laggards by industry emeoutlookmag.com



