Explore our Progress 2050 Goal Tracker

Find out more

Opinion article

Bridging the AI divide, a national imperative for productivity

Australia’s productivity slowdown has become impossible to ignore — and artificial intelligence could be the catalyst that changes everything, if we have the courage and capability to deploy it wisely.

Australia’s productivity challenge is no longer a slow burn, it’s a flashing red light. As we gather at the CEDA AI Leadership Summit in Brisbane, the question isn’t whether artificial intelligence can help, it’s whether we’re bold enough to use it properly.

AI has exploded into the mainstream. From ChatGPT to Copilot, it’s reshaping how individuals work. But here’s the uncomfortable truth, despite billions in global investment, 95 per cent of organisations are seeing zero return. The problem isn’t the tech, it’s us.

There’s a growing recognition of what some are calling the “AI divide”, the chasm between experimentation and transformation. Most deployments fail not because models are weak, but because they’re static, siloed and disconnected from real workflows. In the public sector, this is especially dangerous. Governments risk mistaking flashy demos for genuine reform.

The recent reversal by Commonwealth Bank, which cut 45 call centre jobs due to AI, only to reinstate them after call volumes surged, should be a wake-up call. AI isn’t a shortcut to cost-cutting, it’s a system that must be designed to serve people, not replace them. The backlash from unions and the Fair Work Commission shows what happens when implementation outpaces consultation.

The MIT NANDA report, The GenAI Divide: State of AI in Business 2025, warns of a “shadow AI economy” where employees trust general-purpose tools more than internal systems. That’s a trust gap, and in government, trust is everything. If we don’t embed AI into the fabric of public service, we’ll lose the opportunity to rebuild it.

Treasurer Jim Chalmers’ recent roundtable made it clear, AI is now a national priority. The government is accelerating work on both an APS AI plan and a national AI strategy, with a focus on modernising procurement, improving data use and conducting a legislative gap analysis. This is a moment of alignment between policy ambition and technological capability.

But Australia faces a unique challenge. The University of Melbourne and KPMG’s joint report, Trust, Attitudes and Use of Artificial Intelligence: A Global Study 2025, reveals we are more cautious, less trusting and less confident in AI than many other nations. Only 36 per cent of Australians say they trust AI, and just 17 per cent report high acceptance, among the lowest globally. Concern about AI risks is high (78 per cent), and only 10 per cent are aware of existing AI regulations. In contrast, countries like Nigeria, India and China report trust levels above 60 per cent, and far greater optimism about AI’s benefits.

This caution isn’t a weakness, it’s a signal. Australians want AI to be safe, ethical and well-governed. They expect oversight by government and regulators, co-regulation with industry and international standards. The public mandate is clear, trust must be earned, not assumed.

KPMG sees this moment as a turning point. The public sector can leapfrog by learning from private sector missteps and focusing on systems that serve citizens, not just automate processes. Here’s what we believe must happen next:

  1. Stop building, start embedding: Internal AI builds fail twice as often. Governments should partner with vendors who understand public workflows and co-develop systems that deliver value from day one.
  2. Trust the frontline: The best use cases come from “prosumers”, employees already using AI informally. Let them lead. They know where the friction is.
  3. Design for learning: Static tools stall. Agentic systems, those that retain memory, learn from feedback, and integrate across platforms, are the future of public service delivery.
  4. Protect the basics: As unions rightly argue, AI must be implemented with safeguards. Workforce consultation isn’t a checkbox, it’s a prerequisite for trust and success.
  5. Regulate with purpose: The government’s decision to explore existing legislation rather than rush into a standalone AI Act is wise. Regulation must balance innovation with accountability.

Australia’s public sector has a unique opportunity to lead. But we must act fast and act smart. The AI boom is real. So is the risk of missing it. Let’s not be the country that watched the future happen. Let’s be the one that built it.

CEDA Members contribute to our collective impact by engaging in conversations that are crucial to achieving long-term prosperity for all Australians. Find out more about becoming a member, our ESG and AI Communities of Best Practice or getting involved in our research today.
About the author
LJ

Lisa Jenkinson

See all articles

Lisa Jenkinson is a strategic leader driving data and AI transformation across government and the healthcare sector. As the Queensland lead for KPMG’s Powered Data & AI practice, Lisa delivers end-to-end solutions, from strategy and governance to scaled AI implementation, that unlock real productivity gains.