Explore our Progress 2050 Goal Tracker
Australia’s productivity slowdown has become impossible to ignore — and artificial intelligence could be the catalyst that changes everything, if we have the courage and capability to deploy it wisely.
Australia’s productivity challenge is no longer a slow burn, it’s a flashing red light. As we gather at the CEDA AI Leadership Summit in Brisbane, the question isn’t whether artificial intelligence can help, it’s whether we’re bold enough to use it properly.
AI has exploded into the mainstream. From ChatGPT to Copilot, it’s reshaping how individuals work. But here’s the uncomfortable truth, despite billions in global investment, 95 per cent of organisations are seeing zero return. The problem isn’t the tech, it’s us.
There’s a growing recognition of what some are calling the “AI divide”, the chasm between experimentation and transformation. Most deployments fail not because models are weak, but because they’re static, siloed and disconnected from real workflows. In the public sector, this is especially dangerous. Governments risk mistaking flashy demos for genuine reform.
The recent reversal by Commonwealth Bank, which cut 45 call centre jobs due to AI, only to reinstate them after call volumes surged, should be a wake-up call. AI isn’t a shortcut to cost-cutting, it’s a system that must be designed to serve people, not replace them. The backlash from unions and the Fair Work Commission shows what happens when implementation outpaces consultation.
The MIT NANDA report, The GenAI Divide: State of AI in Business 2025, warns of a “shadow AI economy” where employees trust general-purpose tools more than internal systems. That’s a trust gap, and in government, trust is everything. If we don’t embed AI into the fabric of public service, we’ll lose the opportunity to rebuild it.
Treasurer Jim Chalmers’ recent roundtable made it clear, AI is now a national priority. The government is accelerating work on both an APS AI plan and a national AI strategy, with a focus on modernising procurement, improving data use and conducting a legislative gap analysis. This is a moment of alignment between policy ambition and technological capability.
But Australia faces a unique challenge. The University of Melbourne and KPMG’s joint report, Trust, Attitudes and Use of Artificial Intelligence: A Global Study 2025, reveals we are more cautious, less trusting and less confident in AI than many other nations. Only 36 per cent of Australians say they trust AI, and just 17 per cent report high acceptance, among the lowest globally. Concern about AI risks is high (78 per cent), and only 10 per cent are aware of existing AI regulations. In contrast, countries like Nigeria, India and China report trust levels above 60 per cent, and far greater optimism about AI’s benefits.
This caution isn’t a weakness, it’s a signal. Australians want AI to be safe, ethical and well-governed. They expect oversight by government and regulators, co-regulation with industry and international standards. The public mandate is clear, trust must be earned, not assumed.
KPMG sees this moment as a turning point. The public sector can leapfrog by learning from private sector missteps and focusing on systems that serve citizens, not just automate processes. Here’s what we believe must happen next:
Australia’s public sector has a unique opportunity to lead. But we must act fast and act smart. The AI boom is real. So is the risk of missing it. Let’s not be the country that watched the future happen. Let’s be the one that built it.
Lisa Jenkinson is a strategic leader driving data and AI transformation across government and the healthcare sector. As the Queensland lead for KPMG’s Powered Data & AI practice, Lisa delivers end-to-end solutions, from strategy and governance to scaled AI implementation, that unlock real productivity gains.
Digital technologies such as artificial intelligence (AI) will play a crucial role in tracing and verifying the environmental impact of low-emissions hydrogen. Aurecon’s Megan Wheeldon and Dave Mackenzie discuss how AI could interpret data and support informed decision-making for product certification, so that the carbon intensity of hydrogen supply chains can be accurately documented and verified.
Read more Opinion article September 26, 2023Scale, complexity and autonomy are three of the challenges unique to artificial intelligence that differentiate it from other technological advances and explain the heightened focus on responsible AI. Liming Zu, Research Director of the Software and Computational Systems division at CSIRO's Data61, writes that comprehensive strategies are needed to address the broad spectrum of challenges inherent to responsible AI.
Read more Opinion article May 18, 2021CEDA CEO Melinda Cilento writes that the government should introduce a new Chief Technologist to help Australia become a leading digital nation.
Read more