Speaking at a recent PublicTechnology webinar, hosted in partnership with QA, two leading voices from government and industry share advice on how public sector staff can unlock AI’s full potential through curiosity and confidence, not coding
From answering public queries to planning your next holiday, artificial intelligence (AI) is increasingly embedded into our daily routines, both at home and at work. But as civil servants, are we truly ready to harness its potential?
That’s the question tackled in a recent PublicTechnology webinar, hosted in partnership with QA, where two senior voices, Michael Padfield, Head of Strategy at the Incubator for AI (i.AI), and Toby Barnard, Managing Director for Public Sector at QA, shared their views on what it will take for the civil service to get the most out of AI.
As Padfield put it, “We’ve got a duty to the citizen to make the best use of the technology that’s out there, to improve their lives and to deliver better public services.” The conversation quickly moved away from technical jargon and focused instead on a more fundamental question: Are we thinking the right way about AI?
Barnard was quick to challenge the idea that civil servants need to become technical specialists. “Just be curious,” he said, explaining that public sector workers don’t necessarily need to become programmers, but they do need to explore how AI might help in their day-to-day roles. This sense of curiosity, he argued, is more important than coding skills.
Padfield agreed. “Everyone’s technical in a sense,” he said, highlighting that the real value comes from understanding how AI can be used to deliver better outcomes for citizens. He spoke of the benefits of multidisciplinary teams – those that bring together engineers, policy experts, and users – where each person brings a different skillset to the table, and where curiosity and collaboration can lead to meaningful results.
The mindset that matters most
That spirit of experimentation was echoed in a discussion about prompting: the skill of crafting effective queries to get useful responses from generative AI tools. Padfield explained that his team at i.AI is currently looking at “how we can collate what the best prompts are to be using within the public sector.” Prompting, he said, is something that everyone can – and should – learn, not just the tech experts.
Barnard recalled how his own experience with AI was initially hit and miss. “I wasn’t always getting the right answer,” he said, until he learnt how to use prompts more effectively. He described it as “almost like talking to a child” and likened it to using “Socratic questioning” to guide the AI to a better result.
This idea is now being built into QA’s data analyst apprenticeships for HMRC, where learners take part in “prompt thumbs”: team challenges that gradually build prompting skills by solving tasks of increasing complexity. Padfield, too, underlined that even i.AI’s top engineers focus on prompts, saying: “Even our most technical engineers are focused on how we write the best prompts to deliver the output.”
Despite the buzz around AI, both speakers emphasised that core human skills remain crucial, and arguably become even more important as AI assumes more administrative and analytical tasks.
Barnard pointed out that civil servants must continue to develop “critical thinking and problem solving,” warning that we risk losing these skills if we rely too heavily on AI tools. He’s been pushing for these capabilities to be prioritised not just in civil service training, but from primary education onwards.
Padfield agreed, giving the example of passing legislation. While AI might help draft briefings or analyse documents, it can’t replace the work of building political will. “Passing legislation is about influencing number 10 to want to support your proposal,” he said, “engaging parliamentarians to ensure they vote in your favour,” and managing the relationships and negotiations that make those outcomes possible. “It’s about long-standing relationships, where you understand what motivates different people.” He offered a similar example from social care, where AI transcription tools can enable social workers to focus on their clients rather than paperwork, allowing humans to do what only humans can.
Beyond training: Creating safe spaces to experiment
But making the shift from curiosity to adoption isn’t straightforward. Barnard noted that just rolling out training won’t cut it. People need safe spaces to try new tools without fear of making mistakes or facing reputational risks. “Training isn’t the answer here,” he said bluntly. It’s about changing behaviour, not just delivering content.
Padfield pointed to hackathons as an effective approach. He described one that brought together DSIT, MHCLG and 35 local authorities to tackle challenges in the planning system, with civil servants, policy specialists, digital experts and AI engineers all working side by side. “In that environment of actually building something around a challenge focused for 48 hours, they all learn off each other.” Participants with little technical background discovered what was possible with AI, while technical colleagues gained new insight into policy realities.
Still, concerns about job losses and doubts about the technology persist. Barnard referenced the World Economic Forum’s projection that 83 million jobs may be lost globally by 2027, outpacing the 69 million expected to be created. But he noted that other reports, such as PWC’s, suggest a more balanced picture in the UK. Drawing on the film Hidden Figures, he argued that the key is to focus on how jobs evolve, rather than disappear, and to ensure that workers are supported in adapting and retraining. “It’s about augmenting rather than replacing,” he said, with AI doing “60 to 80% of the heavy lifting” so people can focus on the meaningful parts of their work.
Padfield acknowledged the risks too: “AI can make mistakes,” he said. But he reminded the audience that the civil service isn’t automating perfect systems either. “If we were automating perfect processes, then we wouldn’t be automating them.” Mistakes already happen, and good design – along with “a human in the loop” – can help ensure that AI tools support, not undermine, service delivery.
When asked what helps non-technical users feel confident exploring AI, both speakers emphasised the importance of relevance and accessibility. QA’s “Teach Nation AI” initiative with Microsoft, for example, aims to introduce AI through everyday use cases – like analysing shopping lists or planning holidays – before applying it to work contexts.
Barnard explained that this personal engagement often leads to better understanding and a willingness to experiment. “Just find something that relates to them,” he said. “Cooking, travel, hobbies. Start there.”
Padfield noted that the civil service’s enthusiasm is already high. “We get 10s, if not hundreds, of emails a day” from civil servants asking for access to tools, he said. What’s needed now is better access and better design, so those tools meet real needs and are intuitive to use.
He mentioned that the government is piloting an AI Knowledge Hub, as recommended in the government’s AI Opportunities Action Plan, to give civil servants a single place to find use cases, guidance and prompts. “We’re working hard at this and at pace,” he said.
Reflecting on the discussion, Padfield summed it up neatly: success doesn’t rest on complex tech skills, but on creativity, curiosity, and a focus on citizen impact. Barnard agreed. “Just be curious,” he repeated, a fitting closing note for a conversation rooted in experimentation, learning, and the human side of innovation.