Workplace Insights by Adrie van der Luijt

AI in public services

The machines won’t save us, unless we tell them how

A veteran content designer reflects on why AI in public services is not a silver bullet, despite the hype and offers a grounded path forward.

I’ve been around long enough to remember carbon paper and fax machines. I also remember being told, with total certainty, that the internet would kill paper, that cables were obsolete, and that government digital services would soon be “one front door” where all your needs were met.

Now we’re being told AI will revolutionise public services.

Tony Blair’s latest call to arms, as reported in the New Statesman, is the kind of breathless optimism that’s followed every new technology since the microchip. According to him, AI will deliver “radically better services at less cost.” No doubt that sounds good in a keynote. But I’ve spent decades inside public services, digital projects and tech implementations. And let me tell you: nothing improves unless time, money and genuine expertise are invested.

Let’s be clear. I’m not anti-AI. I use it every day. I even coach content designers and service teams in how to use it safely, ethically and productively. But we must stop acting as though AI will simply do the work for us. Especially in public services, where the stakes are high, emotions run deep and the margin for error is razor-thin.

Here’s what I’ve seen

In the 1990s, we faxed entire policy documents across departments, sometimes re-faxing them because the toner had run out halfway or two sheets went through at the same time. I’ve watched content go from typeset leaflets to responsive services with a chatbot attached. And I remember helping frontline staff deal with the human consequences of poor digital rollouts.

Digital transformation didn’t fail because the tech didn’t work. It failed because no one invested enough in training, culture, or design that treats people like… people. AI will be no different.

In fact, it could be worse. Because the illusion that the machine knows best makes it harder to challenge bad decisions. If you’ve ever tried to argue with a chatbot about your benefits, you’ll know exactly what I mean.

“Efficiency” isn’t neutral

Tony Blair says AI will make government more efficient. But efficiency, in the absence of compassion, leads to error and harm. If you’ve ever watched someone break down trying to fill in a Universal Credit journal, you’ll know that better outcomes don’t come from faster processes. They come from better-designed systems that understand how humans think and feel, especially when they’re under pressure.

A new washing machine didn’t stop my grandmother from working. She simply had more time to do other forms of work. (The ironing didn’t vanish, the children still needed feeding.) AI may well free up some resources, but the question is always: what happens to the time we save? Will it be reinvested into better human support? Or quietly shaved off the payroll?

What we need instead

What public services need isn’t just smarter tech. They need emotionally intelligent, trauma-aware, cross-disciplinary teams who understand the real causes of failure and how to address them. That means:

  • Content designers who write with clarity and compassion.
  • Service designers who work with users, not just data.
  • AI experts who understand ethics, bias, and responsibility.
  • Leaders who accept that trust isn’t a deliverable. It’s earned.

We also need to stop designing for the imaginary “average user” and face the reality of post-lockdown cognition: more people are overwhelmed, more easily triggered and more at risk of being shut out.

Let’s do it properly or not at all
AI is a tool. Like every tool, its impact depends on the craft of the people using it. If the UK government is serious about AI in public services, it needs to stop commissioning white papers and start hiring people who know what they’re doing.

And for the record, I’m free if they need me.

Workplace Insights coach Adrie van der Luijt

Adrie van der Luijt is CEO of Trauma-Informed Content Consulting. Kristina Halvorson, CEO of Brain Traffic and Button Events, has praised his “outstanding work” on trauma-informed content and AI.

Adrie advises organisations on ethical content frameworks that acknowledge human vulnerability whilst upholding dignity. His work includes:

  • developing the UK’s national drink and needle spiking advice service used by 81% of police forces in England and Wales – praised by victim support organisations
  • creating user journeys for 5.6 million people claiming Universal Credit and pioneering government digital standards for transactional content on GOV.UK
  • restructuring thousands of pages of advice for Cancer Research UK‘s website, which serves four million visitors a month.