
Workplace Insights by Adrie van der Luijt
The digital content world has a new false dichotomy: AI versus human empathy. As someone who’s built trauma-informed content services like Police.uk’s spiking information portal and rewritten 5,000 pages of cancer content for Cancer Research UK, I can confidently say this framing misses the point entirely.
The question isn’t whether AI can replace human empathy, but how it transforms our ability to deploy empathy at scale without destroying ourselves in the process.
My journey to creating trauma-informed content didn’t start in a UX lab. It began in local newspaper newsrooms and Dutch radio, where I had precisely three minutes to get an interviewee to open up about something meaningful. Those constraints taught me something crucial: empathy isn’t just an innate quality. It’s a skill built through deliberate practice and methodology.
Later, as an executive assistant anticipating leaders’ needs before they arose, and launching major online portals where understanding user behaviour was paramount, I refined these empathy skills into systematic approaches. At Director of Finance Online and SME Web, identifying that precise window of opportunity to gain a reader’s attention – be it an early morning newsletter or knowledge repository – required methodical understanding of user contexts.
This systematic approach to empathy is exactly why AI can help us create better trauma-informed content.
The idea that humans have a monopoly on creating empathetic content is fundamentally flawed. The measure of trauma-informed content isn’t who created it; it’s whether it:
My Police.uk spiking information service didn’t succeed because I personally experienced spiking. It succeeded because I applied rigorous design principles and deeply understood psychological safety needs, principles that can be systematised, taught and yes, encoded into AI prompts.
The uncomfortable truth: AI can absolutely create trauma-informed content when directed properly. Here’s how:
I experienced burnout researching domestic abuse. The emotional toll of immersing yourself in traumatic content for weeks is real and devastating. AI transforms this process by:
AI doesn’t get traumatised. It doesn’t need therapy after reading 500 accounts of pets being violently attacked to intimidate domestic abuse victims. This preservation of mental health alone is revolutionary for those of us in sensitive domains.
When I rebuilt the Ofsted childcare registration service that had young women in tears, finding the right tone required extensive iteration. AI enables us to:
With AI, I can create 20 variations of a sensitive flow and evaluate them, rather than just having bandwidth for 2-3.
Maintaining consistent trauma-aware language across the 5,000 Cancer Research UK pages I rewrote would have been nearly impossible without systematic approaches. AI ensures:
Yes, human can do it. But it takes a lot longer. I know it does. I did it.
Where humans remain essential isn’t in writing every word. It’s in:
When I created the Universal Credit website, building trust despite massive institutional distrust required constant calibration between clarity, respect and authority. These judgement calls remain human territory, while execution can be AI-augmented.
Here’s how to actually implement this effectively:
Perhaps the most powerful and underdiscussed benefit: AI helps content strategists avoid secondary trauma. When I wrote for Cancer Research UK, I absorbed countless cancer stories. When working on domestic abuse and drink spiking content, I internalised horrific narratives.
AI can process this material without developing PTSD. It can summarise research without needing therapy. This isn’t a small benefit; it’s the difference between sustainable careers in sensitive domains versus burnout.
Just as I discovered at SME Web that identifying the small windows of opportunity was crucial, the same applies to AI in trauma-informed content. The window isn’t replacing human judgment. It’s augmenting our ability to apply that judgement consistently across massive services, testing more variations and protecting our mental health while doing deeply emotional work.
The future belongs to content strategists and content designers who:
AI isn’t replacing empathy in content. It’s finally giving us the tools to apply empathetic principles at scale without destroying ourselves in the process.
And honestly, those claiming AI can’t do this work are really saying they don’t know how to use AI effectively yet. The divide isn’t between humans and machines. It’s between practitioners who understand this new paradigm and those clinging to outdated methods.
The question isn’t whether AI can be empathetic. It’s whether we’ll use AI to make empathy scalable, sustainable and more impactful than ever before.
Adrie van der Luijt is CEO of Trauma-Informed Content Consulting. Kristina Halvorson, CEO of Brain Traffic and Button Events, has praised his “outstanding work” on trauma-informed content and AI.
Adrie advises organisations on ethical content frameworks that acknowledge human vulnerability whilst upholding dignity. His work includes projects for the Cabinet Office, Cancer Research UK, the Metropolitan Police Service and Universal Credit.