
Workplace Insights by Adrie van der Luijt
I wasn’t always an AI enthusiast. When I first encountered generative AI as a senior content strategist working on government digital services, I was properly sceptical. Would these tools actually help us create more effective content or just generate more administrative busywork?
After implementing AI workflows across multiple organisations from the Metropolitan Police Service to the Cabinet Office, I’ve come to a conclusion that might surprise you: the value of AI for content designers isn’t in replacing our skills, but in amplifying them in unexpected ways.
Most content designers use AI for drafting, editing or generating variations. These are valuable applications, certainly. But having spent years teaching AI to assist content professionals, I’ve discovered several unconventional approaches that dramatically extend what’s possible.
Here are 11 AI prompt techniques I’ve never seen in standard guides but have proven invaluable in my work on high-stakes content projects:
“Review this content from the perspective of someone experiencing severe anxiety, grief, financial distress or cognitive overload. Identify specific elements that might create confusion, increase stress, or inadvertently cause harm. Suggest specific text changes that would maintain meaning while reducing potential negative impacts.”
When I worked on Cancer Research UK’s website, we used a similar approach to identify content that might unintentionally distress newly diagnosed patients. What looked perfectly clear to our team often appeared overwhelming or confusing to someone processing information in a heightened emotional state.
“Based on this draft content, simulate the perspectives and potential objections of the following stakeholders: [list specific roles like legal, marketing, compliance, executive leadership]. For each stakeholder, write specific concerns they might raise, suggest possible accommodations and explain the potential trade-offs of each accommodation.”
I first developed this approach when creating trauma-informed content for the national drink spiking advice service for Police.UK. By simulating the perspectives of police officers, victim support organisations and legal advisors before our actual reviews, we addressed potential issues early and significantly reduced approval friction.
Content personalisation often falls flat because the variables are too generic. This prompt creates more nuanced options:
“I’m creating a personalised content system for [specific service or product]. Generate 15 personalisation variables beyond the obvious ones (name, location), focusing specifically on variables that would meaningfully improve user understanding, reduce cognitive load or increase relevance for different user contexts.”
At the Cabinet Office, we used this technique to develop personalisation variables for grant application guidance that adapted based on applicant experience level, organisational type and specific challenges, making complex information more accessible to diverse audiences.
When you notice effective content from other organisations but struggle to articulate why it works, try:
“Analyse these three examples of effective content in the [specific area] space. Identify common patterns in structure, vocabulary, tone, information hierarchy and user considerations. Summarise the underlying content strategy principles that appear to be at work, and suggest how these principles could be applied to our content about [your topic].”
This technique helped my team at telecoms industry body The One Touch Company (TOTSCo) extract the underlying principles from high-performing technical documentation and apply them to our own materials, significantly improving comprehension metrics.
Accessibility testing with diverse users is essential but not always immediately feasible. This prompt helps bridge the gap:
“Evaluate this content from the perspective of someone with [specific cognitive consideration, for example, ADHD, dyslexia, non-native English speaker, someone under extreme stress]. Identify potential barriers to comprehension or task completion. Suggest specific adjustments that would address these barriers while maintaining the core message and purpose.”
I used this approach when developing Universal Credit guidance, simulating how content might be experienced by users with varying cognitive needs before conducting actual user testing, helping us prioritise our testing scenarios and create more inclusive preliminary drafts.
Data often fails to influence decisions because it lacks narrative context. This prompt transforms raw metrics:
“Transform these user analytics/feedback metrics into 3-5 compelling narrative insights about user behaviour and needs. For each insight, provide:
1) The key finding in plain language
2) Why this matters to our users and business goals
3) A specific content design implication or recommendation.”
When presenting metrics to stakeholders at the Metropolitan Police, this approach helped transform raw data about user drop-offs into compelling narratives that actually influenced decision-making about service design.
7. Generate journey-specific content matrices
Creating consistent content across complex user journeys is challenging. This prompt helps systematise the approach:
“Create a content matrix for our [specific user journey] that maps different user needs/states against appropriate content characteristics. For each intersection, suggest specific content patterns, vocabulary considerations and tone adjustments that would be most effective. Focus particularly on transitions between different journey stages.”
This technique proved invaluable when designing content for the Complex Crime Unit’s reporting workflows, ensuring consistent language and tone across interconnected touchpoints spanning multiple channels and services.
8. Develop contextual microcopy variations
Generic error messages and helper text fail users exactly when they need support most. This prompt creates more helpful alternatives:
“Generate 5-7 variations of contextual microcopy (error messages, helper text, confirmation messages) for [specific interaction point], tailored to different user contexts and potential pain points. For each variation, explain the specific scenario it addresses and why the language choices are appropriate for that context.”
At Cancer Research UK, we used this approach to develop context-sensitive error messages for their donation system that acknowledged the emotional context of the interaction, significantly reducing abandonment rates.
9. Create content governance simulation scenarios
Content governance breaks down when teams can’t anticipate real-world scenarios. This prompt helps build practical governance examples:
“Generate 5 realistic scenarios that would test the boundaries of our content governance framework for [specific content area]. For each scenario, include:
1) The situation requiring a content decision
2) The competing considerations at play
3) How our current governance approach would handle this situation
4) Potential gaps or ambiguities that might cause confusion.”
This technique helped the Cabinet Office develop a more robust content governance framework for their COVID-19 grant management tools by identifying edge cases we hadn’t considered in our initial planning.
10. Conduct a “voice and tone stress test”
Brand voice guidelines often break under pressure. This prompt helps identify potential inconsistencies:
“Apply our brand voice principles to these 5 challenging content scenarios:
1) Explaining a service outage
2) Responding to user frustration
3) Communicating a policy change
4) Declining a user request
5) Addressing a sensitive topic.
Highlight any points where our voice guidelines become difficult to maintain while meeting user needs, and suggest potential adjustments.”
When developing the content strategy for TOTSCo, this approach revealed that our initially defined voice principles became unsustainable in high-stress user scenarios, leading to important refinements before implementation.
11. Generate implementation-ready content models
Translating content design decisions into developer-friendly specifications is often a bottleneck. This prompt helps bridge the gap:
“Transform these content design requirements for [specific component] into a structured content model that developers could implement. Include:
1) Content elements and their relationships
2) Required and optional fields
3) Character limits and formatting constraints
4) Conditional logic for different contexts
5) Example content for each element.”
I developed this technique when working with the Metropolitan Police on their incident reporting systems, helping translate complex content requirements into structured models that technical teams could easily implement without const.ant content designer involvement.
The common thread across these techniques is that they don’t try to replace content design expertise, but they extend it. By using AI to simulate different perspectives, systematise patterns and bridge gaps between disciplines, we can focus our human attention on the most valuable aspects of content strategy work.
What makes these approaches particularly valuable is their specific application to content design challenges. Unlike generic AI prompts, these techniques address the unique interdisciplinary nature of content work, where success depends on balancing user needs, technical constraints, business goals and ethical considerations.
Start small. Choose one technique that addresses a current challenge in your content workflow and experiment with it on a low-stakes project. Pay attention to where the AI output helps advance your thinking and where it falls short.
Remember that these techniques work best when they complement established content design practices, not replace them. The goal is to use AI to handle the heavy lifting of information processing, pattern recognition and scenario generation, freeing you to apply your uniquely human judgement to the results.
Most importantly, maintain critical awareness. AI tools reflect the biases in their training data and can confidently present flawed information. Always evaluate outputs against your professional expertise and organisational standards.
After implementing AI workflows across organisations from government agencies to healthcare providers, I’m convinced that the future of content design isn’t about choosing between human expertise and AI efficiency. It’s about thoughtfully combining them.
The most valuable content designers won’t be those who use AI most extensively, but those who use it most intelligently, leveraging these tools to amplify their distinctly human capabilities for empathy, strategic thinking and ethical judgement.
These 11 techniques represent just the beginning of what’s possible when we move beyond treating AI as either a threat to be resisted or a magic solution to be embraced uncritically.
By developing specific, nuanced applications of AI for content design challenges, we can create more effective content while elevating the strategic importance of our discipline.
Adrie van der Luijt is CEO of Trauma-Informed Content Consulting. Kristina Halvorson, CEO of Brain Traffic and Button Events, has praised his “outstanding work” on trauma-informed content and AI.
Adrie advises organisations on ethical content frameworks that acknowledge human vulnerability whilst upholding dignity. His work includes projects for the Cabinet Office, Cancer Research UK, the Metropolitan Police Service and Universal Credit.