
Workplace Insights by Adrie van der Luijt
Last Tuesday, my 89-year-old mother in the Netherlands received an official letter informing her that her insurance fee payment had been refused.
It wasn’t because she hadn’t paid. She’s meticulous about making payments on time. It was because the system couldn’t recognise her bank account now that my late father’s name was no longer on it.
The “simple solution” offered by the insurance firm’s helpline was to log into the website using a 17-digit security code (sent separately by post when the account was first opened), create an account requiring email verification and set up a direct debit.
My mother, nearly blind and profoundly deaf, has never used the internet. This letter joined the growing pile of administrative puzzles awaiting my next visit from London, where I work on major UK digital services.
My journey with government digital services spans multiple countries and decades. I built my first Dutch government digital project in 1987 before relocating to the UK in 1995.
Since then, I’ve led digital transformation work across the public, private and charity sectors. This has included:
In the charity sector, I’ve led content strategy for Cancer Research UK, restructuring over 5,000 pages for a site serving 4+ million monthly visitors.
Today, I divide my time between London and the south of France, giving me firsthand experience with three different national approaches to digital services.
This unique perspective has allowed me to witness a concerning trend: as our technical capabilities advance, our systems’ inclusivity often regresses.
What began as genuine efforts to modernise public services has evolved into complex digital fortresses that exclude the very citizens they’re meant to serve.
Government digital services worldwide share a common flaw: prioritising security mechanisms over usability for vulnerable populations. These barriers manifest in notably different ways across countries, yet create similar exclusion patterns.
In the Netherlands, the DigiD government gateway system requires my mother to verify her identity through a mobile phone she cannot see, using multi-factor authentication she cannot comprehend and ID documents more recent than she possesses.
During my work with the Rural Payments Agency in the UK, I championed the needs of internet-illiterate farmers who couldn’t access crucial agricultural subsidies.
The system required them to report the use of their land using Photoshop online within narrow error margins.
Many owned substantial businesses but had never needed digital skills to run them successfully for decades. The initial service design completely overlooked these users, assuming digital literacy that simply didn’t exist among a significant portion of the user base.
I fought to maintain paper-based alternatives while we gradually developed more accessible digital pathways. This battle required presenting uncomfortable user research directly to senior stakeholders who initially refused to believe the problem’s scale.
In France, where I now spend part of my time, I’ve observed both government rigidity and remarkable ingenuity.
During COVID-19, volunteer-developed systems demonstrated that rapid, accessible solutions are possible when user needs truly drive development.
Yet the standard French administrative systems remain as exclusionary as their Dutch and British counterparts.
Financial institutions across all three countries compound these challenges.
Banking security systems demand responses to SMS codes that cannot be seen, memorable information set up years ago and telephone systems that cannot be heard by those with auditory impairments.
And for those who do manage to book a face-to-face meeting with their bank, the nearest branch office is nowadays often no longer where they live.
This isn’t security, but exclusion disguised as protection.
Why do these barriers persist despite decades of accessibility guidelines and inclusive design principles? The answer lies in a form of institutional blindness that I’ve observed firsthand across my work in multiple countries:
While assessing Ofsted’s digital services, I discovered their childcare registration forms and manual required post-graduate reading ability levels – yet no one had identified this fundamental barrier.
The cognitive dissonance was striking: we were communicating about children’s services using language that many adults couldn’t understand.
When I presented readability analysis showing that most childcare providers would struggle to comprehend critical information about their children’s schools, the response was disbelief followed by uncomfortable recognition.
The same professionals who advocated for educational inclusion had inadvertently created documents that excluded the very families they aimed to serve.
This blindness stems from several factors I’ve witnessed repeatedly:
During Universal Credit content development, we witnessed dramatic differences in comprehension when we expanded testing beyond our usual demographic.
When I led content strategy for the transformation of 5,000 pages of advice and information at Cancer Research UK, we discovered that many informational pages that tested well with general audiences were incomprehensible to newly-diagnosed patients experiencing cognitive effects of treatment.
I’ve sat in meetings with Cabinet Office security teams where they insisted on multiple authentication factors for services with virtually no security risk, despite my objections that this would exclude vulnerable users.
Both Dutch and British governments have explicit policies pushing users toward digital channels while gradually defunding alternatives.
When working on the ‘digital-by-default’ Universal Credit service, I advocated unsuccessfully against removing face-to-face and telephone application options for millions of applicants.
When I brought my mother’s DigiD struggles to the attention of a Dutch civil servant, they genuinely couldn’t understand why sending SMS codes to someone nearly blind presented a problem. “Get someone to read it to them,” they said – unaware that most of the time she only sees a carer once a week.
This blindness produces a particular hallmark I’ve seen repeated across countries: cheerful “no problem!” messages preceding impossible solutions.
When DigiD helpfully offered three ways to update my mother’s mobile number, each required the very number I needed to change. This revealed designers who never imagined someone in my mother’s situation.
In the UK, the equivalent is “There’s a simple way to solve this…” preceding instructions that are anything but simple for vulnerable users. In France, it’s “C’est très facile!” followed by technological labyrinths.
After decades in this field, I’ve come to believe this blindness isn’t malicious but systemic. It’s a case of services designed by people who simply cannot see beyond their own experience of the world.
While the elderly bear the brunt of these failures, digital exclusion affects a much broader population than typically acknowledged.
Through my work across multiple countries, I’ve collected countless examples that illustrate the pervasiveness of this problem.
People who struggle to pay for parking, because it requires them to download an app without wifi, register – which requires verification messages – and pay on a mobile phone. Meanwhile, they can see their train pull out of the station.
In the Netherlands, I observed a teenager without a smartphone who cannot access his school’s homework portal, gym membership or lunch payment system. This creates forced dependency on his parents.
In the UK, while working on a healthcare initiative, I met a cancer patient unable to track important medical appointments because they’re only sent via an app he struggled to navigate during treatment.
As a friend told me today, “When I was in the IT business, my biggest gripe was seeing the delivery of truly sophisticated computer system, but with a confusing and complicated user interface. I compared it to delivering a Rolls Royce, but without an ignition key. Sadly, as an internet user, I still continue to be confronted with similar user interface issues today, some 30 years on!”
But it’s not just about UX/UI design. During hundreds of hours of user research sessions with the Department for Work and Pensions, I witnessed people in genuine distress unable to complete basic benefit applications.
This isn’t due to lack of effort or intelligence, but because the systems assumed capabilities, devices and resources they simply did not have. This wasn’t occasional; it was systemic.
Even self-described “techies” increasingly report feeling overwhelmed by constantly changing interfaces and authentication requirements.
This widespread confusion prompted a Waitrose supermarket in Dorchester to reinstate manned checkouts after customers couldn’t navigate self-service systems. This pattern is repeated across retailers throughout Europe.
When working on Universal Credit, I advocated strongly against purely digital application processes after seeing how they excluded not just elderly users but also:
In one particularly striking example from my consulting work, a homeless shelter in London reported that 83% of their clients were unable to complete mandatory online benefits applications without substantial assistance.
Again, this wasn’t because they lacked intelligence. The systems assumed resources, capabilities and life circumstances they simply didn’t have.
Digital exclusion transcends demographics, creating unnecessary barriers at precisely the moments when citizens most need frictionless access to essential services.
As governments rush to implement AI-driven services, a fundamental contradiction emerges: how can we responsibly deploy advanced AI when we’ve failed to solve basic digital accessibility?
This question weighs on me heavily as I move between countries seeing identical patterns. The UK Prime Minister speaks of AI revolutionising public services while millions cannot access existing systems.
Senior leaders I worked with on digital transformation projects in the UK enthusiastically embraced AI potential while simultaneously cutting resources for telephone helplines and in-person support.
The Netherlands follows similar patterns, with substantial investment in AI research alongside declining accessibility for non-digital citizens.
The French system, despite impressive COVID-response innovations, encounters similar contradictions in its approach to AI deployment.
This contradiction isn’t merely putting the cart before the horse, but attaches a rocket engine to a cart with square wheels.
I’ve used this metaphor in workshops with government digital teams at Cancer Research UK and the Metropolitan Police Service, and it often provokes uncomfortable recognition.
AI inherits and often amplifies the biases built into our digital foundations. Without addressing fundamental inclusion failures, AI risks creating even more sophisticated forms of exclusion.
What’s needed isn’t more technology, but more empathy embedded within our design processes. This is something I’ve advocated for consistently across my work in multiple countries.
Based on my decades of experience with government digital services across the Netherlands, UK and France, I propose a fundamental shift in approach:
Before any system launches, it must pass three critical tests:
– Does it work for users with sensory limitations?
– Does it work for users without digital experience or assistance?
– Does it work for users experiencing temporary crisis or cognitive load
We must abandon the fiction that digital alternatives eventually make traditional channels obsolete. Phone, postal and in-person options must remain properly resourced, not treated as temporary concessions destined for elimination.
Systems need sensible failure recovery paths. If a user can’t pass one verification method, alternative verification should be readily available, not dead ends that lock people out of essential services.
In France during COVID-19, I observed volunteer-developed systems that brilliantly balanced digital efficiency with analog alternatives, proving it’s entirely possible when the will exists.
When designing Universal Credit’s content, we discovered that approaches working for users with cognitive limitations improved everyone’s experience.
This principle should guide all digital services: what works for vulnerable users typically works better for everyone.
The most successful project I worked on in the UK began by designing for users with learning disabilities and ended up with interfaces so intuitive that all users completed tasks significantly faster.
During hundreds of hours of user research sessions across multiple government departments, I’ve observed that cognitive overload is often the breaking point for vulnerable users.
When working with the UK dispute resolution service ACAS, we pioneered a cognitive load assessment approach that measures the mental effort required to complete digital tasks.
By establishing maximum cognitive load thresholds and testing against them, we identified and eliminated breaking points that disproportionately affected users during times of stress or crisis.
Acknowledge contextual vulnerability
Anyone can become temporarily vulnerable through bereavement, illness, stress or circumstance. My mother isn’t inherently vulnerable. She became vulnerable when my father died and she suddenly had to navigate systems they had previously managed together.
Systems should anticipate these life transitions rather than punish those experiencing them. I’ve advocated for “life event” approaches to digital service design, where services anticipate and accommodate the cognitive and emotional impact of major life changes like bereavement.
Accessibility cannot be an afterthought. Diverse users with actual limitations must be integral to the design process from conception through deployment.
Had DigiD’s “no problem!” solutions been tested with someone like my mother, their inadequacy would have been immediately apparent. Regular testing with vulnerable users provides insights no amount of theoretical accessibility review can match.
When we insisted on testing Universal Credit content with genuinely vulnerable users rather than proxy users, we uncovered critical failures that would have gone undiscovered. This approach costs more initially but prevents massively expensive remediation later.
These recommendations align with the Government Digital Service’s emerging vision for the future of digital government.
In their recent “Government Beyond Digital” framework, GDS has acknowledged that true transformation requires moving beyond digital-first thinking toward genuinely human-centered systems.
GDS’s recognition that government services must work for everyone – regardless of digital literacy or access – represents a crucial evolution in thinking.
Their developing work on “Digital Plus” approaches aligns with what I’ve advocated for throughout my career: digital services complemented by robust alternative channels that aren’t treated as temporary concessions.
Most encouragingly, the cross-government “No One Left Behind” initiative directly addresses issues I’ve highlighted from my cross-country perspective.
This initiative acknowledges that digital inclusion isn’t simply about access but about designing systems that recognise and accommodate the full spectrum of human capabilities and circumstances.
As someone who’s worked alongside GDS since its inception, I’ve witnessed the pendulum swing from digital evangelism toward a more nuanced understanding of inclusion.
By building on this foundation while incorporating insights from international approaches, we have an opportunity to create truly inclusive services that leverage technology’s advantages without sacrificing accessibility.
Digital inclusion isn’t merely a technical challenge, but an ethical obligation. As digital professionals, we have a responsibility to recognise when our solutions become the problem.
When security measures designed to protect instead become barriers to participation. When efficiency for the majority creates exclusion for the vulnerable minority.
I feel this responsibility keenly every time I visit my mother in the Netherlands. I watch her – a woman who survived World War II and the 1953 North Sea flood, raised two children largely single-handedly and contributed to her community for decades – now finding herself unable to perform basic tasks like paying her council tax or checking her bin collection schedule. Not because she lacks capability, but because our systems were never designed with her in mind.
In my UK government work, I’ve advocated for trauma-informed content design approaches that recognise how life circumstances affect people’s ability to navigate complex systems.
In France, I’ve seen how accessibility can be prioritised when there’s genuine will to do so. Across all three countries, I’ve witnessed the profound dignity that comes from systems that include rather than exclude.
This isn’t just poor design. It’s digital disenfranchisement. And it’s time we acknowledged our complicity in building systems that, despite our best intentions, are actively excluding the people they claim to serve.
As a content strategist who’s spent decades making complex information accessible, I’ve come to believe that the measure of our digital society isn’t how well we serve the capable and connected, but how effectively we include those at the margins. By this standard, we are failing spectacularly.
As we stand at the threshold of an AI-driven future, let’s ensure we’re not building it on foundations that already leave millions behind.
The technology will change – as it has throughout my four-decade career – but the ethical imperative remains constant: design for inclusion, or accept responsibility for the exclusion that follows.
Adrie van der Luijt is CEO of Trauma-Informed Content Consulting. Kristina Halvorson, CEO of Brain Traffic and Button Events, has praised his “outstanding work” on trauma-informed content and AI.
Adrie advises organisations on ethical content frameworks that acknowledge human vulnerability whilst upholding dignity. His work includes projects for the Cabinet Office, Cancer Research UK, the Metropolitan Police Service and Universal Credit.