What 1,032 disabled adults just told us about building better AI
I've supported a range of high support needs and accessibility requirements over my career, ranging from youth work to an accessibility technology trainer. You can even find some free accessibility software I have built on my GitHub. Technology can be an amazing equaliser when designed to bridge the gap.
My biggest gripe with the whole training and facilitation industry is how often they trade on the good graces of learners. Too often, any adjustments asked of a facilitator to widen participation are met with an apology and moved on from. Immediately, we are excluding people from learning which they then sit through anyway. It captures something that keeps repeating in technology: the people who would benefit most from a small design choice are usually the last people asked about it, and often only after the thing has already been built.
In the disability communities I work with, this is often expressed with the phrase "nothing about us, without us."
That stuck in the forefront of mind when I read a new poll from Business Disability Forum, run with Opinium, asking 1,032 disabled UK adults what would actually make AI products more accessible to them. The most common answer was not faster models, more features, or even cheaper tools. It was simpler and more uncomfortable. Forty per cent said the single most useful thing developers could do is design, develop and test AI products with disabled people in the room. Not consulted at the end. Not surveyed after launch. In the room while the thing is being made.
If you work in or near AI right now, that finding should sting a little. Because the honest truth is that most product roadmaps I see still treat accessibility as a late-stage compliance task, somewhere between legal review and the launch party. Yet here are over a thousand disabled adults pointing at the obvious thing the industry keeps missing.
The poll goes further, and the rest of it is just as practical. Thirty-eight per cent want more user-friendly interfaces. Thirty-seven per cent want better information about how AI can actually support them. Thirty-six per cent want help getting started. Read those numbers in order and a clear pattern appears. People are not asking for futuristic features. They are asking for the basics of good product design and good onboarding. The kind of thing every team claims to do and very few teams genuinely do well.
What makes this poll interesting, rather than another piece of inclusion advocacy, is that the same people are also broadly optimistic about what AI can do. Over a third said AI tools could improve their communications. A third pointed to better access to healthcare information, education, and digital content. Roughly a quarter named employment and customer experience. This is not a community waiting to be rescued by technology, and it is not a community in flat refusal of it. It is a group of adults with a clear sense of where AI could help and where it currently does not, asking to be involved in closing the gap.
It is also worth sitting with the dissenting voices in the same poll. One in five said they did not think AI products could help disabled people at all. Another eighteen per cent were not sure. That is nearly four in ten people who are either sceptical or undecided. If you are a product leader, those are the users who will quietly decide whether your tool gets adopted in this community or routed around. Telling them the future is bright will not move them. Building something that respects how they actually live, work and communicate might.
This is where I think the wider community of people building, buying and regulating AI has to be honest with itself. We talk a lot about responsible AI, about ethics frameworks, about human-centred design. The Business Disability Forum poll is a low-cost, high-signal test of whether any of that is real inside your organisation. If a disabled customer or employee was in the design review tomorrow, would your team welcome them or politely tell them the schedule is too tight? If your answer is the second one, you do not have a human-centred process. You have a marketing line.
I find the Business Disability Forum's framing useful here. Lucy Ruck, who leads their Tech Taskforce, put it cleanly: "AI has the capacity to transform lives, but only if we get inclusion right from the start". From the start. Not retrofitted after a product manager notices the screen reader is broken. Not tacked on once a regulator gets in touch. From the start means including disabled people in user research, in prompt design, in evaluation, in the rooms where features are killed and kept. It means budgeting for it, paying participants properly, and accepting that some of your assumptions will not survive the conversation.
There is a quieter point in the data too, and it pulls me back to the workshop story. One in four people experience disability at some point in their lives. The gap between AI products built with disabled users and ones built around them is not a niche question. It is, over time, a question about whether the tools we are all coming to depend on for work, healthcare, education and public services treat a quarter of the population as core users or as edge cases. That choice will compound. Tools built with a narrow user in mind get harder to retrofit the more they are deployed, the more their training data ossifies, and the more habits build around them. The cost of fixing this later is much higher than the cost of doing it now.
If you are a leader reading this and wondering where to start, I would resist the urge to commission a big strategy. Two practical moves carry more weight. First, find the disabled employees, customers or community members already adjacent to your AI work and ask them, plainly, what is broken and what would help. Pay them for their time. Second, change one decision-making meeting this quarter so the user research presented in it includes disabled participants by default, not as an optional extra. Two small structural changes will tell you more about your culture than any framework will.
The wider question this poll forces me to ask, and that I keep returning to in my own practice, is who counts as a user when we say AI is for everyone.
The people Business Disability Forum spoke to have answered. They want in. The interesting question now is whether the industry is willing to make space at the table while there is still a table to design.