Blind people excluded from benefits of AI, says charity


Blind and partially sighted people are being excluded from the benefits of artificial intelligence tools and facing “a new level of discrimination”, the new president of the Royal Society for Blind Children has claimed as he called for better design of everything from video games to AI agents.

Tom Pey said existing difficulties for blind children were “now compounded because they’re excluded [and] distanced from their non-disabled peers, because those people can experience games, alternative realities and the AI-driven visual types of technology”.

Pey lost his sight as a child and created the Waymap app which offers step-by-step audio navigation instructions. His comments come as tech firms launch more visually based AI-powered systems such as Meta’s range of spectacles and the Google Lens function, which relies on users pointing their phone camera at objects or places.

Pey called on the technology secretary, Peter Kyle, to “formulate laws that will support the needs of disabled people, but also help direct the big companies and startups, so they include disabled people”.

“If we look at the hardware around AI, a lot of it is visual, and it ignores the needs of blind people, and it ignores people who have difficulty, not just with not being able to see, but being able to interpret visual imagery,” he said. “Those people, like me and others, we’re just excluded.”

People with sight loss are less likely to use the internet every day, more likely to be digitally excluded and less likely to own a smartphone compared with the rest of the population, research by the Royal National Institute of Blind People recently found. But it also reported that digital exclusion for blind and partially sighted people was reducing and that AI technology was becoming more accessible.

In response, tech companies including Google, Meta and Open AI all pointed to initiatives to use their tech to help blind and partially sighted people.

In September Meta launched a system that will allow people wearing its tech-enabled Ray-Ban glasses to connect instantly to a sighted volunteer who will be able to see through their glasses’ lens and provide a real-time description of what is going on in front of them. Open AI has also devised a virtual volunteer that will provide an audio description of whatever the phone is pointed at – for example the contents of a fridge – as well as have a conversation about it. The system, tested by Be My Eyes, an accessibility app founded in Denmark, uses Chat GPT-4. Google has an AI-powered app for people with low vision called Lookout, which audio describes photos and also reads out tests and engages in question and answer.

But Pey said young people with blindness or restricted sight were finding the existing gap between their experience of the world and that of non-disabled peer was now widened “because those people can experience games, they can experience alternative realities, they can experience the AI-driven visual types of technology, whereas people like them can’t”.

He called it “a new level of discrimination, which could be avoided by upfront thinking”.

He added: “The designers need to just wake up to the fact that they should design for disabled people.”



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.