[ad_1]
Louise Plunkett from Norwich says, “AI has revolutionized my everyday life.
Ms Plunkett has a genetic eye condition called Stargardt disease, a rare condition that causes blurred vision, which she says, “affects everything I do”.
“I can’t recognize people, not even my husband or my children. When my children were young, I had to teach them how to approach me when I met them on the school playground.
Ms Plunkett is comfortable with digital tools – her business advises companies on how to ensure their online content is suitable for the visually impaired community.
It has used services like Alexa, Google Home and Siri for years, helping with things like setting alarms and checking the weather.
Now she is looking for an assistant named Be My AI Qarzaan.
The app uses ChatGPT to create and then read detailed descriptions of images.
“I’m quite a stubborn person,” Ms Plunkett says. “I don’t like asking for help or admitting that I need help, so using an AI tool for things when other humans aren’t around is useful.”
She says she can use it to check which women’s restrooms are available, or read the ingredients on food packaging, or read a letter.
However, she feels that AI can sometimes be hit or miss. “The downside with AI is that sometimes it gives you too much detail. You sometimes just want the basic information of what’s in front of you, but it will go above and beyond, and mood and Will present emotions.
“For example, it might say ‘a swirling rug evokes memories of a bygone era’. That feels like a step too far.”
Be My AI was developed by the Danish firm Be My Eyes. Its original service put human volunteers in touch with its clients. Through a mobile phone, volunteers will describe what is in front of the visually impaired person.
However, some of its 600,000 users are turning to its AI tool for help, says chief technology officer Jesper Hoering Henriksen.
“We have a woman who was one of our first customers 10 years ago, and within the first six months. [of releasing Be My AI]he made more than 600 pictorial descriptions.
He’s also discovering that people are using the app in ways he never imagined. “We’re finding people using it to check photos that have been sent to them on WhatsApp groups,” he says.
“Maybe they won’t call another human every time to ask about a photo they sent to a WhatsApp group, but they use AI.”
As far as the future goes, he says live streaming video — in which the tech describes the movement of buildings and their surroundings — may be one area they move into. . “It’s going to be a game changer. It’s like having a little person in your shirt pocket telling you what’s going on all day.”
Be My Eyes, which is free to users, makes money by paying companies to sign up to the directory service where they can provide information and numbers to the blind and visually impaired community.
Mr. Henriksen says AI will not replace the need for human interaction.
“At Be My Eyes, people are still choosing to call a volunteer. The blind population in the Western world is typically not young when they start experiencing vision loss… it’s more towards an older population. Italics and this [AI] The latter can add additional complexity. Humans are faster and potentially more accurate.”
Other firms also have products to assist the blind.
With a voice assistant, WeWalk is an AI-powered walking stick that detects obstacles and offers accessible navigation and live public transport updates.
Linked to a smartphone app with built-in mapping, it can tell users where points of interest are, including where the nearest cafe is in over 3,000 cities.
“The cane is very important to us, it helps with navigation and is a very important symbol because it shows our freedom and automation,” says Gamze Sofuoğlu, WeWalk’s product manager.
“Our latest version helps users navigate the stick through voice commentary, for example when you say take me home or the nearest cafe it can start navigating, and you can learn about public transport. You don’t have to touch your phone, it gives freedom to the blind and visually impaired.”
Ms Sofoglu, who is blind, says she has been using it in cities she has visited recently, Lisbon and Rome.
Robin Spinks, Head of Inclusive Design at the RNIB (Royal National Institute of Blind People), and a person with low vision, is a huge advocate of AI – he uses AI most days.
For example, he turns to ChatGPT to help with his workflow, give him a summary of progress in certain areas of work, or even to help plan a paddleboard trip. And Google turns to the Gemini AI tool to help it find objects.
Last year was about conversational AI and chat GPT, he says. Now he argues that 2024 is the year of what he calls “multimodal AI.”
He adds: “It’s showing video and images, and may be able to extract meaningful information and help you in an interesting way.”
He pointed to the Google Gemini. “For example, it lets you record meetings and it helps you with audio labels and account of meetings, it’s really helpful and it’s about making people’s lives easier. “
Mr Spinks says AI is transformative for people who are blind or have low vision.
“I sympathize with people who are genuinely afraid of AI but when you have a disability, if something can genuinely add value and be helpful, that’s a great thing. “The benefits are too good to ignore.”
[ad_2]
Source link