In the Blind Abilities Studio, we welcome Jagadish K. Mahendran, Artificial Intelligence Developer and Engineer,and Hema Chamraj, director, Technology Advocacy and AI4Good at Intel
Raqi joins Jeff in the studio to learn and find out more about this great initiative that may one day enhance the navigating experience while Blind.
From the Intel Press Release:
Intel just announced a research project involving an AI-powered backpack that can help the visually impaired navigate and perceive the world around them with voice commands.
Artificial intelligence (AI) developer Jagadish K. Mahendran and his team designed an AI-powered, voice-activated backpack that can help the visually impaired navigate and perceive the world around them. The backpack helps detect common challenges such as traffic signs, hanging obstacles, crosswalks, moving objects and changing elevations, all while running on a low-power, interactive device.
“Last year when I met up with a visually impaired friend, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help. This motivated me to build the visual assistance system with OpenCV’s Artificial Intelligence Kit with Depth (OAK-D), powered by Intel.”
– Jagadish K. Mahendran, Artificial Intelligence Engineer
The World Health Organization estimates that globally, 285 million people are visually impaired. Meanwhile, visual assistance systems for navigation are fairly limited and range from Global Positioning System-based, voice-assisted smartphone apps to camera-enabled smart walking stick solutions. These systems lack the depth perception necessary to facilitate independent navigation.
“It’s incredible to see a developer take Intel’s AI technology for the edge and quickly build a solution to make their friend’s life easier,” said Hema Chamraj, director,
Technology Advocacy and AI4Good at Intel. “The technology exists; we are only limited by the imagination of the developer community.”
The system is housed inside a small backpack containing a host computing unit, such as a laptop. A vest jacket conceals a camera, and a fanny pack is used to hold a pocket-size battery pack capable of providing approximately eight hours of use. A Luxonis OAK-D spatial AI camera can be affixed to either the vest or fanny pack, then connected to the computing unit in the backpack. Three tiny holes in the vest provide viewports for the OAK-D, which is attached to the inside of the vest.
The OAK-D unit is a versatile and powerful AI device that runs on Intel Movidius VPU and the Intel® Distribution of OpenVINO™ toolkit for on-chip edge AI inferencing. It is capable of running advanced neural networks while providing accelerated computer vision functions and a real-time depth map from its stereo pair, as well as color information from a single 4k camera.
A Bluetooth-enabled earphone lets the user interact with the system via voice queries and commands, and the system responds with verbal information. As the user moves through their environment, the system audibly conveys information about common obstacles including signs, tree branches and pedestrians. It also warns of upcoming crosswalks, curbs, staircases and entryways.
More Context: A Vision System for the Visually Impaired (Case Study) | Intel OpenVINO Toolkit | Artificial Intelligence at Intel | MIRA
Contact Your State Services
If you reside in Minnesota, and you would like to know more about Transition Services from State Services contact Transition Coordinator Sheila Koenig by email or contact her via phone at 651-539-2361.
You can follow us on Twitter @BlindAbilities
On the web at www.BlindAbilities.com
Send us an email
Get the Free Blind Abilities App on the App Storeand Google Play Store.
Check out the Blind Abilities Communityon Facebook, the Blind Abilities Page, and the Career Resources for the Blind and Visually Impaired group