Robotic Cane Offers Indoor Navigation for Visually Impaired Users - Inside GNSS - Global Navigation Satellite Systems Engineering, Policy, and Design

Robotic Cane Offers Indoor Navigation for Visually Impaired Users

A robotic cane integrating a color 3D camera, an inertial measurement unit (IMU) and an on-board computer offers indoor navigation assistance to blind and visually impaired users. Using a building’s architectural drawing, the device can accurately guide a user to a desired location with sensory and auditory cues, while avoiding obstacles like boxes, furniture, and overhangs.

“Many people in the visually impaired community consider the white cane to be their best and most functional navigational tool, despite it being century-old technology,” said Cang Ye, Ph.D., lead author of the study and professor of computer science at the College of Engineering at the Virginia Commonwealth University, Richmond. “For sighted people, technologies like GPS-based applications have revolutionized navigation. We’re interested in creating a device that closes many of the gaps in functionality for white cane users.”

[Photo: Study author Lingqiu Jin tests the robotic cane. Image courtesy of Cang Ye, VCU.]

Cell phone-based applications can provide outdoor navigation assistance to the blind, giving street directions or staying within crosswalks, for example. But indoors remains a major challenge, especially when unfamiliar buildings are explored. Earlier versions of the cane incorporated building floorplans, inertial sensing, auditory cues and a robotic rolling tip to guide the user to a destination. But when used over long distances, the inaccuracies in the user’s location could build up, eventually leaving the user at an incorrect location.

To help correct this issue, Ye and colleagues added a color depth camera to the system. Using infrared light, the system can determine the distance between the cane and other physical objects, features like doorways and walls, as well as furniture and other obstacles. Integrated with data from an inertial sensor, the cane’s onboard computer can map the user’s precise location to the existing architectural drawing or floorplan, while alerting the user to obstacles perceived in the path.

“While some cell phone apps can give people auditory navigation instructions, when going around a corner for example, how do you know you’ve turned just the right amount?” said Ye. “The rolling tip on our robotic cane can guide you to turn at just the right point and exactly the right number of degrees, whether it’s 15 degrees or 90. This version can also alert you to overhanging obstacles, which a standard white cane cannot.”

Some work remains before the system is market-ready. Development of the device was co-funded by the National Institutes of Health’s National Eye Institute (NEI) and the National Institute of Biomedical Imaging and Bioengineering (NIBIB). Details of the updated design were published in the journal IEEE/CAA Journal of Automatica Sinica.

IGM_e-news_subscribe