Disclosures: The authors report no relevant financial disclosures.
October 06, 2021
2 min read

Researchers create cane with color 3D camera, sensors for visually impaired navigation

Disclosures: The authors report no relevant financial disclosures.
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.

Researchers at Virginia Commonwealth University developed a cane with a visual positioning system and color-depth camera to aid in daily navigation for people who are blind or visually impaired.

“Many people in the visually impaired community consider the white cane to be their best and most functional navigational tool, despite it being century-old technology,” lead study author Cang Ye, PhD, professor of computer science at Virginia Commonwealth University, said in a press release from the National Eye Institute, which funded development of the device with the National Institute of Biomedical Imaging and Bioengineering. “For sighted people, technologies like GPS-based applications have revolutionized navigation. We’re interested in creating a device that closes many of the gaps in functionality for white cane users.”

Ye, He Zhang, PhD, and Lingqiu Jin, MS, designed the cane to create a 3D map of the user’s surroundings in real time. The map is imposed on a 2D floor plan of permanent structures such as doors and walls to limit errors when determining the user’s specific location and orientation.

Cang Ye

The researchers created a robotic navigation aid (RNA) prototype with an Intel RealSense D435 RGB-D camera and VN-100 inertial measurement unit (IMU) (VectorNav) and developed a new depth-enhanced visual-inertial odometry (DVIO) method with six degrees of freedom, which uses a feature tracker, floor detector and state estimator to detect more accurately the device’s pose and heading.

DVIO estimation is used to make a 3D map of obstacles on the floor plan; it also informs the particle filter localization method of determining the position and heading of the RNA on the 2D floor plan to reach the next point of interest while avoiding obstacles.

The D435 camera and VN-100 IMU work together to provide data for the active rolling tip (ART), which steers the user in the desired direction. The user simultaneously receives audio navigation guidance through a Bluetooth headset.

The ART can be disengaged if the user wants to handle the device as a regular white cane.

“While we have solved the problem of locating the robotic cane in an indoor space by processing the data from camera and motion sensor, moving objects pose a challenge to the system to work properly,” Ye told Healio. “If the environment is highly dynamic (majority of the scene is moving), the vision-based positioning system could fail.”

Zhang and colleagues evaluated two blindfolded sighted people, in which the device succeeded in guiding the users to the destination.

“In terms of future work, we will recruit visually impaired human subjects to conduct experiments in various indoor environments to validate the assistive navigation function of the RNA prototype,” the authors wrote in the study.