Through an inventive navigation system evaluated using virtual reality, researchers from NYU Tandon School of Engineering have published a new study in JMIR Rehabilitation and Assistive Technology that gives hope to those who are blind or have low vision (pBLV).
The device, which combines vibrational and sound feedback, is intended to assist people in navigating complicated real-world surroundings more safely and successfully.
The study builds on previous work by John-Ross Rizzo, Maurizio Porfiri, and others to create a first-of-its-kind wearable device to assist pBLV in navigating their surroundings independently.
Traditional mobility aids have key limitations that we want to overcome. White canes only detect objects through contact and miss obstacles outside their range, while guide dogs require extensive training and are costly. As a result, only 2 to 8 percent of visually impaired Americans use either aid.
Fabiana Sofia Ricci, Study Lead Author and PhD Candidate, Tandon Department of Biomedical Engineering, New York University
In this study, the research team reduced the previous haptic feedback of its backpack-based device to a discrete belt equipped with ten precision vibration motors. The belt's electronic components, which include a special circuit board and microprocessor, fit inside a simple waist bag, marking a significant step toward making the technology viable for real-world application.
The technology delivers two types of sensory feedback: vibrations through the belt indicate obstacle position and proximity, and auditory beeps through a headset increase in frequency as users approach impediments in their path.
We want to reach a point where the technology we’re building is light, largely unseen and has all the necessary performance required for efficient and safe navigation.
John-Ross Rizzo, Associate Professor, New York University
He added, “The goal is something you can wear with any type of clothing, so people are not bothered in any way by the technology.”
The researchers evaluated the device by enrolling 72 people with normal vision to wear Meta Quest 2 VR headsets and haptic feedback belts while wandering around NYU's Media Commons at 370 Jay Street in Downtown Brooklyn, which is an empty room with only side curtains.
The participants used their headsets to view a simulated subway station as someone with advanced glaucoma would see it: with limited peripheral vision, obscured details, and changed color perception. The setting, designed with Unity gaming software to match the room’s exact proportions, allowed the team to assess how effectively participants could navigate using the belt’s vibrations and audio feedback while their vision was hindered.
We worked with mobility specialists and NYU Langone ophthalmologists to design the VR simulation to accurately recreate advanced glaucoma symptoms. Within this environment, we included common transit challenges that visually impaired people face daily - broken elevators, construction zones, pedestrian traffic, and unexpected obstacles.
Maurizio Porfiri, Professor, New York University
The results showed that haptic feedback significantly reduced collisions with barriers, while audio cues allowed users to move more smoothly across space. Future studies will include those with vision loss.
The technology adds to the capability of Commute Booster, a smartphone app developed by a Rizzo-led team to provide pBLV navigation advice within subway stations. The Commute Booster “reads” station signage and directs passengers to their destination, while the haptic belt could help users avoid obstacles.
Rizzo, Porfiri, and a group of NYU colleagues received a $5 million grant from the National Science Foundation (NSF) in December 2023 through its Convergence Accelerator. This program supports the development of assistive and rehabilitative technology.
That grant and others from the NSF sponsored this research and will also help build Commute Booster. The study’s authors, in addition to Ricci, Rizzo, and Porfiri, are Lorenzo Liguori and Eduardo Palermo from Sapienza University's Department of Mechanical and Aerospace Engineering in Rome, Italy.
Journal Reference:
Ricci, F. S., et. al. (2024) Navigation Training for Persons with Visual Disability Through Multisensory Assistive Technology: Mixed Methods Experimental Study. JMIR Rehabilitation and Assistive Technology. doi.org/10.2196/55776