The French startup SonarVision is developing an application that assists visually impaired individuals in navigating city streets using high-precision location tracking and “3D audio.”
When visually impaired individuals wish to travel in a bustling and vast city like Paris, they can easily get lost right after leaving the subway station. Understanding this challenge, two French engineering students have worked to develop an augmented reality (AR) navigation application that identifies the easiest routes for visually impaired users, utilizing “spatial audio” – also known as 3D audio – to guide them along the correct paths.
“We are trying to create something really simple so that you can receive 3D audio from the correct direction. You turn towards the sound source and then you can continue your journey,” co-founder and CEO of SonarVision, Nathan Daix, explains.
Augmented reality (AR) and “3D audio” can assist visually impaired individuals in navigating the city. (Image: Canva).
This application is still under development and testing, but the young startup aims to launch it in the market this year. The testing program is taking place in Paris but can easily be applied in other major European capitals.
Daix’s project has been selected for funding by the “incubator” program at CentraleSupélec. In fact, there are already several applications designed to alert visually impaired users when they are out and about, such as Blindsquare and Soundscape, but SonarVision offers superior features. The application can guide a person from point A to B much like a high-precision GPS device while also being highly intuitive.
According to Daix, mainstream navigation applications like Google Maps and Apple Maps do not adequately meet the needs of visually impaired users and are difficult to use. Moreover, one of the frustrating issues with these products is their accuracy.
GPS in cities can function well with an accuracy of about 4 to 5 meters. However, in approximately 30% of other instances, this accuracy can deviate by an additional 10 meters. This is dangerous because while GPS may indicate that you have arrived at a bus stop, in reality, you may still be on the other side of the street. Therefore, you still need to navigate your way to the bus stop without a clear indication of your precise location.
To address this issue, SonarVision utilizes the phone’s camera to scan the positions of surrounding buildings using AR technology and compares them to Apple’s database for a specific city.
As a result, Nathan Daix states that they can track users’ geographical locations with an accuracy of 0.2 to 1 meter.
Visually impaired individuals can accurately stand at crosswalks and sidewalks while avoiding stairs and construction areas.
For this technology to function, all users need is headphones and an iPhone with a camera facing the street. In the future, the camera component may be integrated into AR glasses for more convenience. This way, users will no longer need to hold their phones to scan the surrounding landscape.
However, this application cannot detect obstacles in real-time. It is merely a supplementary support product for visually impaired users, alongside white canes, guide dogs, or other devices.
Daix’s team also possesses another technology called LiDAR that helps the visually impaired detect obstacles on their path, featuring light and depth detection capabilities in the environment.
“It alerts us to obstacles, not just at ground level, but also at head and body height. It is truly a powerful tool,” the CEO of SonarVision shares.
With the potential to replace the traditional white cane, this application is currently being tested on the iPhone 12 Pro. Unfortunately, LiDAR is only compatible with more expensive phone models like the 12 Pro and above. SonarVision is striving to make this technology as accessible as possible.