Duncan Bell
Jun 11, 2024

The Final Link in Empowering Those with Sight Loss?

Depth sensors hold immense potential to transform the lives of those with sight loss by providing enhanced spatial awareness and obstacle detection.

The Final Link in Empowering Those with Sight Loss?

Depth Sensors in Consumer Mobile Devices.

Introduction

For the more than 43 million people with sight loss, daily life presents challenges that differ from those with sight. Whilst working, recreating and living in the same physical world as us, they perceive it in a different sensory format. They have long adopted technological and non-technological aids to assist them where necessary. Despite advancements in such technology, many existing aids still leave much to be desired, specifically in terms of spatial awareness and obstacle detection. A promising new development in consumer mobile devices - depth sensors - holds the potential to further empower how individuals with sight loss perceive their surroundings. By providing real-time depth data, these sensors could become a game-changer in further empowering those with sight loss.

Understanding Depth Sensors

Depth sensors are sophisticated devices that measure the distance between the sensor and objects in the environment. They work by emitting a signal, such as light or sound, and measuring the time it takes for the signal to return after bouncing off an object. This data is then used to create a three-dimensional map of the surroundings. There are various types of depth sensors, including 1) Stereo Vision, 2) Structured Light and 3) Time of Flight - including both Indirect Time of Flight (iToF) and Direct Time of Flight (dToF) -  such as LiDAR. These technologies are already being incorporated into popular consumer devices such as the latest iPhones Pro models and certain high-end Android devices.

The Challenges of Navigating with Sight Loss

For individuals with sight loss, navigating through daily life poses significant challenges. Common obstacles include avoiding tripping hazards, detecting overhead barriers, and accurately gauging distances to objects. Traditional aids like canes and guide dogs, while immensely helpful, do have their limitations. Canes can only detect obstacles through touch, sometimes too late to avoid collisions, while guide dogs require extensive training and care. GPS-based applications have helped greatly with general navigation but lack the precision needed for real-time obstacle detection and spatial awareness.

Building on the Past

The release of Microsoft's Kinect depth sensing headset in 2010 caused a flurry of research which clearly identified the potential of depth sensors as a complimentary tech to image based solutions. In 2012 Stephen L. Hicks et al at Oxford University proposed a head mounted Kinect for depth sensing with great results. In 2013 Osama Halabi et al went a step further by including the use of both AR and 3D spatial sound to communicate the room surroundings better.

Somewhat strangely, Microsoft went on to ignore depth in their 2017 release of the app SeeingAI. Choosing instead to use the 2D image together with their Artificial Intelligence models to achieve tasks such as scene description, money and food identification. Then, in 2021, Google developed technology specifically aimed at runners with sight loss. Project Guideline uses machine learning to identify a line on the running track. Using synthesized sound to guide the runner in real time. Truly empowering!

How Depth Sensors Can Help

Depth sensors can dramatically enhance spatial awareness for those with sight loss. Challenges not easily solvable by image based solutions are best illustrated with the question “is it a small obstacle really close to me or a large obstacle a good distance away?”  For a person with sight loss this is of major concern. By providing detailed, real-time depth information about the environment, these sensors can help users detect and identify obstacles well in advance. Imagine a mobile device, carried in the top pocket, or hung around the neck that not only guides you to your destination but also alerts you to a low-hanging branch or an uneven sidewalk. Depth sensors can also improve object detection, such as that which we at ROONIQ are developing, by providing accurate distance measurements, making it easier for users to navigate around furniture or other obstacles in their homes.

One of the most exciting prospects is the integration of depth data with Augmented Reality (AR) applications. AR can overlay useful information onto the real world, creating an immersive and informative experience. Only 10-15% of people with sight loss are categorized as having no useful vision, with the remainder having some vision. An AR and depth enabled world can be high contrast with customizable colour schemes. Filtering out unimportant things and emphasizing those of high importance such as highlighting clear paths, identifying hazards, providing a richer, more detailed understanding of the objects in and around their immediate environment.

Future Applications and Innovations

Modern mobile phones have immense processing power, extremely high resolution cameras, advanced haptic interfaces and now depth sensors. There is the possibility to seamlessly integrated with existing assistive technologies, such as smart canes or wearable devices, to provide continuous spatial awareness. Imagine a pair of smart glasses that use depth sensors to create a detailed map of your surroundings, alerting the wearer to hazards and guiding them safely through crowded areas. The possibilities are many.

Smart Environment Integration

Companies such as Apple are developing detailed three dimensional maps of major public  indoor environments. These maps, converted into depth data can help users navigate complex spaces like airports and shopping malls with greater confidence and independence. Where a sighted individual can compare visual cues on a map, those with sight loss can achieve the same with the depth information they receive from the environment in real time.

Challenges and Considerations

Despite their potential, there are challenges to the widespread adoption of depth sensors. Privacy concerns are paramount, as these sensors collect detailed data about users' surroundings. Ensuring that this data is used responsibly and securely is crucial. Additionally, making these technologies accessible and affordable for the visually impaired community is essential. Current high-end devices with advanced depth sensors can be prohibitively expensive, limiting their availability to those who could benefit the most. Technical limitations also need addressing, such as battery life, processing power and improving the accuracy and reliability of depth data in various lighting conditions and environments.

Conclusion

Depth sensors hold immense potential to transform the lives of those with sight loss by providing enhanced spatial awareness and obstacle detection. As this technology continues to advance, we can look forward to a future where individuals with sight loss navigate their surroundings with greater ease and confidence. By supporting technological innovations and advocating for accessibility, we can help ensure that these life-changing advancements are available to all who need them. The journey towards a more inclusive world is ongoing, and with depth sensors, we can take a significant step in the right direction.

Additional Resources

For those interested in learning more, here are some additional resources:

By staying informed and engaged, we can all contribute to a future where technology empowers everyone, regardless of their abilities.