As part of today's announcement of the new iPhone 12 range, the iPhone 12 Pro models are set to include LIDAR.
This is a function that detects light reflected off objects in the environment and is designed to improve augmented reality by tracking movement and improve camera focus.
My question is, could this be a bit of technology that massively improves life for us VI and Blind iPhone users?
As far as I'm aware, this is the first time we could hold in our hands a device that has the ability to scan a room and build a map of the environment with depth and texture.
We use audio signals in day to day life, often without even noticing, taking sound reflected off objects and surfaces to help with navigation.
Now there is the potential of having something we can carry with us that, if developers use effectively, could be used to make accessible scans of our environment.
Imagine a submarines sonar system, my impression is that LIDAR is much the same, just with light, instead of sound reflection.
It can't be too much of a leap to translate this technology into something accessible for us, can it?