Could LIDAR on the new iPhone 12 Pro camera be a game changer for blind users?
As part of today's announcement of the new iPhone 12 range, the iPhone 12 Pro models are set to include LIDAR.
This is a function that detects light reflected off objects in the environment and is designed to improve augmented reality by tracking movement and improve camera focus.
My question is, could this be a bit of technology that massively improves life for us VI and Blind iPhone users?
As far as I'm aware, this is the first time we could hold in our hands a device that has the ability to scan a room and build a map of the environment with depth and texture.
We use audio signals in day to day life, often without even noticing, taking sound reflected off objects and surfaces to help with navigation.
Now there is the potential of having something we can carry with us that, if developers use effectively, could be used to make accessible scans of our environment.
Imagine a submarines sonar system, my impression is that LIDAR is much the same, just with light, instead of sound reflection.
It can't be too much of a leap to translate this technology into something accessible for us, can it?
Game changer... Yes.
Translate the technology into something accessible... Yes! (Already done.)
That technology has been the basis of my CS Masters Project, SEAR-RL I have been working on for over a decade. It is iOS Augmented Reality app that translates one's surrounding into sound so one can navigate through a real-world place. I got it working on the 2020 iPad Pro when it came out. Including LIDAR in iPhone saves from having to use external hardware. Which was a major barrier until now.
Now, if I just had a way to get a little more funding and perform more user testing to make it a refined product. (I was looking into all that at the beginning of the year, but Covid put a nice kibosh on all that. )
Oh well, a man's gotta eat.
Will see. It will depend on the people who create apps.
I'm going to buy the Iphone mini or the IPhone 12 soon and think it would be cool to test out, I don't know if it has all the feature you need for it to work though so I guess you'd have to look into it.
Also; does it cost cash to put things on testflight?
It's only the pro range that has LIDAR. I am considering the pro for this very reason but then I'd be purchasing a very capibial camera for no reason too.
The rumour mill suggests that the LIDAR tech and associated apps will be making their way to apples glasses which will be available over the next couple of years. It feels like this is a test bed, getting developers interested and letting them explore the possibilities. The idea with having it on the glasses is better privacy, 3D modelling the environment allows for very interesting interactions over image manipulation.
I'd be very interested to understand how your LIDAR application works, mainly, how you translate 3d objects into sound? I 3D print and do some limited 3D designing so new ways of communicating the 3D world is very interesting to me. Is it like The vOICe with a left to right scan or is it more instant spacial than that?
Thanks for your answer.
I think so too... But it would not just be the LIDAR that will bring new opportunities but all the other additional camera features as well:
- Image recognition in particular is announced to make full use of the A14 processing power;
- Wide angle detection with 3 back-cameras together with "noise" cancelation due to vibrations, hand movements, bad lighting conditions; all this should hugely facilitate the tasks which we, blind users use the camera for: text - QR- and barcode recoginition, color / light detection, etc.
I think all these criteria are worth investing into a PRO model, even if the Apps won't be available the first day or month.
Hey, just listened to the video on SEAR-RL you provided with the link above and it sounds amazing! The sounds provided appear quite artificial and intense for me and it should not be louder than VO when recognizing text, but I definitely can imagine using such an App. It has the same potential BlindSquare and Seeing AI do.
That's an awful lot of money to drop on a feature that hasn't proven itself yet. Then again, it's not that much more to jump from the standard 12 to the 12 pro, if you have the same storage options. I feel like this would be a lot more useful if they stuck lidar into the rumored glasses. Imagine sweeping a cane in one hand and holding your phone up in the other. Doesn't sound very fun. I think it's a toss up at the moment
Agreed that this feature is expensive and of dubious utility to the blind now.
However, it is very encouraging that this feature is starting to be introduced in devices as small and inexpensive as phones. The technology will undoubtedly become less expensive as time goes by and mass production of more devices brings prices down even further. Also, as the technology becomes less expensive and more ubiquitous developers will have more opportunities to work with it and will figure out how the technology can benefit visually impaired folks.
So, although we won't be taking advantage of this technology right now, this is a very encouraging sign and points to what will be possible in the near future. If this technology is in the high end phones now it will eventually surely be a standard tool of all or most phones in the future.
I believe that the Bose Frames started the ball rolling, with regard to enhancing the visually impaired users abilities, both in navigation and sound enhancements in their environment. Although I don't own the Bose Frames, from what I have read, I would think that if one would pair an iPhone 13 with the Bose Frames, we might just be hable to have the best of both worlds, until Apple Glass gets released.
As you've noticed, Microsoft released a day ago an update, of it's Seeing AI app.
Microsoft is saying in their update description, that Seeing AI has now added a new channel called "wold."
This feature uses Apples Lidar scanner to give the user a good room impression. Has someone tested this channel already?