Could LIDAR on the new iPhone 12 Pro camera be a game changer for blind users?

By Stoo, 13 October, 2020

Forum
Apple Hardware and Compatible Accessories

As part of today's announcement of the new iPhone 12 range, the iPhone 12 Pro models are set to include LIDAR.

This is a function that detects light reflected off objects in the environment and is designed to improve augmented reality by tracking movement and improve camera focus.

My question is, could this be a bit of technology that massively improves life for us VI and Blind iPhone users?

As far as I'm aware, this is the first time we could hold in our hands a device that has the ability to scan a room and build a map of the environment with depth and texture.

We use audio signals in day to day life, often without even noticing, taking sound reflected off objects and surfaces to help with navigation.

Now there is the potential of having something we can carry with us that, if developers use effectively, could be used to make accessible scans of our environment.

Imagine a submarines sonar system, my impression is that LIDAR is much the same, just with light, instead of sound reflection.

It can't be too much of a leap to translate this technology into something accessible for us, can it?

Options

Comments

By Staque on Tuesday, November 3, 2020 - 09:15

Game changer... Yes.

Translate the technology into something accessible... Yes! (Already done.)

That technology has been the basis of my CS Masters Project, SEAR-RL I have been working on for over a decade. It is iOS Augmented Reality app that translates one's surrounding into sound so one can navigate through a real-world place. I got it working on the 2020 iPad Pro when it came out. Including LIDAR in iPhone saves from having to use external hardware. Which was a major barrier until now.

Now, if I just had a way to get a little more funding and perform more user testing to make it a refined product. (I was looking into all that at the beginning of the year, but Covid put a nice kibosh on all that. )

Oh well, a man's gotta eat.

Staque

By Holger Fiallo on Tuesday, November 3, 2020 - 09:15

Will see. It will depend on the people who create apps.

By Brad on Tuesday, November 3, 2020 - 09:15

In reply to by Staque

I'm going to buy the Iphone mini or the IPhone 12 soon and think it would be cool to test out, I don't know if it has all the feature you need for it to work though so I guess you'd have to look into it.

Also; does it cost cash to put things on testflight?

By Patrick Hurst on Tuesday, November 3, 2020 - 09:15

I think so too... But it would not just be the LIDAR that will bring new opportunities but all the other additional camera features as well:
- Image recognition in particular is announced to make full use of the A14 processing power;
- Wide angle detection with 3 back-cameras together with "noise" cancelation due to vibrations, hand movements, bad lighting conditions; all this should hugely facilitate the tasks which we, blind users use the camera for: text - QR- and barcode recoginition, color / light detection, etc.
I think all these criteria are worth investing into a PRO model, even if the Apps won't be available the first day or month.

By Patrick Hurst on Tuesday, November 3, 2020 - 09:15

Hey, just listened to the video on SEAR-RL you provided with the link above and it sounds amazing! The sounds provided appear quite artificial and intense for me and it should not be louder than VO when recognizing text, but I definitely can imagine using such an App. It has the same potential BlindSquare and Seeing AI do.

By Unregistered User (not verified) on Tuesday, November 3, 2020 - 09:15

That's an awful lot of money to drop on a feature that hasn't proven itself yet. Then again, it's not that much more to jump from the standard 12 to the 12 pro, if you have the same storage options. I feel like this would be a lot more useful if they stuck lidar into the rumored glasses. Imagine sweeping a cane in one hand and holding your phone up in the other. Doesn't sound very fun. I think it's a toss up at the moment

By peter on Tuesday, November 3, 2020 - 09:15

Agreed that this feature is expensive and of dubious utility to the blind now.

However, it is very encouraging that this feature is starting to be introduced in devices as small and inexpensive as phones. The technology will undoubtedly become less expensive as time goes by and mass production of more devices brings prices down even further. Also, as the technology becomes less expensive and more ubiquitous developers will have more opportunities to work with it and will figure out how the technology can benefit visually impaired folks.

So, although we won't be taking advantage of this technology right now, this is a very encouraging sign and points to what will be possible in the near future. If this technology is in the high end phones now it will eventually surely be a standard tool of all or most phones in the future.

--Pete

By Roxann Pollard on Tuesday, November 3, 2020 - 09:15

I believe that the Bose Frames started the ball rolling, with regard to enhancing the visually impaired users abilities, both in navigation and sound enhancements in their environment. Although I don't own the Bose Frames, from what I have read, I would think that if one would pair an iPhone 13 with the Bose Frames, we might just be hable to have the best of both worlds, until Apple Glass gets released.

By bonerobot on Sunday, January 3, 2021 - 09:15

Hi guys,

As you've noticed, Microsoft released a day ago an update, of it's Seeing AI app.
Microsoft is saying in their update description, that Seeing AI has now added a new channel called "wold."
This feature uses Apples Lidar scanner to give the user a good room impression. Has someone tested this channel already?