Labelling photos revisited.

iOS and iPadOS


I wonder if any list member would be kind enough to shed some light on a question that has interested me.

Some packages like Envision AI and TapTapSee allow you to save an image with its description.

ExifEditor also lets us add labels and change properties of photos.

So when we use the Photos App as VoiceOver users when we browse our photos library we here the description of the photo which makes it easier for us to organise things.

As I understand it a sighted person does not see these labels but information such as the location where the picture was taken is shown.

So there is a discrepancy between what we here and what people see.

So my question is that apart from Voiceover users are the labels that attached to photos are only of use to Voiceover users or do they have some useful purpose to the wider public?



Submitted by venova on Wednesday, May 16, 2018

Okay, rephrasing that.
I was on the go and wanted to ask about TapTapSee.
I don’t know how to do that at all. I haven’t found this.
Could you tell me how?

Submitted by Malcolm13 on Wednesday, May 16, 2018

In reply to by venova


Hope I understand your question correctly.

When you take a photograph with TapTapSee and the image description is return ed, double-tap the Share Button. From here choose the Save Image Button.

You should now have a copy of the image in your photo library with the description added that VoiceOver reads.

Please reply if you want further information or if I have miss-understood your request.


Submitted by LaBoheme on Wednesday, May 16, 2018

"As I understand it a sighted person does not see these labels but information such as the location where the picture was taken is shown."

photos can be grouped by location, and you can show the map where the photo is taken. it's not like the location and the photo displayed side by side, unless you use a software specific for that. the exifeditor you use can easily display the location and the map, try it out. apple's native photo app only gives minimal location info for vo users while displays maps for sighted users. i have talked to them about this since ios 8, nothing is done. in fact, i don't talk to apple about anything any more. considering how many family members, friends and nd acquaintances i have pitched into buying apple products, can't say it's my loss.

image description is just a meta field, just like copyright, light source, camera model, etc., anyone needing these info can look it up using proper software. vo deliberately target the image description field, that's what you hear.

also, image description is not just for blind users. you can have a bunch of photos about pasta, for example. they are all taken in different locations, in different settings. adding the word pasta to the description helps locating these photos in the future. news organizations use this method all the time.

Submitted by Faith2011 on Tuesday, April 2, 2019

One of the greatest apps, but if you save a photo to your camera role, you get two: one is labelled and the other isn't.