This post has been updated to include tips for using the image description feature based on feedback received from the Applevis Editorial Team and members of the Applevis community.
When I first heard at the WWDC Conference In June that Voice Over was going to include a new image description feature I was excited to say the least. I thought to myself - finally I will be able to know what all of those funny memes say on Facebook! Yes, I know - super productive application of the feature right? Well, when I finally got my hands on iOS 11, I was very disappointed. Below follows a summary of the new feature as well as my experience putting it through its paces over the past week on my iPhone 6.
Using the Image Description Feature
Apple has made it very simple to access image descriptions without having to import the image to another application. When you encounter an image that you would like the description of, you simply tap once with 3 fingers and you receive a general description of what is in the picture along with the text if any is detected. The only improvement that I would like to see to this gesture is an ability to do it one handed as it can be a bit awkward to use the 3 finger single tap motion.
Image Descriptions in the Photos App
The first place I test drove this feature was in my Photos app. Previous to iOS 11, Voice Over attempted to guess what was in the picture as you browsed through the photo library. Oftentimes you would hear if the image was sharp or blurry, if there were faces in the image and descriptions of possible objects in the image such as cars or animals. In iOS 11, it appears that this feature has completely been replaced with the image description feature/gesture. As I scrolled through my library the only descriptions I heard were if the image was sharp or blurry with no mention of people or objects in the picture.
When focused on a picture and using the 3 finger single tap method I only received minimally more feedback about the picture including information about again the sharpness/brightness of the picture and the page the picture could be found on. Many of the pictures did contain text, however Voice Over more often than not did not detect that there was any text in the picture.
Image Descriptions in the Facebook App
The next place I tested this feature was on Facebook. There are so many pictures, many of which contain text, that are shared on this platform daily and I was really intrigued by the idea that I could finally have access to all of the funny and sometimes thought provoking images that my friends and family share. I promptly scrolled through my News Feed of course not finding any images immediately (because this is always how it goes when you want to find photos). I finally found a photo that a friend shared. I was so excited, the Facebook alternative text even said that the image may contain text. "Yes!" I thought, I can finally be part of the conversation. I double tapped on the status update that contained the photo, scrolled to the image, invoked the 3 finger single tap and...all I got was that the image was sharp. I was so disappointed especially after all of the hype that this feature received. Thinking that this surely must just be a fluke, I continued to scroll through my News Feed to find another picture. Time and time again when I encountered pictures on Facebook I went into the status, found the picture and almost every time I was given even less description of the photo with Voice Over than the alternative text that Facebook already provides.
And What About Gifs and Images in the Messages App?
The final place that I tested the image description feature was within the Messages app under the Images i-message app. Last year when Apple launched all of the i-message apps including the ability to search for and send gifs and images I was very disappointed that no alternative text or descriptions were built into the native app. When Apple announced the new image description feature I thought that surely this feature would work amazingly in their own native app. Once again, I was disappointed. When scrolling through the list of potential images and invoking the 3 finger single tap to access descriptions I was again provided with minimal useful information about the image. I was not even provided with if there were people or animals in the picture and there definitely was not any text extracted and spoken from any of the images.
Tips for Optimizing Image Descriptions
Although the image description is not perfect, there are some settings that you can adjust to increase the likelihood that Voice Over will accurately provide descriptions.
Firstly, this feature will not work well with the screen curtain turned on. If the screen curtain is turned on, you will likely hear an image description that indicates that the image is dark or blurry. Secondly, your screen brightness must be turned all the way up to 100% especially for images that contain text. I performed more testing after the official release of iOS 11 and found that the descriptions significantly improved after making these adjustments.
As you can see, I had very high hopes for this new image description feature. As many of you know, I absolutely love Apple, however they definitely missed the mark on this accessibility feature. The descriptions provided are of no assistance to a blind or visually impaired user and the claim that text will be described is simply unreliable. I understand that there are many different types of typography and layouts that may impact how Voice Over would be able to read and describe the image, however this is so unreliable that the feature in my opinion is completely useless at this point in time.
I will definitely keep an eye out for future updates to hopefully see improvements to this feature, however, for now, if you were hoping to upgrade to iOS 11 for this feature I would recommend waiting until some additional accessibility bugs and improvements can be addressed as there are not many significant updates and improvements to iOS 11 this year.
Have you tested out this feature yet? What was your experience? Share in the comments below.