Proactively learning sighted gestures
Hi all. I'm sticking this in the IOS category as this topic somewhat came up on another post. I thought of starting a new topic. As blind people, are we doing ourselves a disadvantage by not learning at least a few sighted gestures when it comes to any Apple product? Now not having a watch I can't comment, but let's take IOS You just got Mom the iPad mini she wants, now you offer to help. You need voice over, she doesn't. Someone made the point that they had a hard time explaining what they wanted their parent to do because they had no icon on the screen that says, Share, or bookmark, or something like that. My honest thought, we don't want to learn it. we don't want to become visual. Why would it not make some sense to say to parent, "I know you heard it say bookmark, what does it look like when my finger is on that icon?" I'm not expecting us to know that we want the creamiest eggshell paint in the kitchen of our new place, but a robin's egg blue in the bedroom, but just becoming a little more on the sighted working surface with everyone else, just seems sensible. Otherwise, we're really letting others see just how much we rely on things and not think outside the box and just have a thought, ok I know it's na icon so what do we do to help the person? Why else would a totally blind woman have written Get the Picture, a book that gasp, shows even totally blind people how to take beautiful photos. I'm not trying to start a war though I'm sure I will, I'm good that it. Please consider that i'm trying to show people we care that we want the colors of our clothes to match we picked out the furniture because it goes with the walls, not asked a sighted trusted friend or family member or spouse, with the old adage, "he/she can see."
I have run into the situation of not being able to communicate with a sighted person about IOS icons when trying to explain how to use apps; to close a tab in Safari, for example. A few times, I've asked a sighted person to turn on Voiceover and directed them when trying to explain something over the phone.
It might be helpful to have some sort of a guide of basic icon descriptions for Voiceover users.
Yes, I've had a similar problem.
I have a sighted friend who I'd love to help with her mac but we have problems because I don't know what the icons look like to tell her which one to click on.
How can we learn this?
This is a good discussion. I think what's happening here is common in the blind community. You're used to doing things as a blind person that you forget to think of things from the visual perspective.
I, too, would like to suggest creating a kind ov guide, either as a text post or podcast, that describes the icons.
In the meantime, one thing all of us can do is ask a friend to describe what the icons look like. Use your iPhone or some other note making tool to record the descriptions.
On an impulse, I did this the other day. I figured it would also help my friend as much as it would help me. I think I'll do that again this weekend. This way, I will have a guide nearby.
Given this, if anyone wants to contribute to this description guide idea, please do what you can. Thanks.
Hi all. I admit to thinking i'd just caused a war but guess not. Lol. I'm glad people seemto understand, we're blind but we should be sighted as well. It just doesn't make sense we shouldn't be receptive to feedback of helping Mom with the ipad or something like that. plus, you can get the shocked response of "How the hell idd you do that?" which I got for helping my cousin fix her email, all she had wrong was the SMtp server information. When I made her read it more then once, ffixed it, and her incoming mail appeared, I got the slap on the knee of oh wow you did it! :) Tina good luck helping your firend out again. I'm glad people acutally agree with me.
How about all of us work together to build such a descriptive icon guide, starting with iOS?
Each comment could contain just two pieces of information for one control, icon or other visual element:
1. A visual description of the icon. Include descriptive elements such as its color and shape.
2. A concise description of the action(s) performed when the icon is selected. Include descriptions of actions that take place when the icon is tapped, tapped and held, dragged, or acted upon using any other gesture.
Hope this helps.
When iOS 9.2 came out, one of the items mentioned as being changed was in the Music app, which "finally" showed which songs were downloaded to the device and which ones were only in the cloud. I thought that was strange, because VoiceOver always told me that a track was "available offline in this iPad." Apparently that was something only VoiceOver was telling me, that there was no visual indication of this status on the screen. Incidentally, they did change the VoiceOver label so it now says "downloaded" instead.
Over the last couple of years, I have transitioned tfrom using devices visually to using them with speech, as my vision has become progressively worse. This has, among other things, shown me how different these devices are when used without sight. It's not just knowing how an icon looks like or where it is located on the screen. The way certain tasks, including common ones, is performed by the user changes as well. I have had to transition from using drag and drop to using cut and paste, for example, since the latter is much easier when using keyboard navigation.
I know the share icon in Safari looks like an arrow.
Does the ones that says "pages" look like a book?
The way we do things is indeed totally different.
We can still learn though and help others and ourselves bad doing it.
The same thing happened when I tried teaching my Dad how to work Whatsapp, thus I showed him with VO on, while I performed the steps, in this case, to write to a Contact within Whatsapp's Favorites; thanks to this approach, I was told that when VO says Write Message (remember I'm using my IPhone in Spahisn, thus maybe in English it says something else, but hopefully you get the idea), a Pencil icon is onscreen, thus something new learned thanks to this experience.
If it's an app on my phone and someone else has it on their phone, I open the app and tell them which thing on the screen I'm talking about. A generic example, although it may say something about Bookmarks on the screen. Let's say someone called me up who was sighted and asked me where the show bookmarks in Safari was. I would tell them to open the Safari browser and then tap the second button from the right which should be the show bookmarks button. I hope this explanation helps a bit.
I'm really surprised that there isn't a way to get VO to announce the shape of an icon you touch, as well as the shape of the cursor when you touch said icon. I would think this a common enough request that it would have been implemented by now, but obviously not. Should this be something we ask Apple to incorporate? For now, I guess we'll need the sighted to help us help the sighted lol.
This subject actually came up on the thread I posted about the Apple At Home Advisor position . Someone made a very valid point that it would be hard for a blind person to provide troubleshooting of an Apple product to a sighted person since we use the products very differently . There is currently a speak hints option in VoiceOver so that you hear an announcement about what a control does if you pause after selecting it. How about we request another option where we also can hear a visual description of the control, as it appears to cited users on the screen?
The option of adding shape descriptions is an interesting idea, but this leaves me to ask some questions. How long should these descriptions be? Also, how much programming would it take? If it can't be done on the device itself, where else would you get these shape descriptions?
I did have someone try to describe the graphics visually, and I was made aware of how complex a job it is. I hope to ask another friend to describe them tomorrow and get a feel for what it sounds like.
Also, even if you do get a friend to describe these graphics, use your own imaginations. It may not help anyone else, but you can come up with some idea of what the graphics look like based on the descriptions, and combine that with your imagination.
I'm pleased this discussion is coming up. I'll do what I can for my own understanding, and if it's worthy of sharing, I'll share it with you. Thanks.
One option that I do not think people have brought up yet here, is to read through the user guides for the apple products.
From what I have noticed, the descriptions have some of this information right in them.
Hope this helps. :)
Many assistive technology trainers that I know of use a system of teaching people who are blind to use mainstream computer applications using assistive technology. This is the approach they take:
1. Divide each screen of an application into "rows" including a title bar, menu bar, content areas and a status bar.
2. Describe each element on each "row" of the application including a visual description, the description that the assistive technology uses for this element, and the element's function.
3. Explain how each element can be activated for both the assistive technology user and people who do not use assistive technology.
Some of this detail cannot be applied to iOS, though it could be certainly applied to OS X.
For example, using this description we can describe the start page of Safari for the iPhone:
1. At the top of the screen is a long bar where the address is inserted. There is also a button for reloading the page to the right of the address bar. When the address field is activated, a "Clear Text" button appears to the right with the on-screen keyboard below it.
2. Below the Address Bar are four buttons pointing to frequently used websites.
3. Users can add more websites that they like below this.
4. There is a large area in the centre of the screen that contains nothing.
5. At the bottom of the screen are five buttons that VoiceOver identifies as "Back", "Forward", "Share", "Show bookmarks" and "Pages".
For example, if I were telling someone to go forward one page in Safari, I would tell them to tap the second button from the left in the bottom of the Safari window.
Sorry for this very long explanation. I am not trying to come across as being superior; it is just a method that I use and seems to work quite well. This method is also supported by Apple's Human Interface Guidelines for app developers.