Some questions regarding design patterns for accessibility
I've noticed a few apps changing their navigation design pattern from using the hamburger panel menu to now icons at the bottom of the screen. I notice that Netflix have done the same this week.
My concern is how users know that the menu and search is now at the bottom? Is it deemed acceptable to let customers using VO find it using the finger gesture? Does anyone know of any workarounds? I'm assuming this will also cause problems for users with switch controls?
I was considering using hint text to say a three finger tap will take you to the bottom of the screen where you will find the menu and search etc (not exact wording).
Any suggestions on this would be fab!
I don't think there's a need to add any additional hints to let you know there are tabs at the bottom because it's just following standard iOS navigation conventions. The navigation tabs have been around for a long time, long before Google and Android popularised hamburger menus. Apple uses them themselves in apps like Phone, Music or the app store so most people, even beginners, should be familiar with the concept.
Whenever I get a new app, I always use the flick right gesture to go through the interface a control at a time so I can memorise the layout and important landmarks, and this means I'll eventually discover that the app has navigation tabs at the bottom, even if it's the last thing VoiceOver sees on the screen. If anything I prefer tabs over the hamburger menus because they're always there, so eventually I just memorise where each tab is and I can just literarly put my finger on it to get to a section quickly
Hi, Thank you thats great!
I think i was just a bit surprised when I saw it on Netflix because swiping right meant swiping through all content before you got to the bottom but as you've pointed out, you would probably guess what was going on after a while. Yes I agree this is a better way than the hamburger menu, I was just wondering if we should be more explicit to our users as to what they need to do before they found out through learned behaviour.
We try and keep our Voiceover navigation swipe order separate to the actual button positions.
This means we can move them around etc and the Voiceover swipe order remains the same.
It's straightforward on the iPhone and iPad, but not yet supported by the Watch.
Hope that helps.
You wouldn't need to flick through the whole screen. It is possible to just tap the bottom of the screen to find the tabs, and if necessary to flick among them. Alternatively, a four-fingle single tap near the bottom of the screen places the VoiceOver cursor on the last element from which the user can flick left to find the desired element.