Apple Accessibility has asked me for suggestion of use case for Apple Pencil on iPad for VoiceOver and I want to source the hive mind!
So I constantly seem to be asking the question, can we use Apple Pencil with VoiceOver. The short answer is, shmaybe... It can be used with the VoiceOver handwriting feature but it's a very expensive stand in for a finger or inexpensive stylus. When I ask apple accessibility directly they said it can be used as normal but it isn't an experience they would recommend.
they have said there is interest in developing the functionality of the Apple Pencil, presumably for all accessibility, but specifically they asked me about VoiceOver, what ways in which the pencil could be used to improve overall use of the iPad and more specifically how it could work in certain apps.
As a quick reminder, the pencil, after some quick Googling, has an accelerometer built in and a pressure sensor on the tip. My first thought is using the accelerometer in the pencil to complete VoiceOver gestures such as a figure of eight to start reading a document, basically I want a magic wand. I'm not sure if the accelerometer is capable of this granularity but it can certainly understand taps on the barrel, so again, double tap to start reading and another double tap to stop. It might even be possible to sense a multi finger tap but of course, you need to still be holding the pencil which limits things.
Anyway, it's an exciting opportunity for us to give some feedback on what we might want to use the Apple Pencil for. If you can throw your ideas in here I'll put together a digest for apple accessibility to take a look at after all, we're really the only ones who know how we use our devices.
This would possibly be tricky and also somewhat ,dependent on how accessible of a PDF document, etc., you can get ahold of... The ability to record/capture my signature and put it on a PDF file would be helpful. Even better if VoiceOver or the app would then put the signature in the right place. Maybe VoiceOver could offer a guide of sorts, say let's start at the bottom left corner of the screen and it captures it and puts it in the more correct place.
Still an expensive tool for one task, but it'd be helpful.
Extra points if it'd scan a paper document, OCR it, and figure out where to put the signature.
This could potentially bhe more efficient and performant than the current handwriting feature of VoiceOver which requires the print drawing of individual letters rather than fluid cursive handwritten input that gets transformed into editable text.
Using the button or pressure sensor could trigger or indicate the use of enhanced, and perhaps even customized gestures or drawing shapes / patterns such as triangle, X, Z or a figure eight for example to activate selected functions.
I did not loose my sight till I was 27. I still remember hand writing notes and I miss it a lot. I would love to be able to use the pencil to be able to take hand written notes again. Signing documents is also a good suggestion. Being able to annotate documents would also be a good thing to be able to do. It would also e nice to be able to move objects on the screen around with the pencil. If we could use the pencil, we could use the iPad like a digital notebook. It could be used to turn the iPad into a bullet journal. THere are just all kinds of things we could do with the pencil.
Hi there, honestly, I agree with what you're suggesting. This will help us sign different documents for specific use.
I don't know. Signing un online contract at a banc for example.
I've done my high school and university math with pen and paper and some sight left, and it has been really hard to find another means of doing math in which I could spend my energy solving the problem instead of thinking about how to properly write what I want.
Using the pencil to write Math stuff, having it converted to text and reading it with VoiceOver would be a great solution!
It could also be helpful for drawing diagrams, it would improve communication in some technical discussions (e.g. software architecture).
I believe sighted people already have this latter functionality, help in drawing shapes, but the math symbols recognition I don't know if is already implemented.
So apple have asked that we send suggestions as individuals rather than me just sending a digest. It just means they can keep better track
of issues and ideas a for specific users.
Still, if you do have ideas of how the apple pencil could be utilised better for voiceover users, do email:
As they do seem to be actively listening.
HI All, So I was thinking that they could improve upon the pen and add a haptic engine to it. that way like in the example of signing a pdf, when you put the apple pensile on a signature field, you could feel through the pencil's vibration that you are on the field. what if it could be used to fill out basic diagrams through haptics? you look at a graph with it by running the pencil over the graph and when you are on the lines it will vibrate or something. this might be stupid, but that is what I can think up of for the future of this technology for us.
I did just make a similar suggestion though this was also including a camera in the tip for real world documents. I like the haptic idea too though especially as iPad has no haptic engine of its own. Would be good for notifications when working, and indicator of a changed state or a confirmation of an air gesture.
As I said, they replied to me asking that all people with ideas email them individually so they can get a better feel for the range of use case scenarios.