Programming The iPhone For Accessibility By The Visually Impaired
Per’s is a noble quest, so we’ll do our part here:
VoiceOver Over View
VoiceOver, says Apple, “describes an application’s user interface and helps users navigate through the application’s views and controls, using speech and sound.”
Apple offers a concise document that describes how accessibility is delivered with iPhone 3.0. I’ll further distill what’s involved:
VoiceOver literally describes what’s on-screen in spoken word. As an app developer, you provide the descriptions by labeling your UI elements; providing a string hint about the element; and adding traits that describe the state, behavior or usage of the element.
You can label your UI from an inspector panel in Interface Builder. To test the app’s accessibility, you can either turn on VoiceOver or use the Accessbility Inspector.
Supporting this level of accessibility isn’t hard. Twenty minutes ago I’d never used these capabilities.
Standard UIKit controls/views support accessibility by default. The
UIAccessibility informal protocol enables adding these capabilities to custom controls.
Limitations, And A Wish-List For Apple
Per shared an email conversation with Daniel Ashworth, CEO & Chief Architect of Quokka Studios about the limitations of the VoiceOver. With his permission, I’ll share Daniel’s comments here:
As I mentioned in a tweet, there are issues in implementing accessibility for the iPhone in third party apps. Let me explain with reference to one of our apps.
FluxTunes [link added by me] is an app that allows you to control your music using gestures anywhere on the touchscreen, eg tap to play/pause, a swipe to the right to advance to the next track, swipe to the left to go to the
previous track, two finger swipes to change playlists, etc.
Sounds like it may be ideal for a blind user who is tired of hunting through the controls in the standard player using VoiceOver and wants a customised experience that is actually a pleasure to use? The problem is that Apple has provided such limited hooks into the accessibility features of the iPhone that it’s impractical to make this application properly accessible at present.
1. If VoiceOver or Zoom is on, then the app is unable to receive many touch events- the OS simply fails to pass on most events to the application. So an app that uses gestures is immediately unable to function at all with VoiceOver or Zoom active.
2. A fully resolved accessibility solution should allow an application to customise the behaviour of VoiceOver to suit the application. I suspect that the full accessibility solution in place on the iPhone would permit this. The problem is that Apple has opened up such a small part of accessibility for third party developers at present that it’s not possible for them to customize its behaviour as it can be for built-in apps.
3. Maybe we could get users to temporarily disable VoiceOver while using the application and have the application handle voice synthesis to provide feedback? Unfortunately not practical- its obviously not ideal for the user to have to switch VoiceOver on and off while switching apps, but more significantly Apple haven’t to this point opened up the text to speech APIs to third party developers. These are clearly in place in the OS, since they’re used by VoiceOver, but without these being open, text to speech would at best require embedding large additional libraries to replicate a functionality that’s already present on the phone. We’d like the ability to announce tracks even for sighted users without VoiceOver active, especially when they’re in circumstances when they shouldn’t be looking at the touch screen, eg when driving. In those circumstance we should essentially be treating all users as unsighted. So there’s a safety consideration for sighted users also.
So here’s my wish list as a third-party developer:
1. The means to get all touch events, even if VoiceOver/Zoom is active
2. The ability for an app to control when VoiceOver/Zoom is active when it’s running. It may be appropriate in some screens but not others.
3. Open access to the text to speech APIs
4. Better access to customise the behaviour of VoiceOver/Zoom.
I believe that with better support in place along these lines, significantly more third party applications could be made accessible or improve their accessibility. Obviously as well there are many apps that could implement VoiceOver support, i.e those based around standard controls, but aren’t necessarily doing this or doing it well.
Apple have done a good job of making their own apps accessible, but to this point they haven’t yet provided a full toolset for third party developers. I welcome any moves Apple does make to improve the toolset available- they’ve made moves with successive OS releases to open up
more features, so I hope that this will be the case here also.
The Business Case For Accessibility
Reasons for making applications accessible need not be exclusively altruistic:
My mother is disabled; to the extent that we can use technology to try to get her near parity we’re willing to pay for it it. This is certainly worth a premium, though it is normally the case that these bits of tech are obscenely priced; part of it has to do with the comparatively small population of buyers (as compared to e.g., the audience for DVD players) and a larger part to do with having a captive audience that can be taken advantage of.
Just like the $5 iPhone guitar tuner app obsoletes the much more expensive stand-alone guitar tuner, apps focussed on accessibility can demand a premium over the ringtone price-level.