Apple makes some of the most accessible products out there. VoiceOver has won awards for improving accessibility for the visually impaired. They even provide tools so developers can easily integrate VoiceOver into their apps.

But VoiceOver, like other accessibility tools such as screen readers and OCR are not the best possible experience for the visually impaired.

By definition accessibility features puts a visually impaired user second, making an existing experience more accessible.

But what happens if we were to put a visually impaired user at the center  and then design an experience around them?What types of products and apps would we see if they were designed from the beginning to be used by people who can’t see them?

I think we can safely assume that these apps would in general be:

1. Much more usable for the visually impaired. 

2. Based almost entirely on voice commands and audio responses.

The challenges brought up by designing these experiences are certainly achievable, and more importantly, they are generalizable.  

Creating apps for the visually impaired is an opportunity to develop and test a voice based UI with invaluable feedback from a user base that can’t give up and switch to using touch if the UI sucks.

The UI insights gleaned from these apps will give who ever develops them a head start as we transition to wearable electronics and rely more and more heavily on voice based interfaces.

Putting Accessibiliyt First