Nielsen/Norman Group’s Katie Sherwin just posted an article
discussing the experience of a blind person trying to interact with a
touchscreen device and its rich vocabulary of gestures. The complexity of the
interface, plus the many gestures available can create quite a “cognitive load”
as the blind user tries to maintain an idea of what is happening on the screen
to know what gestures to use.
While some products have experimented with tactile/haptic
interfaces that give physical feedback, the potential of such designs is limited.
The better answer, Sherwin argues, to simplify content and workflow so people
can accomplish their goals more efficiently.
Screen Readers on Touchscreen devices
No comments:
Post a Comment