Since its inception, the iPhone has been praised for its clean interface and responsive touch screen. Usability gurus have both praised its interface and scoffed at its touch screen. Say what you want about the benefits of a touch screen, there is nothing that matches the physical feedback of a real button and no indications or assistance for those with disabilities. That could not be more true for the visually impaired, where an all touch screen interface is as useful as a handle on the spout side of a teapot. Until now.
Product designer Bruno Fosi has offered a solution to this dilemma: a physical skin to cover the iPhone’s touch screen to allow accessibility to the visually impaired. Described on the Core77 site, the product is said to be paired with text to speech and could open a world of options to a new user group.
I am curious how this system actually works though. The iPhone is notorious for moving buttons from the center of the screen to the top, the left to the right, not to mention the fact the keyboard has three different modes. So how is one to navigate a touch screen device with a dynamic interface if the mapping of the buttons keeps changing? Sure, the simple solution is the device notifies them when the screen has changed and the user flips to a new cover, with the alphabet or symbols on it. Still, I want to see a touch-screen device that does this for the user.
Braille printers exist to provide hard copies of text to the visually impaired. Research is being performed on Braille computer monitors that can refresh at a constant rate as the user interacts with the system. So what is the proposal here? Merge the two. Set up an OLED display with a Braille monitor so that as the user navigates the screen the keyboard changes quite literally beneath their fingertips. This would truly be haptics at its finest. It would not be limited to assisting the visually impaired either. Think of the applications in games and education. Rubbing a finger along an image of gravel could provide rough feedback, or maybe when Mario falls into the pit of spike a sharp poke is given to the user to immerse them further into the world.
As mentioned in the Core77 article, this is just a prototype and it will be some time until this reaches the market on any level. It is interesting to see how designers are attempting to broaden the accessibility of touch-screens and the next five years of haptic technology will open new doors.