By now we have seen a number of different ambient futures for the web here on Johnny, which almost all include some form of augmenting the physical world. Recently Pattie Maes of the MIT medialab presented Sixthsense, a wearable interface that augments the environment and uses gesture based input. A big difference with the things that we’ve seen here so far, is that this one actually works…
Sixthsense is the work of Pranav Mistry. I could give it try to explain to you how it actually works, but I’m sure Pattie Maes does a far better job than me so I’ll just leave you with the video.
Sixthsense is one of the first things (and one of the most impressive things as well) I’ve come across, which puts the data from the internet to good use in such an intuitive way. Off course the device itself, in its current form, is not something you’ll find on the shelfs any time soon. But it is one step closer to a future in which we are able to use any data, anywhere, in any way, and without having the feeling that we’re interacting with a computer.
I can’t wait till technology like this becomes available to a broader audience and gets accepted by society. When more people get access to a certain technology, the more interesting things tend to happen with it.