How do our devices see us?

Let’s turn the world around… what if devices would look at us.

The Eye of HAL

We always try to apply user centered design, thinking what the world should be like for people. But what if we would turn it around. What does a person look like to a computer? That’s the question Dan O’Sullivan and Tom Igoe asked themselves.How the Computer Sees Us

On the left you see the answer to Dan O’Sullivan and Tom Igoe’s question (from the book “Physical Computing”) of “What does a person look like to a computer?” In other words, to most computers with a mouse, keyboard, and graphical user interface “We might look like a hand with one finger, one eye, and two ears.” (2002, ppxi-xix).

After the 40th anniversary of the “Mother of All Demos” (where Doug Engelbart premiered that very interface), it’s worth taking a look at a few newer devices and how they might ‘see us. Let’s take a look…

Remote controls

To a remote (and its related TV/VCR/dimmer lights), all that you are is a finger. But, like a person with a voodoo doll, that finger through the remote wields a great deal of power – able to put something to sleep with a single, well placed prod. (Try thinking of that the next time you’re cursing at your DVD player.)

Mobile phones

Early mobile phones (portable phones) saw us as a mouth, ear and finger – usually from a car where they were installed. Once their batteries were small enough to be truly mobile, they would suddenly have become aware of us having arms (and pockets), and with the advent of the screen, us having eyes.  However, the most “eye opening” moment for mobile phones has to have been with the advent of flip phones (as  popularised in The Matrix). At that point, mobile phones could see that we had hands capable of manipulating them… something the iPhone has lost.

And on to that darling of multi-touch, the iPhone. Ironically, in some ways, the iPhone sees us less than many mobile phones – we might have quite a few tentacles (after all, it’s all about stroking, not clicking), but no opposing thumbs. That said, we’re definitely something big and powerful, since we can affect the accelerometers by rotating the device. And add-ons like the Heart Monitor Application might make an iPhone realise we are alive ….

Finally, the Wii is interesting in that it begins to not only see us as entire bodies (fingers, arms, eyes), but even as more than one person.

So why might we want to look at devices in this way? According to O’Sullivan and Igoe, “…to make the computer a medium for expression, you need to describe the conversation you what to have with (or better yet, through) the computer” (pix). More pragmatically, it’s a quick way to challenge your own perceptions about how a device might work. After all, it’s easy to forget that not everything sees the world the same way as we do.

Photos: litmuse (HAL), ThunderChild tm (remote), debagel (Nokia NHK-6AX), William Hook (iPhone), chispita_666 (Wii). All under a Creative Commons Licence

Vicky Teinaki

An England-based Kiwi, Vicky is doing a PhD at Northumbria University into how designers can better talk about touch and products. When not researching or keeping Johnny Holland running, she does the odd bit of web development, pretends her TV licence money goes only to Steven Moffatt shows, and tweets prolifically about all of the above as @vickytnz.

One comment on this article

  1. Pingback: Entretiens du Nouveau Monde industriel : Concevoir les objets de demain |