If interaction design really is the business of behaviour change I believe this must apply two ways. While it’s true that design can influence users and engender cultural change, this is always a product of our more tangible work: changing the behaviour of technology. As a user-centred designer of technology my goal is simple: to make its behaviour humane. But how should I approach this?

Humanity implies emotion and, beneath that, personality. These areas lie beyond the frontiers of classical HCI and usability. Fortunately, as often happens, we view the distant summit and see others have already planted the flag. Toymakers, for instance, have explored the art of bestowing personality on products for years. The results are fairly crude, but I defy anyone to watch the torture of a Pleo and successfully suppress a twinge of guilt. Even in its moments of crisis, Pleo has a distinct personality; that is to say, it conveys emotional information

Channels for personality

Perhaps the most obvious conduit for emotional content is appearance.

From BMW's grill to Pixar's Wall-E, they all have a personality

From BMW's grill to Pixar's Wall-E

The designs above show acts of visual anthropomorphism, where gesture and expression alone convey personality. They create empathy through closure, a projection of the self as explored in Scott McCloud’s classic Understanding Comics. Pareidolia, the brain’s propensity to recognise faces everywhere, is a powerful trick. Even an oval, two dots and a line create an unmistakable expression; with detail we can add further emotional nuance.

Closure: excerpt from Understanding Comics by Scott McCloud

Closure: excerpt from Understanding Comics by Scott McCloud

We can also convey personality through message. In the words of Russell Davies, the rise of devices with personality will lead to a surge in “bubbly writing and objects talking to you in the first person”. Here, an Innocent smoothie prudishly asks us to avert our gaze from its most vulnerable area.

Innocent drinks carton with text "Stop looking at my bottom."

Innocent drinks carton with text "Stop looking at my bottom."

But anthropomorphism needn’t be visual. Consider how R2D2 conveys personality through sound alone – his shrieks and bleeps mapping to human expressions of emotion (See Chris Noessel and Nathan Shedroff at dConstruct 2009 [mp3, 43 minutes]). Similarly, IM programs happily announce incoming messages with a rising fanfare and send replies with a descending farewell.

These can be effective ways to communicate personality, but I’ve recently been reflecting about the fuzzier area of expressing personality through behaviour.

According to psychologist Kurt Lewin behaviour is product of the person in question and his environment (check out Lewin’s equation). Our behaviour changes with context. This suggests that we can only form an opinion about someone’s personality through exposure to various scenarios; a single interaction isn’t enough. However once we’ve formed this mental model, we believe it so thoroughly that we become blinded by it, believing that someone’s personality causes their every action – the fundamental attribution error.

Behavioural variance – acting differently according to our environment – is a celebrated part of being human. Anyone who lacks it is boring. Myself, I act quite differently as a Cardiff City fan than as a grandson, since the contexts are very different. At a party you’re expected to drink beer and flirt with girls, not quietly read a library book, if you expect to be invited back.

Dreary technology

This is why I look at modern technology with mixed feelings. As a tool, it’s unsurpassed. But when we engage with it on any human level, it doesn’t respond in kind. Technology has no behavioural variance and very little personality.

Yes, predictability is a key tenet of usability. High-risk systems must respond to input in forseeable ways: an air traffic control system, for instance, needs to be entirely unwavering. But as we’re learning to appreciate the power of play and emotion in our design activities, is there scope for non-critical technology to display behavioural personality?

Mobile devices, for instance, are increasingly a medium of sensory input as well as informational output. We’ll soon carry devices capable of reading our fingerprints, calculating our position and learning our closest social ties by analysing our SMS and email habits. Adding further richness, recent declarative technology encourages users to publish information that designers can use to build emotional responses:

Google map showing current location as Alton Towers theme park

Facebook status showing a user's engagement

So let’s imagine a Twitter client that asks if you really want to send that drunken tweet (maybe you should have read that library book after all). A mobile that loves going on rollercoasters. An MP3 player that longs to play (and listen to?) a new album for once.

Getting personality wrong

Looking, sounding or acting like a human is desirable only if the human is one we like. Some of our early forays have been spectacular failures. For an archetypal example of botched anthropomorphism, look no further than our most hated paperclip.

Designed to save labour and improve UI learnability, Clippy instead came across as smug and invasive. Not only did his brash tone rub many up the wrong way, but he was irritatingly clingy, appearing on simple tasks where users didn’t need or appreciate help.

The despotic HAL illustrates the other extreme of dislikable machine personality. Clarke and Kubrick created a terrifying villain for 2001 simply by highlighting the unflinching rationality of computation. HAL’s cold-bloodedness is the opposite of humanity. Our heroes are irrational, given to senseless acts in the name of compassion. We can all empathise: who hasn’t done something stupid when in the grip of emotion?

Appealing machine personality lies somewhere between the shores of impassivity and fake friendliness. Social psychology research tells us that we like people who share a similar personality to our own, and people who like us (reciprocal liking). Servile flattery isn’t the answer, of course, but through deep user understanding and reliance on our trusty companions trial, error and feedback perhaps designers will uncover a sweet spot.

We may speculate a few guidelines for conveying personality through behaviour (any additions would be welcomed):

  • Personality should be easily overwritten. If you need to make an emergency call, your handset must revert to functionality above all else.
  • Personality should be secondary to function. Clippy was disproportionate: his personality overruled his potential usefulness. Not only does this reduce usability, but we risk giving users false expectations of a system’s capabilities.
  • Personality should be appropriate to the medium. It may be that desktop computers aren’t an ideal platform for behavioural personality; we still regard them largely as tools of business or home organisation. Mobile phones operate in our intimate space and it’s well known that people form emotional connections with their handsets. Could the mobile arena provide sensible starting points for exploration?

This is largely a thought experiment for now, and it’s clear that behavioural anthropomorphism would raise practical questions. How should users tell devices to stop their shenanigans and get on with the task at hand? Do I want my computer, and whatever systems it’s connected to, to know that I spent the night at my girlfriend’s flat? Would a machine object if I do something it doesn’t approve of?

Any attempt to give technology personality will be divisive. Succeed and we make the technological world a slightly more humane place. Fail, and we create an army of Clippies.

Related resources

Special thanks: Rebecca Cottrell
Photo credits: Duncan, estoril

Cennydd Bowles

Cennydd Bowles works as a user experience designer for Clearleft in Brighton, England. Bored of an M.Sc. in Information Technology, he leapt into the mysterious world of user experience seven years ago and hasn't shut up about it since. He is an active mentor, an erstwhile manager and regularly writes and rants about user experience design. Previous clients include Gumtree, JustGiving, UpMyStreet, Business Link and the WWF.

15 comments on this article

  1. Very thought provoking piece. But one thing I don’t completely understand is your introduction. Why would you as interaction designer make technology humane?

    To better fit humans and human behaviour? Or to influence human behaviour better?

    In order to be successful technology should be IMHO serve human needs as well as possible. In some cases that can mean that you will have to give a face to technology, but it could also mean that you have make technology as pervasive as possible, giving it no face at all.

    But as said, very interesting stuff, must read for every interaction designer.

  2. Pingback: Putting people first » Does technology need personality?

  3. I think a good starting point for making technology humane is friendlier copywriting. Anthropomorphism done right will be a tricky and timely exercise, but copywriting that is personable rather than just stating the cold hard facts is something we can do right away (Flickr’s welcome message comes to mind).

    Also, GERTY from the film Moon is a brilliant example of robot empathy (using a vocabulary of only a few basic emoticons).

  4. Nice article. Anthropomorphic behaviors among the earliest childhood learning tools… a large area of the brain is devoted simply to recognizing faces… if good design and artwork are available (a must!) then meaning can be conveyed efficiently. (Recall the Netscape Mozilla? http://home.snafu.de/tilman/mozilla/).

  5. Pingback: TwittLink - Your headlines on Twitter

  6. Josh C on

    Great piece! It’s rare that I feel like I need to thank someone for writing something but this is one of those occasions. Thank you!

    Technology with personality, as you state, is polarizing. There are a few key things at work discouraging the majority of people from accepting this type of interaction. There’s the ‘this is creepy’ factor (see: my mom’s response to Facebook’s “intelligence”), there’s the ‘meh.. lame’ response (see: anyone who isn’t at least a wee bit impressed with the iPhone), and then there’s the varying levels of humanity that people already express toward animate objects (see: animal abuse). All these things come together and play an interesting part in this debate.

    Personally, I would love to see a little more personality in the things I use. I also fully agree with your list of essential guidelines. I might add one more: the personality should be intrinsically positive and helpful. No need for a misbehaving, sassy little gadget; I already own several of those.

    Thanks again.

  7. This reminds me of a responce in an online photogallery of new robots (http://www.popsci.com/technology/gallery/2009-11/gallery-robots-can-do-everything-you).

    The response was:

    “ok if I order one for work can I get the face deleted , it really doesnt need a face does it , its an industrial machine , a pallet jack with a computer, come on , I dont work in a Toy Story cartoon or something.”

    I disagree. The face is THE most natural and intuitive interface you can imagine. By harnessing you automatic and immediate reactions to the humanlooking face, we can design tech that is best suited for us. Make it more analoge and intuitive. For example, I’m glad that I’m not punching commandlines into DOS right now!

    By adding personallity you can increase you anthropomorphic response (Kim & Hinds, 2006) and thereby possibly increase acceptance of the autonomous technology, by triggering our psychological models of humans.

    We can of cause react negatively to this increased “humanlikeness” but these responses can be mediated by design and increased experience with robots.

    My psychology-thesis was an investigation of the above mentioned negative reactions. A short presentation and abstract can be found on: http://www.robotspodcast.com/forum/viewtopic.php?f=12&t=747

    There is also a interesting discussion about our reactions to increased humanlikeness over on: http://gurneyjourney.blogspot.com/2009/12/face-detection.html which I can recommend.

    Kind regards

    Johan Eklund

  8. It’s not as though we’ll be able to have our own Rosie from ‘The Jetsons’ anytime soon. However, robots are slowly making their way into our homes to help with simple tasks. As their abilities grow and prices drop, scientists and legal scholars have begun discussing some of the potential problems that a close, daily relationship with robots may bring about.

  9. Pingback: Personality, perspective, and funny pencils « Design and Innovation Daily

  10. Fred Beecher on

    Fantastic post, Cennydd! It’s really helped me think about using emotion in design in a different way.

    But I would be remiss if I didn’t mention Data from Star Trek as an excellent example of technology displaying personality through behavior. He displays traits like curiosity, loyalty, striving… all without being capable of emotion. Granted, he’s as anthropomorphic as you can get, but his personality comes through via hia behaviors moreso than any other way. I think it would be an achievable task for designers to think of appropriate traits for a system and design functionality to express them.

    To me, when you combinethis thought with some thing as personalized as a handset, then the idea of personality plug-ins becomes at least imaginable. Which is kind of terrifying if you’ve read The Hitchhiker’s guide to the Galaxy. No one wants a future in which Sirius Cybernetics exists. 🙂

  11. So the conclusion must be, yes – technology fares better when it has a personality. But the caveat is that it must be a personality we like. We like R2D2. We don’t like Clippy and HAL. Personality is good, but not when it gets in the way of function.

    We’ve spent decades trying to create functional, usable interactive devices. Can we make them emotionally appealing, too, without sacrificing what we’ve already achieved? Alas, personality is easily confused with attitude. The attributes I’m looking for in any interactive device – from phones to DVD players – are helpful and knowledgable. This was R2D2’s strength. 3CP0 was less appealing because he was a worry-wart who needed constant protection. Yet 3CP0 had a much stronger personality than R2D2.

    I think the next phase of our quest will be focused on bringing certain behavioural attributes forward so that our devices provide us with pleasant surprises along our voyage of discovery. I’m not convinced “personality” will play as large a part as many think.

    But as usual, Cennydd, you’ve sparked an important conversation. Many thanks!

  12. In order to be successful technology should be IMHO serve human needs as well as possible. In some cases that can mean that you will have to give a face to technology, but it could also mean that you have make technology as pervasive as possible, giving it no face at all.

  13. IMHO interaction design should absolutely consider personality, but it is not as simple as making the technology or interface nice and friendly. In the same way people have different personalities, that reflect the fact they are different people with different qualities, so to should an interface and the personality it reflects should be that of the Brand they represent.

  14. Pingback: Interact Seattle » Blog Archive » User eXperience Digest #14

  15. yoko on

    This is a great point to bring up. I offer the thoughts above as general inspiration but clearly there are questions like the one you bring up where the most important thing will be working in honest good faith.