Visualizing Usability Test Results

How can users quickly create a timed transcript of any video on the web? That’s Mozilla’s latest design challenge, in collaboration with the Participatory Culture Foundation, challenges teams to design an intuitive interface for creating and improving subtitles for any video on the web. In this article I’ll share some ideas on how to interpret usability testing results like those presented by the Mozilla Labs team.

Mozilla Labs Design Challenge?

The Mozilla Labs Design Challenges are events for interaction designers around the world to stimulate innovation in user interface design. In three months (March 9th to June 18th) this 7th design challenge will guide participants through a full design cycle:  ideate, select, prototype, evaluate.

The Design Loop

Mozilla Labs kicked off with an evaluation of the Participatory Culture Foundation’s current subtitling tool. Design teams can test-drive the subtitling tool and an interesting part of the briefing is a usability study of four first time users.

Meet John, Jane, Cynthia, and George

Mozilla Labs conducted a usability test with four typical users who had never seen the subtitling tool before. These users performed basic tasks in the subtitling tool: transcribe, sync, and review subtitles. The tests were conducted with an external testing service. On the website of the design challenge you’ll find screen recordings and some general observations for each of the four individual sessions.
John (20), Jane (64), Cynthia (52), and George (28) were asked to use PCF’s current subtitling tool and think aloud to share their experiences. After the test they gave short written response. Based on these responses and user behavior, four pages of summarized observations were generated. But, how can we translate these observations into ideas?

Categorize usability challenges

Creating a subtitle with the current tool is done in three steps: transcribe, sync, and review. Participants worked their way through these steps and encountered challenges on their way. We labeled these challenges with one (or more) of the three steps or as a general problem. While categorizing the usability issues into different tasks,  we assigned a severity rating for each issue. Our goal was to keep the ratings simple, for example: critical [3], serious [2], minor [1], and no issue [0].
Once everything was categorized, numbered, and rated, we created an overview of all the issues. Duplicates were combined into a single issue in order to clean things up (example C1).

A1 [2] When John rewinds the video during the “Transcribe” step he appears to expect the lines to appear in sync.
A2 [2] After reading the instructions, Jane clicks “Next Step” as if the first step is merely an introduction.
A3 [1] John breaks up lines after writing them. (“No, that’s too long.”)
B1 [3] The syncing instructions are unclear to Jane until after several minutes and attempts.
B2 [2] George wishes that playback could be slowed down during the syncing phase
C1 [2] John & Jane repeatedly click the control instructions, expecting buttons.


Visualize usability test results

Sketching over wireframes

We’re not writing a book – we’re designing a graphical user interface. The four participants were asked to verbalize their experiences. Our participants processed their observations and we captured them in spoken and written form. The Mozilla Labs team did the same and summarized their observations in about 10 bullets for each session. Somewhere in the design cycle we need to switch modality: from text only to a combination of image and text. The earlier, the better.
Visualizing interaction challenges in your interface helps to kickstart your ideation process.

User feedback in Usabilla

User feedback with Usabilla

Tips for visualising test results

  • Create your own toolbox to collect information about your users and the problems they encounter. Using different tools to measure user behavior and collect feedback helps you to get a better picture of your users and can be extremely useful in the ideation process.
  • Visual deliverables bring your test results to life. Use visual feedback in your deliverables to pinpoint the most important problems and to share your observations with your team.
  • Categorize usability issues (transcribe, sync, review, general) when you visualize your test data. You can use the categorized overview of issues as legenda and label your sketches with the numbers of each issue.
  • Fast is good. Try to keep your evaluation cycle as agile as possible. Lean and mean tests don’t necessarily slow down the design process. Take small steps and verify your choices with quick tests.

Entries are open for the Mozilla Collaborative Subtitling Challenge from now until the 26th of April. For more information, see the Mozilla Challenge site.

Paul Veugen

Paul Veugen is the founder of Usabilla, an online service to collect feedback on webpages, concepts, mockups, or any other image. He has been working as a user experience designer for 10 years and likes to share his ideas about remote testing. You can follow Paul on Twitter: @pveugen.

5 comments on this article

  1. Pingback: User Experience, Usability and Design links for April 19th |

  2. Pingback: Daily Digest for April 20th ‹ SmediaC - Social Media Community Strategy – New Media And Community Strategy

  3. Pingback: Mal kurz rundgeschaut… #21 |

  4. Pingback: » Visualizing Usability Test Results Johnny Holland – It's all … | usability

  5. Pingback: Best baby travel system in the US? | Baby Travel Systems