How can users quickly create a timed transcript of any video on the web? That’s Mozilla’s latest design challenge, in collaboration with the Participatory Culture Foundation, challenges teams to design an intuitive interface for creating and improving subtitles for any video on the web. In this article I’ll share some ideas on how to interpret usability testing results like those presented by the Mozilla Labs team.
Mozilla Labs Design Challenge?
The Mozilla Labs Design Challenges are events for interaction designers around the world to stimulate innovation in user interface design. In three months (March 9th to June 18th) this 7th design challenge will guide participants through a full design cycle: ideate, select, prototype, evaluate.
Mozilla Labs kicked off with an evaluation of the Participatory Culture Foundation’s current subtitling tool. Design teams can test-drive the subtitling tool and an interesting part of the briefing is a usability study of four first time users.
Meet John, Jane, Cynthia, and George
Mozilla Labs conducted a usability test with four typical users who had never seen the subtitling tool before. These users performed basic tasks in the subtitling tool: transcribe, sync, and review subtitles. The tests were conducted with an external testing service. On the website of the design challenge you’ll find screen recordings and some general observations for each of the four individual sessions.
John (20), Jane (64), Cynthia (52), and George (28) were asked to use PCF’s current subtitling tool and think aloud to share their experiences. After the test they gave short written response. Based on these responses and user behavior, four pages of summarized observations were generated. But, how can we translate these observations into ideas?
Categorize usability challenges
Creating a subtitle with the current tool is done in three steps: transcribe, sync, and review. Participants worked their way through these steps and encountered challenges on their way. We labeled these challenges with one (or more) of the three steps or as a general problem. While categorizing the usability issues into different tasks, we assigned a severity rating for each issue. Our goal was to keep the ratings simple, for example: critical , serious , minor , and no issue .
Once everything was categorized, numbered, and rated, we created an overview of all the issues. Duplicates were combined into a single issue in order to clean things up (example C1).
We’re not writing a book – we’re designing a graphical user interface. The four participants were asked to verbalize their experiences. Our participants processed their observations and we captured them in spoken and written form. The Mozilla Labs team did the same and summarized their observations in about 10 bullets for each session. Somewhere in the design cycle we need to switch modality: from text only to a combination of image and text. The earlier, the better.
Visualizing interaction challenges in your interface helps to kickstart your ideation process.
Tips for visualising test results
- Create your own toolbox to collect information about your users and the problems they encounter. Using different tools to measure user behavior and collect feedback helps you to get a better picture of your users and can be extremely useful in the ideation process.
- Visual deliverables bring your test results to life. Use visual feedback in your deliverables to pinpoint the most important problems and to share your observations with your team.
- Categorize usability issues (transcribe, sync, review, general) when you visualize your test data. You can use the categorized overview of issues as legenda and label your sketches with the numbers of each issue.
- Fast is good. Try to keep your evaluation cycle as agile as possible. Lean and mean tests don’t necessarily slow down the design process. Take small steps and verify your choices with quick tests.
Entries are open for the Mozilla Collaborative Subtitling Challenge from now until the 26th of April. For more information, see the Mozilla Challenge site.