Deconstruction: Analysis Techniques part 2

Related posts:


Deconstruction is one of the most frequently used and fundamental analysis techniques in our toolkit. It is used as both a preparatory technique to get research data ready for use in other ways; and a powerful technique in its own right as a method of isolating, exposing, and testing assumptions deeply embedded in our mental models.

One example of deconstruction is turning an interview transcript into a series of separate comments or answers to questions. Deconstruction is often used simply to prepare data for other analytic processes such as manipulation or summarization, or even abstraction.

Note: this article builds on the first part of the series: Deconstructing Analysis Techniques published in February.

Examples of Deconstruction

  • Chemical analysis – mass spectrometry: a technique for determining the elemental makeup of a substance or molecule.
  • Philosophy/literary criticism: a technique of isolating and testing ideas contained within a work of philosophy or literature
  • Systems analysis: identifying root causes through the identification of individual system ‘actors’ and their interactions
  • Quality control: unit testing functional components of an application, requires first identifying those components (typically by recourse to the specification)
  • User interviews: identifying individual concepts or ideas
  • Card-sorting: working with card-pairs
  • Task analysis: breaking down complex activities into individual tasks and their components.

There are a wide range of examples of the way in which deconstruction occurs, but our aim is always to reach a definite ‘atomic’ state (where the atom is defined by our research objectives). It should be noted that there are typically more things going on than merely breaking down the data. In the case of chemical analysis, level of elements or compounds are measured; in the case of a stakeholder or user interview, the individual words, phrases or ideas may be tallied, grouped, manipulated or otherwise worked with to form some new insight.

Deconstruction can – and often is – built into the design of the research. We see this in online card-sorting, for example, where data is stored from the outset as card-pairs. Survey results are another example of data where pre-deconstruction is built into the research.

Why, though, this urge to break data down into smaller and smaller pieces?

Smaller, more granular data provides for greater flexibility in the other analysis techniques we need to undertake. By separating ideas or objects out into their own data elements we can have greater control over how elements are treated and positioned with respect to other elements.

For example, splitting a Name element into separate First Name and Surname elements allows us to treat these two components independently, and ask a broader range of questions – such as: “What are the most common first names?”

Smaller, more granular data provides for greater flexibility in the other analysis techniques we need to undertake.

It requires extra effort to break data down and then to store it in more granular form. It also takes effort to request and record extra data during the research process itself. So, whatever level of data granularity we use should be for specific reasons, and to address specific research questions.

Deconstruction represents a powerful analytic technique in its own right. By isolating concepts and ideas, and exposing them to scrutiny on their own, deconstruction highlights the existence of untested assumptions and ‘sacred cows’.

In this sense, deconstruction is often used to analyze problems or situations to which we need to formulate a response. This use of deconstruction allows us to test the reality of perceived constraints: by isolating each constraint to the design, and looking at the conditions under which they may hold true, new possibilities can open up that may otherwise not have been possible or feasible.

Dangers in Deconstruction

There are dangers in deconstruction that are worth mentioning here. At the end of the day our work should lead to something substantively new. This can be difficult if we lose sight of the macro-level problem in pursuit of an understanding of the data in finer and finer detail.

Secondly, in studying the fine detail of our data we can miss seeing the patterns in our data that help drive insights and accelerate the transition to design concepts. At the same time, some patterns only become visible or apparent when we reach a level of granularity appropriate for the data.

Deconstruction can also generate noise in our data which obscures our sense-making abilities. This noise may be the result of data overload – simply having too much information to allow for processing; or it may be that small-scale, natural random variations are masking higher-level trends or patterns. In these cases, the use of summation and aggregation techniques might be an appropriate contrast to the deconstruction technique.

Deconstruction in practice

Deconstruction can often be used in very close association with other analytic techniques. For example, we may break data down into more granular form to facilitate manipulation of that data as a means of inspection or ‘eye-balling’.

In the same vein, we may manipulate or transform our data to allow us to zero in on a particular characteristic – deconstruction in the critique sense of the term.

It may help at this point to look at some examples to help illustrate the different uses of deconstruction as an analytic technique:

User Interviews
A typical interview scenario involves asking participants a series of questions (usually open-ended; sometimes based around topics rather than using a strict question set) and recording the responses. Recording may be through the use of written notes, audio recording, video taping, interviewer/observer notes; and may include a combination of the above.

To begin drawing connections and identifying themes between interviews we need to break down – or deconstruct – the interviews to the level of individual ideas or concepts, feelings, thoughts etc. The medium we use to record each of these ‘objects’ is not important: a spreadsheet might be used just as effectively as Post-It notes or index cards.

Once the data is in this more granular form we can carry out further analysis on the interviews. We may, for example, want to look at the prevalence of positive versus negative feedback.

Note, however, that the need for deconstruction is entirely dependent on the questions we are trying to answer through our research. For example, if our intent was to formulate an impression of the overall level of satisfaction for each interview subject, the deconstruction would be an entirely unnecessary task.

Diagnosing Causes
When faced with a failure in a complex system – such as the inability of users to complete a multi-step process, or the appearance of a previously unplanned-for edge case – it is typically quite difficult to diagnose the cause of the failure (in the absence of error handling designed specifically with this in mind). In order to identify the root cause of the failure we undertake a deconstruction exercise to help isolate the components of the systems.

Designing a Car: Highlighting untested assumptions
If we were to begin designing a car we might begin with a brain-storming session and list out all of the components or features that are needed. That list might include items such as “wheels”, “engine”, “fuel”, “doors”, “seats” and a whole range of others. We can now look at each of these features and ask why it’s there, and what it says about our notion of the solution.

For example, ‘fuel’ presupposes a form of combustion engine which, increasingly, may not be relevant. More importantly, ‘fuel’ highlights a range of assumptions – mostly tacit – derived from our mental model of the object ‘car’.

Once these assumptions are exposed we can begin to question their validity in the context of the problem – instead of pre-defining a solution in the statement of the problem. Such questioning, enabled through deconstruction, opens up a broader perspective on the design of a solution.

Conclusion

Deconstruction serves a dual role in our analysis work: as both a preparatory technique to get research data ready for use in other ways; and a method of isolating, exposing, and testing assumptions deeply embedded in our mental models.

The technique is not without its drawbacks: more granular data requires effort to gather and record, store, and analyze. It can also generate ‘noise’ in the data, which can obscure instead of illuminate.

Understanding the role of deconstruction in analysis can help us to better target it’s application to the solution of specific research questions.

Photos by s1mone (card sorting), andercismo (magnifying glass), smiling da vinci (interview)

Steve Baty

Steve Baty, principal at Meld Studios, has over 14 years experience as a design and strategy practitioner. Steve is well-known in the area of experience strategy and design, contributing to public discourse on these topics through articles and conferences. Steve serves as Vice President of the Interaction Design Association (IxDA); is a regular contributor to UXMatters.com; serves as an editor and contributor to Johnny Holland (johnnyholland.org), and is the founder of UX Book Club – a world-wide initiative bringing together user experience practitioners in over 80 locations to read, connect and discuss books on user experience design. Steve is co-Chair of UX Australia – Australia’s leading conference for User Experience practitioners; and Chair of Interaction 12 – the annual conference of the IxDA for 2012.

8 comments on this article

  1. Pingback: New article for Johnny Holland: Deconstruction « Meld Consulting

  2. I think what you miss in my glancery reading is that deconstruction is a step towards re-construction. Well you need to have done construction before the deconstruction in the first place, but then this becomes a bad infinite loop.

    My point is that deconstructing an existing element to its base pieces is PART of the process that needs to co-exist with construction. HOW your build up those grains back into something is as important if not more so than how, why, or what you broke down in the first place.

    I think in your previous article you would have alluded to that as synthesis. What I have seen in the design studio is that synthesis is actually the 1st task before analysis. Not always, but often in the case, especially when “research” is not a primary part of the design studio’s processes (something that happens way too often), they build things usually enmasse as a means of doing the deconstruction itself. In your examples it is most akin to the card-sorting methodology. The method of building/synthesizing is brought together with critique to create a deconstruction that is iterated upon through multiple instantiations of construction. The 2 become bound together and their symbiosis is what creates the power to generate ideas and carve them into solutions.

  3. Overall I think this is good but with two caveats: The dangers of deconstruction and assumptions of level of granularity. I also strongly believe deconstruction never is a continual process for refinement & improvement toward perfection. As well, learning to deconstruct and keep the needed tensions in focus is a skill learned over time.

    The dangers of deconstruction is missing one crucial element, time. Lack of time, or attempting to deconstruct and reassemble in a meaningful way in constricted time leads to the problems you outline. In deconstructing it is important to understand all of the components down to the most granular level possible (given time constraints & hopefully over time get to that level). Understanding where things seem to need most improvement and how elements appear to interrelate is important for narrowing deconstruction when time is short. The state of understanding and knowledge of people interacting with ever changing technology foundation (and the many year adoption cycles) is constantly changing, which is going to modify how we understand all that we do.

    The dangers of deconstruction you lay out seem to come down to one common problem, which is the designer keeping many contexts and levels of understanding in tension and view at one time. This is less a problem of deconstruction than a problem of skill level for the person deconstructing. Learning how to deconstruct deeply and broadly is partly tools, but also heavily knowing how to keep many things in tension and within view at once. Having the capability to reconstruct (conceptually or literally) quickly to see how changes at the molecular or sub-molectular level impacts the whole is very important. But, this capability comes from understanding the components we are working with to a very fine grained level. Once things are broken down to their ultimate granular and the various models for how they reconstruct are can be held in view w/ their various correlated and conflicting tensions, then the deconstruction becomes much easier. We are working with various ecosystems that all interrelate and interact (technical, human, information, design, semantic (various definitions), use/reuse, tasks, etc., which all require deconstruction and understanding.

    Tasks are insanely complex set of sub-ecosystems with elements and sub-elements. Contextual understandings of place, device, motivations, use, constraints, emotional and intellectual constraints of the people performing tasks, as well as the constraints the task was derived under (which may no longer be viable constraints) is just scratching the surface.

    This granularity takes time and resources, which can be collapsed with proper resources (access to data, information, & knowledge). Key to deconstruction under time constraints is having the depth of understanding to know how one or a few discoveries will change the whole. In complex systems that may be incredibly tedious in and of itself, but have large value if done well.

    Deconstruction helps identify and clarify problems or opportunities for improvement. But understanding the problem is a level of magnitude below understanding the complexities to resolve or improve the situation. Understanding the level of deconstruction and what levels (untouched) that lay beneath is a very important first step in grappling with deconstruction.

  4. @Dave: the article talks to two principal uses of deconstruction – as a reductionist tool which, in a design context, is fairly useless on its own; and as a tool for Criticism, which has value as an end in itself. Your point about re-construction – synthesis – is well worth remembering (and will come up again in the future).

    @Tom: that is an excellent point. My own thinking was heading in the direction of sheer logistics and the ability of the design researcher to maintain an appropriate high-level view that will enable those synthesis activities to take place. The addition of a time constraint complicates life for the design researcher, but is also a much better representation of reality for all of us.

  5. Pingback: Brads Ramblings » Links for 4/13 - 4/17

  6. Pingback: Max Design - standards based web design, development and training » Some links for light reading (28/4/09)

  7. Pingback: Colorrage Blog » Blog Archive » Some links for light reading (28/4/09)

  8. Pingback: analytic techniques