UX Magazine

Defining and Informing the Complex Field of User Experience (UX)
Article No. 1093 September 23, 2013

When Does Quantity Become Quality? How to navigate big data

"What I discovered yesterday was that we are now seeing, for the first time, what happens when quantity becomes quality."—Gary Kasparov, after winning against IBM’s Deep Blue in game two to draw level in the six game series

Whatever your view is regarding the definition or competencies of a UX designer, there is no question that we work at the interface between human and computer interactions. Therefore, it should come as no surprise that a solid understanding of the relevant technology and user behavior is required to deliver the best possible user experience.

In the current crop of UX designers, strategists, and architects entering the field, from either a visual design or software development background, there has been a steady increase in the demand for experience in research related disciplines. Research skills fill a gap in the UX team by going beyond the visual and interaction design in an effort to understand the motivations of user behavior, using a combination of qualitative and quantitative research techniques.

Techniques established in statistical and bioinformatics analysis already represent a significant proportion of the current practices commonly seen in user behavior research performed as part of UX activities such as A/B testing and heat maps. With the growing number of usability tests being conducted, the depth and breadth of user behavior knowledge base is expanding at a rapid rate.

While “Big Data” is causing excitement in the IT and business sector, the practice and concept of data analysis is not unfamiliar to those coming from a research background. The intersection of business intelligence and user-centric design therefore creates both demand and opportunity for UX professionals with specialist knowledge in user research to create more impact with their work.

Qualitative and quantitative methods let you treat a problem by linking symptoms to the root cause

Aside from the hype generated by marketing, the real challenge is dealing with data that is being collected in greater amounts at a faster rate from more sources: the so-called “Three Vs (Volume, Velocity, Variety) of Big Data.” To help you navigate the oceans of information, here are some general pointers about metrics and data analysis to set you on the right course.

Qualitative and Quantitative

Generally speaking most usability metrics deal with either qualitative or quantitative data. Qualitative researchers ask broad questions of their subjects with the intention of uncovering patterns and trends. This type of research is typified by the classic Likert scale, commonly seen in online customer surveys, where the questions and answers don’t easily lend themselves to comparisons or sophisticated analysis.

More complex examples of qualitative data include: video footage of user interactions with the system, tester observations of user navigation pathways, notes taken about problems experienced, comments/recommendations, and answers to open-ended questions.

By contrast, the quantitative researcher asks specific questions with a narrow focus, from which they collect numerical data samples from the participants to try and answer the question.

Quantitative data is usually associated with information collected through anonymous user statistics, eye tracking, click-mapping, and other methods that yield specific measurements which can then be subjected to rigorous statistical, mathematical, or computational analysis techniques.

Some examples of quantitative data include: success rates, task time, error rates, and satisfaction questionnaire ratings. However, the distinction isn’t always so clear because when you start collating qualitative research results it is possible to quantify qualitative data, which can be interrogated further.

With bigger, better, and faster tools on the horizon, we won’t need to make the distinction between the two separate areas as we can simply choose the appropriate tools for the questions we want to ask. However, when resources allocated for user or usability research are scarce or inadequate, it’s not uncommon to see only qualitative or quantitative research conducted because the time, cost, and personnel may not always be at the available. Even if your usability research is contracted out or outsourced to specialist agencies, care must be taken to remain critical and scrutinize the results generated.

 

Qualitative or Quantitative?

There are some very common pitfalls when you are dealing exclusively with results derived from only qualitative or quantitative studies, and hopefully you will also come to the conclusion that combining both as part of your usability research and testing is the best way to go.

It’s a logical assumption that for direct feedback about usability you should always go straight to the user and record attitudes, feelings, and behaviors. However, there’s enough literature and research out there to suggest that what users say doesn’t always reflect what they actually do or think. Qualitative research is much more likely to reveal what the users think and do, but it doesn’t always show how or why they do it.

Qualitative analysis allows us to test and validate user behavior assumptions in a clear and unambiguous manner, but is limited to the complexity of the questions that can be asked and the ability to dissect or drill-down into more specific details. Nevertheless, it remains a very valuable tool for framing the big picture questions required at the beginning of the usability studies and should always be the starting point for UX practitioners. It also provides a focus for quantitative research and the context for the results that it produces.

The use of quantitative research appears as a double-edged sword for many organizations, and there are many inherent risks involved in relying on the figures alone. The volume and detail of data collected can provide very powerful insight for the decision makers, but there are also many drawbacks in terms of the effort required to obtain and process the information, not to mention the ability to distil useful insights from the background noise.

It’s also easy to jump to the wrong conclusions based on the weight of evidence that can be open to interpretation without fully understanding the whole context of the user interaction. Just consider how difficult it is to come up with the correction interpretation for a series of actions when any number of motivations and reasons can be hidden behind a single mouse click.

Since there are many reasons why user behavior is not reflected in qualitative research alone, it is important to use quantitative research results to verify qualitative research results. For example, if users feel that the system is slower but benchmark tests gathered from usage statistics show that it is in fact faster, then it’s important to look at other usability issues that might be causing the problem.

Similarly, if users feel that the task-completion rate is higher but more mistakes are being logged, it’s crucial to look at interface or workflow issues. When dealing with the complex behavior of users, it is impossible to take into account all the factors behind every action and decision. This is a good reason to try and combine multiple streams of quantitative data and qualitative data in your studies so you can keep on the right track.

Conclusion

With all the data we’re collecting, it’s important to step back and think about what all of these values represent. Research is as much about asking the right questions as it is about being able to come to the correct conclusions. Some warn against defining and collecting metrics that may not be of any value, and suggest that we should instead put more effort into things that we can understand and action.

Data should not be used for finding evidence to support our own opinions and assumptions. Data gives us an opportunity for to reach out to more users and understand them better. This comes with greater responsibility for UX professionals, who need to exercise more rigorous testing and validation of the “insights” gleaned before rushing to implementing them, as Lou Rosenfeld has pointed out.

One of the key take-home messages for UX practitioners from Comparative Usability Evaluation 8, in Rolf Molich's which DialogDesign looked at usability parameters for the Budget car rental website, is to "Combine qualitative and quantitative findings in your report. Present what happened and support it with why it happened." Applying qualitative and quantitative research methods you are able to treat a problem by linking the symptoms (what happened) to the root cause of the problem (why it happened).

With this in mind, I hope you’ll consider including both qualitative and quantitative data as part of your next UX research project.

 

Image of chess pieces courtesy Shutterstock

ABOUT THE AUTHOR(S)

User Profile

Michael Lai is a freelancing and consulting UX architect specializing in infographic and data visualization design. He has worked and consulted in a number of different industries (hospitality, research, IT, science, and engineering) and covered many UX related roles including user research, copywriting, training, graphic design, business analysis, and information architecture. In his spare time he is working on iPhone app and hardware integration, startups, and non-profit organizations.

Add new comment

Comments

45
57

Good user research typically combines qualitative and quantitative data. Choosing to go for one at the expense of another leads to an incomplete picture, and an effort without the real value addition to product design.

Victoria Chattopadhyay
www.atotalx.com

50
53

I am somewhat confused by your consideration of Likert scales as qualitative, when they actually collect closed-ended, numerical data.

50
49

It is difficult to qualify qualitative and quantitative questions without an actual example, and in many ways there are overlaps. You can turn qualitative data into quantitative data (and vice versa) by transforming the question that you ask. If you ask a user to rate how easy it is to use the software interface, then the question is qualitative in nature (as opposite to working out the completion rate), but you can collect SUS responses from a large number of users for different applications and compare them to each other. I think the terminology is less important, but it is important when collecting data to try and use questions to find out the what and why in your investigation to get the whole picture.

42
48

Thank you for your response. For me, precision is very important (I am a college professor and researcher). What we call things is the only way we have to communicate about them, especially when we attempt to teach others. There is no way a response on a Likert scale can be qualitative, even if it is one question asked of one user. I wonder what you define as "qualitative" research. Categorical variables that you cannot run advanced statistics on are still quantitative. Descriptive statistics are still quantitative. You can transform qualitative data into quantitative through content analysis, but not the other way round. I think that when we try to teach people we have a responsibility to do our best to not misinform.

40
57

In order to understand the significance of the data collected, both the precision and accuracy of the information must be assessed. I agree with the comment that a common language must be used to facilitate the communication process, but we don't want to run the risk of excluding the broader audience group from the discussion by losing them with the technical jargon or acronyms. Indeed, one of the purpose of writing the article is to provoke thought and discussion on this topic. For the purpose of this article, I would suggest that qualitative research in the context of UX design is aimed at uncovering aspects of a company/brand/product that influence the overall user experience (e.g. visual, interaction design, user behaviour, customer satisfaction). The complementary research that yields quantitative data seeks to understand the factors that contribute to these broader areas of the overall user experience (e.g. the use of colours, fonts, design patterns, etc.). Regardless of the semantics, I am simply imploring for a more prudent use of the information available when drawing conclusions about user behaviour, and also reinforce the fact that we need to do more user research rather than relying on the expert opinion of UX designers alone.

45
56

I cannot agree more that we need to conduct more user research. And also, that some user research is better than no user research. But ideally, we would conduct *good* research that is informed by proper understanding of methods. Neither you nor I need to define qualitative or quantitative research, and "avoiding technical jargon" is a poor excuse for imprecise knowledge. There are plenty of research methods books and courses (I happen to teach some at the graduate level), and, of course, online materials. Let's use them.

52
48

Perhaps this is a topic that deserves more attention than just two people swapping comments. I think the people carrying out the research know they want to achieve (whether they use the correct term to refer to it is another case), but in many cases fail to use the correct method to achieve the results. I think within the social sciences the terms qualitative and quantitative research have a more strict definitions because it is placed within the context of the methodology employed to collect and analyze the data. Scientific research follows an iterative and complementary process of formulating hypotheses based on observation, and the validation (or refinement) of the hypotheses by understanding the underlying mechanisms or model of the observed phenomenon (i.e. working on what happens, and why it happens). The trouble with UX research is not whether people apply the research method correctly, but whether the correct research method is being applied. Maybe the term 'exploratory' research should be used for studies used to find out 'what' happens, while the term 'investigative' research should be used for studies that try to understand 'why' it happens.

48
58

That makes more sense, but as I said, there's no point reinventing the wheel. I think you might enjoy books such as http://amzn.com/0123848695. Best of luck to you!