Steve Jobs famously said that to be successful, you’ve got to start with the customer experience and work backwards to the technology. Customer experience is all about understanding customer needs — and addressing those needs with products and services.
How do you gain customer insights? Two basic approaches are gathering feedback by asking questions and collecting direct usage data from customers engaging with products or services.
Both approaches have their strengths and their limitations. Interviews and surveys provide attitudinal data (what people say) and analytics yield behavioral data (what people do). The real power lies in their combination.
Want to Learn About Customers? Ask Them What You Want to Know
Yes, it really can be that straightforward.
Traditional customer feedback approaches like interviews and surveys are best for when you want to dig deep into people’s perspectives and motives around past, present and future scenarios: “What happened when …?” “What do you think about …?” And “Imagine if .…”
Directly asking these types of questions is how you understand the why — or the root causes behind their actions or preferences, expectations and objectives.
But what people say and what they do are not always the same.
The decisions customers make are often instinctive or subconscious rather than conscious and deliberate, so when they’re asked to describe the “why” behind them, their answers can be more of an interpretation of their own actions, rather than the actual motive behind them. In other words: people don’t always know why they do what they do.
It can also be difficult for people to imagine future scenarios or product features and provide an accurate assessment of how they might interact with them and why.
To offset these challenges, include a mix of closed and open-ended questions. (Surveys naturally tend to have more closed questions than conversational interviews do.) The resulting data will be both structured and unstructured. Structured data tends to be quantitative and can be directly processed by software to derive insights through counts and other descriptive statistics. Unstructured data has no predefined format or organization and therefore needs to be further processed in order to derive conclusions, e.g. through a structured method like qualitative analysis.
Use Analytics to Understand What Customers Did — and What They Might Do
When customers engage with a product or service, they leave marks. For example, when a customer uses software, we can track and trace his navigation through the various screens, how long he stayed on each screen, what strings he entered into a search field, etc. The resulting data is typically structured.
The focus of analytics is to understand what customers do through investigating objective usage data, though it’s not always (or even usually) clear why customers behaved a certain way from telemetric data alone. Say, for instance, a user lingered on a screen without any interaction and then shut down the software. Is it because she didn’t know what to do on that screen and got frustrated? Or was it because she interrupted by something that she needed to draw her attention to? Or was it that her power ran out?
Because analytics report on customer behavior that already happened, its focus is on the past. Yet, historic data can be used to predict future behavior. Machine learning techniques like Markov chains or Sequential Pattern Discovery using Equivalence classes (SPADE) allow the prediction of navigation steps.
Recurrent neural networks or Seasonal Auto-Regressive Integrated Moving Average models allow to predict time series events from past data.
When Feedback and Analytics Join Forces
It’s probably clear by now why these two forms of insights must work in tandem to intelligently inform customer experience decisions during the product design process.
There are three primary ways to unite this CX power couple for maximum insight, and each approach can benefit customer experience design in different ways:
1. Run customer feedback efforts and usage analytics simultaneously. A primary benefit of this approach is that it allows you to analyze insights from both data sources with a focus on identifying commonalities and discrepancies.
Commonalities are findings that are backed up by both customer sentiment and usage data (e.g., what a customer says she did matches up with what she actually did), while discrepancies are findings that are in conflict.
An example of a discrepancy is when a user says in an interview that she only rarely navigates to a certain module in a software product, but the analytics data shows that she visits that module quite frequently.
Identifying this type of contradiction should inform a follow-up round of feedback to further investigate the disconnect.
2. Run analytics first to discover usage patterns. What do customers do? What are their navigation paths? What screens do they spend their time on? Gathering this type of data first lets you determine which usage patterns you want to understand in more depth in during the feedback gathered process. Sample questions informed by these usage patterns might be: Why do they navigate the way they do? Does the organization of screens fit their needs? How could it be improved?
3. Carry out feedback first to uncover themes in customer attitudes, expectations and objectives. This approach is great for determining whether people will behave the way they say — or predict — they will. For example, a customer might say that when he is using a content management system to find a certain piece of information, he is fine following a navigation path that you have identified has more steps than necessary and is therefore no longer optimal. In his mind, his chosen path makes more sense to him, so he’s fine taking the extra steps.
Reviewing analytics data about his usage afterwards allows you to verify whether he indeed tends to choose the non-optimal path rather than adopting the new one, or if he — who is being viewed as a proxy for other users — will naturally adopt the new path over time.
The more angles at which you can look at your customers, the better. After all, if not even your customers know why they do what they do, it’s unlikely that you’ll be able to predict what they’ll respond to best without doing your data due diligence.