đź‘€ Q&A Research and Discovery Part 2: Methods and mind set

This is a part two of an Q&A from last fall. Part 1 can be found here.

Remember that these are things that have worked for me, and experiences I have got throughout the years of working with user and customer research. It is not the truth, and every organisation and project is unique. As always, remember to try, fail, learn and do it all over again.

This article we will cover:

  1. How do you get honest answers from the respondents?

  2. Any question you always ask?

  3. What is the right cadence, and how do I know we’re getting it right?

Next time I will cover how I work with both quantitative and qualitative data, it will be released next week.

🗣️ How do you get honest answers from the respondents?

Original question: What is the best way to understand what customers want, in order not to make mistakes because the customer says something that is not what they want?

⏱️ Short answer: no short answer to this question

🦥 Long answer: OK, this another tricky one. Because this is what you aim for all the time. Deep down, everyone wants to do good and do the right thing. Which means all respondents will consciously or unconsciously try to please you. And some people have a much harder time expressing themselves verbally, and it might be difficult to interpret what they actually mean, by what they do or say. When planning a study, you think about this constantly. What kind of questions you should ask, and how to ask them. So everything revolves around this, and below is what I have found to work fairly well. 

Spot the bullshit approach

What I mean by this, is planning your study so you can detect discrepancies in their story. In the same way you work with reliability in other studies, you basically ask the same questions, but in different ways and see if you get the same answer. And in this case, asking the respondents to tell a story rather than a straightforward question, generates better and more reliable results. Because the respondent doesn’t feel that you expect specific answers because you haven’t asked a question, you have just asked them to tell a story. This, of course, requires a bit more from you as a moderator. You need to make sure your question gets answered by the story, which means you have to really listen, pay attention and make them elaborate. 

A few examples of spotting the bullshit

Your objective is to understand more about their reading habits, and you know there is an underlying wish to read as much as possible. Because there is a societal norm that the more you read, the more educated and sophisticated you are. So you start asking them something like “tell me more about how you incorporate reading into your everyday life”, and you let them elaborate a lot into details. Like in which format they read, what kind of text/books they read, why they choose to do it like that, and so on. And later on, you ask “how many times a week do you read”. You start noticing really fast if those two answers don't go together. And you can start digging a bit deeper into why their stories don’t match. And that is often because they feel there is a right or wrong answer, or that they might be judged by the answer they give you. But when they tell a story, there isn’t really a right or wrong.

Another example is when I did plenty of interviews regarding people's private economy. This is a subject which is quite challenging to get honest answers about. You want to appear to be a responsible person, so you tend to tweak the truth a bit. So start asking questions about how they manage their savings, not how much they save or how regularly. 

Another way to explore the truth, beyond what the respondents actually say, is to let them explain how they would explain something to someone else. This is suitable, especially when you are investigating something fairly complex. What I once noticed was that, when they got the task of explaining it to someone else, they explained very factually (facts they had read online or got from experts). But when they told me the story of how they had done it in real life, their actions were only based on earlier experience and knowledge. So the conclusion was that earlier knowledge and experience has a stronger impact on future behaviour and decision-making than reading on how to do it in a book or online.

♾️ Any question you always ask?

⏱️ Short answer: Yes, no and it depends

🦥 Long answer: Of course there are plenty of questions I always ask regarding consent and so on, but if we skip that. It depends on what organisation I have worked in and what objective I have had in general at that company. But let’s say I work as a researcher in an organisation for a longer period. Then I always set up a set of qualitative KPIs, these are things which are important for your organisation to achieve. 

Mostly in relation to value words like “simple”, “easy”, “safe” things you say you want to achieve with your product. Then I try to detect words or words that relate to these values in interviews and tests. This makes it possible to see changes in mindsets and values over time related to your product or service and how it changes. 

Other KPI questions are questions in relation to how our customer-base or users mature with our product or alongside society. When I worked in a bank, I saw considerable changes in the acceptance of having a fully digital bank with no physical or personal service. When I started asking that question seven years ago, numerous people would not accept a bank who didn’t have physical offices and personal services. That target group heavily decreased during those seven years. And that could be identified just by asking “what do you value the most when choosing a bank for your mortgage” every time I had an interview or did a test, which made it possible for me to monitor that KPI over time. 

So I think most research-departments would do well with some of these standardised questions which could be analysed over time. Changes in mindset towards your product, or change in mindset in general, which could have an effect on how they feel or interact with your product.  

🚴🏻‍♀️ What is the correct cadence of doing user research, and how do I know it is the correct one?

⏱️ Short answer: There is no formula for finding the correct cadence in how often you should do user research or tests with users. It depends on budget and resources, the balance between what you put into it, and how much you can get out of it. But what I can say is, that it is like working out, better to do it 15 min 4 days a week rather than 3h once a month. Both for making the most out of the insights, but also to create long-term habits. 

🦥 Long answer: First things first, what is a cadence, we can call it a rhythm. In cycling, cadence is how fast you pedal, the faster you paddle, the higher the cadence. If you have a high cadence, you might not get as far with every stroke as you do if you have a low cadence, but it all serves different purposes. 

Doing big studies, interviewing fifteen people for 60 minutes, requires more planning and much more time in analysis, which means you might do that once a month or every second month. But doing goal-oriented usability tests, which might require less planning and only take about 20 min, you might do those every week. 

So how do you know you have found the right cadence? You will probably always put the bar a bit higher and never be fully satisfied, and have the feeling you should do more. But when something starts to feel like a habit and when you get much more out of it than you put into it, you might have come close to the correct cadence. For your product and based on your organisation’s maturity at that moment. Although, that doesn’t mean it will be the right cadence in the future. When the team has grown, when the product is much more complex, you might need those big studies to become much more frequent. Put up a realistic ambition to start with, prep it well and try to make that a habit before you evaluate it and start changing it. You need to feel comfortable with the cadence you have chosen before ramping it up. 

Previous
Previous

Det enkla ledarskapet

Next
Next

This is how White Arkitekter creates innovation and new businesses. For real.