The human brain is a marvel, but it is not perfect. Its imperfections riddle our perception with what is called cognitive biases.
What is a cognitive bias?
A cognitive bias describes a replicable pattern in perceptual distortion, inaccurate judgment, or illogical interpretation. Here are 10 common cognitive biases that can affect how we design, interpret, and use data from primary market research (surveys, usability tests, focus groups, etc.):
- RECENCY BIAS: Tendency to weight recent events more than earlier events. When we ask people about their past behavior, their answers are more likely to reflect their latest actions, the ones they remember best. Question wording can help to minimize this bias, but sometimes it is inevitable, so factor it in in your data interpretation.
- HINDSIGHT BIAS: Tendency to see pas events as being predictable at time those events happened. Sometimes research results yield such obvious insights that even if we didn’t thought of them at first, when we see them, it feels so logical that suddenly we feel that we had “a hint” of it from the beginning (e.g. regular customers having a more positive attitude towards our brand). This can be dangerous in the management suite, which can conclude that no more research is actually needed since they “knew it all along.” Document whatever knowledge exists in the organization, use it to avoid repeating research already done and to compare with current research and unveil the hindsight bias when it sneaks in the discussion of the results.
- EMPATHY GAP: Tendency to underestimate the influence or strength of feelings, in either oneself or others and instead attribute behaviors primarily to others. This bias can lead to over-or under-overstatement of certain behaviors when asked in a research setting (e.g. How likely are you to buy brand X?). To fill the gap, we need to strive to provide a realistic context to the questions we ask, so that respondent can see themselves in the situation and give more accurate answers.
- CONFIRMATION BIAS: Tendency to search for or interpret information in a way that confirms one’s preconceptions. This common bias is the reason why we must be very careful about how questions are worded in surveys, interviews or focus groups. Unfortunately, it is harder to combat this bias in data interpretation. Internal politics, personal goals or simply lack of knowledge can turn research users into cherry pickers. They will consider certain results and ignore others. I see this all the time in focus group settings, when clients behind the mirror get hung-up on the response of a particular participant who confirms what they already believe about an issue.
- SOCIAL DESIRABILITY BIAS: Tendency to over-report socially desirable behaviors and under-report socially undesirable ones. If you ask questions in a way people feel they can be judged they will give you socially desirable answers. Good questionnaire design is key to avoid this bias. Trade-off exercises, such as MaxDiff, can also help us to avoid this bias.
- ANCHORING: Tendency to rely too heavily, or “anchor,” on a piece of information when making decisions. This is an issue in conjoint studies when we ask respondents to make a choice considering too many or too complex variables. The required cognitive effort is so big that respondents end up using one or two variables to make decisions and ignoring the rest of the information. In some situations that is how people in fact make decisions, and there is no fault in mimicking such scenarios. However, sometimes choice tasks are made unnecessarily cumbersome for the respondents, in an effort to measure as many things as possible at one time for either timing or budgetary reasons. Give fewer choices and control the amount of information presented if you want to understand the effect of different variables.
- FRAMING EFFECT: Drawing different conclusions from the same information depending on how the information is presented. We can never escape this one, so we need to be aware of how it affects research participants’ answers to information provided in a question. For example, respondents will reply with a different set of brands if we ask them which clothing retailer or which online clothing retailer comes to mind. The framing effect is also omnipresent during research reporting. Using different perspective to present the results can lead to different conclusions. Be aware of it.
- IRRATIONAL ESCALATION: Tendency to justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the decision was probably wrong. This one is big in the management suite. Sometimes, senior management, brand managers or product managers get so invested in an product idea that regardless of what the research says, they still will do what they intended to do. They often dismiss the research and try to poke holes at the methodology, but deeper inside they just feel they can’t balk out of an idea they have already spent so many resources on. They look back to what has been invested, not forward to what they could save by cutting their losses. Try to make a strong case for what they stand to gain by letting go an idea that is bound to fail.
- KNOWLEDGE BIAS: Tendency to choose the best known option rather than the optimal one. Generally, we are wired to avoid risk, and a known option (e.g. brand, product, service, provider) is often preferred than a better looking but unknown one. We should include questions to capture the inertia effect that this bias introduces to minimize over-statements of certain behaviors (e.g. likelihood to buy, switch brands, etc.) in a research setting.
- CURSE OF KNOWLEDGE: Failure to understand how things affect people who don’t have the knowledge we have about a particular subject. There is such a thing as “being-too-close-to-an issue,” which is why independent research is necessary to identify problem areas for users of a product or service, that don’t constitute a problem for the maker or provider. We see this all the time in usability testing, particularly in websites or apps developed by engineers who are unable to put themselves in the users’ shoes.
There are many other cognitive biases that plague us all, but these 10 are some of the most relevant to market research. Ignoring them could result in bad designed research and misleading conclusions.