In the autumn of my year at a medical device incubator, my team, a group of scientists, engineers and doctors, was convinced about the utility of our product idea. Then our first pilots showed very different results than what we had expected; our customers did not think like we did. We responded by tweaking the UI, making sure the consumers really understood our value proposition, and adding features to the product. We launched another set of pilot tests, hopeful for a better result. Yet we ended up finding much the same user response. It turned out that consumers fully understood our value proposition, but just did not respond to it as we had hoped. Stories like this are commonplace for those of us involved in product design. But facing up to the fact that actual users just weren’t into what a team of smart, expert Ivy-league fellows thought was perfect for them turned out to be my most significant learning experience in user-centric design.
We are humans, designing for other humans. With that comes the reality of peculiarities, irrationalities and idiosyncrasies, collectively called biases that we all share. We all know that sometimes our eyes deceive us with optical illusions. But our biases also deceive us when we design products. What I learned during the process of failing with our product idea was the following: before uncovering the unmet needs of our consumers, it is important to uncover the filters and lenses we use in our own discovery processes. Instead of listening to our customers, what we were doing was listening to our customers through a filter of our own inherent biases.
The first and most pervasive of these is Confirmation Bias: the tendency to interpret new evidence as a confirmation of our existing beliefs or theories.
We humans are built to be pattern finders. We look for patterns in our environments, and very quickly form an expectation and hypothesis around those initial patterns. Once we have a mental hypothesis, we have a tendency to pick and choose the evidence that fits our initial pattern. Psychologists and economists have studied this phenomenon extensively in the contexts of finance, criminal justice and health and policy beliefs.[1]
For product design, confirmation bias can result in a vicious cycle of non-innovation, in which we end up chained to the often arbitrary and uninformed initial beliefs we started off with.
As we begin our discovery process for new products, our brains are hyperactively finding patterns and making mental models. We are then off on a mission of finding evidence to support our initial patterns and models. Without realizing it, our product design is shaped and slanted very much – too much – by those first few customer sites and observations.
We all exists in the context of companies that sell products. When we observe customer realities, say through field visits, we try to funnel those realities into the product portfolio we currently have or plan to have in the future. Our tendency to shape the unmet need to a vocabulary of our product patterns and product portfolio distorts reality.
Finally, the group of people that we present our findings, observations and arguments to usually already share our mental models, and have the same beliefs about the market and customer. Hence, they are more receptive to certain arguments than others. The level of evidence required to change your colleague’s mind is significantly more than that needed to support what he already believes. While we may all want to build disruptive products, our organizations are much more receptive to product design ideas that support the status quo than those that disrupt it.
Confirmation bias can slant our observations and distort our findings to fit our current product mix. Moreover, this produces easy answers which are more convenient to the organization. But, in the long run, we are setting ourselves up to be the victims of disruption, not the victors.
So now that we understand some of the ways confirmation bias can affect our design process, what can we do design around this bias?Or how might we reduce this bias in our organizations?
- Multiple perspectives in the discovery process: Instead of having one person or one function be responsible for the discovery process, bring in teams that do not share the same mental model. By adding different perspectives to the observation process, we may not eliminate bias, but can instead bring out the dichotomy of realities that can be understood deeper. Having multiple perspectives summarize the same customer visit will show differences in observation and allow for a deeper analysis to find the truth.
- Push teams to prove that their assumptions are wrong: Having teams explicitly present data or observations that don’t fit the present hypothesis and allowing some degree of amorphousness in the initial discovery process is crucial to make sure we don’t push our teams to a simple easy solution. Some companies even split product teams to two and have each team focus on opposing viewpoints to refine and check the assumptions.
- Prototype and test aggressively. Allow for many failures: When we understand that our first observations will not be correct and plan for failure, we allow teams to find the correct answer instead of going with the first hypothesis.
- Pick initial customer sites to be representative, and refrain from forming any active product hypotheses too soon. Knowing that the initial sites can slant our perspective should force us to be extra careful in sampling from a larger pool of sites.
There could have been many reasons why my team’s initial product idea could have failed to resonate with our customers, especially since human biases exist on both side of the design equation— as human designers and human users. The important thing is to be aware of the existence of human biases to understand how it affects our product design, implementation and use, and then to make sure we either design-around bias, or even sometimes use bias to help with design.
Citations:
[1] Nickerson, R.S., 1998. Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), p.175.Vancouver