How to avoid cognitive bias in research and design.
By
Aditi Kant
April 20, 2020
Listed below are examples of popular biases and the ways to overcome them. By asking better questions and becoming a more empathetic, creative and open-minded researcher, you’re getting the best possible data from your subjects. In order to become better at managing clients, or gain better data through existing research, it is crucial for both researchers and designers to have overcome their bias, and thoughtfully choose both their words and actions.
By definition, the bandwagon effect is a psychological phenomenon whereby people do something primarily because other people are doing it, regardless of their own beliefs, which they may ignore or override. In other words, it’s the herd effect.
When I was growing up, I often asked my mum for things because all of my friends had it. Can you guess the response that my mother gave me? “If everyone is jumping in the well, will you jump too?”
Why do people prefer popular opinions? To learn what is correct, we look at what other people are doing. A second reason others influence us is that humans are social creatures. We have survived because of our ability to come together. It’s built into our minds to follow the group. How often as designers, do you find yourself simply following the current trends?
Until recently ‘every’ designer added those dreaded onboarding screens to apps, fully expecting users to read and educate themselves before use. In reality, people simply ignored them and got started. At Digital of Things we experienced a similar case with a recent client. All participants ignored the onboarding screen and landed on the home screen knowing nothing about the app. The clients definitely have the risk of losing these customers now. A simple solution would be to provide some context e.g. coach marks on relevant screens. By following the bandwagon, the business could stand to lose a lot.
As designers, it is our duty to create something relevant and ‘useful’ for our users knowing that they will be blind to many of the elements that we designers spent precious time creating. Let’s promise ourselves to not be the victims of the ‘bandwagon effect’ and continue our contribution to creating meaningful design solutions. Always think twice and validate your designs. Even if ‘everyone’ is doing it, it may not be the optimum solution.
As the name suggests, a bias that limits a person to use an object only in the way it is traditionally used is called functional fixedness. This bias is one of the biggest deterrents for designers exploring solutions. To solve problems and create innovative solutions, we need to have a clear and open mind. If a designer is blind to the alternate use of an object, it will limit their problem-solving capabilities and stifle creativity. The way to overcome functional fixedness is by generating outside-the-box ideas using abstract thinking. Only this will help us identify the correct problem and create a relevant solution.
First of all (and most importantly), stay creative! Find a balance between creativity and breaking conventions for no user gain. While creating a solution needs a lot of creative thinking, we may end up following conventions, existing component libraries restrict our design thinking. Don’t get me wrong, I am the biggest proponent of following standards and reducing the cognitive load on end-users. However, the issue is when we put our thinking in a box and don’t think beyond. E.g. button-like components are now used instead of radio buttons or checkboxes.
The above example is a great way of providing a bigger target area on touch screen interfaces (and is probably the only time you should ‘box your thinking’). If the designers restricted themselves by using button-like components for form submission, radio buttons on forms would still be in use!
People with information bias tend to seek more information than they need to make a decision. I’m sure a lot of us have been asked to roll back the timelines for the discovery phase by our clients for this very reason. It can even feel like a waste of time and money as there is no tangible deliverable that comes out of this phase. Here we are questioning, “how much information is enough”? “How do we decide what information is relevant and what is unnecessary”? Discovery is for discovering, not validating. This means we have to be authentic in our approach to learning through Discovery, and not just look to have our existing ideas confirmed.
As researchers, we do not have to test everything – in fact, we shouldn’t! We should identify the areas of high-risk or low certainty and test those. This will enable focused feedback from users and a more manageable list of changes for the designers.
As designers, we’re not blocked if we don’t have each piece of information to start a project. The concept of ‘lean UX’ (which aims to reduce waste and provide value, combining the solution-based approach of design thinking with the iteration methods which compound Agile) brought in a great concept of generating assumptions in a workshop at the outset of the project or as frequently as needed. It gives designers a direction to start and the validation needed to move the project forward.
The cognitive bias that occurs when you are communicating with others and unknowingly assume that the others have the background to understand is a curse of knowledge. Does this concept sound familiar? Don't worry, it happens to the best of us! Designers are not users and we should abide by this saying to avoid this bias. It is often that we assume users are aware of all the details, when in fact – they may have no idea what’s going on! This bias may also impact the way you conduct knowledge sharing sessions with colleagues, if you assume everyone is on the same page in terms of information understanding.
The curse is all but our limitation to unlearn something and imagine how someone without that information may feel. In other words, it reduces our ability to empathise fully with our users. Naturally, to enhance empathy towards the users, there are two obvious options.
Confirmation bias is the tendency to interpret or recall information in a way that confirms or strengthens your own hypotheses. Simply put, when analysing research findings, you unknowingly try and confirm your own thoughts, or you might develop a sort of tunnel-vision that leads you to interpret the data in a certain way. Confirmation bias occurs when people would like a certain idea or concept to be true, so they end up believing it actually is true! If you’ve ever heard of the term ‘wishful thinking’ or experienced this concept for yourself, you know what confirmation bias is and how hard it may be to overcome! Usually research is initiated when there are points of contention between various departments or too many assumptions from the design team. In that case we rely on the user feedback to make decisions.
Often this justification happens while at work - we try to justify our design decision, and start looking for trends and research to support it. While there might be a ton of research in favour of your research, all this proves is that bias confirms our thoughts are correct and there is ample evidence to prove it.
It is best to keep an open mind and have a neutral stance in terms of your own ideas. Peer reviews fail when we are so bent towards our own ideas and keep finding reasons to stick to it. If possible, have another designer validate your design from a neutral perspective, as oftentimes other designers can better analyse your own findings and ask the right questions needed to prove your hypothesis is correct, or not.
Staying open-minded is the key to combating this bias. The focus should be to find problems from the end-user’s perspective rather than confirming your own beliefs as true. Remember if we do not overcome our confirmation bias or need to be ‘correct’, we might trigger more bias like the aforementioned few and lead to some not so user-centric solutions. Another helpful tip could be to get a bigger number of data points than just a small group. The more data you have to interpret, the easier it will be to see outliers and trends.
Implicit stereotyping is a type of bias that leads one to assume that the attributes of the individuals are based on their demographics alone. Unfortunately, these thoughts usually stem without conscious intention. This can be seen in numerous UX examples, such as the assumption that primary users of a mobile gaming app will be a certain demographic, only to turn out to be the complete opposite! For games such as Candy Crush, the primary user group are women over 35 years of age! However, many think that this is a game built for children.
Living in such a multicultural place like Dubai, I’ve observed many stereotypes that are practiced in the design field. When we bottle users in a group, we tend to address the issues from a perspective we can only assume they have, which means we have a higher risk of finding an irrelevant solution. Very recently in our lab, we conducted a study to identify if tourist family groups would benefit from an app with information about animals. One obvious stereotype we had was that ‘young children like animals and would love to read about them’. Upon investigation however, we found that the people who were most likely to use such an app were 60+ years of age.
It is important not to let research be influenced by this type of stereotyping bias.
Ensure you are validating the attributes of your primary user group through research. Let the research dictate your end results and not your existing bias. Not all western expats are tech-savvy and not all women love to cook.
Recruiting the right participants for your study is critical. You don’t want to gather inaccurate data because the participants did not reflect the target audience well. Next time when someone mentions demographics for finding the participants, question the traits you’re looking for in the users e.g. tech-savviness does not equate to gender; having a Snapchat account does not equate to being a millennial and knowing how to operate a MacBook does not mean you are a designer. Better to recruit right from the start than have to conduct a study again!