Removing Bias From Questions
Writing survey questions that are well structured and without bias may be the trickiest aspect of product research for beginners to get right. We all bring conscious and subconscious bias to any engagement, whether it’s a cocktail party conversation or a market research survey. For product managers undertaking their first survey, it can be difficult to recognize when bias creeps into their questions and threatens the quality of their research.
Fear not, novice product researchers! In this third blog in the Beginner’s Series, we’ll help you write questions like a market research professional. We’ll describe eight simple principles to follow when writing questions and response options. When you adhere to these guidelines, you can be much more confident that the research data you receive is an accurate reflection of consumer sentiment.
You can read a Beginner’s Guide to Product Research to start at the beginning of the blog series.
What is bias?
First, let’s examine the definition of bias in the context of market research. A broad definition was published in the Journal of Market Research by Ray Tortolani: Bias is "any force, tendency, or procedural error in the collection, analysis, or interpretation of data which provides distortion.” In other words, bias can occur throughout the market research process, not just in the way questions and responses are worded.
Sample and non-response biases are examples of biases that can occur in other parts of the market research process. Sample bias happens when the audience being surveyed is not truly representative of the larger market understudy and non-response bias is when survey respondents differ from the larger population.
Both of these biases are important concerns for professional research teams that assemble their own audiences. But, if you’re using a CX platform like DISQO's, these risks are largely mitigated for you. Research providers take steps needed to minimize sample and non-response biases by selecting representative samples of the consumer market under study.
For example, if you’re studying adult consumers who live in New York, the platform sources a sufficient quantity of randomized participants meeting these criteria to minimize sample and non-response bias.
Guidelines for minimizing response bias
Now, let’s dig into the specific rules for removing bias from questions, which professional researchers call response bias. To do this, we need to pay attention to the way we phrase questions and the response selections that we offer. In addition, the order in which questions are presented can also introduce bias.
All researchers need to be concerned with response bias. DISQO Experience Suite users have an advantage because the platform offers survey templates that are designed to minimize bias. Templates designed for specific research objectives include the questions frequently asked by product teams. These questions are ordered and phrased in a way that minimizes bias. All you need to do is customize them to your product and industry.
Before you dive in and start working with the templates, we strongly recommend reviewing the guidelines below. This will help you to avoid accidentally modifying the question in a way that encourages biased responses.
Avoid leading questions
Leading questions use biased language to suggest a specific response.
For example, consider these two questions:
- How disappointing is it when you can’t easily find the “contact us” page of the website?
- How easy or difficult is it to find the “contact us” page of the website?
Notice question #1 suggests a specific response and likely reveals the survey’s hypothesis. Respondents are biased to confirm any data presented in a question, hiding their true sentiments. Question #2 is more likely to prompt a truthful response because it allows participants to rate their experience, positively or negatively, using a scale.
To avoid leading questions, keep them simple and eliminate any phrasing that reveals your intent. Offer response options that enable participants to disagree with your hypothesis.
Don’t use double-barreled questions
A double-barreled question is when two questions are merged into a single question and answered with a single response, leading to inscrutable results.
Here are two examples:
- How important and necessary is the ability to check your credit score? (1-5 scale)
- Is this video clear and interesting? (Yes/No)
For both questions, it’s impossible to tell whether the participant is responding to the former or latter portion of the question, or maybe a combination.
While participants may find the video both clear and interesting, they may also find it clear but uninteresting, or unclear but interesting nonetheless. Gluing together the qualities of “clear” and “interesting” into a single question makes it difficult to get useful feedback about either.
You can detect a double-barreled question by doing a search for the words “and” or “or” in your survey document. This isn’t foolproof, but it’s a good first step. When you see a double-barreled question, break it down into separate questions.
Avoid absolutes
Absolutes are terms like “Always”, “Never”, “Every”, “All”. These terms often exaggerate opinions. Instead, ask questions that allow people to express nuanced opinions, lifestyles, and behaviors. Don’t ask questions with absolutes and avoid offering “Yes” or “No” answers.
For example, the question below offers absolutes not just at the ends of the scale, but also the middle responses, which are close to absolutes and don’t allow respondents to indicate their actual habits.
How often do you drink coffee?
- Every day
- Often
- Rarely
- Never
Here is a revised version of the question with responses that capture a wider range of drinking habits:
How often do you drink coffee?
- Most days
- A few times per week
- Once a week
- A few times per month
- Once a month
- Less than once per month
Offer balanced answer options
Scales and answer options should always be balanced, offering the same degree of extreme on both sides of a neutral response. This helps to avoid projecting opinions onto survey participants. An unbalanced response scale creates bias.
For example, the response scale offered for the question below is biased because three of the five response options are varying degrees of “Agree.” It offers more possibilities for participants to agree than disagree.
How much do you agree or disagree with the following statement? Renters insurance makes me feel secure.
- Agree strongly
- Agree somewhat
- Agree slightly
- Neutral
- Disagree
A balanced approach is shown below. If you use a research technology platform, the templates should include balanced scales like this.
How much do you agree or disagree with the following statement? Renters insurance makes me feel secure.
- Agree strongly
- Agree somewhat
- Neither agree nor disagree
- Disagree somewhat
- Disagree strongly
Offer neutral responses
Don’t force participants to express an opinion when they don’t have one. Instead, offer them the option to say they don’t care.
Imagine the previous question without the middle response option. This would eliminate important data from the survey. What if the team evaluating the data output plans to incorporate security into their messaging? Without data for a neutral response, this hypothesis would never be refuted.
Provide an opt-out response
Similar to the neutral response described above, we want to offer responses that capture all possible participant feedback, including when a question doesn’t apply.
In the example below, there are two problems. First, we offer redundant opt-outs (“Other” and “None of the above”). Both responses tell us the participant doesn’t drink the listed brands. Second, we don’t allow the participant to indicate they don’t drink soft drinks.
What soft drink do you drink most often? Choose one.
- Coke
- Diet Coke
- Pepsi
- Dr. Pepper
- Diet Dr. Pepper
- Sprite
- Mountain Dew
- Other (Specify)
- None of the above
Instead, we should replace “None of the above” with “Do not drink soft drinks.” This covers all possible participant soft drink consumption habits.
In addition, be careful not to combine opt-out responses. For example, offering “None of the above/Do not drink soft drinks” would lose an important distinction.
Use clear, understandable terminology
Beware of jargon, acronyms, and fancy language that may be familiar to you, but is not understood by consumers. Phrase your questions and responses using language that consumers will immediately understand. This will ensure the data you collect is accurate participant feedback, instead of a guess.
Consider the question below posed by an insurance company. The responses may be perfectly understandable to a company employee, but consumers don’t think about insurance every day and most are likely to be unsure what is included in the coverage options listed.
Which of the following does your current homeowners' insurance policy cover? Select all that apply.
- Dwelling
- Structure
- Personal Liability
- Loss of Use
- Personal Property
When in doubt, spell it out and/or provide examples. However, you need to balance completeness with the need to keep surveys short to minimize respondent fatigue.
Limit lists or options
Keep your response options as tight as possible to minimize the cognitive load placed on participants. We recommend a maximum of eight response options per question. Fewer is better. Don’t overwhelm participants by giving them a list of 30 responses to sort through.
The “Other” response with an open text field is a double-edged sword. On one hand, it allows you to eliminate a bunch of low-frequency responses, making your list shorter and may surface responses you didn’t anticipate. On the other hand, it takes additional time and effort for participants to write in their responses. Don’t offer an open text option if you don’t plan to use the data.
Unbiased questions reveal true consumer sentiment
Market research beginners are typically unaware of these eight guidelines, but they’re just common sense once you know about them. In fact, it’s relatively easy to write unbiased questions and responses, once you are aware of the ways bias can color survey results.
If you are vigilant about removing bias, you’ll be rewarded with research that reveals true consumer sentiment.