6 common survey biases and what to do about them
7 min read

6 common survey biases and what to do about them

It’s important your data is reliable if you want to make evidence-based decisions with confidence. Thinking proactively about the sources of bias in your survey or evaluation can dramatically increase the quality of the data collected and enable accurate and meaningful insights to be drawn.

As much as possible, you want to be positive you have captured a survey sample that is representative of your broader audience and that your survey questions have been answered accurately.

We asked our Data Scientist Tom McKenzie to weigh in on the ways in which bias can sneak into surveys and skew results – and what you can do recognise and combat these biases to collect the best data possible.

What is survey bias?

Survey bias is an implicit or explicit influence on data collected via a survey. It may not be possible to completely eliminate all sources of bias in your data collection; however, there are several practical steps you can take to minimise its effects.

It’s important to note that bias can be present irrespective of sample size, so the solution is not always as simple as collecting more data. Sending out a biased survey is akin to taking a train on the wrong track – no matter how many responses you collect you will still be diverging from the true result!

Common causes of bias

Sampling bias
A good sample is one that is representative of the population. Selection bias can result in audience groups being under-represented in the sample – known as under coverage. Various factors can play into selection bias. How the survey was completed might affect who responded – for example, if the survey was sent out via email, then access to a computer is a requirement of respondents.

Even when a representative sample is selected, non-response bias can creep in. This occurs when participants choose to not complete the survey, break off early, or skip certain questions. When respondents differ in meaningful ways from non-respondents then the data will be affected by this bias.

Voluntary response bias typically occurs when respondents are self-selected (e.g. in voluntary surveys). This tends to over-represent respondents with strong opinions on the survey subject. A common example given is callers calling in to talk radio stations – you’re never going to hear from fence-sitters or those who aren’t angry one way or another. Random sampling can help to reduce voluntary response bias.

Keep in mind that targeting a specific audience to collect survey responses from is not necessarily a case of sampling bias in and of itself – it depends on how the collected data is analysed and presented. If insights are drawn within the context of the audience then they can still be valid and accurate. Just think, are we trying to learn about the audience as a whole? Or are we just interested in the responses of a certain subset?

Response bias
Even if you have taken steps to minimize sampling bias, other biases can occur in the form of response bias. This is a bias in how the questions are answered and can be the result of numerous contributing factors, such as: asking leading questions, a respondent’s desire to appear positive, or even the order in which the questions are asked. For example, when presented with multiple-choice questions there is a tendency for respondents to just select the first option (known as primacy bias), or the last option presented to them (recency bias).

Respondents’ answers can also be swayed by the desire to provide answers that are in line with what they believe to be culturally or socially acceptable by social desirability bias. This is most common when respondents answer surveys where they’ve been asked for their email address or name and don’t believe the survey is anonymous.  

Response bias can also be found in respondents’ desire to agree with statements, regardless of context, known as acquiescence bias. Generally, agreeing or being positive about survey questions or statements takes less of a mental load than disagreeing. This increases when survey respondents have agreed with multiple statements in a row. For example, if a respondent were asked “Do you agree that there were ample food and bar stalls at today’s event?”, and then later “Do you agree that there should be more food and beverage options available?”; agreement with both statements might seem contradictory – this is likely the result of acquiescence bias.

Minimising the impact

Once potential sources of bias have been identified we can take steps to try and minimise their impact. Here are some things we can try:

  • Obtain responses via more than one collection method (if possible). For example, use an online survey emailed out to participants in addition to intercept interviews.
  • When interacting with respondents, try to remain impartial/neutral. When conducting interviews in person, respondents’ desire to provide overstressed positive responses will often affect respondents if they feel they are being watched.
  • Ensure that all demographics of interest are well represented in the sample. Random sampling  can help to avoid undercoverage of certain groups.
  • Collect responses at different points in time. Getting responses from participants in the moment might differ from responses captured at a later date, so capturing both will give a more wholistic view of respondent sentiment.

The above list is just a few tips on reducing bias during survey data collection. The exact means of minimising bias will depend on the nature of your survey population and what insights you are hoping to extract from the data. It’s an iterative process of understanding the kinds of bias you might be encountering – and also ensuring that your attempts to minimise bias aren’t introducing new sources of bias themselves. 

Testing and interrogating your surveys, and the data you collect from them is an important step to understanding the data you collect, the audiences you serve and ensuring best practice evaluation.

About the author
Tom McKenzie is a Data Scientist at Culture Counts.