Survey writing. It is something everyone might think they can do, but it takes a professional to do it correctly.
There are several common mistakes amateur survey writers unknowingly make. Without a background in survey design and programming, they are easy to overlook.
You wouldn't hire an amateur to work on your vehicle. You wouldn’t ask a first-year med student to perform open-heart surgery. And, you wouldn’t try to fly a plane without any training.
Yes, I know these are extreme cases – but you will not regret partnering with a market research company to assist with your survey.
Choose one that has the experience and the know-how to write an actionable survey script. One that provides you data for recommendations, action items, and next steps.
Here is a highlight of 6 common survey writing mistakes that are easy to make without experience in the field.
Mistake 1: Double-barreled questions
A double-barreled question is exactly how it sounds. This type of question attempts to cover 2 separate objectives in one single question.
By using a double-barreled question, it is impossible to isolate each factor independently. This may be one of, if not the most common survey writing mistake.
Here is an example of a double-barreled question.
"Using a scale of 1 to 5 where "5" indicates very satisfied and "1" indicates not at all satisfied, how satisfied are you with the customer service and the price of the plan you purchased?"
In this example, the respondent is forced to choose one rating even though ratings for customer service and price may differ significantly.
To correct this, the survey writer needs to isolate factors and ask each separately.
- Question 1: Using a scale of 1 to 5 where "5" indicates very satisfied and "1" indicates not at all satisfied, how satisfied are you with the customer service?
- Question 2: Using a scale of 1 to 5 where "5" indicates very satisfied and "1" indicates not at all satisfied, how satisfied are you with the price of the plan you purchased?
Mistake 2: Leading the respondent
Careful and choice wording in a survey is essential to receiving unbiased and accurate results. Any attempt to influence an answer directly or subtly can have a major impact on reliability.
The goal of market research is to remain objective and to collect unbiased results, so as a survey writer, you need to make absolutely certain this holds true.
If there are issues with bias wording and phrasing in a survey there will be errors in the data collected and the results cannot be deemed reliable.
Here is an obvious example of leading the respondent.
"Don't you agree this concept would be appealing to customers?"
- Yes
- No
The statement is clearly trying to lead the respondent to agree with the concept being appealing and the initial language is unneeded.
Here is a more subtle example of leading the respondent.
"Our customer service is available 24/7. How would you rate our accessibility?"
Although the statement may be factual, reminding the customer of this availability is unneeded.
What matters is their perception, uninfluenced by a preceding statement.
Also, be cautious of order bias.
Order bias is a type of bias that occurs within the answer choices of a question. Respondents are more likely to select answers shown at the top of a list.
To alleviate this, answer choices should be inversed/flipped or randomized to ensure order bias does not affect the data.
For example, respondents are asked, “Which ice cream flavor is your favorite?”
Respondent A selects from this list:
- Mint chocolate chip
- Vanilla
- Strawberry
- Rocky Road
- Other
While Respondent B selects from this list:
- Strawberry
- Rocky Road
- Vanilla
- Mint chocolate chip
- Other
Learn more about the different types of bias in market research.
Mistake 3: Close-ended answers are not mutually exclusive
First, let's define close-ended answers for those who are new to market research survey writing terms.
Close-ended answers occur in any type of question where respondents are given answer choices to select from.
Now onto another new market research phrase: “MECE rule in market research.”
The first half of the classic MECE rule in market research. The ME stands for “mutually exclusive.”
This ensures all of your answer categories offered in the survey are mutually exclusive of one another. No overlap.
In addition to mistake #1, this is also a very common mistake made in survey writing.
Here is an example of not being mutually exclusive.
"Which of the following best describes your age?”
- Under 18
- 18 to 25
- 25 to 35
- 35 to 45
If the respondent is 25 years of age, or 35 years of age, which answer category do they select? This shows the selection options not being mutually exclusive.
Here's how the answer choices should read:
- Under 18
- 18 to 24
- 25 to 34
- 35 to 44
- 45 or older
Now, all respondents will be able to select the appropriate answer choice no matter what their age is. *Sigh of relief*
Mistake 4: Not being collectively exhaustive
The second half of MECE stands for, “collectively exhaustive.” Collectively exhaustive ensures all possible answer choices are covered in a series.
Leaving out or omitting responses in a survey can create havoc on your results because you may not have covered all potential responses.
Here is an example of not being collectively exhaustive.
“Which of the following best describes your total annual household income?”
- $25,000 to $49,999
- $50,000 to $74,999
- $75,000 to $99,999
- $100,000 to $149,999
If respondents make less than $25,000 or more than $149,999 what answer do they select?
Should respondents skip the question or pick the answer choice that's the closest to their actual total annual household income?
Either way, there will be gaps in the data collected.
Here's how the answer choices should read.
- Less than $25,000
- $25,000 to $49,999
- $50,000 to $74,999
- $75,000 to $99,999
- $100,000 to $149,999
- $150,000 or more
The updated answer choices will ensure there are no gaps in the data collected and respondents will not be frustrated when completing the survey.
Another recommendation here is to always include "other(s)" as an option.
"Which of the following social media sites do you use?”
What if the respondent only uses Instagram or only uses Snapchat? No answer selections exist for these.
This is an example of why you would want to include "other(s)" as an option.
This at least ensures the respondent has an option to type in omitted responses.
Reviewing these early in fieldwork will help the analyst understand if anything was missed.
Mistake 5: Using a multiple response for a single response question
When writing a survey, make sure you understand whether you want to force a single response or a multiple response.
Each will produce different results. We have seen survey writers ask respondents to choose their most important factor but mistakenly allow for multiple responses.
Therefore, you have some respondents selecting one and others selecting multiple answers to indicate what they believe to be the most important factor.
Here is an example of allowing multiple responses for a single response question.
"Which of the following advertisements is the most appealing one to you?"
- Advertisement Blue
- Advertisement Red
- Advertisement Yellow
If the survey writer allows for multiple responses to this question, it will create a mess of data.
Half of the respondents may select “Advertisement Blue” while the other half select “Advertisement Red” and “Advertisement Yellow.”
This will be very hard to understand what advertisement is truly the most appealing.
If 1,000 respondents completed your survey, it would be nice to have 1,000 selections on their most appealing advertisement. It's cleaner.
Mistake 6: Not using skip patterns
This is often a virtue of not having advanced survey software.
Survey programs allow survey writers to skip appropriate questions and avoid respondents from having to answer questions that do not apply to them.
Survey respondents are strapped for time. Any additional work on their end in a survey increases the likelihood of a high dropout rate.
Here is an example of not using skip patterns.
"Did you visit store ABC in the past 3 months?"
Let's say the respondent answers "No."
The next question asks, "What product(s) did you purchase from Store ABC in the past 3 months?"
This obviously should not be asked based on prior responses.
Bonus Tip: Don’t forget about the respondent
Surveys should take respondents less than 5 minutes to complete, which is typically 15 to 20 questions. Remember that people are less to lose their interest in completing a survey much past this.
Also, when possible, try to make surveys more engaging with creative questions and using different forms of answer choices such as sliding scales, pictures/graphics, and more.
Drive Research is a full-service market research company headquartered in Upstate New York. Our experts have decades of survey writing and programming experience.
Interested in conducting an online survey with our team? Contact us through any of the four ways below.
- Message us on our website
- Email us at [email protected]
- Call us at 888-725-DATA
- Text us at 315-303-2040
George Kuhn
George is the Owner & President of Drive Research. He has consulted for hundreds of regional, national, and global organizations over the past 15 years. He is a CX certified VoC professional with a focus on innovation and new product management.
Learn more about George, here.