The most critical step of a market research project is the upfront work you do in designing a survey and preparing it for data collection. The old adage “garbage in, garbage out” is highly relevant, as it demonstrates the relationship between the quality of a survey instrument with the insights generated from the survey.
After you’ve put careful consideration into each question you plan to ask, the next step is to program the script into your survey software and prepare it for data collection. As part of this preparation process, it is essential to thoroughly proof the final survey before it goes live. This will help identify potential problem areas, ensuring that money and time you’ve invested doesn’t go to waste.
Below are a few key items to look for when proofing your survey.
Getting ready to launch your survey? Don't forget to proof it!
Item #1: Grammar/Typos
While this is an obvious proofing item, we see basic spelling and grammar errors far too often in live surveys. Whether you use software or a cloud-based service to develop your survey, you don’t want to assume that spelling and grammar errors will be automatically flagged. At best, these errors will make your survey appear unprofessional. At worst, the errors will result in a misinterpretation of the question by respondents, and the data will need to be disregarded during analysis.
Item #2: Question & Response Types
Single/Multiple Response Questions – A common issue we see with DIY surveys is mixing up single selection and multiple selection questions. If a question says “select all that apply”, you’ll want to ensure that multiple responses can actually be selected. Additionally, when proofing your survey, you’ll have one last chance to consider if you’ve chosen the most appropriate question type (single or multiple response) in the first place.
Open-Ended Questions – It is also important to review the open-ended questions in your survey. You’ll want to confirm that the comment boxes allow respondents to type as many words or characters as they would like to. Additionally, if you’re using data validation rules (e.g. forcing numeric responses, validating e-mail address formatting, etc.) you’ll want to make sure these rules are applied appropriately, and aren’t overbearing.
For example, if you ask for a phone number, the respondent will be frustrated if an error prevents them from proceeding because they didn’t include hyphens or used spaces in their formatting.
Item #3: Randomization
Order bias is when respondents disproportionately select the first or last response for a question. A common approach to combating this bias is randomizing the response options within your questionnaire. You’ll want to confirm that this logic has been implemented properly. Randomization should be applied to all questions, unless the responses are intended to be in a particular order (i.e. Likert scales, age brackets, income brackets, etc.). Also, certain response options like “none of the above” or “Other, please specify” should be excluded from this randomization and remain at the bottom of an answer list.
Item #4: Respondent Pathways
In many surveys, logic or branching is applied, allowing you to ask different questions to different audiences (based on their previous responses). In addition to ensuring that this survey logic is working as intended during the pre-test, you’ll want to assign different survey pathways to each pre-tester. This ensures that the full survey and all questions within it, are seen by the reviewers.
Item #5: Survey Flow & Question Ordering
Lastly, it’s important to take a high level look at your survey, from the respondent’s perspective. Does the order of questioning make sense? Does the survey nicely segue the respondent from one topic to the next? All of these things need to be considered, to ensure that the respondent experience is a good one.
Additionally, you’ll want to ease the respondents into the survey by starting off with a few easy-to-answer questions (such as a yes/no question, rather than a question with 15 different response options). Questions that may be more personal in nature (such as income, race, or gender) or difficult to answer should be asked towards the end of the survey to minimize respondent drop-off.
Final Note
The respondent experience is tremendously important in the market research world. Having a survey with errors or asking irrelevant questions will lead to significantly higher drop-off rates due to respondent frustration. If you don’t take your survey design seriously, the respondent won’t either.
Contact Drive Research
Drive Research is market research company located in Syracuse, NY that uses data insights to help organizations achieve their key operational and marketing objectives. Interested in working with the team at Drive Research?
Send us an e-mail at [email protected] or call us at 315-303-2040.