During my time at The Quirk's Event in Chicago this past April, I sat in on a fascinating presentation about something not often considered in market research: doing research on evolving market research best practices. Why not use your own expertise to optimize your market research processes?
Comcast partnered with Phoenix MI to understand the current state of online survey respondents and how the rapidly changing world is impacting their research behaviors. Their study aimed to find answers to questions including best practices in sampling, the impact of scale set-up, and the impact of device type.
Market research is not a perfect science. The best way to conduct an online survey for one researcher may look quite different from that of another. On the qualitative side, two moderators could have completely opposite styles for leading focus groups. Nevertheless, both approaches in these scenarios may very well produce quality data.
With so many styles and choices, how do you know your market research is unbiased, accurate, and quality? How can you be sure age-old market research best practices still provide the best results? Studies like that of Comcast and Phoenix MI help to answer these questions.
In this post, I will reflect on some of the most interesting takeaways from the presentation.
There's a lot to be learned from researching the research practices we use every day. Respondents are evolving, and survey writers need to adapt.
There are three levels of data cleaning
Cleaning survey data will always be a necessary step in an online survey study. A good way to organize all the steps within data cleaning is to funnel them into three categories. By order of completion, these include:
- Machine cleaning
- Distracted cleaning
- Art cleaning
Machine cleaning is the first step to making sure researchers are keeping only the quality respondents in their data. Examples of machine cleaning include identification of duplicates, robots, or professional survey takers. These processes are largely automated because they flag responses based on paradata, or administrative data, from the survey.
The next step is distracted cleaning which focuses on weeding out those respondents who are disengaged or not paying attention. A common tool used during this step is a red herring survey question that asks an obvious question such as "Select the color blue" or "Select a rating of 6." The idea is that if the respondent does not follow instructions, they are not giving the survey their full attention.
Lastly, art cleaning is utilized to clean the remaining data for the research analysis. This stage of cleaning removes respondents who provided irrelevant answers to the survey or answers that conflict with one another. This is the most manual step of cleaning, as it usually takes someone to read through responses for irregularities.
With all three of these cleaning steps complete, you can feel confident that your analysis and reporting will be based on high quality data.
Check out a full checklist of data cleaning procedures here.
Consistency with similar questions is key
There is always going to be more than one way to ask a question in market research. It would be silly to say there is one definitive approach to every type of question. The reality is, question wording varies based on the survey writer.
In some cases, the wording of a survey question can directly impact the results. Take for example a "Select all that apply" question. In their research, Comcast and Phoenix MI tested to see the difference in responses when listing answers with a Yes/No option and answers with a blank check box.
Their research showed that more answers were selected when respondents had to choose either Yes or No, where less answers were selected when respondents had to add a check mark.
What about rating scale questions? Which is best - check boxes or a continuous slider scale?
The research performed by Comcast and Phoenix MI showed when asked the same question, respondents are more likely to offer higher scores (good and very good) when answers are in a check box grid.
Digging deeper into slider scale questions, answers may be listed from left to right as very poor to very good, or very good to very poor. You can expect higher average scores when the slider scale is shown as very poor to very good.
What does this mean for your online surveys?
Be consistent with scales and similar types of questions. For example, when asking "Select all that apply" questions, make sure the answers follow the same format (either all blank check boxes or all with Yes/No options").
The only real threat to your data here is switching up your approach throughout the survey. If you stick with one approach for each question type, your results will carry just as much weight as someone who does it consistently with another approach.
Mobile respondents behave differently
Mobile optimization for online surveys is just as important now, if not more important, than desktop functionality. Our market research firm ensures that every survey we distribute to respondents displays and flows on a mobile screen as well as it does on a full-size screen. Unfortunately, this is not always as easy as copying and pasting the experience from desktop to mobile.
One of the insights from the Comcast-Phoenix MI study was that the data cleaning process is mismatched between desktop and mobile survey respondents. They found mobile respondents were more likely to be flagged for removal than those on desktop.
It is difficult to pin down why this is the case, but part of the reason is likely the younger audience uses mobile devices more often. Younger survey takers, especially in the 18-34 age group, are more likely to be tossed from the data.
Another quirk of mobile surveys is the need to be conscious of the number of questions being asked. On a smaller screen, asking fewer questions is a good idea to maintain a positive survey experience for the respondent.
Also worth noting, the added difficulty in typing out longer responses on mobile keyboards. Two to three open-ended questions on a mobile survey is optimal to keep respondents happy and engaged.
Here are 4 tips to make your online survey more mobile-friendly.
Drive Research is a market research company located in Syracuse, NY. Our team uses past experience and industry best practices to conduct the best possible market research project for our clients.
Interested in learning more about our market research services? Reach out through any of the four ways below.
Message us on our website
Email us at [email protected]
Call us at 888-725-DATA
Text us at 315-303-2040