As the winter holidays approach, you might be rushing to finish a research project. But, if you run the project too close to Christmas, do you risk poor data quality ruining your results? Or should you proceed with the study?
If you’re curious about how the holidays might affect survey responses and data quality, read on to get answers from SurveyMonkey researchers.
How did we do it?
Our research team ran two identical surveys on SurveyMonkey Audience to study how survey responses might differ during a major holiday. We launched one the week before Thanksgiving (Nov 19th, 2019) and one on Thanksgiving (Nov 28th). Both surveys included over 1,000 respondents and took about one day to complete.
Each survey contained questions on a variety of topics, including new product ideas, familiarity with various companies, product use, respondent characteristics, and data quality questions. We then compared respondents’ survey responses, their demographics, and data quality.
What did we find?
Overall, we found no substantial differences in survey answers or demographics. Out of the 32 questions we asked, only three questions had significantly different results during Thanksgiving. Note that, out of every 20 comparisons we make, we usually expect one to turn out statistically significant due entirely to chance.
However, we did find evidence of slightly lower data quality during Thanksgiving compared to the week before the holiday. Read on for more detailed findings and what this means for you.
Respondent demographics (gender, age, education, and race/ethnicity) were very similar before the holiday and during Thanksgiving. We also found no differences on 6 additional characteristics (e.g. green eyes, right-handed) that respondents self-reported.
The only significant difference was in the percentage Black/African American respondents, which was in line with the amount of variability we can expect due to chance alone.
|Before Thanksgiving||Thanksgiving||%-pt Difference|
|High school or less||22%||22%||0%|
|Some college or 2 year degree||35%||37%||2%|
|College or more||43%||41%||-2%|
Similarly, there were no differences on a variety of questions, including familiarity with companies, volunteering, and interest in new product ideas. There was one significant difference in the percentage of respondents who have heard of Airbnb.
|Responses||Before Thanksgiving||Thanksgiving||%-pt Difference|
|% internet on mobile||70%||73%||3%|
|Open to ideas||69%||67%||-2%|
|Would try new product||47%||50%||3%|
|Interested in a specific new product idea||34%||33%||-1%|
|Volunteered in last year||63%||64%||1%|
|Heard of Airbnb||84%||80%||-3%|
|Heard of Dropbox||75%||73%||-3%|
|Heard of Doordash||84%||83%||-1%|
|Heard of Workday||14%||17%||2%|
We computed the following data quality (satisficing) measures for both surveys:
- Speeding: Respondents who completed the survey in less than half of the median completion time were coded as having sped through the survey, which means they might not have responded thoughtfully.
- Two trap questions: One trap question appeared in a question matrix and asked participants to select “somewhat disagree.” The other asked participants to select all images with flowers out of 6 images. Those who did not select “somewhat disagree” or select all images of flowers were coded as having failed that measure.
- Straight lining: Respondents who selected the same answer for each of 8 items in current event matrix were coded as straight lining.
- Open-end response validity: Responses that were nonsense, contained profanity, or clearly did not answer the question were coded as invalid.
- Fake companies: We included three made up companies in addition to the four real ones. Any respondent who reported 1) having heard of one of these fake companies or 2) having used one of these fake companies' products was coded as satisficing on that measure.
Using those data quality measures above, we created a composite measure of satisficing. Respondents who failed 3 or more of these seven data quality measures were coded as satisficers.
We found that data quality on Thanksgiving was somewhat lower on 3 of the 7 measures (speeding, having heard of, and having used a fake company), and on the composite satisficing measure as well. It makes sense that respondents were a bit more distracted on Thanksgiving than the week before, possibly due to taking the survey in a noisy environment surrounded by family or while traveling. Note, however, that there were still no differences on the majority of data quality measures (both trap questions, straight-lining, and open-end validity).
|Before Thanksgiving||Thanksgiving||%-pt Difference|
|Satisficed on 3+ items out of 7||9%||13%||4%|
What does this mean for you?
So, should you run a survey at the end of the year or during another holiday?
For the most part, yes! We found almost no differences in responses on survey questions or on demographics the week before versus on Thanksgiving. While data quality was slightly lower on Thanksgiving than the week before, the difference in data quality did not substantially affect survey responses.
Here are 3 key takeaways from our research:
1. Consider question difficulty
Your decision to run a survey during the holiday season might depend on the difficulty of the questions you’d like to ask. Are you primarily interested in respondents’ opinions? If most of the questions you are asking are pretty easy for respondents to answer, then it’s totally fine to squeeze in that research project and run over the holidays.
If, on the other hand, the data you’re collecting might be particularly sensitive to data quality or require respondents to be very attentive, you might want to hold off running your survey until the holidays are over.
Consider the questions where we found the greatest disparity in answers before and during Thanksgiving: reporting having heard of and having used fake companies. Answering these questions correctly requires respondents to carefully think about each company name and override any vague feelings of familiarity, in order to accurately report that they have not heard of that company. Compare those to asking respondents their age or gender or how much they like a certain product idea. Those questions are intuitive and much easier to answer accurately even if respondents are a bit distracted.
2. Understand your data precision needs
Also, consider the data you’re looking to collect and its purpose. Does your market research project require precise estimates, for example, of the percentage of respondents who have used a particular service or bought a certain product? If so, it’s probably best to wait until after the holidays. Are you asking multiple questions about respondents’ degrees of familiarity with different brands or products? Also probably best to wait.
3. Always design for mobile
Regardless of when you survey, many of your respondents will be taking your survey on mobile. Make sure to keep your survey short, avoid matrix questions, and include no more than two open-ends. In addition, be sure to thoroughly test your survey on mobile to make sure that all images are appearing and videos are playing. Since videos can be hard to load, you might want to hold off on concept testing surveys with multiple videos to make your survey even more mobile-friendly.
Sending a survey during the holidays is a decision that really depends on your desired results. If you’re primarily interested in respondent sentiment, we say go for it, and run your survey over the holidays.
However, if you think your data is sensitive to data quality then perhaps hold off until after the holiday.