Whether you’re collecting customer feedback, performing employee evaluations, or planning an event, the first step toward creating an effective survey is to brush up on the basics of survey science. Check out our resources for online survey tips and best practices to make sure your next survey is a success!
To keep your survey on the right track, here’s our top 10 tips and tricks:
What does a clear, attainable goal look like? Let’s use an example. Say you want to understand why customers are leaving your business at a high clip. Instead of a goal like: “I want to better understand customer satisfaction.” Your goal should be something like: “I want to understand the key factors that are leading our customers to leave—whether these reasons are caused by internal or external forces.”
Once you’ve come up with your goal, you can use it as a reference to prioritize the top questions you want to ask.
Treat your survey like a conversation.
Would you start any exchange by asking someone how old they are? Probably not. Instead, you’d engage in small talk first, and gradually move on to more personal topics. Similarly, keep your early set of questions light and straightforward, and then slowly move towards more personal questions (often taking the form of demographic questions).
In most cases, your respondents are doing you a favor by taking your survey. What better way to respect their time than by not taking up too much of it? You’ll be rewarded with a higher completion rate as well as more thoughtful responses for the questions you end up including.
What do we mean by closed-ended questions? We’re talking about questions that use pre-populated answer choices for the respondent to choose from—like multiple choice or checkbox questions. These questions are easier for respondents to answer and provide you with quantitative data to use in your analysis.
Open-ended questions (also known as free response questions) ask the respondent for feedback in their own words. Since open-ended questions can take much longer to answer, try to only include 1-2 of them at the end of your survey.
If you’re keen on getting a lot of responses, an incentive in some form can prove helpful. Potential incentives range from entering respondents into a sweepstakes drawing to giving respondents a gift card if they answer all of your questions. To learn more about the different types of incentives you can use, and how to make the best use of them, check out this article.
In other words, try not to put your own opinion into the question prompt. Doing so can influence the responses in a way that doesn’t reflect respondents’ true experiences.
For example, instead of asking: “How helpful or unhelpful were our friendly customer service representatives?” Ask: “How helpful or unhelpful were our customer service representatives?”
Using answer choices that lean a certain way can result in respondents providing inauthentic feedback.
Let’s revisit our prompt: “How helpful or unhelpful were our customer service representatives?”
Here’s how a set of unbalanced answer choices (that lean towards being too positive) can look for that question:
a. Very helpful
c. Neither helpful nor unhelpful
And here’s how they’d look once balanced:
a. Very helpful
c. Neither helpful nor unhelpful
e. Very unhelpful
Absolutes use words like “every,” “always,” “all,” in the question prompt. And, essentially, they make the respondent either agree or disagree with a strongly worded question without allowing for more nuanced opinions.
For instance, take the question:
“Do you always eat breakfast?”
Your respondents might eat breakfast most of the time, half of the time, or on occasion, but you wouldn’t know the difference once the responses come back.
Double-barreled questions are when you ask for feedback on two separate things within a single question.
Here’s an example:
“How would you rate the quality of our product and support?”
How would the respondent answer this question? Would they address the quality of the product? The quality of support? Maybe they’d skip the question or (worse) leave your survey altogether.
You can fix a double-barreled question by either choosing one thing to ask or by breaking the question up into 2 separate ones.
Imagine sending your survey only to realize that you forgot to add a question. Or that you didn’t include a few essential answer choices for one of the questions you asked. In either case, you’ll probably end up being frustrated and get results that fall short of what you need.
To prevent any mishaps in your survey design, preview your survey. Even better, share it with others so they can catch any mistakes you might not find on your own.
Looking for more best practices in writing your survey? We’ve got plenty of resources that can help turn you into a survey pro!
Once you’ve written your clear, well-formatted survey, it’s time to get people to take it. But where do you begin? You know who you want to take your survey, but how do you get it to them?
To make sure your data to be statistically significant, you first need to figure out how many people should take your survey–and what you can do to get a representative sample of the population. In order to reach the right people, you’ll also need to choose the appropriate survey mode (phone poll, paper questionnaire, in-person interview, or online survey) for your target population.
But wait! Before you send your survey, take it for a test drive. Make sure your questions are clear and that skip logic, question randomization, and the overall design are in working order. Send your survey to a friend—or do a practice run with real respondents in your target population—for a smarter approach to collecting survey data.
To test your survey, share it with others, and ultimately send it, you’ll need to create it in SurveyMonkey, first. See how below!
Here are more guidelines for collecting the data you need:
Success! You’ve got survey results. Now what? Because you need quality data to make accurate assessments and predictions, make sure the data you have is reliable, then slice and dice it to develop insights.
Were all your respondents completing your survey? Did they skip enough questions to taint your results? Did they really try to answer, or did they satisfice by picking easy but inaccurate answers? Look for irregularities to make sure your results are accurate.
Then try to answer the questions you had when you started the survey. Do text analysis to draw conclusions from open ended questions where people gave written answers. Filter and cross-tabulate your results to understand how different segments (like women and men) answered your survey.
And once you’ve found the data you’re looking for, find an effective way to present it. Whether you’re writing a big report that will inform your company’s marketing strategy, or fishing for Facebook likes with fun survey results, you want your report to be accurate and well-informed. Avoid analysis pitfalls like generalizing or misrepresenting the data–and consider alternate explanations for why respondents answered the way they did.
Finally, you’ll want to keep track of your process–from start to finish–so people can replicate your survey in the future. Repeat your survey to perform longitudinal analysis (or benchmarking) and see changes in people’s responses over time.
Sounds like a lot, right? Try diving into these articles to get all the answers about, well, how to get answers:
And for all you visual learners out there, this video can help:
Our powerful online survey tools make it easy to create surveys, collect responses, and turn your data into insights. To get started today, take the tour or sign up!