Online Surveys = Ongoing Insights

Here’s How Many People Should Take Your (Longer) Survey

Here’s How Many People Should Take Your (Longer) Survey

LongerSurveyIt’s not surprising that the length of a survey can sometimes have a negative impact on your survey completion rate. It’s pretty intuitive—the longer a survey takes to complete, the higher your chances may be of people dropping out.

So it’s a delicate balance of achieving your research goal, while also ensuring that people take your survey from beginning to end.

In general, keeping your survey concise can improve survey completion rate. However, there are situations where more questions are critical to a survey’s success.

For example, when companies are taking part in a big internal initiative like measuring culture, typically their employee engagement surveys tend to be on the longer side because there’s a lot to cover. So if you know you have a longer survey to send out, just how many questions should you ask respondents to in order to optimize your survey completion rate?

Well, we are of course big fans of analyzing data at SurveyMonkey. Here’s some completion rate data to help you estimate how many survey takers you need to reach when sending longer surveys.

We analyzed the completion rate of 50,000 random surveys conducted by SurveyMonkey users and aggregated the data based on the number of questions in the surveys. You can use the table below to get an estimation of how many respondents you need to reach out based on the length of your survey:

Number of questions

In order to figure out how many people you need to send your longer survey to, first look at the completion rate based on the number of questions in your survey. Then you can calculate the total number of people you need to send your survey to like this:

Number of Respondents You Need = Desired Completes / Expected Complete Rate

For example, if you have a long survey with 46-50 questions and you want to collect at least 100 completed responses, on average, you will need to send your survey to about 117 (100 / .852) survey takers.

There you have it, data fans. We hope this comes in handy the next time you’re designing your survey and don’t be shy—let us know if you have any questions on this chart or more in the Comments section below.

As always, subscribe to our blog for even more tips and best practices like these. And if you’re looking for even more, head over for some Surveys 101!

Get the Survey Writing eGuide


  • lorendn

    The concept behind this post is very helpful but I strongly question the validity of your data. It would be a very skewed population that would give a 98% return on anything. You could ask for an RSVP to a free weekend in the Islands and would be unlikely to have a response over 75%. In my wildest dreams I cannot imagine completing any survey longer than 20-25 questions much less an 85% response on a 50 question survey. I would like to know what the driving motivation was for the surveys you reported from.

    • TGV

      I believe this is referring to the start to complete rate, rather than the overall response rate. It looks like this is what percentage you can expect to finish the survey once a respondent has entered the survey.

      • KTsurveymonkey

        Ah, see, TGV was on it! Thanks for helping us pint that out 🙂

    • KTsurveymonkey

      Hi Lorendn-

      We can see how this can be a little confusing! We reached out to Ashely about this, and she had this to say:

      There is a difference between a completion rate and a response rate. Completion rates are calculated on only survey takers who already opened and started on the survey. For instance, if a user sent a survey to 200 people in email, 100 of them opened the survey link and 80 of these people actually completed the survey, than response rate could be only 80/200=40%, while completion rate in my discussion is 80/100=80%.

      You can also check out our help center article on response rates (different from completion) here:

      We hope that helps out!

  • JRHyde

    This is very helpful as we are always looking for data to help our clients understand how to plan and budget appropriately. One “best practice” that likely was not able to be measured in this study is to manage expectations by being transparent and providing survey taker prospects with the true length of the survey before they start. Of course this can affect your response rate if you tell them the survey will take 10, 15, or more minutes. Many people will not be willing to even start that survey. (That just means there must be an appropriate incentive to encourage participation.) It’s our opinion that many surveys encounter attrition because the surveyor did not properly manage the survey taker’s expectations at the beginning or, even worse, were not honest about the length. Asking someone to “please take a short survey” and then having 30+ questions is dishonest and unethical. A good rule of thumb is that an online survey taker can complete 3 closed ended questions or one open ended question in a minute. Finally, quality of the survey also plays a role in survey completion. Surveys that ask questions that do not apply or the survey taker cannot answer or that are confusing can all cause people to drop out along the way.

    • KTsurveymonkey

      We couldn’t agree more, JR! Thanks so much for your tips and participating in this topic. Glad you liked the article!

  • Julia Kuznetsov

    50K random surveys but you don’t mention anything about whether these are incentivized surveys, e.g. gift card or sweepstakes entry for completion, which I find is a huge factor in both response and completion rate.

    Additionally, did you control for whether these are distributed to an opt-in mailing list or whether the respondents are funneled to the survey via a survey panel provider? I’d love more nuance around these figured to make them applicable to myself and my team. Thanks!

    • KTsurveymonkey

      Thanks for your comment, Julia! We reached out to Ashley for a bit of background on this. The data your referring to regarding incentives is actually data we don’t collect from our customers because we respect our customers’ privacy. For instance, we don’t ask our customers if their surveys are incentivized or not. And also, for instance, after our customer get the URLs (Weblink) of their surveys (which is our default collector type and 80% of users deploy surveys using Weblink), we don’t ask them if they would send the URLs to an opt-in mailing list they have, or put the link on some websites. So, this is why that sort of information isn’t included, but it’s certainly a great question to ask!

      If you’d like further insights, you are welcome to contact our Audience team at: Thanks!

Inspired? Create your own survey.

Inspired? Create your own survey.

PRO Sign Up Sign Up FREE or Sign in

Write Surveys Like a Pro

Write Surveys Like a Pro

Ever wonder what SurveyMonkey’s really made of?

Ever wonder what SurveyMonkey's really made of?

Read our engineering blog »