# Does Adding One More Question Impact Survey Completion Rate?

It will come as no surprise that the more questions you ask, the fewer respondents who start a survey or questionnaire will complete the full questionnaire. If you want or need to receive responses from a certain number of respondents, and you have a limited audience or sample size that you can get to start your survey, the fewer questions you ask, the better. In many cases, surveys may not be able to be condensed into just a few questions, but when response rates matter, keeping surveys succinct can help.

We analyzed response and drop-off rates in aggregate across 100,000 random surveys conducted by SurveyMonkey users to understand what drop-off rates looked like as the length of surveys increased. We looked at both the number of questions and number of pages. We wanted to understand the drop-off rate from start to finish for surveys from 1 to 50 questions to see how drop-off rates correlated with each incremental question per survey. The qualifier, however, was that the respondent had to submit a response on at least 1 page. To conduct our study, we looked at 2,000 random surveys with 1 question, 2,000 with 2 questions, 2,000 with 3 questions, etc. all the way up to 2,000 surveys with 50 questions.

So what did we find?

As expected, the more questions per survey, the higher the respondent drop-off rate from start to finish. However, as can be seen in the graph below, the relationship between survey length and drop-off rate is not linear. Data suggests that if a respondent begins answering a survey, the sharpest increase in drop-off rate occurs with each additional question up to 15 questions. If a respondent is willing to answer 15 questions, our data suggests that the drop-off rates for each incremental question, up to 35 questions, is lower than for the first 15 questions added to a survey. For respondents willing to answer over 35 questions in a survey, our data suggests they may be indifferent to survey length, and are willing to complete a long survey (within reason of course—we limited our analysis to surveys with 50 questions and below—we didn’t tackle the really, really long surveys we’ve seen this time around).

The chart above is based on random, aggregated survey and completion rate data from surveys deployed between January 2009 and September 2010. 2,000 random surveys with each number of questions (1-50) were included (100,000 surveys in total). A survey was considered to be “started” if a user submitted at least one page of data that included 1 response to a question, hence the 100% completion rate for any survey with 1 question. A survey was considered to be “completed” if they reached the end of the survey and clicked submit, skip logic may have resulted in users not answering the maximum number of questions in any given survey.

So what does this mean, from a practical perspective?

• If you are trying to optimize for completed survey responses, try to keep your survey short.
• The incremental value of each question should be worth the possible drop in response rates.
• Skip logic can help shorten a survey to allow respondents to navigate to only to survey questions relevant to them.

Of course, each survey, audience, and distribution method is different, and results will vary widely between surveys, but keep these metrics in mind as you design your next survey.

Have you experienced higher or lower drop off rates in your surveys as you varied the number of questions? We’d love to hear your thoughts on this data as well as any other data that you’d find helpful.

Written by Brent Chudoba, SurveyMonkey’s Vice President of Business Strategy & Business Intelligence. As the largest source of online survey distribution and response collection in the world, SurveyMonkey’s data insights come from reviewing aggregate data across millions of new responses per day across hundreds of thousands of live surveys. Our goal is to share aggregate analysis and insights to help our customers be more effective and successful in conducting surveys.

### Inspired? Create your own survey.

• Paul Lara

Ever vigilant of minimizing respondent times in my surveys, I take advantage of skip logic as much as possible, so follow-up questions are relevant the most recent answer.

• http://www.surveymonkey.com Anne R

Paul–that’s great to hear.

• http://www.environmetrics.com.au Gillian King

I’d guess that the drop-out rate declines on longer surveys because some will be aimed at captive audiences (e.g. employees) while others will offer an incentive to participate. So maybe the respondents are not quite ‘indifferent to survey length’?

• http://www.rickyleepotts.com Ricky Potts

The results of this survey do not surprise me at all. But thank you for taking the time to gather and share all of this. When it comes to creating surveys, you must assume that people won’t fill out any more than he or she has to. People are lazy, hence why blogs (like my own) that are flooded with a ton of content have high bounce rates. People don’t want to read let alone fill out personal information about themselves. That is why video is so popular, and has such a higher conversion rate when placed on a website. If a user can see (or hear) what you have to offer, why bother reading about it?

This made me think, and the next time I create a survey it will help me to evaluate the number of questions that I include. Great read and something that I hope hits home to a lot of people.

• Sicco Jan

Did you also research the drop-rate correlation to type of question (i.e. is the drop rate on open-ended higher than on yes/no?) and to making answers compulsory?
I would expect that having a lot of must-answer questions affects both the completeness of the respons (number of answered questions) and the response rate (number of submitted surveys). I also expect that the issuer can get away with a lot of yes/no questions but should be moderate with open-ended questions (or make the part of yes/no/other: style).

• http://myindigolives.wordpress.com/ Ellie Kesselman

Nice chart, Anne! Thank you for a good post. I noticed your response to my comment from a few months ago.

I also liked the comment made by Ricky Potts above. I didn’t expect to get insight about weblogs here (only was thinking about statistical survey methods), yet I did. Thanks for that too.

• http://www.surveymonkey.com Anne R

Hi Ellie–thanks for all of your comments and feedback. It’s been wonderful to hear from you as we build out our blog. Please definitely keep the feedback coming and let us know what other topics would interest you.

• http://www.dfhsfdjbskasdakjsdbsdcb.net Pearly Bevell

• http://allisonnastoff.wordpress.com Allison Nastoff

I just used SurveyMonkey to distribute surveys to students at my college for my senior thesis research and wish I would have read this beforehand. I had an extremely difficult time getting the required 100 respondents. Other classmates reported having difficulty too which could be due to the fact that a lot of seniors choose the survey route and the campus is bombarded with survey requests. But at 19 questions, all of them compulsory, the length of my survey could have been a factor as well. This is excellent insight that I will share with younger friends to save them the frustration I had!

• http://www.surveymonkey.com Bennett P

• Michael Pinnegar

The graphic for this page is broken :*( and it’s kind of the core of the article. Can you fix the page? or can you provide the graphic? I’d love to see the graph on questions / response rate.

• Hanna J

Hi Michael – Yikes, you’re right! Thank you so much for pointing that out. We’ve fixed it, so it’s up and running for your viewing pleasure. Thanks again!

• Dougpy

The idea that across 100,000 surveys the RESPONSE rate is no lower than 11% is incredibly hard to believe. Then reading “the respondent had to submit a response on at least 1 page” means that the data is not useful.

Should I have a one question survey or a two question survey? Both take up only one page…

• Jim Martin

I think “not knowing” is an issue in response drop-off. I’ve run a big internal survey (not on SurveyMonkey) for several years and while there has always been a gradual drop in responses across the whole length of the survey, there are pronounced “break points” at the end of each page (these are multi-page surveys with 2-5 questions per page). I’d liken this to the experience of climbing up a hill, only to find that when you reach the top, there’s another hill. That’s the point at which people give up.

In my first large SurveyMonkey survey we decided to have each question on a separate page, partly for formatting reasons and partly to hide the skip logic from responders. On the welcome page, we put a message: “This survey has a maximum of five questions” (everybody got either four or five, depending on the skip logic). That seemed to work pretty well and virtually everybody who started the survey followed through to the end. Obviously, it was a short survey, so you’d hope for minimal drop-of, but the feedback on the welcome page suggested that people liked that feature

• Kayte K

Hi Jim, thank you so much for the insightful comment. This will certainly be helpful for other customers! Please let us know if you ever need any help with any of your survey projects!