Ask more, know more, do more

How Much Time are Respondents Willing to Spend on Your Survey?

How Much Time are Respondents Willing to Spend on Your Survey?

How much time would you be willing to spend completing a customer satisfaction survey about a recent shopping experience? On a survey distributed by your Human Resources team regarding employee satisfaction? From your friends who hosted an event that you attended and want your feedback?

Understanding your audience when constructing a survey is important and can help inform decisions on survey length and content granularity. Since survey respondents have different motivations for responding to surveys, their tolerance for how long a survey is will vary.

What we analyzed.

We wanted to understand how the length of surveys–as measured by number of questions–impacts the time respondents spend completing surveys. In order to understand this relationship, we took a random sample of roughly 100,000 surveys that were 1-30 questions in length, and analyzed the median amount of time that respondents took to complete the surveys.

In ideal circumstances and over a large, randomized sample of responses, the average time it takes to answer a question should not vary based on the length of the survey, so a linear relationship between the number of questions in a survey and the time it takes to complete a survey should exist.

What we learned.

It may come as no surprise, however, that the relationship between the number of questions in a survey and the time spent answering each question is not linear. The more questions you ask, the less time your respondents spend, on average, answering each question. When your respondents, in methodological terms, begin “satisficing”—or “speeding” through a survey—the quality and reliability of your data can suffer. On average, we discovered that respondents take just over a minute to answer the first question in a survey (including the time spent reading any survey introductions) and spend about 5 minutes in total, answering a 10 question survey. However, respondents take more time per question when responding to shorter surveys compared to longer surveys:

Can we always assume that longer surveys contain less thorough answers? Not always, since it depends on the type of survey, the audience, and the relationship of respondents to surveyor, among other factors. However, data shows that the longer a survey is, the less time respondents spend answering each question. For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions.

In addition to the decreased time spent answering each question as surveys grew in length, we saw survey abandon rates increase for surveys that took more than 7-8 minutes to complete; with completion rates dropping anywhere from 5% to 20%. The tolerance for lengthier surveys was greater for surveys that were work or school related and decreased when they were customer related.

What this means for you.

Take survey completion time into consideration as you design your next survey. Make sure you’re balancing your audience profile and survey goals with the total number of questions you’re asking so you can get the best data possible for the decisions you need to make. And if you do write a survey that has a low response rate, make sure you send it to enough recipients to get a good sample size.

Have questions about our analysis or interested in other data and insights we can share? Let us know in the comments below.

Tags: , , , ,

40 thoughts on “How Much Time are Respondents Willing to Spend on Your Survey?

  1. Much needed research when you are trying to keep everyone happy and they all want an input with lots of questions “So nothing is missed out”. This gives us a reason to say “no”!

    It would be interesting to review incentivised v non-incentivised.

  2. Michael Culbertson says:

    Hmm, interesting. But, item type surely affects how long it takes to complete the survey, and I would guess that very short surveys tend to have more open-ended items, which take longer to answer, than longer surveys. A more-detailed analysis incorporating item type would be quite instructive.

  3. Judith Waite Allee, author/speaker says:

    Interesting–thanks for sharing the information.

    I am guessing that 30-question surveys would tend to include multiple choice or other question types that are quick and easy to answer, and quick and easy to quantify, but may miss the heart of the real question that open-ended questions can capture.

    Smart marketing move on your end–sharing info of value to your best customers. Inspires me to think about survey topics that motivate the right people to participate, publicizing the survey itself, and then publicizing the results.

    Judith Waite Allee
    P.S. I’d love to see comparisons of question types as they affect the completion rates. Also, comparisons of various incentives, how they affect completion, and how they affect the data.

    1. Anne R says:

      Hi Judith–thanks so much for the feedback as well as ideas for future analysis that you’d like to see. We definitely plan on sharing more data, so appreciate hearing what is most interesting to you!

  4. Laura Schild says:

    Longer surveys are more likely to have skip patterns based on survey answers. When counting the total questions per survey, were skip patterns taken into consideration?

  5. Jean-Pierre Calabretto says:

    Thank you for this information very interesting and useful.
    I would also like to see comparisons of question type, in particular for conditional questions e.g. “if yes, please comment …” These types are clearly very useful to gather additional qualitative data but possibly affect completion rates.
    Jean-Pierre Calabretto

  6. Dr Derek Sequeira says:

    I manage our University’s online evaluation program which has 11 mandatory core items on a 7-point Likert scale. Academic staff may add items from a categorised item bank if they wish. Respondents can provide comments in 2 text boxes

    This affords significant variations in the time taken for respondents to complete their surveys. Those who only respond to the 11 mandatory items will spend a much shorter time than those who work through additional items and provide their comments in the 2 text boxes.

    The level of motivation among respondents will also impact on the time taken to complete a survey. Some will speed through the radio-button clicks while others will make comments that are detailed and carefully thought through.

    Having said this, a succinct survey that students find relevant would yield higher response rates irrespective of the time taken for its completion.

    1. Michelle Brooks says:

      This is great insight – thanks for sharing. It will certainly help when ‘wrestling’ with clients over the number of questions is optimum.

  7. Rod Clark says:

    Interesting … just before I read this blog entry I received a survey email from a hotel at which I recently stayed. The email came the day after I got home. They asked me a number of questions and then asked for responses in text boxes. I was interested in the survey personally, but the more questions that got asked that were driven off of previous responses started to irritate me. Since I am a market researcher by training, if even I can get irritated by the questions, think what the general populace must feel about this.

    1. John Whaite says:

      Thank you so much for this – facts are hard to find. I now know that if I want to call it a 5 minute survey, I should have only 10 questions, and if I don’t want to lose people, I should have only 15 questions. These are good figures to have to guard against “question creep” – the urge to add just one more little question.
      It would be interesting to know if the length of a survey influenced participation in following surveys – a lot of questions could be annoying, but answering them could make someone feel they have contributed and so be more committed in future.

  8. dropletform says:

    Similar experience – in my case they ( JW Marriott ) sent the letter to me while i was still in the hotel ( in China)

    I responded telling the management team that i was still in the hotel – they invited me for a drink to discuss!

  9. Elizabeth Jones says:

    Thanks for the info. I personally hate long customer relation surveys. I think around 5 questions is ideal for these. Certainly no more than 10.

    I will be taking your research into consideration when making up our first survey.

  10. Anna Kadric says:

    Interesting input. Thanks to you I now have ‘hard facts’ about survey abandon rates in relation to number of questions asked.

  11. Neil Alexander says:

    it’s interesting bt want to get much more

  12. suad says:

    THank you for your refering of this helpfull information and i hope it will help us

  13. vijay kumar v says:

    It is very interesting One, thank you.

    1. Other Paul says:

      I was quite intrigued by the claim. But then I looked at your ‘Response Times’ graph. The data points on it could just as equally support a linear hypothesis – the fall-off seems to occur only near the end. Though I don’t doubt that the actual statistics you have (but which you don’t present) favour the non-linear model, and I can see that the heavy line is a better fit than the dashed line, it seems to me that the linear model is quite ‘good enough’ for the range in question.
      One thing that could counter the time-per-question decrease is the thought – in the answerer’s head – that in a large questionnaire, the respondent knows they’re in for the long haul and, accepting this, they give their time. That would pull the curve back up again and it’s maybe what keeps the data points not too far removed from linear.
      Are all the data from questionnaires where the number of questions is known in advance?

  14. Sally says:

    Is there any information on whether an incentive changes these results? For example, are the abandon rates lower if there is an incentive to complete?

  15. Prasenjit Dasgupta says:

    What I have seen, other than the Qre.’s length and quantum, the response style also plays a major part. Especially in Qualitative surveys, that have the open ended questions.

    1. Louis Pace says:

      @Other Paul: Since we do not have access to the data, we do not know that the linear fit is at all appropriate, let alone “good enough.” The strength of a trend depends on a lot of factors, including sample size, none of which we have. But a sufficiently large sample size can show that even a small difference is significant.

      IMHO, the data “appears” to follow a strong trend, and this trend “appears” to deviate from the linear fit by 30 seconds to up to a minute. And remember, those times are per question, so this can have an impact on your survey. But even this is just an opion based on appearances, which, especially in statistical graphs, can be deceiving.

      What I’d really like from Survey Monkey is a little more quantitative information, like the model used for the trend and some additional statistics specific to that study. Meanwhile, the information presented is very useful for helping us design future surveys. Thanks!

  16. Bob says:

    It does not surprise me if it is true that people spend less time per question the longer the survey is. However, the time it takes to read the directions does not appear to be taken into account here, and that would appear to be significant. if for example, it takes 75 seconds to answer a 1 question survey, and 80 seconds to answer a 2 question survey (you say 40 seconds on average, even though you say total 2 minutes), that implies it takes 70 seconds to read the directions, and 5 seconds to answer each question. In that case, the amount of time spent answering a question actually increases from there, gradually, up to 23 seconds per question by the time you have 15 questions, And, in longer surveys, do users usually come across another set of instructions for another section of the survey? Is this taken into account? For me, this is too little information right now to trust the validity of the conclusions.

    1. Mike Stempo says:

      There are a ton of variables left either unconsidered here or unmentioned in this analysis. Where to begin? The incentive of a gift, what its value is, and the chances of winning it is simply not mentioned. Are we to assume no survey gift?

      For starters, the survey has to actually be something that applies to the survey taker. I will sometimes either completely ignore a survey if it looks too long and it really has nothing to do with anything I care about or will haphazardly fill it out trying to understand the gist of way it was sent to me in the first place. That sure skews the results. Some surveys re extremely poorly structured and ask very asinine questions. I chuckle to think what the surveyor really expects to get of any accuracy and quality out of such ill conceived surveys.

      I get these surveys asking me to evaluate “impressions” I have of companies and if I would recommend them to others.. Some survey makers are so isolated in tiny their little worlds that they think us out “there” in survey land really pay that much attention to mundane offerings and the mundane companies that offer them. Most companies do not command that much attention from us beyond the first one or two.

      I particularly chuckle at surveys the try to “herd” you to an answer or do not cover all answers and give no space to comment on other. I deep six those after the first discovery of that ill thought structure. Some surveys even try to beg the question.

    2. newboys2013 says:

      Thank you for your super site in internet.

  17. Sometimes in a short survey, 1-5 questions, the questions are more complex. In a survey of 25-30 questions the complex questions are broken down into simpler queries that require less thought. For example if I asked you ‘What do you find the most difficult aspect of purchasing a new car?’ and provided you with 3 options and an ‘Other’ with comment option, it might take you quite a while to develop a valid response. If, however, I asked you 5 questions with yes or no type responses I might be able to get the info I want and you the respondent could get on with your life.

  18. From my experience YES/NO survey answers tell you nothing that will help you to improve any aspect of the issue in question. Open ending all questions will draw the respondent
    into some focus on the issue whether positive or negative.
    If YES or NO are presented so should Please comment why!!
    Incentives also go a long way to getting participants interested. Customer Service and Product Quality must be surveyed frequently if a business is really about improvement. A note to food businesses who leave surveys on dinner tables in restaurants – do you ever ask your staff what they do with them?

    1. Dolores says:

      Yes there is a lot of missing information ie completers compared to non-completers.

      1. Doug Shaw says:

        This is interesting and useful thanks. Long surveys do seem to bore people and when it comes to staff surveys it feels like the more questions you need to ask the worse the employer/employee relationship already is. “you want to ask me how many questions?? Wow – you don’t know a thing about me – do you?” And very often the useful stuff is in the written feedback not the numbers so keeping things short encourages folk to write more back to you too. And if you have to give an incentive to complete a survey well I think that signals even bigger problems.

        Keep it as short as you can – put your self in the respondent’s shoes and ask do we really need to include all these questions?

        Cheers – Doug

  19. Bruni Brewin says:

    When I am asked to complete questionnaires or surveys where there is no option to voice an opinion (other than in the way the way the survey stipulates you should answer, I will opt out not to contribute to supposed research that I do not agree with. (Usually University type surveys.)

    These types of surveys will give you a three answer selection without the option of stating something different. This then gives the survey the only answers that ‘they want’ from the survey, without it being an accurate reflection of the thoughts of the people responding to the survey. How helpful is that – or is it a play at Politics to get ones own ideas supposedly accepted?

    1. Tim Sudderth says:

      I completely agree with Bruni. At the first sign that the survey is ‘leading’ me by only allowing the responses desired, I quit. On top of that, I don’t participate with that sender again.

    1. Michael says:

      “For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions.”
      Given the shape of the curve in the chart immediately following this assertion, wouldn’t this be essentially true for any number all the way down to 4 questions or so? There is no uniquely large gain in time per question at the 30 question mark.
      Am I missing something?

  20. Russell says:

    What then is one supposed to do when, for reasons of difficulty recruiting “live” research participants, one has a large number of questions to ask, say 300-400??

    Are there any research surveys that ask people to fill out a number of “linked” surveys breaking the block of questions down to smaller blocks?

  21. jodie says:

    A lot of long surveys usually lead to forgotten places for yes or no responses and kick a person trying to finish the survey out. Some ask redundant questions over and over which becomes very irritating asking which paper towels have the most sheets, best type to buy and repeat the questions with a change in one word.
    What is really irritating is trying to finish the survey and you get a message that “sorry the quota has been filled”

  22. Scott says:

    I really appreciate that you guys are doing this sort of research. It’s super helpful! Just wanted to mention that the images all appear to be broken right now…

    1. Bennett P says:

      Scott – oh dear! Not sure what happened to the images! Thanks for flagging and we’ll fix. And, thanks for reading our blog! Glad it helps

  23. kaytek says:

    Hi there! Not clear what app you’re referring to?

Leave a Reply

Your email address will not be published. Required fields are marked *

Inspired? Create your own survey.

Make your own survey

Sign In

Sign up to get started

PRO Sign Up 
Sign Up FREE

Latest Features

Curious about what we’ve been working on?

View New Features

Get expert survey help

Get expert survey help


Best practices for planning expert surveys

Planning makes writing your survey easy.

Download eGuide

How many responses do you need?

Use our sample size calculator to find out!

Calculate now

Analyze survey data like a pro

Learn to slice and dice data using the Analyze tool.

Download eGuide