Online Surveys = Ongoing Insights

How Much Time are Respondents Willing to Spend on Your Survey?

 → 
 → 
How Much Time are Respondents Willing to Spend on Your Survey?
research_data

How much time would you be willing to spend completing a customer satisfaction survey about a recent shopping experience? On a survey distributed by your Human Resources team regarding employee satisfaction? From your friends who hosted an event that you attended and want your feedback?

Understanding your audience when constructing a survey is important and can help inform decisions on survey length and content granularity. Since survey respondents have different motivations for responding to surveys, their tolerance for how long a survey is will vary.

What we analyzed.

We wanted to understand how the length of surveys–as measured by number of questions–impacts the time respondents spend completing surveys. In order to understand this relationship, we took a random sample of roughly 100,000 surveys that were 1-30 questions in length, and analyzed the median amount of time that respondents took to complete the surveys.

In ideal circumstances and over a large, randomized sample of responses, the average time it takes to answer a question should not vary based on the length of the survey, so a linear relationship between the number of questions in a survey and the time it takes to complete a survey should exist.

What we learned.

It may come as no surprise, however, that the relationship between the number of questions in a survey and the time spent answering each question is not linear. The more questions you ask, the less time your respondents spend, on average, answering each question. When your respondents, in methodological terms, begin “satisficing”—or “speeding” through a survey—the quality and reliability of your data can suffer. On average, we discovered that respondents take just over a minute to answer the first question in a survey (including the time spent reading any survey introductions) and spend about 5 minutes in total, answering a 10 question survey. However, respondents take more time per question when responding to shorter surveys compared to longer surveys:

Can we always assume that longer surveys contain less thorough answers? Not always, since it depends on the type of survey, the audience, and the relationship of respondents to surveyor, among other factors. However, data shows that the longer a survey is, the less time respondents spend answering each question. For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions.

In addition to the decreased time spent answering each question as surveys grew in length, we saw survey abandon rates increase for surveys that took more than 7-8 minutes to complete; with completion rates dropping anywhere from 5% to 20%. The tolerance for lengthier surveys was greater for surveys that were work or school related and decreased when they were customer related.

What this means for you.

Take survey completion time into consideration as you design your next survey. Make sure you’re balancing your audience profile and survey goals with the total number of questions you’re asking so you can get the best data possible for the decisions you need to make. And if you do write a survey that has a low response rate, make sure you send it to enough recipients to get a good sample size.

Have questions about our analysis or interested in other data and insights we can share? Let us know in the comments below.

Tags: , , , ,

  • Pingback: Tweets that mention How Much Time are Respondents Willing to Spend on Your Survey? | The SurveyMonkey Blog -- Topsy.com()

  • Pingback: How Long Should a Marketing Survey Be Online? | Internet Marketing Denver | ReachLocal Denver | Social Media Denver()

  • Much needed research when you are trying to keep everyone happy and they all want an input with lots of questions “So nothing is missed out”. This gives us a reason to say “no”!

    It would be interesting to review incentivised v non-incentivised.

  • Pingback: » How Long Should a Marketing Survey Be Online? Best Diets and Dieting()

  • Michael Culbertson

    Hmm, interesting. But, item type surely affects how long it takes to complete the survey, and I would guess that very short surveys tend to have more open-ended items, which take longer to answer, than longer surveys. A more-detailed analysis incorporating item type would be quite instructive.

  • Judith Waite Allee, author/speaker

    Interesting–thanks for sharing the information.

    I am guessing that 30-question surveys would tend to include multiple choice or other question types that are quick and easy to answer, and quick and easy to quantify, but may miss the heart of the real question that open-ended questions can capture.

    Smart marketing move on your end–sharing info of value to your best customers. Inspires me to think about survey topics that motivate the right people to participate, publicizing the survey itself, and then publicizing the results.

    Thanks,
    Judith Waite Allee
    P.S. I’d love to see comparisons of question types as they affect the completion rates. Also, comparisons of various incentives, how they affect completion, and how they affect the data.

    • Hi Judith–thanks so much for the feedback as well as ideas for future analysis that you’d like to see. We definitely plan on sharing more data, so appreciate hearing what is most interesting to you!

  • Laura Schild

    Longer surveys are more likely to have skip patterns based on survey answers. When counting the total questions per survey, were skip patterns taken into consideration?

  • Jean-Pierre Calabretto

    Thank you for this information very interesting and useful.
    I would also like to see comparisons of question type, in particular for conditional questions e.g. “if yes, please comment …” These types are clearly very useful to gather additional qualitative data but possibly affect completion rates.
    Thanks
    Jean-Pierre Calabretto

  • Dr Derek Sequeira

    I manage our University’s online evaluation program which has 11 mandatory core items on a 7-point Likert scale. Academic staff may add items from a categorised item bank if they wish. Respondents can provide comments in 2 text boxes

    This affords significant variations in the time taken for respondents to complete their surveys. Those who only respond to the 11 mandatory items will spend a much shorter time than those who work through additional items and provide their comments in the 2 text boxes.

    The level of motivation among respondents will also impact on the time taken to complete a survey. Some will speed through the radio-button clicks while others will make comments that are detailed and carefully thought through.

    Having said this, a succinct survey that students find relevant would yield higher response rates irrespective of the time taken for its completion.

    • Michelle Brooks

      This is great insight – thanks for sharing. It will certainly help when ‘wrestling’ with clients over the number of questions is optimum.

  • Interesting … just before I read this blog entry I received a survey email from a hotel at which I recently stayed. The email came the day after I got home. They asked me a number of questions and then asked for responses in text boxes. I was interested in the survey personally, but the more questions that got asked that were driven off of previous responses started to irritate me. Since I am a market researcher by training, if even I can get irritated by the questions, think what the general populace must feel about this.

    • John Whaite

      Thank you so much for this – facts are hard to find. I now know that if I want to call it a 5 minute survey, I should have only 10 questions, and if I don’t want to lose people, I should have only 15 questions. These are good figures to have to guard against “question creep” – the urge to add just one more little question.
      It would be interesting to know if the length of a survey influenced participation in following surveys – a lot of questions could be annoying, but answering them could make someone feel they have contributed and so be more committed in future.

  • dropletform

    Similar experience – in my case they ( JW Marriott ) sent the letter to me while i was still in the hotel ( in China)

    I responded telling the management team that i was still in the hotel – they invited me for a drink to discuss!

  • Elizabeth Jones

    Thanks for the info. I personally hate long customer relation surveys. I think around 5 questions is ideal for these. Certainly no more than 10.

    I will be taking your research into consideration when making up our first survey.

  • Anna Kadric

    Interesting input. Thanks to you I now have ‘hard facts’ about survey abandon rates in relation to number of questions asked.

  • Neil Alexander

    it’s interesting bt want to get much more

  • suad

    THank you for your refering of this helpfull information and i hope it will help us

  • vijay kumar v

    It is very interesting One, thank you.

    • Other Paul

      I was quite intrigued by the claim. But then I looked at your ‘Response Times’ graph. The data points on it could just as equally support a linear hypothesis – the fall-off seems to occur only near the end. Though I don’t doubt that the actual statistics you have (but which you don’t present) favour the non-linear model, and I can see that the heavy line is a better fit than the dashed line, it seems to me that the linear model is quite ‘good enough’ for the range in question.
      One thing that could counter the time-per-question decrease is the thought – in the answerer’s head – that in a large questionnaire, the respondent knows they’re in for the long haul and, accepting this, they give their time. That would pull the curve back up again and it’s maybe what keeps the data points not too far removed from linear.
      Are all the data from questionnaires where the number of questions is known in advance?

  • Sally

    Is there any information on whether an incentive changes these results? For example, are the abandon rates lower if there is an incentive to complete?

  • Prasenjit Dasgupta

    What I have seen, other than the Qre.’s length and quantum, the response style also plays a major part. Especially in Qualitative surveys, that have the open ended questions.

    • Louis Pace

      @Other Paul: Since we do not have access to the data, we do not know that the linear fit is at all appropriate, let alone “good enough.” The strength of a trend depends on a lot of factors, including sample size, none of which we have. But a sufficiently large sample size can show that even a small difference is significant.

      IMHO, the data “appears” to follow a strong trend, and this trend “appears” to deviate from the linear fit by 30 seconds to up to a minute. And remember, those times are per question, so this can have an impact on your survey. But even this is just an opion based on appearances, which, especially in statistical graphs, can be deceiving.

      What I’d really like from Survey Monkey is a little more quantitative information, like the model used for the trend and some additional statistics specific to that study. Meanwhile, the information presented is very useful for helping us design future surveys. Thanks!

  • Bob

    It does not surprise me if it is true that people spend less time per question the longer the survey is. However, the time it takes to read the directions does not appear to be taken into account here, and that would appear to be significant. if for example, it takes 75 seconds to answer a 1 question survey, and 80 seconds to answer a 2 question survey (you say 40 seconds on average, even though you say total 2 minutes), that implies it takes 70 seconds to read the directions, and 5 seconds to answer each question. In that case, the amount of time spent answering a question actually increases from there, gradually, up to 23 seconds per question by the time you have 15 questions, And, in longer surveys, do users usually come across another set of instructions for another section of the survey? Is this taken into account? For me, this is too little information right now to trust the validity of the conclusions.

    • Mike Stempo

      There are a ton of variables left either unconsidered here or unmentioned in this analysis. Where to begin? The incentive of a gift, what its value is, and the chances of winning it is simply not mentioned. Are we to assume no survey gift?

      For starters, the survey has to actually be something that applies to the survey taker. I will sometimes either completely ignore a survey if it looks too long and it really has nothing to do with anything I care about or will haphazardly fill it out trying to understand the gist of way it was sent to me in the first place. That sure skews the results. Some surveys re extremely poorly structured and ask very asinine questions. I chuckle to think what the surveyor really expects to get of any accuracy and quality out of such ill conceived surveys.

      I get these surveys asking me to evaluate “impressions” I have of companies and if I would recommend them to others.. Some survey makers are so isolated in tiny their little worlds that they think us out “there” in survey land really pay that much attention to mundane offerings and the mundane companies that offer them. Most companies do not command that much attention from us beyond the first one or two.

      I particularly chuckle at surveys the try to “herd” you to an answer or do not cover all answers and give no space to comment on other. I deep six those after the first discovery of that ill thought structure. Some surveys even try to beg the question.

    • Thank you for your super site in internet.

  • Sometimes in a short survey, 1-5 questions, the questions are more complex. In a survey of 25-30 questions the complex questions are broken down into simpler queries that require less thought. For example if I asked you ‘What do you find the most difficult aspect of purchasing a new car?’ and provided you with 3 options and an ‘Other’ with comment option, it might take you quite a while to develop a valid response. If, however, I asked you 5 questions with yes or no type responses I might be able to get the info I want and you the respondent could get on with your life.

  • From my experience YES/NO survey answers tell you nothing that will help you to improve any aspect of the issue in question. Open ending all questions will draw the respondent
    into some focus on the issue whether positive or negative.
    If YES or NO are presented so should Please comment why!!
    Incentives also go a long way to getting participants interested. Customer Service and Product Quality must be surveyed frequently if a business is really about improvement. A note to food businesses who leave surveys on dinner tables in restaurants – do you ever ask your staff what they do with them?

    • Dolores

      Yes there is a lot of missing information ie completers compared to non-completers.

      • This is interesting and useful thanks. Long surveys do seem to bore people and when it comes to staff surveys it feels like the more questions you need to ask the worse the employer/employee relationship already is. “you want to ask me how many questions?? Wow – you don’t know a thing about me – do you?” And very often the useful stuff is in the written feedback not the numbers so keeping things short encourages folk to write more back to you too. And if you have to give an incentive to complete a survey well I think that signals even bigger problems.

        Keep it as short as you can – put your self in the respondent’s shoes and ask do we really need to include all these questions?

        Cheers – Doug

  • When I am asked to complete questionnaires or surveys where there is no option to voice an opinion (other than in the way the way the survey stipulates you should answer, I will opt out not to contribute to supposed research that I do not agree with. (Usually University type surveys.)

    These types of surveys will give you a three answer selection without the option of stating something different. This then gives the survey the only answers that ‘they want’ from the survey, without it being an accurate reflection of the thoughts of the people responding to the survey. How helpful is that – or is it a play at Politics to get ones own ideas supposedly accepted?

    • I completely agree with Bruni. At the first sign that the survey is ‘leading’ me by only allowing the responses desired, I quit. On top of that, I don’t participate with that sender again.

  • Pingback: How Many Days Does it Take for Respondents to Respond to Your Survey? | The SurveyMonkey Blog()

    • Michael

      “For surveys longer than 30 questions, the average amount of time respondents spend on each question is nearly half of that compared to on surveys with less than 30 questions.”
      Given the shape of the curve in the chart immediately following this assertion, wouldn’t this be essentially true for any number all the way down to 4 questions or so? There is no uniquely large gain in time per question at the 30 question mark.
      Am I missing something?

  • Russell

    What then is one supposed to do when, for reasons of difficulty recruiting “live” research participants, one has a large number of questions to ask, say 300-400??

    Are there any research surveys that ask people to fill out a number of “linked” surveys breaking the block of questions down to smaller blocks?

  • jodie

    A lot of long surveys usually lead to forgotten places for yes or no responses and kick a person trying to finish the survey out. Some ask redundant questions over and over which becomes very irritating asking which paper towels have the most sheets, best type to buy and repeat the questions with a change in one word.
    What is really irritating is trying to finish the survey and you get a message that “sorry the quota has been filled”

  • Scott

    I really appreciate that you guys are doing this sort of research. It’s super helpful! Just wanted to mention that the images all appear to be broken right now…

    • Bennett P

      Scott – oh dear! Not sure what happened to the images! Thanks for flagging and we’ll fix. And, thanks for reading our blog! Glad it helps

  • Cheepoii

    I really appreciate what you guys are doing. But the surveys said to download this app. So I downloaded it. and still it didn’t work. Please help me

  • kaytek

    Hi there! Not clear what app you’re referring to?

  • Martien Schriemer

    There is an online environment were respondents were monitored for a couple of years and there ara completely other findings than mentioned in this article.

    I would be very interested to match our data with yours. Would that be possible?
    Awaiting your response,
    Martien Schriemer
    University of Applied Sciences Leiden, Netherlands

    • KTsurveymonkey

      Hi Martien,

      Could you give us a little more insight on the project you’ve mentioned here?

      Thanks!

      • Martien Schriemer

        Hi,

        There has been a study about aborting online surveys published in The Netherlands by Van der Zee (2007). The study covers a period of several years (2004-2007). There hasn’t been much publications about this topic, solid information about this topic is rare.

        I studied this data. The added graph pictures the percentage of respondents aborting the survey in relation to the number of questions of the survey. Its apparent there is no linear relationship. On average 3% of the respondents will abort, even if the survey is only 10 questions long. This percentage climbs slowly with the number of questions asked. It peaks -on average- at 6,1% when the survey consists of 41 to 50 questions. Has the survey more questions, than the percentage of aborting respondents diminishes to -on average- 2,1% for surveys containing 81 – 90 questions. But when the survey takes more than 91 questions the percentage of aborting respondent sears to 16,4% and higher.

        The interpretation I made in accordance with this data is the following: there will always be respondents aborting, so be sure your sample is always +3%. When a respondent has decided to fill in the survey he becomes more disillusioned by the topic, or more distracted by external factors (i.c. doorbell rings and aborts the survey), so he will abort. All the “normal” reasons to abort apply in this situation. But, when the survey takes longer than 50 questions the respondent becomes more committed to complete the survey, probably because the topic interests him (i.c. doorbell rings, he pauses the survey and completes the survey afterwards). By and large all the “normal” reasons for a respondent to respond to a survey (note: the “normal” reasons are well documented in the literature and not repeated here). The large number of respondents aborting the survey, when the survey has more than 91 questions is probably due to the poor quality of the questionnaire. If a researcher isn’t able to make his point within the scope of 90 questions, the survey isn’t worth completing. It’s my opinion that this behaviour of respondents is in line with the theory of reasoned action of Ajzen and Fishbein (1980). These authors state that the intention to behave in a certain way is determined by two factors: the attitude toward the behaviour and the beliefs about how other people would like you to behave.

        Would it be possible to analyse your data?

        Fishbein, M. & Ajzen, I. (1980). Understanding Attitudes and Predicting Social Behavior. Reading, MA: Addison-Wesley.

        Zee, F. van der (2007). de enquête. Het maken van een goede vragenlijst. Groningen: Grafisch bedrijf Letsch, ISBN nummer 978-90-78421-05-4.

        • KTsurveymonkey

          Wow, very interesting and insightful, Martien! Thanks so much for sharing that information. Unfortunately we do not provide our data to external sources, but we do love that this has got your wheels turning and it’s brought you to a point to question your previous analysis and our current findings!

        • Quincy

          Wow! Thank you.

  • This is an awesome article. Believe it or not, I really was in need of this article. Cause, this information is really rare to find. Thank you man.
    Clipping Path service

    • KTsurveymonkey

      That’s so awesome! Glad we could help out. Happy surveying!

  • Alex González

    I can’t see the images. I think that is a very important article. Please fix it! :(

    • KTsurveymonkey

      Hi Alex,

      Thanks to bringing this to our attention. We are working on updating some images that have broke on our blog. Very sorry for the inconvenience.

  • Steve Buchanan

    Found Brent’s other article on survey question volumes also very helpful – would love to see these graphics back up however on this as well, as article equally interesting (but missing in the message without graphics).

    • KTsurveymonkey

      Hi Steve,

      Thanks for mentioning this. We are in the midst of trying to fix this! We moved our blog and some of the images were lost sadly. We’re so happy you liked the article, and we’ll try and get these images back for you. Thanks for providing them from the other article!

  • Steve Buchanan
  • Bhanu Priya

    Hi,
    I am currently making a survey to identify the causes of accumulation of waste in the community around my school. We have narrowed it down to 20 questions. We will be taking this survey to a sample set of 500-600 people. The people are not very educated and my school students are going to go door-to-door to take the survey. They will be reading the questions to the sample set. What should be the ideal number of questions in this case?

    • MFsurveymonkey

      Hi Bhanu! Thanks for using our tool for your project! That sounds like an awesome topic by the way :)
      Since the survey will be taken in person, and door-to-door, you might want to make it as simple and short as possible so that people will be more inclined to take the time to answer. 20 questions are ok, but if you can make it between 12 and 15 it may increase the willingness factor for your respondents. I hope this helps!
      Also, after you collect your responses you can them manually enter them to our platform so you can analyze the results. See more here:
      http://help.surveymonkey.com/articles/en_US/kb/How-do-I-manually-input-responses-data-entry

Inspired? Create your own survey.

Inspired? Create your own survey.

PRO Sign Up Sign Up FREE or Sign in
Write Surveys Like a Pro

Ever wonder what SurveyMonkey’s really made of?

Ever wonder what SurveyMonkey's really made of?

Read our engineering blog »