Log inSign up free
Blog results
Showing 0 of 0 results
Stay curious! You'll find something.
Survey Science

The double-barreled question and other common survey mistakes

The double-barreled question and other common survey mistakes

It’s no secret that survey creators want their respondent experience to be as close to perfect as possible. After all, that’s how you get accurate results and uncover insights that will drive more impact for your organization. However, to accomplish all that, you need to steer clear of things like inaccuracies, ambiguous questions, and one of the most common survey mistakes: double-barreled questions.

The double-barreled question, also known as a double-direct question, is basically a trick question. (Or, if you’re feeling fancy, an informal fallacy.) It’s when respondents are asked for feedback on two different issues or topics within one question. Since they can only respond with a single answer, the results will end up skewed—never a good thing when it comes to survey data. 

Imagine a customer has purchased a brand new microwave and has a hard time figuring out how all the programming works. The manual is confusing, and they become so frustrated that they call the company’s customer support line. The representative they speak to is extremely helpful, taking the time to walk the customer through everything they need to know and responding to every question they have. After the call, the customer receives a survey with the question:

“How would you rate the quality of our product and customer support?”

See the tricky part here? The customer may want to provide a negative rating for the product, since the user experience was less than stellar, but give the customer support experience a positive rating. Because both topics are squished into one question, they’ll have to choose which to address. Then, when the survey creator analyzes the results, they won’t be able to parse out a clear answer, or understand that the customer really has two separate measurements of these issues. 

Of course, there is another way that this scenario could go. The customer might avoid the problem entirely and either skip the double-barreled question or quit the survey. Either way, that double-barreled question is negatively impacting both the respondent experience and the final survey results.

Crafting survey questions has never been easier. We’ll guide you through the process, step by step.

So how do you fix a double-barreled question and make sure you’re following survey question best practices?

The key is to not attempt to do too much with a single question. Multitasking may be beneficial in other areas of life, but it doesn’t go well with survey questions. If you realize you have inadvertently created a double-barreled question, break it up into two separate questions. With the example above, the survey that new microwave owner received would instead ask two questions:

  • How would you rate the quality of our product?
  • How would you rate the quality of our customer support?

The microwave company could even take the survey a step further and follow up with open-ended questions that ask the respondent to explain the reason for their ratings. (These text responses would provide an opportunity for the customer to share details about the excellent customer service and confusing user experience.) Once a double-barreled question is teased apart into separate questions, you’ll get the clear answers you need to take action on respondents’ feedback.

As long as you’re avoiding double-barreled questions in your surveys, remember that they aren’t the only kind of survey question mistake. Leading questions are like close cousins to double-barreled questions. Since they generally confuse respondents and muddy results, you should do everything possible to keep them out of your surveys. 

Here’s how you can spot a leading question: It has bias, opinion, or non-neutral language that can potentially sway a respondent to a particular way of thinking. When respondents’ answers are influenced by leading questions, you can’t count on accurate survey results that truly reflect their opinions and experiences.

Let’s say a company is sending an employee engagement survey to its employees. Here’s an example of a leading question around diversity, equity, and inclusion (DEI):

“Our company has been rated one of the most inclusive in our industry. How would you rate our dedication to diversity and inclusiveness?”

In this case, the sentence that appears prior to the question has set a certain level of expectation for respondents and could ultimately affect how they view (and rate) DEI at their organization.

Sometimes leading questions include unnecessary adjectives or descriptors. For instance, let’s say a post-event feedback survey asked:

“How likely is it that you would recommend this popular event to a friend or colleague?”

Saying “popular event” rather than simply “event,” guides the respondent to consider the event’s prestige, which may alter how likely they are to recommend it. The best way to keep leading questions out of your surveys is to make sure the wording of your questions is neutral and focused on the exact issue you want respondents to address. 

Okay, so double-barreled questions and leading questions are definite no-nos when it comes to survey questions. What else do you need to know about problematic survey questions and related survey mistakes? Here’s a quick cheat sheet to keep in mind:

Loaded questions are another survey don’t. They make an assumption about the respondent and force them to provide an answer they may not agree with or find applicable to them. For example:

  • Have you painted the exterior of your house in the past year?

Since this question assumes that the respondent lives in and owns a house, a response from a renter or a condo owner would likely not lead to the kind of survey data that the survey creator needs.  

The best way to eliminate loaded questions and create more inclusive surveys is to ask preliminary screening questions and use skip logic to ensure that respondents are only seeing and answering questions that apply to them. 

You should always strive for clarity and specificity in your survey questions. Ambiguous language (i.e. any wording that isn’t immediately clear), will not only slow respondents down as they puzzle over what you mean, it will also impact the accuracy of their answers. For instance, it’s better to ask customers specific questions about how your product’s quality, price, or user experience compares to other brands than to ask them to agree or disagree with a more ambiguous statement like “Our product is better than our competitors.”

Nobody likes being backed into a corner, even in the context of survey responses. Absolutes in survey questions typically force respondents to choose yes/no answer options and include words like “always,” “never,” “every,” “all,” etc. For example: 

  • Do you always get at least 8 hours of sleep? (Yes/No)

The “always” in this question, along with the yes/no answer choices, creates an extremely rigid survey experience for respondents. To avoid this, just say no to absolutes in your survey questions.

We touched on how bias is a big part of leading questions. It can also show up in your surveys as research bias. This includes your surveying methodology, your target population (or lack thereof), and whether your questions’ answer options are exhaustive and inclusive. Whether you’re keeping an eye out for bias in the wording of your questions or the pre-planning stage of your survey, it’s important to carefully examine how your survey questions could be perceived and whether you’re staying true to your survey’s purpose and goals.

Double-barreled questions, along with the other survey mistakes we’ve outlined here, can happen to the best of us. Luckily you can counteract the question chaos by keeping these tips in mind and taking advantage of resources like our expert-written survey templates and SurveyMonkey Genius