Products

SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

Get data-driven insights from a global leader in online surveys.

Explore core features and advanced tools in one powerful platform.

Build and customize online forms to collect info and payments.

Integrate with 100+ apps and plug-ins to get more done.

Purpose-built solutions for all of your market research needs.

Create better surveys and spot insights quickly with built-in AI.

Templates

Measure customer satisfaction and loyalty for your business.

Learn what makes customers happy and turn them into advocates.

Get actionable insights to improve the user experience.

Collect contact information from prospects, invitees, and more.

Easily collect and track RSVPs for your next event.

Find out what attendees want so that you can improve your next event.

Uncover insights to boost engagement and drive better results.

Get feedback from your attendees so you can run better meetings.

Use peer feedback to help improve employee performance.

Create better courses and improve teaching methods.

Learn how students rate the course material and its presentation.

Find out what your customers think about your new product ideas.

Resources

Best practices for using surveys and survey data

Our blog about surveys, tips for business, and more.

Tutorials and how to guides for using SurveyMonkey.

How top brands drive growth with SurveyMonkey.

Contact SalesLog in
Contact SalesLog in
Survey Science

Don’t know?

Don’t know?

Don’t know whether to include a “Don’t Know” (or “Not Applicable”) response option for one of your questions? Don’t worry, you’re not alone. Here are some do’s and don’ts.

The debate about whether to include a “Don’t Know” (DK) option or not starts with a philosophy about valuing “clean” data and valuing an amount of data—both are valid arguments. The latter is grounded in the reality that survey data is expensive to come by in most cases, due to the time required to collect it and the costs of fieldwork. In that light, researchers who employ a DK response may be wasting what could have otherwise been useful responses from people who are attracted to the ease of offering a satisfactory, but ultimately unprovable, response.

But assuming we could spare a few of these satisficers, one of the advantages of a DK response option is that it helps reduce noise caused by people who offer a response to a closed ended response option. That is, people who haven’t thought much about a topic or don’t have the experience or attitude relevant to the question are indistinguishable from people who do because, after all, selecting a radio button in a survey is the same in both cases.

So the answer to the question about whether to include a DK response option is: it depends. It depends on whether you can afford to throw out a bit of data in order to make the part you keep cleaner. There is another factor to help make a decision about whether to include a DK option: the demographics of the people being surveyed and the nature of the question.

For example, if your survey question is “What is the GDP of China?” and your sample is comprised of second graders, you might want to include a DK option. If, on the other hand, the sample is comprised of American adults, then a result like “20% of respondents reported ‘Don’t Know’,” is a real finding that may have important implications for making a decision about school curriculum. In short, if you have an expectation that the respondents ought to know the answer to a question (like “How tall are you?”) then don’t include a DK option. But if not knowing is plausible and even important to know, then include a DK option.

Now you know.