Products

SurveyMonkey is built to handle every use case and need. Explore our product to learn how SurveyMonkey can work for you.

Get data-driven insights from a global leader in online surveys.

Explore core features and advanced tools in one powerful platform.

Build and customize online forms to collect info and payments.

Integrate with 100+ apps and plug-ins to get more done.

Purpose-built solutions for all of your market research needs.

Create better surveys and spot insights quickly with built-in AI.

Templates

Measure customer satisfaction and loyalty for your business.

Learn what makes customers happy and turn them into advocates.

Get actionable insights to improve the user experience.

Collect contact information from prospects, invitees, and more.

Easily collect and track RSVPs for your next event.

Find out what attendees want so that you can improve your next event.

Uncover insights to boost engagement and drive better results.

Get feedback from your attendees so you can run better meetings.

Use peer feedback to help improve employee performance.

Create better courses and improve teaching methods.

Learn how students rate the course material and its presentation.

Find out what your customers think about your new product ideas.

Resources

Best practices for using surveys and survey data

Our blog about surveys, tips for business, and more.

Tutorials and how to guides for using SurveyMonkey.

How top brands drive growth with SurveyMonkey.

Contact SalesLog in
Contact SalesLog in
Survey Science

Frequent surveys make a big difference

Frequent surveys make a big difference

You’re about to release a new product and you’re itching to find out what people will think of it. A quick survey of your customer base shows that 95% of people love the idea of your product. Your boss tells you to increase production and advertising, since you’re now expecting massive sales. Unfortunately, when you finally get the first sales numbers, you’re shocked—sales are way below expectations. What’s going on?

Well, let’s say that product is a super-high-powered air conditioning unit, that the first survey you sent out asking if people would buy them was in August, and that production on the new units finished in September and they were in stores in October. Okay, so air conditioning units don’t sell so well in October—maybe you don’t need a survey to tell you that, but it’s not always that easy to spot. Let’s say that you sent out the survey in June, production managed to spit out the units a month later and no sales. It’s still summer and it’s still hot. Now, what’s going on?

Well, a quick survey of your customer base might let you know that people are worried about their electric bills, or maybe about the possibility of blackouts due to heat waves, or polluting the environment because a new documentary just came out—or maybe they just think the new units look ugly or big. How do you know which problem it is? Ask!

When it comes to surveys you want to ask early and often. Asking early gives you a running start to address any problems or concerns, and continuing to survey helps you avoid unwanted surprises.

You should survey repeatedly for:

When you measure something more than once, you can check that your measurement tool is consistent. For example, if you know that you’ve been eating the same amount of food lately, you’d feel more confident in your scale if it read vaguely the same weight (within a few pounds or so). The same goes for surveys. If you send out the same survey multiple times and get the same or similar results each time, you can be more confident that the survey really is getting at satisfaction with the product than if you only sent it out once. You’ll be able to put more stock in your survey as a legitimate tool to find out what people think of your product.

As much as we all would like surveys (and scales) to be reliable, sometimes they aren’t. Sometimes you feel like you haven’t been eating more but your scale says you’re gaining weight—and it’s not the scale that’s the problem. Surveys are the same way. A change in your survey results doesn’t necessarily mean that your survey is the problem. Attitudes and preferences do change! People are more excited about the prospect of a vacation on a Tuesday than on a Friday. People are more likely to want steak at 7 PM than 9 AM. People are more likely to buy a certain kind of deodorant after an advertisement for it goes viral on YouTube. Change doesn’t always mean the questions being asked are bad—it might only mean that something has actually changed. A good survey will be able to capture that shift. Still, the bottom line is that the only way to be able to measure a change is if you are measuring more than once. You don’t know if you’ve gained weight in the past year if you don’t know how much you weighed both in 2011 and in 2012.

The last reason is a dose of relativity. Let’s say you want to know what people think of your company’s sponge product right now, and you’ve never asked before. You find out that 64% of people like the sponge. But how do you know if that’s good or bad? What if most people just don’t like sponges in general, which means 64% is a great number? Or what if most people adore sponges, so 64% is terrible? You could compare your results to how many people like your major competitor’s sponge, but what if your competitor’s sponge is a regular sponge, while yours is a robot sponge that can clean on its own? People might hate a regular sponge but love a sponge that cleans for you. In that case, benchmarking using a competitor’s product won’t help since yours is completely different. Instead, you can send out multiple surveys asking about your product, so that over time, you build a baseline for satisfaction of your product. Thus, any time you want to gauge satisfaction, you have a handy baseline to which you can compare your results.

The bottom line? Send out the same survey multiple times and you can make better decisions with higher confidence. That’s a win-win scenario. (A third win would be creating a robot sponge.)

How often are you sending out surveys? Have you noticed any trends in the data you’ve collected? Let us know in the comments section below.