Panels. An entire industry exists solely for the purposes of providing interviewees to survey researchers. Each firm has its own way of recruiting people to take surveys and those differences do, in fact, cause differences in the data. So the proverbial question for researchers buying interviews is “who is right and how do we know?”
Well, we know what the truth is—in theory. That is, we know that probability samples represent populations. In the absence of probability sampling, though, what would we have to do to get close to results that would be yielded by a probability sample, but without all of the costs and lengthy field times? The first step is to try to identify a source that has great coverage. That is, identify a method to invite people via some mechanism whereby everyone is eligible for the study. The most common tools are addresses and phone numbers—nearly everyone has one or the other, but data collection via these channels is time consuming and expensive. Thankfully there are other touch points that might come pretty close to the coverage of those channels—the question is, whether they are suitable for inviting people to take surveys. For example, Google and Facebook have pretty far reach, but being served a survey before you conduct a search or update your wall would seem a bit out of place.
Here at SurveyMonkey, we have a great deal of reach and people are in the survey-taking mode. We process more than a million survey responses each day—opinions given to surveyors that, collectively, indicate hundreds of thousands of research objectives. The sheer diversity of the surveys and their content — from bake sales to business strategy – all but guarantees the people responding are as diverse. In a way, this model is the proverbial wisdom of the crowd—a really big crowd. Take a look. The two maps below show the population density of the United States and SurveyMonkey traffic by location for one month:
Our traffic seems to reflect population patterns pretty well. To determine just how diverse the SurveyMonkey traffic is we put it to a test. Each day for three weeks during the summer of 2010, we invited a random subset of people who had just completed a survey on its system to answer one more question—“Do you approve or disapprove of the way Barack Obama is handling his job as president?”—the American National Election Study’s classic measure of presidential approval. Fully 87,000 people—through a 46% response rate—from more than 8,300 of the 19,000 American cities matched the results of Gallup’s RDD studies within the margin of error nearly every day, without the use of statistical weighting.
This map shows the location of responses to our presidential study and, again, the locations look pretty good. Now, for one last test. It’s one thing to be able to show people are where they ought to be, but do the attitudes of the people in certain locations match our expectations?
The map above is a picture of approve/disapprove responses by location and, compared to the map next to it, (2008 voting patterns), the “anti-Obama” folks are exactly where we’d expect them to be.
These results gave us the confidence that we could create a great panel. And so we did! That’s how SurveyMonkey Audience was born. It’s been up and running for a few months now, and has proven invaluable to customers. No middleman necessary. Ask your questions. Get your answers. For us, it’s a way to return the wisdom of crowds back to the crowds, if you will.