Our intrepid survey scientist is back on the blog as promised to share more of her findings from 3MC, a conference that brings other researchers from around the world together to talk shop and discuss all things survey research.
Whether you’re designing a survey to send around the world or just to your friends and family, this conference has proved to me that it’s essential to keep cultural differences and contexts in mind all the time.
Here are three things I learned this week that might help you out the next time you’re creating a survey on SurveyMonkey.
1. Tell a story
…to make your response options comparable. A lot of surveys have questions that ask respondents to give a rating on some verbal or numerical scale. But how do you know one person’s rating of 8/10 is the same as another’s?
For surveys of diverse, multi-cultural groups, this is a particular problem, as lots of research has shown that different cultures have different norms regarding the acceptable or typical ways of responding to survey questions. For example, people in Asian countries tend to choose the middle option of a five-point scale, while people in the US are more likely to choose the extreme (either-end) option.
A team of researchers at the University of Michigan wanted to see if they could correct for this variability. They asked respondents in Sweden, China, and the US to rate the level of their own pain using the following scale: None, Mild, Moderate, Severe, Extreme. Respondents in Sweden and China had similar distributions of responses, with about half of respondents saying “None” and the rest distributed across the other four categories.
In the US, however, about 50% of respondents said “Mild,” with the rest about evenly distributed among the remaining categories. Were these differences in self-rated pain real? Probably not!
The researchers were able to correct the variation by presenting respondents with short vignettes to demonstrate what each value of self-rated pain indicates (e.g. “Mild: Tim has a headache that goes away shortly after taking an aspirin”). After reading the vignettes, the bias in responses went away across the three countries.
2. Choose your language wisely
Are you running a survey in a country with a diverse, multi-lingual population? (Hint: if you’re sending your survey in the US, you definitely are.) You might think it’s best to offer your survey in whatever language your respondents prefer, but keep in mind that language cues a person’s cultural framework—and often her responses to survey questions.
Emilia Peytcheva at RTI presented research examining responses to survey questions by Spanish-English bilingual respondents who answered either in Spanish or English. She found that some questions elicited different responses depending on the language of the survey itself.
For example, in the National Latino and Asian American Survey, bilingual respondents who answered in Spanish gave higher ratings of family pride than bilingual respondents who answered in English. Or, in the New Immigrant Survey, bilingual respondents answering in Spanish reported higher numbers of biological children than bilingual respondents answering in English.
These results indicate that bicultural respondents may answer the same question differently, depending on the language of the interview and the way they perceive the intent and sensitivity of that question in that language.
3. Don’t rely on social media
…as they’re not representative of who you actually need!
If you’re conducting an international survey, you might think Facebook is an easy way to recruit respondents. Everybody has a Facebook account, right? Not so fast.
Researchers at Hokkaido University in Japan wanted to test whether they could get nationally representative estimates by recruiting responses through Facebook advertisements.
They ran advertisements on Facebook to recruit people from around the world to participate in the World Relationships Survey, which measures the freedom people have in choosing their interpersonal relationships. The ads were targeted to each country, with culturally appropriate graphics and text that emphasized the benefit of participating: the ability to compare your own responses to everyone else’s.
They found that Facebook ad recruiting was very cost effective in most countries: on average they paid about $1 per valid response. However, the responses that they got were not particularly representative of the national populations. Due to Facebook’s targeting algorithm, respondents were younger and more female than a truly representative sample from each country would be.
Instead of relying on social media to recruit respondents, might we recommend SurveyMonkey Audience? Our non-probability panel allows you to purchase responses from around the world and you can specify the exact balance of age, gender, and other characteristics that you want. Plus, we’ve already demonstrated that it yields reliable and valid data.
There you have it: three tips gleaned and you didn’t even have to spend a perfectly beautiful summer week in a conference room like I did.
Questions? Comments? Leave them below!