Survey researcher, one of the highest-paying social science careers, revolves around designing surveys to gather research data and analyzing that data. These surveys can take a variety of forms. Although it may seem like the form of a survey would revolve purely around the convenience and preferences of the researcher, that’s not the case. The method by which a survey is conducted can affect the way participants answer it, even if the questions posed use precisely the same language, according to the Pew Research Center.
Different Ways of Conducting Research Surveys
The term “survey” may conjure up images of a sheet of paper printed with questions and multiple-choice answers, but research surveys can be conducted in many different ways. Written surveys can be mailed, administered in person or conducted electronically through an online platform. Survey interviewers often call potential participants on the phone to administer surveys. In person, a survey might also take the form of a one-on-one interview or a focus group.
Besides figuring out what questions to ask and how to word them, a big part of designing a research survey is determining the mode to use for conducting it.
More Honest Answers Without an Interviewer
To get the information they’re looking for, survey researchers must sometimes ask highly personal questions. Answering questions of this nature – such as queries about the participant’s physical or mental health, financial state, sexual history or history of drug or alcohol use – can make the respondent feel uncomfortable. After all, these are topics they would be unlikely to discuss with a stranger in a regular setting, and in this instance, the stranger is recording their answer for further use and study.
When it comes to these sensitive questions, the more the respondent feels that there is an “interviewer” asking the question, the more likely they are to fail to be completely truthful. That means that, in terms of research questions in these areas, online survey research may be more likely to yield accurate results than a survey conducted over the phone by a human survey interviewer.
Phone survey data may still be more accurate than data gained through surveys given in the format of an in-person interview, in which the respondent is sitting face-to-face with the interviewer and being watched.
The Impact of Answer Choice Order
Even the order in which your survey lists answer options can sway how a participant responds to a question. Whether the question is presented in writing or asked orally by an interview over the phone or in a face-to-face setting matters in determining which answer is more likely to get a participant’s attention.
When taking a written survey, as most online surveys are, respondents are usually more likely to choose the first answer they read, according to the Pew Research Center. On the other hand, humans process spoken words differently than they do written words. Rather than being drawn to the first option they read, participants in a phone or in-person interview are more likely to cling to the last answer they heard. This option is the freshest option in their minds, so they are likely to choose it, particularly if they don’t have a strong opinion on the available options.
The longer the list of options and the more complex the language of each answer, the more difficult it may be for participants in a phone or face-to-face interview to recall the full language of each choice, rather than agreeing to the choice they heard most recently.
The Impact of Satisficing
When survey takers choose an answer that just “good enough,” it’s not so good for the quality of the researcher’s data. This practice of choosing an answer that is satisfactory or acceptable, but not necessarily the best or most accurate, is known as satisficing. Researchers investigating this phenomenon have collected data that suggests that satisficing may be a more common practice in online surveys and that the result could be a decline in the quality of the survey results, according to an article published in Public Opinion Quarterly.
Specifically, when participants take a survey online, they’re more likely to choose an answer like “don’t know,” which isn’t very telling for research purposes, or to skip survey questions altogether, researchers reported.
Choosing a Survey Mode
If there are so many drawbacks to each different kind of survey mode, you may wonder how any researcher ever gets accurate information. Part of making sure you get the best data possible is being aware of the limitations and potential pitfalls of each mode of interviewing and crafting questions in such a way that they minimize these issues.
For example, you may decide to limit the number and complexity of answer options to questions that are posed over the phone in an effort to make it easier for participants to consider each option equally. If you wish to ask about a sensitive topic, you might decide to use an online survey to avoid the interviewer effect that reduces the honesty of answers. However, you also might make that answer field required to continue the survey, so participants don’t skip the question. Thinking critically about what answer options to include, and avoiding choices like “don’t know,” can also help limit the effects of satisficing for online survey respondents.
How you choose to word a question is another factor that influences the quality of your data, regardless of whether the question is asked online, by a phone interviewer or in person. Language that is vague or seems negative may result in poorer quality of data.