What if We Treated Online Surveys like Interactions?

Survey researchers could draw richer conclusions by learning from discourse scholars

by Jena Barchas-LichtensteinJoshua RaclawAbby Bajuniemi
Oct 2, 2020

Many researchers assume that the “content” of open-ended responses in surveys can be separated from the “form” of those responses. But this assumption may go against the assumptions of survey-takers, who often write their responses conversationally. But how should researchers approach survey responses? The key may lie in principles and methods from discourse linguistics. In this paper, three researchers from different linguistic traditions explore open-ended survey responses. They suggest simple techniques for survey researchers, evaluators, and anyone who designs surveys -- with or without a background in linguistics.

Let's Put It to Work

This research is particularly useful for survey designers and analysts. Here are two ways that you can approach survey responses as interactions:

Look for places where people write more than expected or required. We encourage survey analysts to treat a sequence of survey questions and respondents' answers as an interaction, and consider the way in which respondents answered, not just the answers they provided. One way to do this is to look carefully at responses that include more than "minimal" information, and consider what information can be gained from these responses. If respondents are volunteering information that’s not directly asked for, this suggests that they find it important and relevant to the topic of the survey. When this happens, researchers might consider adding more open-ended questions to the survey.

Analysts should also be mindful of signs that the question is difficult to answer in some way. Responses that start with “well” may indicate that people are having trouble. They might reject the premise of the question, or they might reject what they see as the expected answer.

Read across, not down. When survey results are in spreadsheets, each person’s responses typically appear in one row, while all participants’ responses to a question appear in one column. Researchers often pay attention to the columns -- they analyze each question’s responses separately. But there’s useful information to be had when researchers take a different approach. Instead of considering responses to each open-ended question separately, analyze the rows of responses. In particular, consider groups of related open-ended questions together. This makes it possible to identify patterns which would otherwise have been unclear. It also supports one of the key goals of social science research: shifting the unit of analysis from the response to the respondent.

Get the Data. We used two data sets for this study, which are free to use. The News Sharing data set features questions about people's willingness to share news stories with other people. The News Relevance data set focuses on people's assessments of relevance in news stories.

About the Researchers & this Study

Joshua Raclaw is an Assistant Professor in the Department of English at West Chester University, Jena Barchas-Lichtenstein is the leader of Media research at Knology, and Abby Bajuniemi is a Human Factors and UX Research consultant currently partnered with Gomoll Research + Design. While the writing itself was not funded, the data analyzed for this paper were collected with support from the National Science Foundation under Grant No. 15163471 and the National Institutes of Health under Grant No. #1R25OD0202212-01A1.

Join the Conversation
What did you think of this? How did you use it? Is there something else we should be thinking of?
Support research that has a real world impact.