A new, hybrid approach to surveys Princeton researchers have developed combines the data-gathering advantages of interviews with the lower cost and analytical simplicity of traditional surveys to yield insights that would be difficult to obtain with other methods.
These “wiki surveys” — inspired in part by the constant, user-fueled evolution of websites like Wikipedia — begin with questions and a set of potential answers as in a traditional survey. The wiki surveys then continue to change as participants offer new potential answers, which are shown to future participants.
“On the one hand, we have methods that are open to new information, like interviews, but these are generally slow, expensive and hard to quantify,” said Matthew Salganik, a Princeton professor of sociology and one of the researchers. “On the other hand, we have methods that are good at quantification, like traditional surveys, but they tend to be closed to new information. So this wiki survey project combines the quantifiability you get from a survey with the openness you get from an interview.”
The new approach to surveys and two examples of “pairwise” wiki surveys — in which participants are presented a question and two possible answers — are described in an article, “Wiki Surveys: Open and Quantifiable Social Data Collection,” published in May in the journal PLoS ONE. The authors are Salganik and Karen Levy, who earned her Ph.D. at Princeton and is now a postdoctoral associate at New York University.
One of the examples is a survey created by the New York City Mayor’s Office of Long-Term Planning and Sustainability as part of its effort to update the city’s sustainability plan, PlaNYC 2030. City officials wanted to incorporate ideas from residents, so they created a wiki survey that asked: “What do you think is a better idea for creating a greener, greater New York City?” The officials generated a list of 25 possible answers.
Participants saw the question and were presented with two possible answers. They could choose whichever they preferred (or “I can’t decide”) and then would see another pair, continuing for as long as they liked.
Instead of choosing either supplied answer, though, participants could offer their own. After being screened by survey organizers for relevance, that answer would be added to the stock of possible responses and be shown to future participants. Almost 1,500 people took the survey over four months, and they submitted 244 alternative responses that other participants could choose.
In the end, eight of the top 10 answers were contributed by participants.