Online vs. Telephone: A Tale of Two Survey Methodologies
Question: Which type of survey is “better” – telephone or online (internet-based)?
In the earliest days of scientific survey research nearly all surveys were conducted face-to-face. When telephone service became nearly ubiquitous in the U.S. telephone interviewing began to overtake face-to-face and mail as the preferred mode. Now a growing number of academic institutions, media, and research firms are migrating over to online surveys as faster, cheaper, and (some would say) better than telephone.
Declining Telephone Research Quality
Prime Group was a very early adopter of online survey research. Around the year 2000 we observed (and experienced) the growing challenges to the use of telephone interviewing. The explosion of mobile-only households (now at nearly 50% of the U.S. population), greater use of caller ID, and the proliferation of telephone surveys and telemarketing exerted a combined downward pressure on response rates which today stand at about 5-10% and falling. The significantly greater costs and lower response rates associated with mobile phone interviewing have been exacerbated by federal regulations prohibiting the use of predictive dialing with mobile phones. It was clear to us that the future of scientific survey research was NOT in telephone interviewing.
The Rise in Quality of Online Research
Meanwhile, conducting surveys via self-administered online interviews was becoming increasingly feasible, reliable, and attractive. For many clients such as membership organizations and higher education institutions who have their own comprehensive email lists, drawing representative samples from the target population or even conducting a “census” of the population became a very feasible and attractive option. For other populations such as the general public, voters, opinion leaders, and professionals such as physicians and teachers, sophisticated approaches to the use of research panels began to come on line and grow increasingly reliable.
With the ability to conduct reliable projectable online surveys we set about developing a much more innovative and powerful approach to strategic message testing. The result is our unique M3 design — MaxDiff Message Modeling. Based on a MaxDiff (maximum difference scaling) platform which is commonly used for commercial product testing, M3 employs a “forced-choice” methodology that requires respondents to make a series of iterative choices among multiple message options. With a sample of n=1,000 the exercise generates more than 40,000 unique data points which produces extreme precision and differentiation — the very qualities so sorely lacking in traditional telephone-based message testing.
So Which Approach is Better?
Occasionally, though less and less frequently, we are asked whether online surveys are reliable and how their results compare to the results of telephone surveys. The short answer is: It’s complicated (as a brief run through the attached academic literature on the subject will demonstrate). But one way to measure the relative accuracy of telephone and online surveys is to see how effective each type was in predicting the outcome of a single election. After the 2012 presidential election, Nate Silver (now of FiveThirtyEight but then with the New York Times) did just that. Silver evaluated and rated the accuracy of 23 polling organizations that had been regularly tracking the presidential race. A majority (4) of the top seven most accurate organizations conducted their surveys exclusively online (Gallup which used only telephone interviewing came in last). More recently, online surveys consistently predicted the outcome of the British Brexit vote more accurately than did the telephone polling.
To put it another way: no single research approach is perfect. Researchers are constantly evolving and adapting to fit the changing preferences of the populations we are trying to reach. Trade-offs must be made, but the bottom line is high quality research is and (we believe) will always be achievable. At Prime Group we generally find that the positives of online research outweigh its limitations, while that is less and less often the case for telephone research.
Online vs. Telephone Modes
- Wells, Anthony. 2016. "What We Can Learn from the Referendum Polling." UK Polling Report: Survey and Polling News from YouGov's Anthony Wells.
- Wang, Wei, David Rothschild, Sharad Goel, and Andrew Gelman. 2015. "Forecasting Elections with Non-Representative Polls." International Journal of Forecasting 31(1): 980-991.
- Matthijsse, Suzette M., Edith D. de Leeuw, and Joop J. Hox. 2015. "Internet Panels, Professional Respondents, and Data Quality." Methodology 11(3): 81-88.
- Laaksonen, Seppo and Markku Heiskanen. 2014. "Comparison of Three Modes for a Crime Victimization Survey." Journal of Survey Statistics and Methodology 2(1): 459-483.
- Silver, Nate. 2012. "Which Polls Fared Best (and Worst) in the 2012 Presidential Race." FiveThirtyEight: Nate Silver's Political Calculus.
- Ansolabehere, Stephen and Brian F. Schaffner. 2010. "Does Survey Mode Still Matter? Findings from a 2010 Multi-Mode Comparison." Harvard University: 1-37.