NewsWire: 6/30/21

  • While the public complains that polls are less accurate, pollsters complain that they are becoming too expensive. As polling companies switch from phone polls to online polls, typically to save money, experts worry about how to maintain survey accuracy. (The Wall Street Journal)
    • NH: On November 8, 2016, national polls showed Hillary Clinton easily winning the presidential election. But on November 9, the nation awoke to the shocking news that Donald Trump had won. Two years later, a survey by The Hill showed that public trust in surveys had significantly eroded. Fully 52% of Americans said they doubted polls cited by the media.
    • Things didn't get much better in 2020, after another election in which the pollsters didn't exactly cover themselves with glory. (See "Did the Pollsters Get it Wrong?") One analysis from the WP shows that presidential polls last year were the least accurate since 1996. 
    • So what's going wrong? One big problem, according to polling experts, is the rising rate of nonresponse.
    • During the Great Depression and the American High, almost every American answered a pollster when called upon. It was seen as both a civic duty and just a polite habit. Today there's a lot less of both. Over the last several decades, ever-fewer people are bothering to respond to pollsters. A Pew study found that, as recently as 1997, 37% of households contacted from a random sample actually completed an interview. In 2016, that number was 9%.
    • Large nonresponse rates increase the risk for what statisticians call nonresponse bias. This is the idea that the people who respond to polls may be systematically different from those who don't. Thus, a "random survey" no longer represents a true random sample. As such, it is no longer an unbiased reflection of the public mood. 
    • So what do we know about the people who do respond to polls? For one thing, they are more likely to volunteer and be politically engaged. They are also more likely to be politically "extroverted." Pew acknowledges that its surveys often overrepresent levels of civic engagement.
    • Many pollsters try to correct for nonresponse bias by reweighting each subgroup by its share of the population. If a survey panel only manages to include a few young adults, for example, pollsters will inflate their responses so that they are represent the relative national size of that age bracket. When crosstab sample sizes are small, this can lead to trouble by increasing both the standard deviation and the standard error. What's worse, weighting won't help if the nonresponses are correlated with the answer even within the population segment. This may have played a role in pollsters' underestimate of the Hispanic vote for Trump in the last election.
    • What explains the secular rise in nonresponse? As we've often explained in these NewsWires, the decline is directly linked to declining civic trust and engagement by generation. (See "Are Western Democracies on the Verge of Becoming Ungovernable?") The sociologist Robert Putnam has written extensively on this generational dynamic--for example, in his classic Bowling Alone.
    • Pollsters are adapting as best they can. Many are transitioning from phone surveys to online questionnaires. This includes the Pew Research Center. Since 2019, Pew has conducted most of its American Trends Panel through the web. And just this year, it switched the US portion of the Global Attitudes Survey to online.
    • The primary reasons for the switch are cost and time. Cell phone surveys are twice as expensive to conduct as landline polls. And it now takes approximately 40,000 cell phone numbers to get just 800 responses. Why waste money and time on a phone survey when so few are bothering to answer?
    • But the rising popularity of online polls has further muddied the waters of survey accuracy. Sure, it solves one problem: You no longer squander resources trying to get a response from the nonresponders. But it doesn't solve the underlying problem, which is to discover what the nonresponders think or how they are going to behave.
    • Even if you try to administer an e-poll like a phone poll--and send the questions to individual email addresses--you cannot randomize the addresses in the same way. Also, many people still lack high-quality internet service. This makes it difficult to poll Americans that are older, reside in rural communities, or live off the grid.
    • Pew does attempt to deal with these issues. It mails a random subset of the population an invitation to be a part of a long-term online survey. Pew will then provide internet service and a free tablet to those who lack access. But if people aren't answering calls, I'm sure even fewer are responding to email or mail solicitations. So you're still left with the intractable riddle of nonrespondent bias.
    • Nonresponse isn't the only problem. There is also the puzzle of systematic bias among those who do respond. Apparently, this bias is also growing. It emerged after the 2016 election in the so-called "shy Trump voter" theory, which suggested that many respondents chose to give an uncontroversial response to a stranger, 
    • A variant of this bias is arising now that so many pollsters are switching survey methods. Pollsters initially figured that Americans would answer online surveys the same way they had always answered phone surveys. It turns out they were wrong.
    • Pew recently did a study comparing online responses to phone responses for 78 questions. The vast majority of answers differed by at least four percentage points (pp). And 19 questions varied by over 10 pp. Only 21 responses differed by 3 pp or less. 

      What's The Matter With Polling? NewsWire - May3

    • How exactly did the answers differ? The responses mainly diverged by intensity, not direction. People taking the survey by phone were more likely to have extreme reactions, especially about public figures. And when asked to rate the quality of their family relationships and social lives, phone respondents were more likely to answer positively. Studies suggest that the less emotional the question, the more negligible the difference in response. Some surveys have also shown that phone respondents favor last choices, but that web respondents favor first choices.
    • So where does this leave us? Clearly, polls remain very useful. Remember, while the results of the 2020 presidential election seemed closer than the polls suggested, the final national count was still within a respectable margin of error. On the other hand, there's no reason to expect, as many experts once did, that polling accuracy will keep getting better over time. 
    • Yes, technological progress has been enormously helpful to pollsters. They can now perform the most sophisticated statistical analyses of vast quantities of attitudinal and behavioral data in an instant and at practically no cost. George Gallup and Elmo Roper had access to no such advantages back in the 1950s. What they did have, however, cannot be bought today at any price. That is access to a citizenry who sufficiently trusted the establishment to divulge their honest feelings about the most important issues of their time.
To view and search all NewsWires, reports, videos, and podcasts, visit Demography World.
For help making full use of our archives, see this short tutorial.