How Does Partisanship Shape Voters’ Perception of the Facts?
IPR’s John Bullock examines how voters respond to survey questions
Get all our news
Image from Creative Commons.
Pollsters blanket the country every two years, if not more, with the mission of taking American voters’ temperature on issues from healthcare to climate change. But when voters respond to polls, how much do their responses reflect their true, factual beliefs, and not their partisan bias—and what are the implications for our democracy?
In a recent article for the Annual Review of Political Science surveying existing research on the topic, IPR political scientist John Bullock and his co-author attempt to find out. They review a series of studies, including one of Bullock’s own, observing that when Republican and Democratic voters are given small financial incentives to answer factual questions about issues like the state of the economy, their perspectives tend to converge. In other words, absent that incentive, survey responses might frequently reflect partisanship more than actual belief.
“It seems clear that partisan gaps in answers to ordinary survey questions about factual beliefs don't always reflect sincere, thought-out differences of belief between the parties,” Bullock said.
According to Bullock, whether those gaps reflect an actual lack of information or conscious partisan “cheerleading,” it could have repercussions for political scientists, pollsters, and lawmakers themselves.
“If voters don't have accurate information about the performance of politicians, their choice of politicians won't have much to do with performance,” Bullock said. “For example, if their beliefs about how well a president is managing foreign policy have little to do with how he is actually managing it, they won't be able to reward or punish him for his actual performance in that domain.”
When voters give inaccurate answers, Bullock sorts their responses into two key categories: the aforementioned “cheerleading,” or consciously giving a false answer to bolster one’s partisan beliefs, and “congenial inference,” when in the absence of confidence in one’s knowledge, voters make assumptions that bolster their partisan beliefs. He writes that although it is difficult to identify and distinguish between the two, evidence suggests that a large number of respondents simply don’t know the answers to factual questions about politics, relying on partisanship to guide them.
Bullock concludes that more research is needed to determine both the extent of voters’ true knowledge and how they choose to express it. He says that researchers should focus on determining whether, on average, voters enter the ballot box with enough of an incentive toward knowing the facts to allow them to hold policymakers truly accountable. He also offers suggestions for pollsters looking to draw a more accurate portrait of voters’ beliefs.
“[Pollsters] should provide ‘don't know’ options when they ask factual questions about politics, and they should tell the people who take their surveys that it's OK to say ‘I don't know,’” Bullock said. “At present, few pollsters do both of these things… the finding on this point is enormous. When you give people even the slightest encouragement to say ‘I don't know,’ the partisan gap collapses.”
John Bullock is an associate professor of political science and an IPR fellow.
Published: October 21, 2019.