Easy Ways to Turn July 4th Into a Longer Vacation. Find Out How at aarp.org/takeaday

 

Can You Believe Polls? An Insider Tells the Truth

Nobody would argue that political polls are infallible. Think of the legendary 1948 Chicago Daily Tribune headline “Dewey Defeats Truman.” And we’re supposed to look to Gallup or Harris or Zogby as oracles for another close presidential race, like McCain versus Obama?

For more than seven decades, however, polls have been used to chronicle changing American attitudes, cultural movements and political leanings. The media and politicians regularly point to polls as evidence of public outrage or support for public policy or presidential candidates. Faulty or not, polling is here to stay.

That can be dangerous, says David W. Moore, a former senior editor at Gallup Poll for 13 years. In his new book The Opinion Makers: An Insider Exposes the Truth Behind the Polls, Moore explains why the polls we read about in newspapers or hear about on television yield incomplete and even inaccurate conclusions. His wry and clear-eyed critique posits that the way pollsters ask questions distorts measures of public opinion. [Read an excerpt of Moore's book.]

And those erroneous results can be used to justify public policy decisions, even when the public is ambivalent about an issue, or to drum up support and financing for political candidates, even when most of the electorate remains undecided. “It’s an ironic tragedy that one of the 20th century’s most celebrated social inventions, widely anticipated as a means of enhancing democracy, has turned out to do the opposite,” Moore writes.

In an interview with AARP Bulletin Today, Moore explains why bad polling threatens American democracy and how pollsters can fix their practice.

Q: First things first. Can we believe polls?

A: I believe that polls that deal with people’s personal experiences can be valid representations of what the public at large is thinking. For example, a question in a recent Gallup Poll asked respondents if they thought they were financially better off or worse off now than they were a year ago. In that type of question, there’s no reason not to believe what people tell the pollsters. That kind of poll has been tremendously important in helping us understand American culture, showing how people live, what their concerns are, what makes them happy. My reservation comes from polls in two specific areas: preelection polls and polls that are designed to measure people’s opinions about public policy matters. In those areas, I have real reservations about the validity of polls, and I don’t think we can trust the polls to give us an accurate representation of what the public is thinking.

Q: What’s so different about such polls?

A: Pollsters design their polls to minimize the number of people who may be undecided about whom they are going to vote for or to minimize the number of people who are simply unengaged and have no opinion on any given policy issue. The whole paradigm of polling amongst the news media is to get everybody to have an opinion. And if they don’t have an opinion on something, because the issue itself is too obscure, pollsters give them information about the issue and then immediately ask them to form an opinion. In that way, pollsters kind of manufacture an opinion that appears to be representative of a majority.

Q: Give us a recent example of a “manufactured” public opinion.

A: It happened in September when three major media polls measured the public’s opinion about the economic bailout. The Pew Research poll found that the public supported the bailout by a 27-point margin; the Los Angeles Times/Bloomberg poll, by contrast, found that the public opposed the bailout by a 24-point margin; and the Washington Post-ABC poll concluded that the public was essentially evenly divided.

Now, how could three polls, all conducted in exactly the same time frame, come up with completely different results? One reason is that most people probably didn’t have a strong opinion about the bailout. After all, it is a very complicated matter, and we didn’t even know much about the bill at the time. But the media polls wanted to present the appearance of a public that had made a decision, so they essentially manufactured this opinion.

The Los Angeles Times/Bloomberg poll did it by asking a question that stressed the problems and costs, and so we got an opinion that was against the bailout. Pew Research talked about investing in the economy—a very positive way of phrasing the issue—and in that phraseology their respondents showed a strong margin of support. And then the Washington Post-ABC poll was more neutral, but it still didn’t give people the option of saying that they didn’t know, and so they found that the public was evenly divided. In all three cases, the failure to measure the percentage of people who really didn’t have an opinion one way or the other essentially led to the manufacturing of three entirely different opinions. It wasn’t real opinion, in the sense that it didn’t reflect what the general public at large was thinking.

Q: Why don’t pollsters measure how many people are undecided about an issue?

A: A poll that shows a strong majority of the public in one direction or another makes a more interesting news story than one that says a large number of people have no opinion. For example, that Los Angeles Times/Bloomberg poll about the bailout really dominated the political news media. I heard a lot of pundits refer to it to show how angry Americans were about the bailout. Taking a look at the poll, you can see that maybe some people were angry, but certainly not a majority. So it’s a more interesting news story, but not necessarily an accurate one.

Q: Preelection polls are frequently lambasted for inaccuracy. What accounts for the errors?

A: Election polls ask the question about whom people would vote for if the election were held today. The problem is that, early on, a lot of people don’t have any clue which candidate they’re going to vote for, so when polled months and months before the election, they pick the name that is most recognizable. That’s what happened with Rudy Giuliani. He was well-known across the country, so when Republicans were forced to come up with a name among those listed, they mentioned Giuliani because his name was the most familiar. Yet at the very point that all the major media polls were showing Giuliani as the national front-runner, he was trailing in all of the early contests. The notion that he was the front-runner was really a figment of the polling industry. For one poll in October only, Gallup changed its format, and instead of saying, “Who would you vote for if the election were held today?” the poll started out by asking Republicans if they had decided whom they would support for president. In that context, more than 70 percent of the Republicans said they had not made up their mind, and among those who had made up their mind, no candidate got more than 5 percent of the vote. That was the true state of what Republicans nationally were thinking.

Q: What was the fallout?

A: Well, Giuliani was able to solicit millions of dollars in contributions for a campaign that ultimately didn’t win one single delegate to the Republican National Convention—simply because the polls showed him ahead. So the early polls can really distort the election process. I’m sure that they made it very difficult for the other Republican candidates to get much money. It almost killed McCain’s candidacy!

Q: In what way can presenting inaccurate polls as fact endanger democracy?

A: The coverage of an issue can be distorted, because issues are usually framed by a source within the administration, then that information is given to the media, and the media use it to conduct these polls. Sometimes you only hear an issue framed in one way—the way sanctioned by the administration.

For example, the issue of whether there was torture of prisoners at Abu Ghraib: When this story broke, the Bush administration immediately indicated that this was not a matter of torture, that there were some isolated cases of “abuse” of prisoners. Then the news media conducted polls asking the public whether it was torture or abuse, but the way they posed the question influenced the outcome. The Washington Post-ABC poll, for example, asked several questions about the “apparent abuse” of prisoners at Abu Ghraib. Then, after repeated references to “apparent abuse,” the poll asked whether respondents thought that what happened to the prisoners was abuse or torture. Since the poll itself used the term “apparent abuse,” by the time the respondents were asked whether it was abuse or torture, naturally the majority of people said, “Well, it must have been abuse, because that’s how you’ve been referring to it.” So the poll showed that the public agreed that the Abu Ghraib incidents were abuse, not torture, and created public support for the administration.

Q: How can the public evaluate poll results?

A: Well, frequently polls conflict with one another, especially very early preelection polls. In those cases you can go to websites like Pollster.com or Fivethirtyeight.com, which try to average all the poll results.

Q: Though flawed, is polling still the best way we have to measure public opinion?

A: I’m very much a supporter of polls that ask the right questions in the right way and come up with valid results.

Join the Discussion

0 | Add Yours

Please leave your comment below.

You must be logged in to leave a comment.

Next Article

Read This