Upfront Home
In This Issue
News and Trends
Features
 • 
 • 
 • 
 • 
Times Past
The Ethicist
Debate
Teen Voices
Upfront Topics
Contact
Magazine Info


According to The Latest Poll...

With Election Day approaching, polls are making headlines. But it can be tricky figuring out what they really mean—if they mean anything at all.

By Patricia Smith


As soon as the Republican Convention in St. Paul was over last month, pollsters scrambled to measure its impact (as they had for the Democratic Convention in Denver the week before).

Gallup released a poll showing 50 percent of registered voters favoring Senator John McCain, the Republican candidate, compared with 46 percent for Senator Barack Obama, the Democrat; an ABC News poll found Obama leading McCain 47 percent to 46 percent; and a third poll by CBS News had McCain leading Obama, but by a closer margin of 46 percent to 44.

So what's going on here? Why are there so many different results in the same time frame? The short answer: Polls are complicated.

With the November election just weeks away, the news is filled with polls handicapping the presidential race. Not only are people fascinated by everyone else's opinions; politicians use polls to help tailor their messages to voters, and journalists use them to find out what people care about and why.

Michael Traugott, who studies election polls at the University of Michigan, says political polls have both direct and indirect effects on voters—especially in a tight contest like this year's race for the White House.

"They can stimulate people to give money or to volunteer for a campaign," Traugott says. "They can make people think that their vote is worth more if the margin is close, meaning they have an impact on turnout."

Not a Crystal Ball

It's just as important, however, to know the limits of what polls can tell us.

A poll tells us about the present, not the future: It's not a crystal ball, but a snapshot of how the public is thinking at a particular moment—and not a perfectly sharp snapshot at that.

Although a large portion of the American public is committed to one party or the other and holds firm positions on a wide range of issues, there is always a segment that wavers, or stays unconvinced (or uninterested) until late in a campaign. Those "swing voters" ultimately push elections back and forth, and their views can keep changing up to the very last minute.

One of the most embarrassing errors in polling history came in 1948, when the major polling organizations all declared that Thomas E. Dewey, the Republican Governor of New York, would defeat the Democrat, President Harry S. Truman. All of them had stopped interviewing the public several weeks before the election and missed a late swing toward Truman. In addition, a study after the election found their samples had too many middle-income and wealthy people, who were more likely to vote Republican.

Even on election night, political experts were convinced Dewey was going to win—and the photograph of a jubilant Truman holding up a copy of the Chicago Daily Tribune that wrongly declares "Dewey Defeats Truman" has become a political and polling icon.

Today, surveys are conducted right up to Election Day—and at the polling booth itself for "exit polling" in which voters are interviewed just after voting.

But even exit polls have pitfalls. In Florida, in the 2000 presidential race, a combination of bad polling data in a handful of precincts and mistakes in the actual vote counts, led most TV networks to call a race—first for Democrat Al Gore, and then for Republican George W. Bush—that was in fact too close to call. Florida ended up determining the election's outcome, which was ultimately decided by the Supreme Court in favor of Bush.

More recently, the polls got it wrong when they predicted that Obama would win the 2008 New Hampshire primary; in fact, Senator Hillary Clinton won decisively.

David M. Moore, a longtime pollster for Gallup, is skeptical about many current polls. "The polls measure the way the wind is blowing at a given moment and then treat that as a serious response," Moore says.

No poll is perfect, but the basic premise behind public-opinion polling is that by asking the opinions of a surprisingly small number of people, you can get a good sense of what an entire city or state or country is thinking.

Most reputable political polling today relies on a method called "probability sampling": If you select people at random from a whole population, no matter how large or small, you have a good chance of reflecting, within a few percentage points, the opinions of everyone.

But those few percentage points can make a difference. A survey of 1,000 people—which is a typical number for a news-media survey—will have what's known as a "margin of sampling error," usually plus or minus 3 percentage points, 95 percent of the time.

Truth or Fib?

The margin of sampling error is one of the most important things to know when trying to understand the significance of a poll. With a 3-point margin of error, the CBS poll after the Republican Convention that showed McCain leading Obama 46 to 44 percent means that McCain might actually be at 43 percent and Obama at 47 percent—which obviously tells a very different story.

This illustrates one of the most common mistakes people make in reading polls: They tend to think of the percentages as precise points, when they're actually more like fuzzy blobs.

There are other possible sources of distortion as well. How questions are worded—and even the order in which they're asked—can sway the results.

For example, polls about capital punishment generally show that about two thirds of Americans support the death penalty. However, if the question is asked differently—with life without parole presented as an alternative—the portion who support the death penalty drops to about half.

Studies have found that people largely tell the truth to pollsters, although what they say is based on their understanding of events, which is not always perfect. Sometimes they'll give an opinion about something they didn't even have an opinion about until the pollster called.

And sometimes social pressures or other factors will lead people to fib. For example, more people report that they are registered to vote than actually are, probably because voting is considered a good thing to do. This year, with Obama on the ballot, race could come into play: In contests where one candidate is black, polls have sometimes overstated the black candidate's strength.

It's known as the "Bradley effect," in reference to Tom Bradley, the black Mayor of Los Angeles who lost the 1982 California Governor's race even though polls indicated he would win. The theory is that because of the social unacceptability of racial prejudice, some respondents say they support the black candidate even though they don't.

But Democratic pollster Mark Mellman is skeptical of that happening this year: "I don't think we see any evidence of that. If it ever existed, it's largely disappeared."

Another wild card this year are the millions of newly registered voters, many of them young people. During the primaries, voter turnout increased dramatically. But will those new voters show up in November?

"The polls could be inaccurate because it's hard to know how to weight first-time voters," Darrell West of the Brookings Institution told The Christian Science Monitor.

"Most of the traditional polling methods are based on likely voters, and the likelihood of voting is measured through participation in past elections," West explains. "Generally, that's not a big problem because there aren't that many new voters, but this year, we've had a big spike in interest by first-time voters."

Most surveys are still conducted by phone, although efforts are under way to use the Internet for polling. For now, however, most "polls" you see online are not scientific and not worth the screens they're displayed on. Pollsters have long had to adapt to changes in technology. When scientific surveys began in the mid-1930s, most polling was done face-to-face. It wasn't until the late 1970s that most polling companies switched to phones.

Cell Phones

For now, the big challenge facing pollsters is the growing number of people who have only cell phones, estimated at 13 percent of the U.S. population. Both Gallup and the Pew Research Center have begun calling cell phones to include these people, but most pollsters still call land lines only.

The question is whether people with land lines have different opinions from those with cell phones only. A Pew study published in January concluded that "cell phone only" people are not significantly different in terms of political attitudes. But this could change as more young people shift to a cell-only lifestyle. By next year, it's estimated that 40 percent of adults under 30 will have cell phones only.

"The polling industry is on the cusp of having to bite the bullet and call people on cell phones," says Moore.

Another continuing issue for pollsters is the number of people who refuse to respond to surveys or can't be reached: How do they differ from the people who will answer the phone and take the 15 or 20 minutes that most surveys require?

"Fewer and fewer people are willing to talk to us," Mellman says. "So far, that has not be debilitating; the polls have stayed reasonably accurate. But at some point, that could change."

Despite these cautions, properly conducted polls are still the best way to find out what people are thinking at any moment, and in a democracy, that's important to know. So be skeptical when candidates say they don't pay attention to polls, or that "the only poll that matters is the one taken in the voting booth." Often they're the ones who think they're behind.