You turn to us for voices you won't hear anywhere else.

Sign up for Democracy Now!'s Daily Digest to get our latest headlines and stories delivered to your inbox every day.

Leading Pollster Predicts Obama Wins with 344 Electoral Votes

Listen
Media Options
Listen

As the presidential race enters its final days, what do the polls say about the election outcome? We speak to Nate Silver, whose website FiveThirtyEight.com has become a must read for hundreds of thousands of political junkies. [includes rush transcript]

Related Story

StoryMar 22, 2024U.S. Said It Was Calling for a Gaza Ceasefire, But Its U.N. Resolution Didn’t Say That: Phyllis Bennis
Transcript
This is a rush transcript. Copy may not be in its final form.

JUAN GONZALEZ: Our first guest today, Nate Silver, was recently called “The Spreadsheet Psychic” by New York Magazine. His website FiveThirtyEight.com has become a must read for hundreds of thousands of political junkies. Using the most recent polling and demographic trends, the site simulates the election 10,000 times each day. The site is now projecting Barack Obama will receive 344 electoral votes, seventy-four more than the 270 needed to win the election. John McCain is projected to win 193 electoral votes.

AMY GOODMAN: Nate Silver launched the site FiveThirtyEight.com earlier this year in his spare time from being a professional baseball statistician. He is the managing partner of the influential publication Baseball Prospectus. Nate Silver joins us now from Chicago.

Welcome to Democracy Now!, Nate.

NATE SILVER: Yeah, good morning, guys.

AMY GOODMAN: It’s good to have you with us. OK, let’s start out with the poll that the New York Times is talking about that puts Barack Obama one percentage point ahead of John McCain. What do you think of this poll? How should we analyze it?

NATE SILVER: You know, I think you have to take that one poll from the AP along with the dozen or so other polls that are out that give Obama, you know, larger leads, I think as large as fourteen points in certain cases. I mean, there are different philosophies about how to evaluate this election. It is maybe a difficult election, when you probably have a lot of new people who are voting. But, you know, one thing that some of these pollsters are doing is applying what’s called a likely voter model. So that will take, you know, the voters that say they want to vote and basically kick some of them out of their sample if they haven’t voted in the past, they’re from demographics that don’t traditionally turn out in high numbers. They’re kind of basically making an assumption that the election will look like it did in 2004. If it doesn’t — and remember, the Obama campaign is counting on getting new voters to the polls, young voters, blacks, Latinos — then these kind of more optimistic scenarios, the double-digit leads, might come into play instead.

AMY GOODMAN: But can you go further on this, how a poll then affects perception, because the Times writes a piece, they’re writing about momentum of John McCain, and you’re saying that just to select out this one poll is not representative of the, say, eleven most recent polls?

NATE SILVER: Yeah, I don’t think this poll is particularly good. It had a fairly small sample size. You know, it had a very conservative — lowercase c — likely voter model. You know, you can kind of cherry pick any number of polls you would want. I mean, right now we have something like, you know, twelve national polls coming out every day. And we have maybe twenty state polls coming out every day. You know, there’s a poll, came out this morning, that had Obama fourteen points ahead in Ohio. You know, I don’t believe he’s fourteen points ahead there, but if you wanted to tell a story around that and say, oh, McCain’s done for, you could do that just as well. So I think, you know, you have to look at the big picture. If you’re going to have that many polls coming out, you’re going to have some outliers, almost by definition. You know, you’ll have some polls outside the margin of error if you have thirty polls coming out every day. You’re going to have a couple that just randomly end up on one side or the other.

So I think the responsible thing to do is to take an average or an aggregate of some kind, to educate yourself about the fact that polling is not an exact science, that these polls can differ a lot from survey to survey — even, you know, the same day, the different survey can produce very different numbers — and then kind of proceed from there.

JUAN GONZALEZ: Now, Nate, how do you work your analysis of these polls? Are you doing basically a meta-analysis, putting them all together? Or are you adding particular ingredients of your own to make sense of them?

NATE SILVER: Well, one thing — one of the main things we try and do is to try and discern whether there is any kind of a trend line and who it’s moving toward. And so, we look not just at the state polls —- or the national polls, but also at the state polls. So, you know, for example, if we have a poll, the same polling firm, and, you know, Obama improves from, say, a plus-five to a plus-eight, we find that a meaningful data point, whereas maybe if you have polling from different firms, it doesn’t tell you quite as much. But we try and mesh that all together and really kind of see where the election is headed.

And right now, we see it as being fairly flat. Obama made a lot of gains over the course of the past six weeks, since the kind of Lehman Brothers collapse. And since then, it’s kind of plateaued at about a six— or seven-point lead. But we consider demographic information. We consider kind of as much as we can to kind of, you know, deal with this imperfect data, because polls are certainly an imperfect data set.

AMY GOODMAN: Now, Nate Silver, these numbers that Juan was just talking about at the beginning, Obama getting 344 electoral votes, seventy-four more than the 270 needed to win election, McCain projected to win 193 votes, way more than anyone else is predicting. How did you arrive at this number?

NATE SILVER: Well, it’s just looking at the individual states. And one thing about this year’s map is the Obama campaign is active in so many places that if they win the national popular vote by six or seven points, that means they’ll turn a lot of states blue. They’ll probably win, you know, Ohio and Florida. They would certainly win at that level Virginia and Colorado, which are two states they’re very fond of, I think. But even Missouri and Indiana, you know, North Carolina, these are pretty big electoral vote — double-digit electoral vote states. And so, you know, it looks like a landslide when you have a number like 350 electoral votes, but you can get that with even just a five-point win in the popular vote because of the way, you know, the Obama campaign is engaged in so many kind of battleground states this year.

JUAN GONZALEZ: Now, you drew quite a bit of attention during the primary season, specifically when you were in Indiana and North Carolina. You were making predictions that others had not made and were more accurate. Can you talk about the development of your site and why you decided to move into political analysis from sports?

NATE SILVER: Well, I think partly it was because I was frustrated with some of the kind of prevailing media narrative. You know, for example, when you do have one poll saying McCain’s only one point behind Obama, that might generate a lot of attention, but it’s not necessarily the best indicator of where the race is at. So that was a kind of thing that bothered me.

You know, sometimes people oversimplify demographic questions. For example, during the primaries, there was a lot of attention about, well, you know, Obama doesn’t perform well with Latino voters. And really, that wasn’t about, you know, skin color; it was about economic class. And Hillary Clinton did fantastically well with working-class voters, both Latino and white, but there was nothing about brown people won’t vote for black people. And then, once we had the general election, you saw Obama is doing very well with Latino voters.

So it was things like that where I think there was a lot of sloppy analysis in the mainstream media, where I wanted to kind of have my say and also to kind of be — serve to put some public pressure on these pollsters, who might not tell you why they’re doing what they’re doing, who might not really have thought their methodology through, and to kind of expose pollsters that do a bad job and aren’t very accountable for when they just kind of throw a bunch of numbers together. So that was really kind of the goal, I think.

AMY GOODMAN: We’re talking to Nate Silver. Now, why should people believe you? Especially people who don’t follow politics, so they might be getting into it the way the media covers it as a game. But so you develop PECOTA, Player Empirical Comparison and Optimization Test Algorithm. Explain what this is. This is for baseball. You predicted the Rays would get into the World Series?

NATE SILVER: We predicted the Rays would win eighty-nine or ninety games this year and be a playoff contender. We actually had them, you know, behind — them and the Red Sox and the Yankees all kind of in a tie in the American League East.

But PECOTA is a system I developed five or six years ago. It uses comparable players. If you have a guy like Derek Jeter, you look at, you know, maybe Phil Rizzuto, players in the past who were similar to Jeter, and you look at how their different kind of careers went down different kind of branches of the probability tree, so to speak.

But, you know, I did move into politics this year. And, you know, as we mentioned before, I got some things right, like Indiana and North Carolina, where the polling wasn’t very accurate, really. You know, if you looked at the polling in the primaries, Barack Obama routinely over-performed his polls in Southern states by seven or eight or nine points. So we looked at the demographics, as well. We had so much information about the primaries, when we had had contests in twenty-five or thirty states, that we almost didn’t have to look at the polling there.

But, you know, it’s just trying to be as rigorous as possible. I do like sweating the small stuff and the details and, you know, getting into this level — this kind of granularity. It’s kind of my methodology and my brand, I suppose. But we want to be as thorough as possible, but also have some fun with it and realize that there is — it is a horse race. The polls aren’t the most serious thing in the world. You don’t have to — you know, you don’t have to have a bad day if your candidate loses a couple of points in the Gallup tracker or the Zogby tracker. So try and balance that —-

AMY GOODMAN: Can you talk about Gallup and Zogby? What do you think of these other pollsters? And then, I know you have this running battle with ARG.

NATE SILVER: Yeah. I mean, you know, we have ratings of maybe the twenty-three most prolific polling firms. And, you know, obviously, the firms that we rank toward the top, I’ve heard from, and they’re very friendly. And some of the ones toward the bottom are not quite as friendly.

You know, there are different things that kind of go into being a successful pollster. I think part of it is obviously your track record. And we can go back and see who’s been the closest, you know, in previous elections, and not just kind of the one election -— how did they do with the Bush-Kerry number in 2004 — I mean primary elections, I mean Senate elections, I mean looking back to 2000, 2004, individual states, so really looking at kind of a long track record.

And sometimes they would surprise you. Some of the brand names, like Gallup and like Zogby, actually have average to slightly below average track records, whereas some what I call boutique polling firms, like SurveyUSA, which is out in New Jersey, like Rasmussen, all they do is polling, those firms actually do quite well. Some of the academic firms, like Quinnipiac University, is a good pollster, whereas, again, you know, some of the major media names — you know, CNN’s polls haven’t been great, Fox’s haven’t been great, CBS’s polls haven’t been especially good. So, don’t buy these brand names. Kind of really look at who’s had the best track record and who kind of explains the most about what they’re doing and why.

JUAN GONZALEZ: And do you have any concern about the increasing reliance in our society by political leaders — as you’re saying, all these polls that are coming out every day — in terms of the — this impact of, in essence, constantly trying to monitor how the American people on any given day are feeling toward candidates or issues, the impact that this has on the actual political positions and the ideas that these candidates put forth?

NATE SILVER: Well, what I’m worried about is that this reliance is happening at the same time the polling itself is getting more and more difficult to do. You know, it used to be considered good if you could get 40 percent of the people to pick up the phone when you called and responded to your survey. That used to be kind of the benchmark. And now, if you get 20 percent of people actually responding, or even 15 percent, that’s considered good to adequate. You know, people screen their phone calls a lot more now. People who have cell phones aren’t called, in the first place, by most of the pollsters; there are some exceptions. You know, people have Vonage, these services on the internet, where they don’t even have a telephone. So it’s hard to reach people who are busy, who are distrustful of a random number coming up on their land line.

And so, the non-response problem is becoming, you know, very serious. We might have to move into some kind of a hybrid, where you do try some kind of a survey instrument on the internet, combine that with a telephone sample. But these polls, you know, as we saw numerous times in the primaries this year, are not terribly accurate. It’s not because the pollsters are dumb. There are a lot of very smart people in the industry. It’s just inherently very hard to do when you’re trying to do a survey, and, you know, four out of five or five of six people won’t take your phone call. It’s kind of very hard to balance everything to kind of make up for that fundamental problem.

JUAN GONZALEZ: And is there any concern and any analysis been done of the people who are refusing to be polled, whether they have any particular political perspective or viewpoint that might end up skewing your — the total results of a poll?

NATE SILVER: There are different kind of groups. Usually older people still pick up the phone more than younger people, women still more than men. You know, there’s some thought that maybe people who are quite conservative won’t pick up the phone as much as people who are more liberal. So it does depend.

In this election, those kind of different kind of things balance out, where you might get too few young people, but maybe too few conservatives. That would kind of balance out.

But, you know — but there definitely is what’s called non-response bias. What the pollsters try and do is say, you know, if the woman answers the phone, to say, “I want to speak with the person having the next birthday,” for example. But you can’t always correct for that. Some of these polls use automated scripts; they’re so-called robopolls. And so, they won’t have a chance. Someone can say they’re are a woman, if they’re in fact, you know, not a woman, if that’s what they feel, if they feel like trying to mess with the polling script and stuff like that. So there are kind of numerous problems with your kind of selection and with your non-response bias.

AMY GOODMAN: Nate, CNN has this poll of polls that says it has no margin of error. As the sister of a statistician, I ask you, is this possible?

NATE SILVER: Yeah, it’s kind of a grandiose claim that CNN makes there. You know, it’s like, oh, there’s no — it’s the perfect — you know, no margin of error at all. I mean, of course this has a margin for error. And one problem is that even in states where you have, you know, tons and tons of polling data, you know, as many people as you’d want to interview — for example, in New Hampshire this year for the primary, there were literally twelve or thirteen polls with, you know, hundreds of people each that were in the field at the same time. You probably had 10,000 or 20,000 people who were interviewed by these pollsters. It’s like ten percent of the people who actually voted. And yet, the polls there missed Hillary Clinton by eight or ten points, missed badly. The problem is when polling firms make mistakes, they often make them all in the same direction. So the mere fact that you’re compiling polls together doesn’t really help when they kind of make the same mistakes, you know?

AMY GOODMAN: Well, Nate Silver, I want to thank you very much for being with us. And a shout out to your sister Rebecca. In the interest of disclosure, Rebecca Silver was our graphics designer for a number of years, and she’s now at graduate school at University of Michigan. Thanks so much, Nate.

NATE SILVER: Of course. Thank you, Amy.

AMY GOODMAN: Nate Silver is founder of the polling analyst site FiveThirtyEight.com — it’s getting something like 600,000 hits a day now — and managing partner of Baseball Prospectus.

The original content of this program is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License. Please attribute legal copies of this work to democracynow.org. Some of the work(s) that this program incorporates, however, may be separately licensed. For further information or additional permissions, contact us.

Next story from this daily show

Report: McCain Suppressed Info on Fellow Vietnam POWs Left Behind

Non-commercial news needs your support

We rely on contributions from our viewers and listeners to do our work.
Please do your part today.
Make a donation
Top