The Alan Grayson Page

The Anthony Weiner Page

Guest Contributors


  • BN-Politics' administrators respect, but do not necessarily endorse, views expressed by our contributors. Our goal is to get the ideas out there. After that, they're on their own.
Blog powered by Typepad
Member since 05/2007

Blog Catalog

  • Liberalism Political Blogs - Blog Catalog Blog Directory



« Siegelman Release "A Big Win in a Very Long War" (Update) | Main | Hillary's Health Care Plan, One Way to Help Pay for it »

March 28, 2008



Someone needs to do a really really big meta-study of all these polls, and figure out if there are trends in the selection bias between people who are willing to take the surveys and those who hang up.

It sure seems like the Rasmussen poll has 10% undecided, even if they failed to report it explicitly.

D. Cupples


I'm soooo embarrassed. My mind played tricks: I added 44 and 46 and came up with 100 (though I really do know that it = 90).

Thanks for pointing out the error so I could correct it quickly.

I'm not sure HOW they could figure out sampling biases given that they need input from the very people who refuse to participate.

It's been years since I took stats. I'd have to look up how to calculate a z-score at this point.

I do remember thinking (while in class) that some questionable assumptions were involved in the process of calculating error margins. This troubled me.

I have no idea why the polls were generally right about SC and the Potomac states but conflicted re: the states I list in this post.

Personally, I wouldn't join a big-stakes betting pool based on political polling.


My guess on the resons for the bad polling in those four states:

California and Iowa: poor assumptions on who was a "likely voter"

New Hampshire: big news story right on the eve of the election threw things off.

Texas: last minute ads had a significant impact, plus the "Rush effect".

Off the top of my head, a meta-study would look at things like the demographics of those who answer the surveys compared to those who do not. You then match this up against the demographics/results of exit polls, and finally the actual results. If you have enough data, you can probably tease out significant demographic and voting trends in those who answer surveys versus those who don't. You may figure out that, say, older voters who support Obama are more likely to answer the survey than older voters who do not, but no such trend exists among middle-aged voters. Stuff like that.

The thing about any sort of betting pool is, you don't need to be right all the time, you only need to be right more than average. I can do well at things like CNN political market just by taking the obvious bets.

D. Cupples


WHAT assumptions about likely voters? The surveyors ask people whether they're likely voters, no?

As for the other states, did you even look at the tables that I'd linked to in the post? NOT all polls were wrong about CA, TX, OH.... Only some were.

That was one of my points: when polls of a similar time conflict, some of them are necessarily wrong. That's because of inherent problems with polling (or sampling) as a whole.

Unfortunately, I don't know precisely what those inherent problems are.

Tangent: another fundamental problem with phone surveys is that they rely on self-reporting. Some people say things that aren't true because they don't want to look stupid or bad to the surveyor. Some just lie for the hell of it. Nobody knows the percentages.

I realize that self-reporting is the best we have. At the same time, I'm ever aware of some of their inherent flaws of polling generally.


As I understand it, some polls do just ask whether you plan to vote, but others ask correlative questions (e.g. "did you vote in the last primary?") and throw out or discount answers from people they consider unlikely to vote. Different polls use different recipies in their secret sauces, hence varying results.

So, for example, I'd guess that there's something about Zogby's determination of a likely primary voter that has tended to favor Obama. Hence the wacky Texas/Ohio numbers. Everybody else was within margin of error, I think.

You're absolutely right that there are other biases in polling. There are some famous cases of this. For instance, polls, including exit polls, showed David Duke (the former KKK wizard) getting blown out in the Louisiana governor's election, but he ended up only losing by a very narrow margin. People were embarrassed to admit they supported Duke.


I just found another poll-aggregating website that is focussing on the general election. This site has a good handle on basic statistical analysis, which makes for some interesting insights.

D. Cupples

GREAT site, Adam. Thanks.

The comments to this entry are closed.