Learning from the Research 2000 Polling Mess
Monday, July 5, 2010 at 10:31PM
steve

This might sound strange or even ironic coming from an operative, but politics and especially coverage of politics has become too focused on the nuts and bolts, or process of elections.  Virtually every story these days has to do with who works for who, what so and so raised and my favorite, reporting on public polls.

Over the past few years, the public sphere has been literally inundated with polling and many in the media are literally addicted to them, publishing every ounce of data reported, with little or no regard to the quality of the survey instrument, the record of the pollster or any methodology considerations.  When I asked a member of the press about this, their response was "we just put it out there and others can decide if it is important or meaningful."  Problem is, when the media publishes polling in print or on-line, it immediately becomes meaningful.

It isn't that polling is a bad thing, in fact, quite the contrary.  Polling is an important tool for understanding the nature of political races and the mood of the electorate.  Except, most of the polls done today are not done with that layer of context, rather they focus on one thing:  the horse race.  Often this data comes from groups and companies that have little to no public track record, yet the data is treated as fact.

This whole issue recently came to light after the founder of a leading liberal blog, Daily Kos, announced his intention to sue his pollster, Research 2000 for fraud after a number of investigations called into question the reliability of their data.  It was the right thing to do.  Many folks, including myself, would shake their heads at some of the data coming out of that shop.  But here is the problem, for over a year, the press has been reporting on these polls, without a shred of concern as to their accuracy.  Even here in Florida, we have felt the impact.

Flashback to November 2009, Marco Rubio's campaign is all of the talk, but to date, the race still appeared to be Crist's for the taking.  All but one poll to date had Crist over 50% (and that one had him at 49), and no poll had the race inside 15 points---and most had the margin in the mid 20's.

But the narrative began to change in a meaningful way when the Daily Kos/Research 2000 poll was released in late November.  First, they were the first that had the race at 10 points or less, showing Crist at 47 and Rubio at 37.  It was also the first time Crist approval was under 50 among Republicans.  However, that wasn't the only major thing to pop out of that poll:  at the same time, they released data showing Crist winning a three way Senate race as an independent, fueling not only the "would he do it" talk, but also the "can he win" debate.  Except, now we have no idea if any of that data was collected in a scientific way.

Research 2000 isn't the only firm to have problems, it is just the most recent. 

One of the things I do in my spare time, and one I probably shouldn't repeat, is trying to replicate public data.  More often than one would expect, I find that the data released by company or organization X was based on a model or methodology that bares little resemblance to reality, such as over or under sampling primary voters or using vote models that don't take into account Florida's changing diversity. 

In my perfect world, a public poll wouldn't land on a blog or in the paper unless the full methodology was explained.  Good scientists (which is what pollsters should be) should have no problem with letting others mess around with their data and test its veracity.  It shouldn't take much, just tell us what your sample looked like, how you modeled your voters and release your crosstabs. 

But assuming that isn't going to happen, take two steps when you find a poll posted on a blog:  Go to Nate Silver's FiveThirtyEight.com and check out his rankings of pollster accuracy, then go to the website of the organization that did the polling and check out the data for yourself.  

Many factors determine how elections turn out, but too often media coverage is focused on just one or two of them.  Therefore, take all of this data in context.  If polling taken a year before an election was rock solid gold, we'd be talking about President Guiliani or Clinton.  Clearly, history decided to go a different direction.  

 

Article originally appeared on Steve Schale -- Veteran Florida Man Politico (http://www.steveschale.com/).
See website for complete article licensing information.