Pages

23 May 2013

Polling Voters #nlpoli

If you are still mulling over the British Columbia election result and the polls, take a look at this post by Eric Grenier at threehundredeight.com.  It includes a link to his piece in the Globe on Wednesday on the same topic.

Pollsters tend to weight their samples to match the population as a whole.  Problem:  that isn’t the same as the demographic profile of voters.voters.

Grenier shows how Ipsos, for example, weighted a poll equally across three age groupings.  In the 2013 election, those age groupings didn’t turn out equally.  The over-55s made up half the total voter turn-out, not one third.

As Grenier put it, rather colourfully,

To put it in the context of the market research that is the bread-and-butter of polling firms, the failure to identify voters and how they felt about the campaign was equal to a failure to identify a company's likely customer base, and how they feel about an advertising campaign. A poll is of little use to a diaper company if it is identifying the shopping habits of childless adults, and especially adults who have no intention of having children.

Missing the target market is a pretty big problem but polling firms do it fairly regularly in their election work.  Back in 2007, for example, SRBP noted a huge problem with CRA polls:  they are off by as much as 20 percentage points in some respects.

A quick review of CRA’s methodology pointed out CRA’s distortion of tossing aside anyone who is undecided and reporting party choice results as a share of people who picked a party.  There’s no valid reason for doing that.  American pollsters expressly reject the practice.  And there’s absolutely no reason to believe that the resulting numbers in any way correspond to actual voting behaviour.  Take the 2007 or 2011 general elections and you’ll see how far out of whack some of CRA’s reported results were.

But the root of the whole problem of polling accuracy goes back to the fact that pollsters don’t look for people who will actually vote when they do these public polls.  We don’t know what they do for private clients because they keep that information confidential.  If they deliver poll results based on the same mistaken weighting – population as a whole versus voters - then Grenier’s got a point.

This isn;t the only cause of polling error compared to voting results.  As Grenier notes, some people do change their minds at the last minute or make up their minds at the last minute.  Polling for news reporting won’t catch that, in most cases.

Then there is the issue of who gets polled.  Polling firms will happily tell you it is getting harder and harder to get people to do a telephone survey of any kind.  The people they get may not reflect the population as a whole, even though they may wind up being closer to actual voters in their profile than not.  If they do it might be a coincidence more than the result of sound methodology.

Many firms have switched from the old method to some form of panel of volunteers from which they pick a sample.  They may contact some by phone and others by e-mail to get them to respond to an online questionnaire.  Those new methods offer their own problems.  No method is fool-proof.

And of course, if the polls are of a general population, then you can pretty much miss any significant changes happening at the district level.  In 2011, we had more polling in a general election than in any election in recent memory.  None of the poll – not one of them – picked up the NDP surge in the metro region that produced a historic political turn-around. 

Environics polling in early October did pick up a strong second place showing for the NDP and a Conservative number close to the actual result. An NTV/Telelink poll in early October had the Tory number very close to the final result.   As much as anything else, though, that accuracy was a result of their reporting method.  few local media – if any – picked up on the numbers as a clue something might be going on besides a Tory majority government.

Grenier makes a good suggestion when he encourages firms to release more information about their methodology. SRBP said much the same thing at the end of a lengthy series after the last provincial election.   At the very least, that increased disclosure would help outsiders understand what the firms are reporting.  Even if the firms did nothing more than follow their own industry association guidelines, Canadian polling firms would be offering infinitely greater transparency than most of them do currently.

The next election might well be coming in Nova Scotia as Grenier notes.  He thinks that’s a good thing because he believes Corporate Research Associates “has a good track record”.  Your humble e-scribbler would disagree strongly, at least as far as Newfoundland and Labrador is concerned.  CRA gets some things pretty close but others – like the Conservative vote and the “will not vote”  - tend to be off by huge amounts.

-srbp-