20 October 2011

PPM: Controversy, Accountability and Disclosure #nlpoli

On October 3, Liberal leader Kevin Aylward issued a news release in which he claimed that the second MQO poll released the Friday before had been “bought and paid for by the Tories.”

In the release, the Liberals also claimed that “[t]he Dunderdale Government has bought and paid for this online survey.” 

The Liberals had no evidence to back their claims that the poll was fabricated.  They offered no evidence to refute the polls findings.

In 1989, faced with a partisan poll scam, the Liberal campaigned released its own internal data that proved to be far more accurate than the one released by the Conservatives’ pollster.

In 2011, the Liberals didn’t have polls, let alone ones that could give numbers different from the ones MQO produced.  The release served only to keep alive a bad news poll for the Liberals for the third week of the campaign. The release looked much like a desperate effort by a disorganized campaign fumbling about for anything that could stave off collapse.  The release reeked of desperation.

That desperation became all too apparent as NTV, Environics and then CRA polls appeared all conducted around the same time as the MQO one and all showing numbers that showed the Liberals in more or less the same place as MQO.

Industry controversy, too

After the campaign, CRA president Don Mills complained publicly about campaign poll reporting.  The Telegram quoted Mills:
“There’s a lot of people who say online research is just as good as telephone research. That has not been proven to be true and we have recent examples in Atlantic Canada where a competitor of ours has used an online methodology and have not got it within the margin of error they quoted” he said.
Of course, Mills’ poll didn’t come any closer, but his comments did point to problems with the publicly released polls.

Industry Standards

The Market Research and Intelligence Association represents Canadian market research and public opinion firms.  MRIA has established standards for the public release of polls by firms.  The standards include these provisions:
1) Please include the following key facts in the report:
  • Sample size, and population surveyed (who was included)
  • Sponsor of study (who commissioned the research)
  • Survey method (e.g. telephone, on-line, intercept)
  • Timing (when the survey was done)
  • Statement of sample error/margin of error (i.e. "+/- 2.5% 19 times out of 20")
2) Please make the following facts available to the public upon request (if not included in report):
  • Name of practitioner (company conducting research)
  • Sampling method (e.g. random, custom list)
  • Weighting procedures (statistical weights, if used)
  • Exact wording and order of questions
3) Always differentiate between scientific (most public opinion polls) and non-scientific studies (reader/viewer polls or other "self-selection" methodologies). 
4) Where appropriate, use the caveat that research is not necessarily predictive of future outcomes, but rather captures opinion at one point in time.
What’s the problem?

Both the Aylward and Mills criticisms are rooted in one problem:  a lack of disclosure of details of the polling not just by MQO but of all the firms that released polling during the election.

MQO only confirmed its research was done independently of its clients once Aylward raised the issue.  But even for the sake of its own public image, MQO‘s two releases didn’t offer a great deal of information that would have avoided that controversy or the issue Mills raised.

How do the firms stack up?

A simple reading of the first MQO release suggests that it meet those MRIA standards exactly, as limited as they are, with respect to what must be included: 
The poll was based on a sample of 413 Newfoundlanders and Labradorians, with a margin of error of +/- 4.9%.  It was conducted between Friday, September 16 and Sunday September 18, 2011.
The release did not indicate the type of survey (telephone, online etc).  It did identify the company that conducted the work  - part of the “on request” information but didn’t release any of the other information in the same section.

If this release complied with the MRIA standard, the poll was a sample of all people in the province regardless of whether they could vote or not (“Newfoundlanders and Labradorians”). That would make it difficult to compare the poll to any others or even to the voting population as a whole.

There’s no indication of how MQO selected the sample, either.  Was it a randomly selected sample or was it something else?

The second MQO disclosed some additional information but that just raised questions about whether or not the second poll could be  compared to the first:
The poll was based on a sample of 464 residents of Newfoundland and Labrador, with a margin of error of +/- 4.6 per cent. The research was conducted via phone and online between Wednesday, September 28 and Friday, September 30, 2011, using MQO’s research panel iView Atlantic. The sample was weighted regionally to ensure proper representation of the province as a whole.
Again, the sample appears to have been drawn from people resident in the province, regardless of age.  The second release includes information on how MQO collected the data – “via phone and online” – but it isn’t clear how the sample was selected if MQO a previously screened research panel.

The other firms aren’t necessarily any better, though. Consider how Canadian Press described the Environics poll:
Unlike traditional telephone polling, in which respondents are randomly selected, the Environics survey was conducted online from Sept. 29 to Oct. 4 among 708 respondents, all of whom were chosen from a larger pool of people who were recruited and compensated for participating. Environics then adjusts the sample to reflect a broad spectrum of the population.
The non-random nature of online polling makes it impossible to determine statistically how accurately the results reflect the opinions of the population at large.
More information in many respects right down to the fact that participants in the panel were recruited and paid for participating. And to be fair to Environics, this is the CP version, not necessarily the exact description the polling firm gave of its sample and population (who they were studying).

And then there’s how CRA typically describes its quarterly omnibus:
These results are part of the CRA Atlantic Quarterly®, an independent survey of Atlantic Canadians, and are based on a sample of 400 adult Newfoundland and Labrador residents.  The survey was conducted from August 15 to August 31, 2011 with overall results for the province accurate to within + 4.9 percentage points in 95 out of 100 samples.
Some information is missing – presumably it was a random sample – but CRA appends to the release a table of data, the wording of questions and comparative data for several previous polls.

Even with industry standards and even considering the firms involved  are all MRIA members – MQO and CRA are Gold Standard members – the polling firms don’t follow the same practices in how they report polling information. 

Why Disclosure is Important

For those who might think Mills’ criticism about the online panel are accurate, check Geoff Meeker’s post on the controversy.   Meeker put the question of the panel to MQO and got a written reply.  Included was this description of the panel composition that – from MQO’s standpoint – justified including a margin of error while Environics did not:
iView Atlantic is a probability-based panel, meaning that every member of the population has an equal chance of being selected for participation.
These are not esoteric issues.  The MRIA news release accompanying the disclosure standards included these comments by the past president of MRIA’s predecessor organization:
"All of these reporting items create transparency in the polling process and help the public establish an informed opinion about the results of a poll…If we want people to take part in the democratic process, we must be certain they have confidence in the way we conduct our business, from the time they answer our questions through to the results being published."
Ensuring public awareness of opinion research techniques and issues also fulfills objectives set out in the MRIA Code of Conduct. Members of the public should be able to have complete confidence in the poll results they see as well as in their ability to compare results from different research methods.

Given both the controversy in the recent election and the variation in the amount of information polling firms released – even with established standards -  the MRIA has some work to do.

The American Example

With a larger polling industry, more election polling and a much wider experience with controversy, the American polling industry has much more stringent disclosure standards than those of MRIA. None of the polls released during the Newfoundland and Labrador election came even close to the American standards. 

Few Canadian polling firms typically come close to the American standards, but then again, the local controversy is merely one part of a much larger issue in the Canadian industry.  Perhaps it is time for MRIA and its members to review their existing standards with an eye to making them more prescriptive.

The American Association of Public Opinion Research also promotes education for journalists on polls and polling in order to improve knowledge of industry standards and practices. AAPOR has also developed an online course for journalists run in co-operation with  the Poynter Institute.

The National Council on Public Polls has also compiled a list of 20 questions journalists should ask about polls.  The information contained in the questions and answers includes plain language discussions of margins of error and sources of error.

- srbp -

1 comment:

rod said...

Anybody with a brain knows that polls can be easily manipulated to produce the results desired by the people who sponsored the survey. You might as well be shouting at the moon.

Keep digging at Muskrat Falls Ed. It's the bigger issue.