20 October 2011

Who leaked the amalgamation report? #nlpoli

Was it a city councillor with an axe to grind?

Do the neighbouring municipalities have a mole at Tammany at Gower?

- srbp -

The Dysfunctional Dunderdale Administration #nlpoli

Kathy Dunderdale thinks the provincial legislature is dysfunctional.

Well, if the House sat more often than it has since Dunderdale’s been a member, they might be doing better.

Dunderdale as premier is carrying on the tradition of her predecessor of avoiding the province’s legislature as much as possible. As labradore pointed out on Wednesday,  Newfoundland and Labrador has had the shortest election campaigns in the country and gone the longest period between voting day and when the legislature sat next.

In 2007, the Tories – Dunderdale was deputy premier – went 153 after the election before they showed up in the legislature.

SRBP has been pounding on this issue since 2006. Dunderdale is continuing her predecessor’s ignoble tradition.

Uppity Datelabradore produced another picture that shows the time lapse between voting day and Throne Speech.  The results for some provinces changed dramatically. 

But guess what?  Newfoundland and Labrador still drags its ass across the finish line in the democracy marathon.  Or sprints effortlessly to an easy win in the anti-democracy dysfunction.

It all depends on how you look at it.

Either way, it’s nothing to be proud of.

- srbp -

PPM: Controversy, Accountability and Disclosure #nlpoli

On October 3, Liberal leader Kevin Aylward issued a news release in which he claimed that the second MQO poll released the Friday before had been “bought and paid for by the Tories.”

In the release, the Liberals also claimed that “[t]he Dunderdale Government has bought and paid for this online survey.” 

The Liberals had no evidence to back their claims that the poll was fabricated.  They offered no evidence to refute the polls findings.

In 1989, faced with a partisan poll scam, the Liberal campaigned released its own internal data that proved to be far more accurate than the one released by the Conservatives’ pollster.

In 2011, the Liberals didn’t have polls, let alone ones that could give numbers different from the ones MQO produced.  The release served only to keep alive a bad news poll for the Liberals for the third week of the campaign. The release looked much like a desperate effort by a disorganized campaign fumbling about for anything that could stave off collapse.  The release reeked of desperation.

That desperation became all too apparent as NTV, Environics and then CRA polls appeared all conducted around the same time as the MQO one and all showing numbers that showed the Liberals in more or less the same place as MQO.

Industry controversy, too

After the campaign, CRA president Don Mills complained publicly about campaign poll reporting.  The Telegram quoted Mills:
“There’s a lot of people who say online research is just as good as telephone research. That has not been proven to be true and we have recent examples in Atlantic Canada where a competitor of ours has used an online methodology and have not got it within the margin of error they quoted” he said.
Of course, Mills’ poll didn’t come any closer, but his comments did point to problems with the publicly released polls.

Industry Standards

The Market Research and Intelligence Association represents Canadian market research and public opinion firms.  MRIA has established standards for the public release of polls by firms.  The standards include these provisions:
1) Please include the following key facts in the report:
  • Sample size, and population surveyed (who was included)
  • Sponsor of study (who commissioned the research)
  • Survey method (e.g. telephone, on-line, intercept)
  • Timing (when the survey was done)
  • Statement of sample error/margin of error (i.e. "+/- 2.5% 19 times out of 20")
2) Please make the following facts available to the public upon request (if not included in report):
  • Name of practitioner (company conducting research)
  • Sampling method (e.g. random, custom list)
  • Weighting procedures (statistical weights, if used)
  • Exact wording and order of questions
3) Always differentiate between scientific (most public opinion polls) and non-scientific studies (reader/viewer polls or other "self-selection" methodologies). 
4) Where appropriate, use the caveat that research is not necessarily predictive of future outcomes, but rather captures opinion at one point in time.
What’s the problem?

Both the Aylward and Mills criticisms are rooted in one problem:  a lack of disclosure of details of the polling not just by MQO but of all the firms that released polling during the election.

MQO only confirmed its research was done independently of its clients once Aylward raised the issue.  But even for the sake of its own public image, MQO‘s two releases didn’t offer a great deal of information that would have avoided that controversy or the issue Mills raised.

How do the firms stack up?

A simple reading of the first MQO release suggests that it meet those MRIA standards exactly, as limited as they are, with respect to what must be included: 
The poll was based on a sample of 413 Newfoundlanders and Labradorians, with a margin of error of +/- 4.9%.  It was conducted between Friday, September 16 and Sunday September 18, 2011.
The release did not indicate the type of survey (telephone, online etc).  It did identify the company that conducted the work  - part of the “on request” information but didn’t release any of the other information in the same section.

If this release complied with the MRIA standard, the poll was a sample of all people in the province regardless of whether they could vote or not (“Newfoundlanders and Labradorians”). That would make it difficult to compare the poll to any others or even to the voting population as a whole.

There’s no indication of how MQO selected the sample, either.  Was it a randomly selected sample or was it something else?

The second MQO disclosed some additional information but that just raised questions about whether or not the second poll could be  compared to the first:
The poll was based on a sample of 464 residents of Newfoundland and Labrador, with a margin of error of +/- 4.6 per cent. The research was conducted via phone and online between Wednesday, September 28 and Friday, September 30, 2011, using MQO’s research panel iView Atlantic. The sample was weighted regionally to ensure proper representation of the province as a whole.
Again, the sample appears to have been drawn from people resident in the province, regardless of age.  The second release includes information on how MQO collected the data – “via phone and online” – but it isn’t clear how the sample was selected if MQO a previously screened research panel.

The other firms aren’t necessarily any better, though. Consider how Canadian Press described the Environics poll:
Unlike traditional telephone polling, in which respondents are randomly selected, the Environics survey was conducted online from Sept. 29 to Oct. 4 among 708 respondents, all of whom were chosen from a larger pool of people who were recruited and compensated for participating. Environics then adjusts the sample to reflect a broad spectrum of the population.
The non-random nature of online polling makes it impossible to determine statistically how accurately the results reflect the opinions of the population at large.
More information in many respects right down to the fact that participants in the panel were recruited and paid for participating. And to be fair to Environics, this is the CP version, not necessarily the exact description the polling firm gave of its sample and population (who they were studying).

And then there’s how CRA typically describes its quarterly omnibus:
These results are part of the CRA Atlantic Quarterly®, an independent survey of Atlantic Canadians, and are based on a sample of 400 adult Newfoundland and Labrador residents.  The survey was conducted from August 15 to August 31, 2011 with overall results for the province accurate to within + 4.9 percentage points in 95 out of 100 samples.
Some information is missing – presumably it was a random sample – but CRA appends to the release a table of data, the wording of questions and comparative data for several previous polls.

Even with industry standards and even considering the firms involved  are all MRIA members – MQO and CRA are Gold Standard members – the polling firms don’t follow the same practices in how they report polling information. 

Why Disclosure is Important

For those who might think Mills’ criticism about the online panel are accurate, check Geoff Meeker’s post on the controversy.   Meeker put the question of the panel to MQO and got a written reply.  Included was this description of the panel composition that – from MQO’s standpoint – justified including a margin of error while Environics did not:
iView Atlantic is a probability-based panel, meaning that every member of the population has an equal chance of being selected for participation.
These are not esoteric issues.  The MRIA news release accompanying the disclosure standards included these comments by the past president of MRIA’s predecessor organization:
"All of these reporting items create transparency in the polling process and help the public establish an informed opinion about the results of a poll…If we want people to take part in the democratic process, we must be certain they have confidence in the way we conduct our business, from the time they answer our questions through to the results being published."
Ensuring public awareness of opinion research techniques and issues also fulfills objectives set out in the MRIA Code of Conduct. Members of the public should be able to have complete confidence in the poll results they see as well as in their ability to compare results from different research methods.

Given both the controversy in the recent election and the variation in the amount of information polling firms released – even with established standards -  the MRIA has some work to do.

The American Example

With a larger polling industry, more election polling and a much wider experience with controversy, the American polling industry has much more stringent disclosure standards than those of MRIA. None of the polls released during the Newfoundland and Labrador election came even close to the American standards. 

Few Canadian polling firms typically come close to the American standards, but then again, the local controversy is merely one part of a much larger issue in the Canadian industry.  Perhaps it is time for MRIA and its members to review their existing standards with an eye to making them more prescriptive.

The American Association of Public Opinion Research also promotes education for journalists on polls and polling in order to improve knowledge of industry standards and practices. AAPOR has also developed an online course for journalists run in co-operation with  the Poynter Institute.

The National Council on Public Polls has also compiled a list of 20 questions journalists should ask about polls.  The information contained in the questions and answers includes plain language discussions of margins of error and sources of error.

- srbp -

19 October 2011

Amalgamation #nlpoli

The only way anyone should agree to amalgamation on the northeast Avalon is if the entire crowd at Tammany on Gower - council and senior staff alike - have absolutely nothing to do with the administration of the new city.

As a townie, there is no way your humble e-scribbler could sleep at night if the crowd on Gower Street currently wreaking havoc on common sense got the chance to spread their ways to anyone else in the province.

- srbp -

PPM: The Polls and the Local Media #nlpoli

All daily media in the province reported polls released by polling firms during the 2011 general election

They reported the polls as the firms released them.  That is, they did not question or alter the presentation of the numbers, nor did they discuss the different methodology used to generate them. This CBC report is typical.

The second MQO poll, released on September 30 stands out in particular as a result of the way it was reported.  This poll prompted a news release from Liberal leader Kevin Aylward that we’ll discuss in greater detail in the final segment of this series.

What’s most interesting about the second MQO poll is the way the firm reported the results for a question it asked on the leaders’ debate. 

One can make a theoretical argument about eliminating undecideds in the presentation of a party choice question. Some pollsters would claim this would allow you to match their poll result with the popular vote on election day.

But there’s no reason to alter the presentation of the results in such a way for any other sort of question.  Yet that is exactly what MQO did with the leaders’ debate question.  For some inexplicable reason, MQO reported the results in such a way that they placed the emphasis on those who watched the debate.  They then presented their responses as if those who reportedly watched the debate was 100% of the sample.

When asked about the leaders’ debate, 34 per cent of those polled said they watched the televised leaders’ debate on Wednesday, September 28. Of those respondents who watched the debate, 36 per cent felt Kathy Dunderdale won the debate, while Lorraine Michael was seen as the winner by 22 per cent, and six per cent said Kevin Aylward came out on top. The remainder of respondents said there was no clear winner of the debate.

The result was this sort of presentation in news stories:

The poll poses serious questions for Kevin Aylward and the Liberals who have been struggling through the campaign with an accumulated debt and difficulty with recruiting candidates.

Of respondents who had watched the leaders' debate on Wednesday night, only six per cent said Aylward won it. By contrast, 36 per cent chose PC Leader Kathy Dunderdale and 22 per cent chose NDP Leader Lorraine Michael.

"The remainder of respondents said there was no clear winner of the debate," MQO said in a statement.

In the rush to write a news story, no reporters caught that 66% of the respondents didn’t watch the debate.  They didn’t adjust the rest of MQO’s numbers accordingly. The result is that MQO’s misleading presentation of its results survived into news casts.

Aside from the MQO and Environics polls released by the companies themselves to all media, two of the province’s major media outlets commissioned polls for the 2011 election.

NTV commissioned their usual pollster, Telelink, to conduct a poll similar to ones they have done with Telelink on several occasions over the past six years.

The poll served the usual purpose.  It gave NTV exclusive news content as well as a marketing opportunity.  NTV was also able to ask a question on Muskrat falls similar to one from an NTV/Telelink poll in February.  As such, NTV was able to describe not only the results of the new poll but the trend – in this case a decline in support – from the earlier poll.

The Telegram commissioned CRA to conduct a poll with what turned out to be the largest sample of any poll conducted during the election. 

Consistent with experience elsewhere, the Telegram used the poll to generate large amounts of original content in addition to reaction pieces.  The poll was the centrepiece of the paper’s election coverage over three days, beginning on October 6. 

The poll, however, did not add significant new information to any other poll results.  The Saturday edition  - October 8 - featured responses to questions on the so-called rural-urban divide.  However, the questions – which party do you think best deals with rural issues and so on – basically mirrored the satisfaction questions without adding significant depth or colour to public understanding of the issues and public opinion. “Satisfaction” questions ask respondents to indicate how satisfied they are with government performance on a given topic. 

The Saturday edition of the Telegram also reported on another key question.  The Telegram asked respondents which party they would chose if Danny Williams was still the Tory leader.

Curiously,  the Telegram found that health care was overwhelmingly the major issue for respondents, just as CRA polls for the provincial government reported for the past year.  However, the Telegram did not probe any dimensions of that opinion.  What is it about health care that people are so concerned about?

The most significant aspect of the media and polls during the 2011 election was not what the media reported of specific polls during the election.   Rather it was the conclusion that news media drew from the CRA quarterly omnibus results and then the subsequent polls.

The Tories were assured of a massive majority, so the interpretation went.  The only thing potentially worth watching was a race for second place in poll results.

You can see the theme in national media – CTV or the Globe for example – and you can also see it in local coverage.  The CBC interpreted the second MQO poll with a particularly strident emphasis on the supposed loss of ground by the Liberals in the poll.  The decline, incidentally, was well within the margin of error.  The CBC characterised the change in numbers as “freefall.”

What this interpretation missed as a result was the dramatic battle between the New Democrats and the Conservatives in St. John’s.  No daily media in the province reported it before the election results on October 11.  In the October 8 edition, for example, the Telegram election coverage made no mention of the battle beyond the NDP confidence in seats changing hands. 

If they missed entirely a pitched battle right under their noses, it is no surprise that they also missed the Liberal campaigns in western Newfoundland that resulted in the party winning enough seats to continue as the official opposition in the House of Assembly.  The Liberals, as one wag noted, refused to follow the media script and die on cue.

By following the polls – marketing devices for the polling firms and for some news outlets – the news media missed the news in the local election.

- srbp -

 

18 October 2011

Follow the money: political finance edition #nlpoli

If you have eyes, be prepared to have them popped by labradore’s latest comparison of party financing.

He looks at the pattern of corporate political donations in metro Halifax and metro St. John’s from 2005 to 2009.  The results are startling.  In a region with a smaller population, the corporate sector in St. John’s gave more cash and they gave it disproportionately to the party in power.

Add that bit of information to a post on Monday that showed just how much the corporate sector gave in just a single year, namely 2010.

In 2010, the governing Progressive Conservatives raised $690,000 in reportable contributions, versus the Liberals $31,000 and the NDP's $59,000. That is the highest amount the Tories have ever raised in an off-election year.

Of the PC total that year, fully $383,000 — over 55% — came from business donors in the greater St. John's area.

And just to further refine those numbers, bear in mind that of the $690,000, the Tories got $235,000 or thereabouts from one sector:  the construction industry.

- srbp -

Handling the Undecideds #nlpoli

Opinion polls conducted in Newfoundland and Labrador that ask about party choice measure the opinion of the entire population of eligible voters.

As such, discarding the undecided responses (anything other than a party choice) or reallocating the undecideds according to some pre-determined policy tends to distort the poll results. It doesn’t matter whether the question is about a theoretical election “tomorrow” or one that will actually occur two or three weeks in the future.

What to do with the undecideds is a contentious issue among pollsters themselves.  The technique CRA or MQO used in their election poll reports has the effect of allocating the self-identified undecideds according to the same breakdown as those already decided.

But the experts at pollster.com, for example, will insist that late undecideds tend to break for the challenger.  That’s the opposite of what MQO and CRA do.

Other pollsters handle the undecideds differently.  As Mark Blumenthal noted in a 2004 post, the Pew Institute and Gallup sometimes allocate undecideds evenly among the other choices.

What’s interesting to note, though, is that the undecideds in the American poll results tend to be less than 10% of respondents.  The decideds comprise two choices each of which is four or five times larger.  Reallocating that small a percentage or discarding it entirely does not necessarily skew the picture of public opinion that greatly.

In the recent provincial general election, MQO reported the lowest level of undecideds in its polling at 18% for its second poll and 20% for the first one. But at least one of these was not the traditional random sample pollsters historically use.

The “undecided” category appears to capture those who said they were undecided, those who said they did not intend to vote and those who refused to answer the question.

In telephone polls during the election that apparently used random samples – Telelink and CRA -  the lowest reported undecided/will vote/ refused was 26% for CRA in both its poll for the Telegram and its August quarterly omnibus.

Telelink hit 42%. 

Environics had an undecided of 30% using its online panel survey method.

With undecideds at those levels, reallocating them can significantly distort the perception of what opinion the public actually holds.

This is no small point when the polls are apparently intended to describe the opinions held by all adults over the age of 18 years not just those who may – from election to election – decide to go to the polls.

In the most recent general election, the percentage of eligible voters who didn’t go to the polls was larger than the percentage that supported the winning party.

You can see the effect of the distorting effect when you compare the poll results to the actual vote result as a share of eligible voters. All the results cited below were conducted between September 30 and October 3. 

 

Vote

CRA

ENV

MQO

NTV

PC

32

44

38

44

35

ND

11

18

22

27

15

LIB

14

12

09

11

08

DNV

42

26

30

18

42

The distortion can then lead people to draw some erroneous conclusions.  Take, for argument sake, the Telegram editorial on October 17:

Cheers: to more fun with numbers. The provincial Liberal party has made much hay with claims that by landing six seats and staying as the provincial opposition, it was somehow proving pre-election critics wrong. (The claims, of course, ignore the fact the party had the lowest share of the popular vote in its history.)

Numbers can be lots of fun, if you understand what they mean.  Election results in a first-past-the-post system depend very much on what party can get its voters to the polls in each district.  Even with overwhelming voter support – according to the distorted presentation of poll results that discount UND/will not vote – the Tories should have easily swept every district.

But they didn’t.

They lost seats.

The Tories lost seats in their heartland of St. John’s and came close to losing a bunch of others.

When voter turn-out drops, as in the election just finished, the actual share of eligible vote becomes more important.  The Telegram editorial ignores the fact that turn-out in the most recent election hit a historic low.  If voter choices had actually looked like the numbers their pollster claimed he reported, the whole election would have turned out differently!

But  - for some reason - their pollster missed a huge chunk of public opinion.  He wasn’t alone.  Only NTV/Telelink hit the number, even if it wound up being mislabelled.

In the end:

a poll released near to an election with a relatively high number of undecided voters is an indication that the questionnaire was not designed properly, and/or that the screening of voters was not conducted with enough rigor. Well-designed screening questions and well-written “who will you vote for questions” should, as a natural byproduct, produce lower undecideds in a final pre-election poll, all other things being equal. The solution is not, as some have recommended, for the pollster to make up numbers on election eve for the purpose of eliminating the undecideds, but rather to craft the survey instrument in such a way that it naturally results in fewer and fewer undecideds as the election draws near.

- srbp -

 

  • Monday:  “Politics, Polls and the News Media”

  • Tuesday:  “PPM:  The Polls in the 2011 Election”

  • Wednesday:  “PPM:  The Polls and the Local Media”

  • Thursday:  “PPM:  Controversy, Accountability and Disclosure”

PPM: The Polls in the 2011 Election #nlpoli

Polling firms released more election polls in the 2011 provincial general election than in any recent provincial election.

Corporate Research Associates, Environics, MQO (MarketQuest Omnifacts) and Telelink produced a total of five polls.  MQO issued two polls 10 days apart.  Telelink and CRA produced polls for NTV and the Telegram, respectively and Environics issued a single poll.

The Polls and What They Reported

News stories reported what the polling organizations reported to them.

MQO issued its first poll on September 20, one day after the campaign formally started.

They reported the responses to questions on  party choice (“if an election were held today”), leader choice, government satisfaction and top issue.  The news release gave the results for party choice and leader choice as percentage of decideds, but with the percentage undecided  - described as “noncommittal” in the leader choice paragraph – included.

The poll was based on a sample of 413 Newfoundlanders and Labradorians, with a margin of error of +/- 4.9%.  It was conducted between Friday, September 16 and Sunday September 18, 2011.

Aside from the news release, MQO apparently did not release anything else about the poll including information on how the sample was chosen, any weighting, if the poll was done alone or as part of a larger poll or details of question wording and order.

MQO issued its second poll release on September 30.

The release reported on questions on party choice, leaders’ debate,  government satisfaction and major issue. As with the earlier release, the results for the party choice and leader questions gave results as a percentage of decided respondents and included the percentage undecided.

The poll was based on a sample of 464 residents of Newfoundland and Labrador, with a margin of error of +/- 4.6 per cent. The research was conducted via phone and online between Wednesday, September 28 and Friday, September 30, 2011, using MQO’s research panel iView Atlantic. The sample was weighted regionally to ensure proper representation of the province as a whole.

The release did not include information on the research panel or how MQO combined the panel sample with the telephone.  The release only indicated the sample was weighted regionally but gave no indication of what standard was used to determine the appropriate weighting.

The release did not give any information on question wording, order or other information related to how the poll was conducted. The two releases do not indicate if MQO conducted both polls using the same methodology such that the results could be compared.

This release included two graphics showing some of the responses for the leader choice and party choice questions.

On October 3, NTV reported a poll it commissioned from Telelink. NTV reported on party choice and leader choice.  Telelink conducted the poll on October 1 and 2, with a sample comprising 511 residents of the province. Reported margin of error was plus or minus 4.3 percentage points.

NTV reported the results for the party choice question (how the respondent intended to vote) as a percentage of all respondents, including don’t know, no answer.

Telelink probed the undecided/refused and produced a second set of party choice numbers combining decided plus leaning.

On October 4,  NTV reported on a question on Muskrat Falls following up on an earlier Telelink poll it had commissioned in February.

Environics issued a poll result on October 5 through Canadian Press.

Thirty-eight per cent of respondents backed the incumbent Progressive Conservatives, compared to 23 per cent for the NDP and nine per cent for the Liberals.

Thirty per cent were undecided.

The online poll was conducted by Environics Research Group and provided exclusively to The Canadian Press.

Unlike traditional telephone polling, in which respondents are randomly selected, the Environics survey was conducted online from Sept. 29 to Oct. 4 among 708 people.

The respondents were chosen from a larger pool of people who were recruited and compensated for participating.

The non-random nature of online polling makes it impossible to determine the statistical accuracy of how the poll reflects the opinions of the general population.

Neither Environics or Canadian Press released any other information on the poll.

The Telegram commissioned Corporate Research Associates to conduct a poll exclusively for the newspaper. 

The Telegram will roll out results of the wide-ranging poll — which has a large sample size of 800 — in the coming days.

The poll was conducted between Sept. 29 and Oct. 3 and has a margin of error of plus or minus 3.5 percentage points with a confidence level of 95 per cent.

The Telegram reported results of the poll from October 6 to October 8 for questions on major issue, party choice, second choice, leader choice and government satisfaction. 

The October 6 edition reported the party choice question as percentages of all respondents.  The Telegram reported on regional results for some questions but did not indicate separate margins of error for the regional results. CRA poll reports obtained from the provincial government under access to information laws typically do not show margins of error for these sub-sample breakouts.

The Telegram did not release any other details on the poll including specific wording of questions, sequencing or weighting. 

What the Party Choice Question Measured

The party choice question is the one question asked by pollsters for which a genuinely objective confirmation exists.

The problem comes, however, in determining what the pollsters intended to measure when they posed the party choice question.  There is what may be called a standard question – “if an election were held tomorrow…” however at least one of the polls involved a non-standard question along the lines of “who will you vote for next week?”

Some of the polls apparently asked non-standard questions

None of the news releases or news stories indicated what the poll results for the party choice question would show.  That is, one can read the releases or news stories and not see clearly that the numbers were intended to show a party share of vote on election day.

Some pollsters, such as CRA, report results that discard all undecided responses and treat the specific party choices as if they were 100% of the responses.

The Telegram story on October 6 included figures reported that way at the end of the front page story:

Among decided voters across all regions, 59 per cent of them said they would vote PC,  25 per cent NDP and 16 per cent Liberal.

It also included the results of a proving question for the undecideds/refused (26-27%) of respondents.

Among the undecideds or those who refused to state their preference, 26 per cent are leaning towards the PCs, while 21 per cent are leaning towards the NDP and 14 towards the Liberals.

Some 38 per cent said they don’t know.

The Telegram did not report its own results for decided + leaners

Based strictly on the limited information provided in the news stories and/or news releases during the campaign, it is impossible to say for certain what the pollsters intended to measure.  As such it becomes very difficult to compare the polls – as reported – for accuracy and consistency.

Take, for example, the CRA report for the Telegram.  According to the newspaper account, they have three separate potential sets of figures in response to the single party choice question.  There is the raw percentages reported on the front page of the Thursday edition (PC = 44%, for example)  Then there is the decideds-only reported in the third last paragraph of the story. (PC= 59%) 

And then there would be the possible decided plus leaning response.  To figure this one out, you’d have to do some math to calculate what 26% of 26% is.  That’s the number of leaners within the undecided/don’t know/refused from the first set of percentages.  Do the math, though and you’d get 44% + 7% = 51%.

But what are these numbers supposed to represent? 

There’s the rub.

Based on information SRBP obtained several years ago, it appears that none of the polling firms screens respondents to exclude non-voters. They do not identify voters, specifically.  Pollsters in Newfoundland and Labrador simply poll a sample of those eligible to vote. 

That means that the correct comparison for their polling numbers is not the share of people who show up and vote on polling day but a comparison with the entire population of eligible voters.

You can see this in the CRA question reported by the Telegram on October 6.  The initial question asked respondents which party the respondent was most likely to vote for. If CRA had screened out non-voters, none of the replies should have been “I won’t vote” .  But, in fact, 3% indicated they did not plan to vote.

The same is basically true of all the poll results, to one degree or another.  They measure party choice by eligible voters.  In the process, they capture  - or are supposed to capture -  people who are genuinely undecided, people who claim to be undecided and those who will not vote.

All are important in presenting a clear picture of public opinion as it actually was at the time the poll was taken.

Anything else is a distortion.

- srbp -

17 October 2011

When is “nutbar” an unacceptable term? #nlpoli

CBC’s ombudsman has ruled on complaints about Kevin O’Leary’s on-air use of the word “nutbar”.  According to the Globe and Mail:

The watchdog says hundreds of complaints were filed after Mr. O’Leary called [Chris Hedges,] the Pulitzer Prize-winning journalist “a nutbar” during CBC News Network’s The Lang & O’Leary Exchange on Oct. 6. The remark came during a seven-minute segment about the Occupy Wall Street protests unfolding in the United States.

“There is room at the inn for a range of views, but there is no room for name-calling a guest,” CBC ombudsman Kirk LaPointe writes in a decision dated Oct. 13.

Interesting.

In November 2010, O’Leary had a few choice words about the Old Man shortly after he announced he’d be skedaddling from provincial politics.

Nutbar Factor 6 is how O’Leary described Williams’ economic policies that included seizing private property using the power of the provincial legislature.

Maybe CBC didn’t get complaints about that one.

- srbp -

A study in fiscal responsibility contrasts #nlpoli

Alison Redford, Alberta’s new Conservative premier promises she will deliver some long overdue attention to the provincial government’s heritage fund.

That’s a stash of oil and gas cash created in 1976 as a kind of rainy day fund.

Previous Conservative governments took cash out, stopped putting cash in and otherwise neglected a fund that could be worth $100 billion today according to former premier Peter Lougheed, the guy who created the fund.

Meanwhile, Newfoundland and Labrador’s Conservative premier is promising “fiscal responsibility”. 

Unfortunately, Kathy Dunderdale and her colleagues have been promising that since 2003 and they still haven’t delivered

What’s worse,  they have explicitly rejected every responsible fiscal idea that’s been tossed their way.

- srbp -

The NDP Rise #nlpoli

On Sunday, the always provocative labradore dissected the NDP performance in the last general election.

The NDP gains in votes, and seats, came almost entirely at the expense of the Tories. The Liberals stubbornly refused to believe their own obituaries.

According to the script, this wasn't supposed to happen.
And it's not just a matter of raw vote- and seat-counts. Straits and White Bay North released itself from the Liberal clutches it fell into in the late by-election, but did so without rushing back to the Tory fold. Clyde Jackman saw his political career flash before his eyes, and the NDP was strong enough to be competitive in several of the St. John's area seats it didn't win. Apart from raw margins, another indicium of the potential winnability or convertability of a district is whether the second-place party is strong enough to win polls within the district. The NDP did so in at least four Tory-held St. John's seats Tuesday night, and enough of them that it was leading in suburban Cape St. Francis in the first hour of the count.

Whence the source of the nervous-nelliness.

Labradore’s colourful charts don’t convey the full sense of the shock on the ground. 

Your humble e-scribbler has had way too many direct and indirect accounts over the past week of good Tories who watched the telly gobsmacked on Tuesday last as the NDP ate Shawn and Ed and Bob.

They never saw it coming.

The shock is profound.

They really didn’t see the NDP second-place strength in the other metro seats otherwise they’d be flinging themselves out the nearby windows.

The Tories will have a very hard time dealing with the NDP insurgency.  The NDP policies are the same ones the Tories have been pushing since 2003.

Since the NDP voters are apparently former Tory voters – for the most part – the old Tory scare tactic of NDP financial irresponsibility just won’t find any purchase with those who will likely be looking to change votes and parties next time out.

- srbp -

Politics, polls and news media #nlpoli

For the past 65 years,  public opinion polls have been an integral feature of news media reports on politics and elections.

The reasons are pretty simple to understand.  Most public opinion polls are conducted by professional firms using scientific methods.  As such, they are considered to be inherently impartial, accurate and fair representations of what the public thinks about candidates and parties. 

The firms that poll during an election are usually independent of the political parties.  This gives the news media a source of independent information about the campaign.  Polls, especially ones exclusive to the news organization, can give the media outlets a direction for coverage.

When news media commission polls, they also gain a marketing boost.  Don’t discount the business imperative in news.  Tom Rosenstiel is executive director of the Pew Research Center’s Project for Excellence in Journalism.  Rosenstiel began a 2005 article on political polling and news media by recounting a meeting at the Los Angeles Times in 1991 to plan coverage of the 1992 election.

“Polls are a form of marketing for news organizations,”  Rosenstiel wrote. “Every time a Los Angeles Times poll is referred to in other media, the paper is getting invaluable positive marketing for its name and credibility.”

Presenting information in an entertaining way has always been a part of news. Poll results typically come in a form that lends itself to an horse-race story format.  That injects some energy into what might otherwise be a dull story of numbers.  .

Reporters usually have an easy time summing up a poll report.  That’s an increasingly important factor in newsrooms operating on tight budgets and facing heavy demands for content.

Rosenstiel marked that pressure in 2005 as a key feature of modern newsrooms.  But in truth, the need to produce news stories quickly has always been a feature of news media for some time now, especially electronic media. Political scientist Everett Carl Ladd wrote in 1980:

For the most part, the press… must work quickly to do its mandated job.  This observation obviously applies a somewhat less to magazines than to the daily newspaper or the nightly television news broadcast, but it holds generally. The story must be promptly brought to the audience.

What’s changed more recently is the increased demand for content as smaller numbers of news organizations produce material for print, radio, television and the Internet, sometimes from the same newsroom.  Often this is simply the repackaging of material, as Rosenthiel noted.  And that makes apparently simple stuff – like reporting a horserace poll – that much more attractive.  if the news organization commissions a poll of its own and delves into more than just the “who’s on first” question, they can generate new content for days.

Controversy

None of the media’s use of polls is has come without controversy. 

In the run-up to the spring general election, the seemingly wide variation in poll results generated news stories about the reliability of polling.

At a conference on the May federal election, people representing eight polling firms debated the impact of polls on the election.  Opinions varied – as they did – on what impact poll reporting had on the public.  According to a Canadian Press story, Frank Graves of Ekos Research said that post-election polling found that Canadians didn’t believe poll reporting affected the outcome of the election

Environics' Kevin Neuman was doubtful.

"People may say that (polls) don't influence, but it would influence the media and how the media cover the story and frame the story," he said, adding that the CROP poll "may have completely changed the media coverage."

In the recent Ontario general election, some pollsters complained about the publication of polls from different sources, often without any apparent concern for their accuracy.

“We are distorting our democracy, confusing voters and destroying what should be a source of truth in election campaigns — the unbiased, truly scientific public-opinion polls,” wrote Darrell Bricker and John Wright of Ipsos Reid.

Bricker said most research firms are accurate. But some are “so ridiculously inaccurate” he wonders how they got into the business. And elections bring out the carpetbaggers or those trying out untested, and dubious, methodology.

Still, the biggest question for him is not research firms. “I have to ask the question, what are the media thinking?

Closer to home, Corporate Research Associates’ Don Mills complained in the Telegram on Saturday about the accuracy of some polling released during the recent provincial election campaign. MQO released two polls during the campaign that relied on a combination of telephone polling plus online surveys:

“There’s a lot of people who say online research is just as good as telephone research. That has not been proven to be true and we have recent examples in Atlantic Canada where a competitor of ours has used an online methodology and have not got it within the margin of error they quoted,” he said.

“They are not even supposed to quote margin of error in online polls.”

Industry critics

Not all pollsters are as enthusiastic about the proliferation of polls and the increasingly close relationship between the media and opinion research firms.

In April, Allan Gregg – perhaps the country’s most famous researcher – and Frank Graves of Ekos spoke out in an article by Canadian Press.

There’s broad consensus among pollsters that proliferating political polls suffer from a combination of methodological problems, commercial pressures and an unhealthy relationship with the media.

Start with the methodological morass.

“The dirty little secret of the polling business . . . is that our ability to yield results accurately from samples that reflect the total population has probably never been worse in the 30 to 35 years that the discipline has been active in Canada,” says veteran pollster Allan Gregg, chairman of Harris-Decima which provides political polling for The Canadian Press.

The increased use of cell phones and changing lifestyles have made traditional telephone surveys less reliable, according to Gregg.  Online polling may produce more reliable results in some instances but not in others.

Still, according to Gregg, polling firms are producing margin of error calculations “as if we’re generating perfect samples and we are not anymore.” 

Pollsters continue to generate horse race polls for their marketing value, according to both Gregg and Andre Turcotte, a pollster and communications professor quoted in Joan Bryden’s Canadian Press story from April.

Turcotte says political polls for the media are “not research anymore” so much as marketing and promotional tools. Because they’re not paid, pollsters don’t put much care into the quality of the product, often throwing a couple of questions about party preference into the middle of an omnibus survey on other subjects which could taint results.

And there’s no way to hold pollsters accountable for producing shoddy results since, until there’s an actual election, there’s no way to gauge their accuracy.

Not surprisingly, the association representing polling firms disagrees.  The Market Research and Intelligence Association (MRIA) took out a full page ad in newspaper’s across Canada when the polling controversy first sprang up in February.  The ad affirmed the association’s “confidence in the results of our polling and the value that we provide to Canadians.”

Politics, polls and the media

The 2011 provincial general election in Newfoundland and Labrador brought with it both an unprecedented number of horse race polls and a certain level of controversy.

In the second part of this series – on Tuesday -  we’ll take a look at the polls, the polling firms, what they reported, and what the polls measured.

- srbp -

The Series:

Related:

16 October 2011

Cheap power for Nova Scotians #nlpoli

No, this is not about the official policy of both the Conservatives and new Democratic Parties in Newfoundland and labrador, although both parties have that as their major energy goal in the near future.

Rather, it’s a report in the Chronicle Herald this weekend that a coalition of environmental groups in Nova Scotia wants Nova Scotia Power to buy cheap power from Hydro-Quebec so the province can stop burning coal sooner rather than later.

"Quebec power is immediately available and the Newfoundland and Labrador project is an idea," Neal Livingston with Black River Hydro Ltd. said Friday, referring to the Muskrat Falls megaproject.

Because Quebec hydro is readily available, it’s "an opportunity for Nova Scotia to get off coal burning quickly," he said.

The story quotes independent energy analyst Tom Adams.  regular readers will recognise his name.  Adams said that HQ’s average revenue from exports to the US this past summer was 6.5 cents per kilowatt hour.  The average American wholesale price was 4.9 cents a kilowatt hour.

Electricity is cheap, in other words, Hydro-Quebec has more than enough and – as the story notes, HQ could wheel the power to Nova Scotia through New Brunswick.

The Nova Scotians also raised the need for an independent review in Nova Scotia of the Muskrat falls project:

Brendan Vogel with the Ecology Action Centre echoed her concerns, saying the province needs an independent analysis of the Lower Churchill Falls development, which includes the Muskrat Falls project.

"We need an arms-length analysis that thoroughly scrutinizes the benefits of the Churchill project for ratepayers," he said. "The cost-effectiveness of this project needs to be weighed against the other options on the table."

- srbp -

Whither the Liberals #nlpoli

[revised and edited 4:45 PM]

The tale is not told in the view of columnists  - Stephen Maher, Chantal Hebert and Susan Delacourt - who try to link a series of different events into one explanation.

The tale is told in the comment of one long-time Liberal who bumped into another in St. John’s recently.

The Liberal Party doesn’t speak to me any more, said one.

Exactly, exclaimed the other.

The Liberal Party may have won six seats in last Tuesday’s general election but it stands at an historic low.  Only 11% of the electorate in Newfoundland and Labrador voted Liberal on Tuesday.

Voters in Newfoundland and Labrador looking for something other than the ruling Conservatives opted for the New Democrats last Tuesday and they did so in record numbers.

They did it in St. John’s for the most part but also in Burin-Placentia West,  Labrador West and The Straits-White Bay North. 

While the New Democrat resurgence is a subject for another day, the key thing for this post is which party voters chose last Tuesday and it was not the party that dominated politics in this province for so much of the post-Confederation period.

The reason is simple:  the Liberal Party does not speak to them any more.

A decade or so ago, the dominant voices in the party shifted to an increasingly rural focus on the party.  The Blame Canada commission with its pile of old axes reground was symptomatic of the shift.  So too was the resurgence of make-work as a core government policy for rural parts of the province and the transfer of government offices to major centres outside Capital City.

In this most recent election, ruralism took centre stage in the party’s platform.  And the leader the party executive chose overwhelming was not just committed to the ruralist agenda: he started out the election by loudly proclaiming his fierce “nationalist” sentiment.

Some may blame the Liberal fortunes on the last-minute change of leadership.  Others will focus on the impact of what appeared to be the most ineptly run campaign in provincial political history. 

Both had their part to play but both the campaign and the focus were already in train before the executive board picked Kevin Aylward.  And, if anything, Aylward did not apparently want to shift the dominant internal party trends so much as reinforce them

Aylward is scarcely any different from Yvonne Jones who fixated on the idea that building a Stunnel to Labrador was the winning party policy.  Party insiders fought to keep it out of her convention speech and her Facebook posting during the campaign was nothing more than a last-ditch effort to push the stunnedest of stunned ideas.

Beyond the ruralist core, the Liberal Party simply does not know what it stands for. 

In the last election, the party became the nothing more than a political sideshow.  There were plenty of contortionists: cast-offs from other parties abounded.  There was a star of the open line shows.  A perennial favourite of the political fringes stage-mothered a couple of her current charges through their political appearances on the ballot rather than run herself.  A few students came along for good measure as did staffers hounded relentlessly until they agreed to be names on ballots at their own expense.

The only thing missing was the sword swallower.

The Liberal party does not speak to anyone, anymore.

The people running the party seem to have no desire to speak to anyone other than themselves out there on the tattered edges of the provincial political landscape.

They are so far out in the political woods, they’d have to come in to hunt.

What’s worse, though, is that they seem to have lost the desire to hunt.

You can see that in the party after the election.

The leader disappeared.

The party president popped up to do a couple of interviews about the latest leadership crisis.

But while political life carried on, and issues and targets abounded, the party fell completely silent.  Shameful comments by the Premier about the legislature went unchallenged by Liberals. 

They said nothing about anything that truly mattered in the province and in the stuff that mattered only to the people involved in the party, they said little.

The Liberal party no longer speaks to the people of the province.

And, as it seems, the party doesn’t even speak to itself any more either.  Maybe the few of them still out in the political woods need to take heed of that. 

The rest of us [in the province] already have.

- srbp -

15 October 2011

An excess of chutzpah: pollster attacks colleagues over methods, accuracy #nlpoli

Corporate Research Associates president Don Mills is criticising his professional colleagues for their use of online surveys to conduct opinion polling.

CRA uses telephone surveys. In two election polls released in September, MQO reportedly used a combination of telephone and online surveys to prepare it’s results. Environics used an online method and Telelink used telephone surveys.

According to the Telegram:

An in-depth poll by CRA conducted for The Telegram came closest to election night results, Mills said.

The Telegram noted:

CRA, using a telephone poll based on a sample size of 800, predicted 59.5 per cent support for the PCs among decided voters, 24.7 per cent for the NDP and 15.8 per cent for the Liberals.

The actual election results were 56.1 per cent for the PCs; 24.6 for the NDP; and 19.1 per cent for the Liberals — a total difference of 6.8 per cent from the poll prediction.

The only problem is that claim isn’t true.

Like all of the opinion polls released during the campaign, Mills and CRA polled eligible voters.  They did not report screening for voters only nor indicate any method by which they determined whether those opinions they surveyed related to people who would vote only.

By surveying all eligible voters, Mills and CRA should have reported all their responses, including those who indicated they would not vote or had no opinion.

That’s what the Telegram did in it’s front page story on Thursday.  The numbers cited in the Telegram on Saturday disregard some responses and therefore  present a distorted and misleading impression of what CRA’s polling found.

Here’s what the Telly reported compared to the actual reported vote result on Tuesday as a share of eligible vote:

Telegram
Sept 30 - Oct 3

Actual Vote Oct 11

CRA Apparent Error*

       

PC

44

32

+ 12

LIB

12

11

+ 1

NDP

18

14

+ 4

UND/Will not vote

26

42

- 16

Note:  The figures do not add to 100% everywhere due to an apparent minor rounding or typographical error in the results as reported by the Telegram.  SRBP adjusted the UND by one percentage point from what the Telegram reported.  When SRBP contacted the Telegram for more information on the poll, the newspaper management refused to discuss the results at all beyond what was in the published stories.

Even allowing for that one percentage point, the published CRA results are significantly different from the actual result.

SRBP compared most of the polls in a pre-election post.

Compared to CRA, MQO** was off by about the same proportions using its hybrid method. CRA was off by the same country mile in 2007.

Environics was closer to the final actual result than either of those two.

Of all surveys released during the campaign, Telelink came closest to the actual result, just as they did in 2007.

SRBP will have more on the polls in the recent general election in a series starting on Monday.

We’ll look at:

  • the polls themselves, what they reported and how they reported it,
  • compare the poll findings with the actual results,
  • tackle the comments by Liberal leader Kevin Aylward,
  • look at poll reporting standards in the news media and in the polling industry, and
  • and look at the way the local media used polls in the past two elections.

- srbp -

Related: 

  • Comparing polls”  By Horizon Research, a New Zealand opinion research firm that uses online polling.  Horizon questions the validity of discounting upwards of 30% or more of responses when reporting survey results.
  • Two wrongs and you get a news story”  discusses the way CRA’s reporting of decideds produces a misleading impression, in this case from 2009, of an increase in support for one party when it actually declined.
  • CRA has been known to engage in controversial practices, like releasing a poll just before a by-election vote, significantly ahead of its usual schedule for releasing its omnibus for that quarter.

*  Apparent error refers to the discrepancy between the CRA poll result reported by the Telegram on the front page of its Thursday edition compared to the actual vote result. 

All polls contain error. Researchers strive to reduce known and possible sources of error. 

** Edit to correct the comparison.

Election Week Traffic #nlpoli

The ballots are in and counted.

People are still trying to grasp the enormity of what happened

Next week, SRBP will have a series on polls during the election.   We’ll look at the polls themselves, compare their findings with the actual results, tackle the comments by Liberal leader Kevin Aylward, and look at the way the local media used polls.

If that isn’t enough for the election junkies out there, we’ll also turn attention to some of the issues connected to the historic low turn-out in last week’s general election.

We’ll take a closer look at the NDP and Liberal campaigns and the seats they won plus there’ll be the usual collection of comments about and observations on public life in the province.

There’s always lots to chew over at SRBP.

In the meantime, to keep you going, here are the top 10 stories SRBP readers were poring over last week:

  1. Here’s what an opposition party looks like
  2. Globe and Shitemail
  3. The Morning After the Night Before
  4. Williams set to offer comms director plum patronage job
  5. The way not to change
  6. She can’t handle the truth
  7. Whom the gods destroy
  8. Motivation and demotivation
  9. A house divided
  10. What does sanction really mean?

- srbp -

14 October 2011

She can’t handle the truth #nlpoli

Premier Kathy Dunderdale is like her benefactor, Danny Williams.  neither liked the province’s legislature where they could be held to public account for their actions.

So they have treated the legislature  - and by extension the people of Newfoundland and Labrador  - with contempt.

Dunderdale told CBC:

Most of my issues are around the quality of debate and the research and the fact that you can pretty well get up in the house of assembly and say whatever it is you like. You don't have to be concerned with truth.

Kathy Dunderdale should stop projecting her own behaviour on others.

As noted in this corner before, Kathy Dunderdale has a problem.

Kathy Dunderdale says things that are not true.

She says things that are at odds with the facts.

She says things in a way that suggests she does not understand the issue or the explanation she is trying to give.

Kathy Dunderdale does not look like she knows what she is talking about, sometimes.

She gets caught out on these occasions and it must be embarrassing for her.

However, the failing is entirely hers.

Kathy Dunderdale does not like the legislature because it shows up her numerous shortcomings.

 

- srbp -

13 October 2011

What does sanction really mean?

Here’s what Kathy Dunderdale told CBC’s David Cochrane about Muskrat Falls:

"We're looking at sanction, at the earliest at this point in time, would be in the spring and the house will be in session before we sanction Muskrat Falls," said Dunderdale.

That’s an interesting timeline.

It’s way beyond when it was supposed to happen, as labradore pointed out on Wednesday. The whole thing was supposed to sanctioned in 2009 and up and running by 2015.

Now project sanction won’t happen until the second quarter of 2012, at the earliest.

So if you’ve been following this along, the Lower Churchill is costing millions with tons of design and engineering work started.  The thing is rolling along through review after study and yet no one has approved the project yet.

That’s what sanction is, right?

Approval to go ahead and do something.

Bit late by next spring, it would seem.

If nothing else, Dunderdale is supposed to have an agreement with Emera no later than November 30 or the term sheet Danny signed as he ran from the Premier’s Office last fall goes up in a puff of pixie dust.

That would mean the project should be ready to launch.

And by that time, Nalcor would pretty much have the approval to launch or they’d be so far along in the process stopping wouldn't really be much of an option.

So what would the “debate” in the House mean?

Not very much at all.

In reality, Muskrat Falls is already “sanctioned” in all but name only.  All that’s left to come is the huge bills and the monstrous political fallout.

- srbp -

The fine art of cabinet making #nlpoli

One of these people will replace Shawn Skinner as Capital City’s man in cabinet:

  • Tom Osborne
  • John Dinn
  • Dan Crummell

- srbp -

Some Day the Sun Will Shine: Oil and the End of Newfoundland History #nlpoli

Some Day the Sun Will Shine: Oil and the End of Newfoundland History
Time: 8 p.m.-10 p.m.
Location: Hampton Hall, Marine Institute, Ridge Road
Description: The Newfoundland Historical Society will hold a free public lecture. This month's lecturer will be Dr. Jerry Bannister. Refreshments to follow. Parking is free and everyone is welcome to attend!

Sponsor: Newfoundland Historical Society

Originally from Newfoundland and Labrador, Bannister is associate professor of history at Dalhousie University.  
- srbp -