Posts Tagged ‘Polling Analysis’

There has been a bit of debate on Twitter about this post at The Political Scientist, which argues that the National’s rise in the polls is merely the result of Labour voters switching to “undecided” and reducing the denominator.  There were also a few requests for an explanation, and Thomas at StatsChat has obliged here (also, I stole his title).  I have a few points I want to add or expand on.

Firstly, as an aside, many of the mistakes are the same as those made in this post on Kiwiblog which argues based on correlation coefficients that in 2011 it was mainly National voters who stayed home, not Labour voters.

Secondly, we don’t actually know the number of undecided voters.  As pointed out in comments on the StatsChat rebuttal many of the raw numbers are weighted by demographics, probability of voting, and others (whether or not they also have a cellphone?).

Thirdly, the results for the correlation coefficients are very susceptible to the number of polls.  On first read this particular table from The Political Scientist stood out:

Correlation coefficients, from The Political Scientist.

Correlation coefficients, from The Political Scientist.

The table shows the correlation coefficients with the undecided vote for four parties for all nine Fairfax polls from July 2012 to June 2014 (top), and for only eight polls, with the June 2014 poll results excluded (bottom).  You can see that the correlation coefficient for National changes from 0.7 to 0.3 with the addition of a single poll!  Obviously the results aren’t particularly robust, and that is equally as true for the other three parties as well, even if they just happened to show smaller changes in the table above.

Taking this a step further, it should be reasonably obvious that you can’t trust estimates of correlation coefficients based on a small number of data points.  When you have only two data points to work with you must get a correlation coefficient of 1 even if there is no actual correlation between the things you are measuring, because for any two given points it is possible to draw a straight line that passes through both of them (or, rephrasing, two points define a straight line).  Adding more data points will move your estimate of the correlation coefficient closer to the true value, but with a small number of polls you can never be very confident.

As another aside, always be suspicious when you see results quoted to a large number of significant figures.  There’s nothing wrong with it in principle, but it raises the question of how accurate they really are.  In this particular case, if the addition of a single poll moves National’s coefficient from 0.7 to 0.3 then there’s no point quoting more than one decimal place, if at all.

Fourthly, there seems to be confusion between different coefficients.

Thomas covers this point, the difference between correlation coefficients and regression coefficients, in paragraphs 2-3.

More intuitively though, the correlation coefficients shown in the table above between NZ First and undecided voters (0.8) is almost that same as that for Labour’s.  Does the drop in NZ First support cause the increase in undecided voters?  In the last two year the number of respondents supporting NZ First fell from 32 to 24 (see linked table below), while the number of undecided respondents went from about 110 to 230.  Would you argue that the 8 former supporters per 1000 lost by NZ First turned into 220 new undecided voters?  Of course not!

Poll results, and estimated number of respondents, from The Political Scientist.

You may argue that the real evidence is that the number of supporters lost by labour is (roughly) equal to the increase in the number of respondents who are undecided, and that correlation coefficients have nothing to do with it.  And that’s fine.  But then why bother publishing the correlation coefficients at all?

Fifthly, correlation does not imply causation (see also, xkcd).  When dealing with correlation effects you have to be very careful to avoid false causation.  Even assuming the changes aren’t just a statstical fluctuation we still can’t say whether Labour voters are really becoming undecided.  As Thomas says

You could fit the data just as well by saying that Labour voters have switched to National and National voters have switched to Undecided by the same amount — this produces the same counts, but has different political implications.

If you’re a frequentist then Thomas’ alternative explanation is just as convincing.  If you’re Beysian then now might be a good time to break out Occam’s Razor and say that you thought that Labour voters were switching to undecided anyway, so you believe the first hypothesis.  Which is fine.  But in that case was there any value in the analysis?

The only way to figure out what it really going on is to do a longitudinal study where you use the same sample of voters for each poll.

Sixthly, in their conclusion The Political Scientist says

Without taking into account the level of the undecided vote this presents a misleading picture of how many voting age New Zealanders support each party.

Of course, by limiting reporting only to those who have declared party support and are likely to vote the reporting may very well reflect what could happen in an election.

This is sort of hedging both ways.  If the point of the Fairfax poll is to gauge support and/or try and predict the results of an election “if it were held today”, then the pollsters must do something with the undecided voters.  Unless you have evidence that they break differently than for decided voters (which could be the case), it seems sensible to just ignore them when publishing the headline results.  It’s not “a very misleading picture” at all.

Bonus: check out this excellent diagram from Wikipedia showing the differences between regression coefficients and correlation coefficients.  All the graphs in the middle row (except the centre) have the same absolute correlation coefficients.

Correlation coefficients.


Read Full Post »

A busy week for political polls. After the release of the Roy Morgan Research poll on Friday (already covered in the previous update on Sunday,) there were two additional polls released on Sunday night by TV3 (Reid Research poll) and One News (Colmar Brunton poll).

The Colmar Brunton poll shows increases in support for National (54%, up 1%) as well as Labour (34%, up 3%) relative to the last Colmar Brunton poll published in late November 2009. The big loser appears to be the Green party (4.7%, down 2.3%). None of the changes are statistically significant on their own. The TV3 poll also shows no significant changes relative to the last TV3 poll published in mid December 2009.

It’s been a slow start to year for NZ political polling, but now that we finally have a handful of polls it’s interesting to look at some of the trends. Please see below the graphs for analysis.

As usual, the two graphs below summarise the polling averages for the party vote after the new poll. The horizontal axes represent the date, starting 60 days before the 2005 NZ General Election, and finishing 60 days from the present. The solid lines with grey error bands show the moving averages of the party vote for each party, and circles show individual polls with the vertical lines representing the total errors.

Party vote support for the eight major and minor NZ political parties

Party vote support for the eight major and minor NZ political parties as determined by moving averages of political polls. Colours correspond to National (blue), Labour (red), Green Party (green), New Zealand First (black), Maori Party (pink), ACT (yellow), United Future (purple), and Progressive (light blue) respectively. Party vote support for the six minor NZ political parties

Party vote support for the six minor NZ political parties

Party vote support for the six minor NZ political parties as determined by moving averages of political polls. Colours correspond to Green Party (green), New Zealand First (black), Maori Party (pink), ACT (yellow), United Future (purple), and Progressive (light blue) respectively.

As always, please check the Graphs page for further simulation results.

Since it’s been a while, I’ll discuss the results for each party below.

  • Green Party: In previous posts I’ve pointed out that “there appears to be a roughly 6% chance that the Green party will fail to clear the 5% party vote threshold and therefore get no seats in Parliament.” In fact, at one stage in the last month the simulation was predicting an 11% chance this would happen. In hindsight, this appears (as predicted in the relevant post) to be the result of statistical errors on the polling averages blowing up due to the small number of polls released early this year. With the release of three polls in the last week the Green Party’s polling averages have moved from 5.7% +- 0.5% in early February to 6.3% +- 0.3% currently. The small increase coupled with the tighter margin of error now means the Green Party are predicted to have a greater than 99.9% chance of getting seats in Parliament.
  • New Zealand First: I’m not sure if anybody picked up on it, but at one stage the New Zealand First party were predicted to have a roughly 1% chance of clearing the 5% party vote threshold and winning six seats seats in Parliament.  The reason for this appears to be the same as that given for the Green Party above; statistical errors on the polling averages blowing up due to the small number of polls.  After rerunning the simulation with the latest polls added in to the calculations I am able to confirm that at no stage were the New Zealand First party predicted to win any seats in Parliament.
  • National: A couple of recent polls have shown National to be in the low-50% range, a reasonable drop off their polling highs in the middle of 2009. At one stage I was almost ready to call TOD on the National Honeymoon (to the extent that the term “Honeymoon” has any meaning.) The last two polls from One News and TV3 have bumped National back up to the mid 50% range, however. Current averages have the National party on 54.0% +- 1.1%; a significant, although small, decrease relative to the post-election high of 56.3% +- 0.9% recorded on the 10th of October 2009.
  • Labour: Labour’s moving average currently has them on 31.9% +- 0.8%; a small but significant rise compared to their post-election low of 28.9% +- 0.7% on the 23rd February 2009.
  • Others: The ACT Party and the Maori Party have been consistently polling around the 2% mark for the last six months. Current polling puts support at 1.7% +- 0.3%, and 2.4% +- 0.3% respectively. The Progressive Party and United Future NZ have been consistently polling around the 0.2% mark for the last six months. The relative errors on these averages for the Progressive Party and United Future NZ are the largest of any of the parties due to significant rounding errors in the polls; most pollsters only report results rounded to the nearest 0.5%. Both parties are polling below 0.6% at the 90% confidence level.

Read Full Post »