User talk:Teratix/Archives/2023/September

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Opinion Polling for the 2023 Australian Referendum - Aggregates

Hey! I have also been really appreciative of your opinion poll plots over the last few weeks (months?). I noticed you've recently removed the plot which includes the undecideds. This is really important to include, as it reveals the underlying volatility of the vote over time and also suggests the way soft/uncertain voters are trending (e.g. your previous graphs revealed that although 'yes' was declining, the increase in 'no' vote was largely from undecideds breaking to them. Suggesting a much greater trend within the larger undecided population towards 'no' than the Y/N plot reveals).

A number of polling companies (notably Essential) which had originally only been asking yes/no forced choice questions have recently begun asking yes/no/undecided for exactly this reason. So I think it's really useful to retain going forward. Danielcstone (talk) 13:06, 25 July 2023 (UTC)

Hello @Danielcstone: thanks for the kind words. As I mentioned in my edit summary, the reason I removed the plot with the undecideds was because firms' methodologies for giving a number for "undecided" have been so diverse that it's unclear whether they're really comparable, and any apparent trends in the data may just be down to house effects.
For example, some pollsters include a "leaner" question – pressing people who say they're unsure with another question to get them to say which way they're inclined. Obviously, this will drive down the number of undecideds compared to polls which don't have leaners. When reporting undecideds, YouGov/Newspoll and (now) Essential use leaners, Resolve, Morgan, Newgate and The Australia Institute don't, JWS reports both, and Freshwater reports both but (confusingly) its leaner only as a binary.
So already we're going to have to split YouGov and Essential, two of the most prolific pollsters on this issue, from the rest of the group. Unfortunately, the dissimilarities don't end there. Morgan, Freshwater and The Australia Institute use a pretty standard yes/no/unsure question, but Resolve and Newgate use a Likert-like question, which also measures degrees of support: something like strong Y/weak Y/weak N/strong N/neutral. Particularly in Newgate's case, this has tended to produce more undecideds. For its non-leaner JWS uses a standard three-option question, but this includes a "need more information" option which has also tended to drive up the number of undecideds (though looking again there has been a significant drop in June).
And if you're asking about how soft/uncertain voters are trending that opens a whole other can of worms – apart from Resolve and Newgate already mentioned, Newspoll's question used to ask for strength as well, Essential sometimes asked for strength after its binary, Freshwater has apparently asked about strength but has for some reason its figures are only available second-hand...
So my point is it's difficult to settle on exactly what data ought to be included or excluded in the graph. If we are too inclusive, we end up comparing data which isn't really comparable and get meaningless results. If we are too exclusive, we end up only using data from a select few pollsters and become vulnerable to their quirks and house effects, plus we're running the risk of doing original research if we're making too many judgement calls.
However, you have said you've found it important and useful, and I don't think there's anywhere else in the internet where an aggregation with undecideds is available, so I will take another look and try to figure out if there's a way forward, and I'll put a notice up on the talk page to see if anyone else has ideas for a solution.
(For more information, Murray Goot has an excellent piece ("Undecided" on the Voice) in Inside Story). – Teratix 19:43, 25 July 2023 (UTC)