There has been much conversation, interaction and discussion in multifaith and interfaith circles along with formal responses to an Essential Opinion poll about Muslim immigration. Professor Andrew Markus of the Monash – Scanlon Index of Social Cohesion examines the poll, the various methods of polling, and what results certain types of polling deliver. Due his nation-building work on social cohesion, we reproduce (with permission) this examination of this poll by Prof. Andrew Markus.
An Essential Report poll finding that 49% of Australians want to ban Muslim immigration received extensive media coverage last week. In addition to general reporting, Essential’s executive director, Peter Lewis, wrote in The Guardian:
The result floored me.
Less surprised was commentator Ray Hadley in The Daily Telegraph:
The left-leaning café latte sippers were left scratching their heads this week when an Essential poll revealed …
Senior journalists, including from Fairfax Media, and politicians took the findings at face value. Labor’s deputy leader, Tanya Plibersek, saw the survey as proof that:
We’re not doing a good enough job as national leaders to bring harmony and cohesion to our community.
Among the few to question the result was new Labor MP Anne Aly. She asked whether public opinion was really so adverse.
A second questioner was One Nation senator Pauline Hanson, who said the poll understated the degree of opposition:
I believe it’s a lot higher than that. Because people … have been in fear to answer the question … because they don’t know who’s taking the call.
Some aspects of the Essential findings are worthy of critical scrutiny. One relates to methodology.
There are two main approaches to surveying. One is a sampling of the population based on randomly generated telephone numbers. The other utilises an online panel of respondents who complete surveys out of interest and for reward.
Contrary to Hanson’s claims, no-one was “taking the call” in the Essential survey: it utilised an online panel.
Surveys employing online panels are much cheaper and quicker to run. They have a proven record on a number of issues, notably predicting election outcomes, as over a period of years they develop weighting formulas for their panel calibrated against election results. But there are no formulas of the same level of precision when surveys deal with social issues.
An extensive review of online survey methodologies found that:
Computer administration yields more reports of socially undesirable attitudes and behaviours than oral interviewing, but no evidence that directly demonstrates that the computer reports are more accurate.
Major organisations seeking the highest level of reliability continue to employ random population sampling, despite the cost involved.
To test the impact of different methodologies, in 2014 the Scanlon Foundation administered the same questionnaire to both a random sample of the population and an online panel. It found 44% of Australia-born online panel respondents whose parents were born in Australia indicated they held “very negative” or “negative” views toward Muslims. The same demographic in the random sample had a much lower percentage (28%).
There is a second issue, just as important, with the Essential finding.
Surveys do not simply identify a rock-solid public opinion; they explore, with the potential to distort through questions asked. Essential chose not to present respondents with a range of options on Muslim immigration. Rather, it was a yes/no choice:
Would you support or oppose a ban on Muslim immigration to Australia?
The product was easy-to-understand copy for the media, but arguably also a gross simplification. Public opinion on social issues defies binary categorisation. It is more accurately understood in terms of a continuum, with a middle ground on some issues in excess of half the population.
For example, with regard to asylum seekers, nine polls between 2001 and 2010 using various methodologies asked respondents if they favoured or opposed the turning back of boats. The average for these surveys was 67% in favour of turnbacks.
But, in 2010, the Scanlon Foundation survey tested opinion on this topic by offering four policy options, ranging from eligibility for permanent settlement to turning back of boats. In this context, a minority of just 27% supported turnbacks.
Minorities and Australian opinion
Survey findings are typically considered in isolation in the media, with no understanding of context, of what is within the expected and what is beyond it.
The Essential survey of attitudes to Muslims is hardly the first in the field. Several random population samples since 2010 have found that when respondents are asked for attitudes to minorities, by far the highest level of negative opinion is towards Muslims.
A random population sample by Roy Morgan Research in October 2015 asked respondents if they “support or oppose Muslim immigration”. It found a minority, 36%, opposed; 55% in support. Of Greens-voting respondents in the Morgan poll, just 1% indicated they were opposed. This is a marked contrast with the Essential finding of 35%.
A last issue concerns broad context. If the Essential finding is a sound reflection of Australian opinion, is it beyond the realm of previous findings? We cannot be certain, because past surveys rarely raised the zero option – the banning of a specific group – without establishing the range of opinion.
Between 1984 and 1988, however, when there was considerable public discussion of Asian immigration, ten surveys asked if the number of Asian immigrants was too high. On average the surveys found 58% were of that opinion, with a peak of 77% obtained by Newspoll in 1988.
And, in 1996 – at the time of Hanson’s first maiden speech in the federal parliament – an AGB McNair telephone poll found 53% of respondents agreed that Asian immigration “should be reduced”.