Telum Talks To… Pureprofile Data Scientist Dr Uwana Evers

Telum Talks To… Pureprofile Data Scientist Dr Uwana Evers

Research is used in a multitude of different industries in a multitude of different ways. Sometimes it’s used to make informed business decisions, while in the PR and broader media industry, they’re used to gain an understanding of the people, the results of which are often newsworthy.

When research produces such a high-profile failure like the Australian Federal Election last month, questions about its legitimacy are raised. Indeed off the back of the election, the Sydney Morning Herald and The Age do not have an active polling or research agency.

Telum Media spoke to Dr Uwana Evers – a data scientist at data and insights firm Pureprofile about what this means for the future of polls and research in media.

If the political polls are broken, why aren’t other surveys / research?
Political polls aren’t broken. However, as technology and the population changes, so too does the way pollsters gather and interpret data.

We know some of the bigger polling companies use landlines only, and others rely on landlines plus online to gauge voter sentiment. Polling companies need to keep up with the times and this means adding modern communication means to the mix. For example, most millennials have mobile phones, not landlines so naturally this demographic could be missed.

The smartphone penetration rate in Australia is almost 85 per cent, according to Statista.

Modern research-gathering methods use online as a primary tool. At Pureprofile we have a mobile-first strategy to ensure all our surveys are readable, accessible and functional on any portable device.

We’ve found most people tend to respond to surveys at their chosen time – much like how people can now bank when they want to or watch movies on demand.

The perception baby boomers aren’t online or aren’t internet savvy is wrong – our panel (and those of other research firms) have shown we can reach a wide yet targeted demographic, across generations.

Polling companies also “tweak” the raw data to ensure the sample accurately represents the intended population. They also take preferences into consideration when deciding what the polls will reflect.

Political polls have increased potential to predict the wrong outcome because the margins are so narrow, in recent cases, less than one per cent. Most other kinds of research do not have such narrow margins and are therefore less likely to make inaccurate predictions.

How does research differ from a political poll?
A political poll is a particular type of research. Broadly speaking, research can have different goals: a company might want to better understand consumer behaviour or create brand awareness, whereas a PR agency might want to generate a strong headline for a good media pitch. Political polls are conducted to gauge voter intentions.

How can Australians trust surveys / research, when the political polls – which is what the average punter might think of when talking about surveys – seem to be broken?
The answer is we can’t trust them in an absolute sense, they should be used as a guide only. We’ve seen with the 2015 UK General Election, the 2016 Brexit referendum and the 2016 US election that the polls can and do get it wrong. The problem is not unique to Australia.

There isn’t a silver bullet when it comes to accurately gauging voter sentiment. Political parties need to have an arsenal of tools and technologies at their disposal – both physical and virtual – and this includes using surveys, polls, social listening, sentiment analysis, focus groups, door knocking and more.

What is it about polling or survey methods that leave it open to such inaccuracies as we’ve seen in this election, and how do we go about fixing it?
Polling tries to get a representative sample, but getting a representative sample of the entire voting population can be challenging. With people no longer having landlines or being in the phone book, old methods of random selection no longer work.

In politics, the margin of error can be within one per cent. If we wanted to predict the election from an Australian sample with 95 per cent confidence and one per cent margin of error based on a population size of 25 million, we would need to ask 9,600 randomly sampled Australians.

We would also need to do this at the electorate level to be accurate, as seats are won / lost at this level. The sample sizes required at the electorate level are not much different: for example, Warringah with 104,449 voters, we would need 14,000 randomly sampled people. It would be prohibitive (impossible!) to pre-poll all electorates with any level of accuracy before an election.

This doesn’t even take into account the effect of swing / undecided voters.

Pollsters can utilise different sampling and data collection techniques to ensure different voices in the population are represented in the sample. It’s not just about including people across generations and from different locations, but also having non-english language options, and incorporating multiple methods of reaching these diverse people.

These polls come out in a fortnightly or monthly basis – is this a big enough time frame to get an accurate cross-section of the population? Is there a correct amount of time to gauge a change in opinion or attitude?
Tracking shifts in sentiment over time can be beneficial. Taking the pulse of the same pool of voters can help guide policy settings before and after they’re introduced.

There will never be a “right” amount of time to gauge a change in opinion or attitude, especially if an electorate has a large number of swing voters – people who don’t decide until they step in the voting booth – to change any previous outcomes.

That’s why it’s important (in the political arena) to use myriad methods to as closely as possible determine how people will vote.

Read more at https://www.telummedia.com

Dr Uwana Evers is a Data Scientist at ASX-listed data and insights firm Pureprofile. She is also a research fellow at the University of Western Australia. Prior to her academic post, she worked for Thomas International in London as its Psychology Research Analyst. Uwana is a BPS Chartered Psychologist and has a PhD in Psychology from the University of Wollongong.



News from our blog