Ipsos: The trouble with research surveys - and why they won’t die

Andrew Green
By Andrew Green | 16 December 2014
 

We still rely on surveys to tell us how many people are reading newspapers and magazines. The Ipsos emma survey, for example, quizzes around 50,000 Australians every year about what they read, what they watch, what they own and what they buy from a representative sample which is projected to the whole population.

But why do we continue to do this? The great thing about the internet is that everything can be tracked, recorded and measured. Every keystroke is captured and every word we write or web page we visit is recorded. Why depend on the not-always-reliable memories a few thousand people when everything they do is already being automatically recorded?
One reason, of course, is that not everything people do in life is online. Life very much endures in ‘analogue’, where in Australia 17.3 million printed newspapers are sold every week.

Another is that data captured automatically about our online behaviour suffers many limitations. For a start, most internet traffic is not human1, so this has to be filtered out of the audience data. Some traffic comes from outside Australia, so this too must be removed. Some of it, as we also know, is fraudulent.

Visits to a newspaper or magazine webpage can be as short as a second, but will be counted in exactly the same way as one viewed for a minute by a web analytics supplier. Content may or may not be viewable. And the various tracking methods used to try and identify ‘unique’ page views (people accessing a web page for the first time) such as cookies or advanced ‘fingerprinting’ of computers all have their flaws, including the obvious one that they measure machines rather than people.
So to gain a complete picture of peoples’ reading behaviour, it remains essential to use survey research, ideally in combination with other kinds of data, including web analytics, online and mobile panels. Non-survey methods can only ever offer part of the readership measurement story.

Surveys and their flaws

Of course surveys have their flaws too. People may not remember exactly what they are being asked to remember. They may be untruthful in their responses to certain questions or they might be in a hurry to finish answering questions and so rush their answers.

In recent years, it has become much harder to recruit people to take part in surveys, particularly when interviewers knock on their doors without appointments. Market research companies have employed a number of measures to address this, including being more persistent (calling back more often), offering more attractive incentives and offering more convenient ways for respondents to complete the surveys, such as doing them online at their convenience rather than when an interviewer comes to the door or calls.

What is now clear is that surveys, panels and web traffic data work best in combination with one another rather than as standalone information sources. The smart approach today is to develop statistical methods for combining the information from these different sources to present a complete picture of reading behaviour.

Each of these components of the measurement solution – the survey, the panels, the web traffic capture and the statistical integration methods used – need to be of the highest possible quality.

The Four Pillars of Market Research Surveys

At the heart of good survey research are four key pillars underlying the process:

1. Making sure we talk to the right people;
2. Persuading them to take part in a survey;
3. Asking them the right questions in the right way;
4. Ensuring they are engaged and honest in their answers.

Getting to the right people

It is obviously important that the people we do talk to in a survey are representative of the people we are trying to measure. We need to ensure that they properly represent people living in different parts of Australia and that they reflect the various gender, age and income sections of the population. Where there are differences in the characteristics of a recruited sample and the population, various kinds of survey weighting are used to being the sample into line.

One particular innovation has been the use of ‘propensity’ weighting to help ensure the sample correctly typifies the behavioural characteristics of the population as a whole. Some people do not habitually read printed newspapers or magazines for example and, for that reason, may feel they do not ‘qualify’ to be asked questions about an activity they rarely participate in. But they do. A survey needs to represent readers and non-readers to be properly representative (just as, ideally, an online panel needs to represent both heavy and light internet users).

emma™ survey participants do tend to be heavier print readers than non-participants. We know this because people who decline to take part in our survey when contacted are asked briefly about their reading habits before we thank them and close the call. The many thousands who do answer these short questions tend to be lighter readers than the people agreeing to take the survey. And these propensities in the population are changing as time goes on. This is now fully taken account of in the published survey results - a world first in readership measurement.

Convincing People to participate

People are busier and less willing to spend lengthy periods of time answering surveys than they used to be. So research companies need to do all they can to maximise response.

Enabling them to complete the survey at their own convenience rather than when an interviewer calls is also important. They are also offered the option of filling it out online or having an interviewer call. As an incentive for taking part, respondents are offered the opportunity to win cash vouchers in a quarterly draw.

We demand a lot of respondents. Our goal is to make it as easy as possible to take part and to engage them in the process.

Asking the right questions in the right way

The exact way in which questions are asked is very important. It has been found in many cases when even quite minor changes in wording occurred in a readership survey questionnaire, that reported results could change quite significantly.
But it is not just wording. How titles are presented to readers is important (are they, for example, shown in typewritten lists for people to pick from, as mastheads with supporting descriptions or as full front covers?) emma™ has drawn from work around the world on this and employs what we consider to be best practice – using colour logos for newspapers and recent front covers for magazines to prompt people to recall their reading.

We also ensure that what we call ‘reading’ is clearly defined – in this case as ‘reading a printed publication for at least two minutes’.

Ensuring they are engaged and honest

The fourth pillar in creating a high quality readership survey is to ensure that, once we have recruited a representative sample of people and presented them with questions they are able to understand and to answer – that they do so honestly and conscientiously.

To help in meeting this goal, we have designed a high quality visual interface for online survey takers. An algorithm is built into the survey which helps us to identify and, if necessary, eliminate those who are seen to be rushing through the questions too quickly or to be answering according to an obvious pattern that indicates an unconsidered set of responses.

Bringing it all together

As noted, all research methods have their weaknesses. Page traffic data measures machines rather than people and offers limited detail about them. Online panels help us to draw a deeper picture of website audiences, but are limited by their small sample sizes. Surveys rely on a representative sample of people accurately recalling their behaviour.

But by combining these data sources – as emma™ has begun to do with its statistical ‘fusion’ of Nielsen Online panellists onto its large and highly robust sample of survey respondents – we can get nearer to the truth than we can by reliance on a single approach.

Andrew Green
Global CMO
Ipsos MediaCT

For more news:

Ogilvy wins BMW
Nike adopts Shazam for first in-store mobile content platform
Presto and Stan add to offerings

Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au

Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day. Need a job? Visit adnewsjobs.com.au.

comments powered by Disqus