Skip to content
GSS > Blog > GSS Blog > Getting ONS Social Surveys online

Getting ONS Social Surveys online

A key feature of the transformation of household and social data collection in ONS is a move towards using multiple data sources, with the balance moving increasingly to non-survey data sources away from traditional surveys. Surveys will of course be required in some capacity, in particular to answer the ‘why’ policy questions, but in the future we do expect non-survey data to be the primary source to meet our requirements.

When we do need surveys the expectations is that they will be mixed mode but ‘digital by default’, that is online first. There has been a huge amount of activity in the last 12 months to understand the impact of moving social surveys online; using a prototype transformed Labour Market Survey as a testing vehicle.

A lot of this development work has been qualitative in nature, focussing on respondent document and questionnaire design using an innovative respondent centric design (discussed here).

Additionally, two major quantitative tests have been undertaken:

Test 1 – July 2017

We mailed out to 38,000 households using an experimental survey design to answer some key questions – what proportion of households would take part online? What mailing strategies and materials would help maximise take up? Maximising online take up is very important in order to maximise the proportion of data collected in a relatively inexpensive collection mode.

We were encouraged that overall around 20% took part online (with no monetary incentive), and that if we sent invite letters on a Wednesday in a brown envelope (rather than white), and sent three mailings about the survey, this could be increased to around 22-23%.

More detailed results of this test can be found in a report authored by Ipsos MORI (who managed the survey mailings and hosted the questionnaire online).

Test 2 – September 2017

Next challenge, what happens if we introduce an incentive? What is most effective in maximising response, and more importantly, what is cost effective? We mailed out to a fresh 40,000 households, again with an experimental design fully described in the report for this test authored by Ipsos MORI. This test demonstrated that a heavily incentivised online survey could get in excess of 30% response, but (against expectations) we found that sending a non monetary incentive in the form of a Tote bag to households in advance was the most cost effective incentive.

So, a lot of testing activity, but we now have a clear picture of how to maximise online response, and how to cost effectively use incentives. This will be vital learning as we move towards understanding more about how to integrate a face to face collection mode to the online mode, and how to retain households in later waves of social surveys.

Other testing and learning

Along the way we have learnt a lot about who takes part in surveys online, and bust a few myths in the process. For example, some of us thought introducing an online mode would mean we would have a more balanced sample in terms of age, with proportionately more young people doing it online. Not so! We have learnt that young people are a challenge to get to respond regardless of mode. And actually those aged 50-70 are the most likely to respond online.

We have also been able to do a small test to see what proportion would do a ‘Wave 2’ survey, and 6 in 10 did so, which we’re really pleased with and bodes well for future longitudinal testing.

Overall, gathering these metrics has been of vital importance to inform further development. We could have just relied on our perceptions of the best thing to do, but when perceptions and facts are out of line we know that survey design is not going to be as good as it could be.

The next challenge is to develop the prototype survey further and properly scope out the impact a mixed mode survey – what response rate might we get, and is the data that we get biased? More evidence needed!

Denise Sexton