Respondent engagement for push-to-web social surveys

Policy details

Metadata item Details
Publication date:1 February 2021
Owner:Social Surveys Transformation - Research and Design Team
Who this is for:Social researchers, survey designers, survey managers, content designers
Type:Report
Contact:Research.And.Design@ons.gov.uk

Document downloads

Respondent engagement for push-to-web social surveys: qualitative research report (DOCX 2MB)

Templates:

These files may not be suitable for users of assisted technology. If you require any of these files in an alternative format, please email: GSSNet@statistics.gov.uk

Back to top of page

Acknowledgements

This research has been completed over several years and has involved input from numerous researchers and experts across the Office for National Statistics (ONS).

Main contributors; Natalia Stutter, Laura Wilson, Sophie Nickson, Tara McNeill

Design; Thais Faria, Grace Ellins, Andy Budd, Rachel Price

Research support; Alex Nolan, Charlie Hales, Emma Dickinson, Vicky Cummings, Emma Timm and their teams

Additional thanks to our colleagues in Social Survey Transformation; the Survey Operations Research Team and the Materials Strategy Advisory Group.

Back to top of page

Executive summary

This purpose of this report is to:

  • share the learning from our qualitative research so that it can be used and recycled by others
  • provide advice and recommendations for those developing respondent communication materials for their own push-to-web surveys

Over the last few years, the Social Survey Transformation’s Research and Design Team, as part of the Census and Data Collection Transformation Programme (DCTP), have conducted a series of qualitative and quantitative research activities to develop a suite of push-to-web respondent materials. Online completion is offered in the first instance as the primary mode of data collection.

In 2012 the government introduced a digital by default strategy which supports the design and delivery of digital services. Alongside this, standards and guidance on how to design digital services were released. Following this guidance, the approach to this work is respondent (user) centred.

Back to top of page

Introduction

The ONS collects data through running large-scale household surveys. Traditionally, many of these surveys are conducted in person, or over the telephone. In line with the government’s digital by default strategy and as part of the Census and Data Collection Transformation Programme (CDCTP) , the ONS has been exploring online first data collection (also referred to as a push-to-web approach) alongside alternative sources of data, such as admin data.

A push-to-web approach to data collection means that respondents (referred to in this document as ‘users’) will be first asked to complete the survey online before any other mode is offered. If the user is unable to complete online, an alternative follow-up mode of data collection will be introduced such as face-to-face or telephone.

In the push-to-web approach the first touchpoint users will have with the ONS is through the materials they receive in the post. This means that compared to traditional data collection methods, additional reliance is put on the communications to convince people to take part.

Understanding who our users are

The users of ONS social surveys are potentially anyone living in the UK. Most of the ONS’ surveys are for people aged 16 years or over, but sometimes children are invited to take part too. It is important that the household surveys are representative and unbiased, to do this, addresses are sampled from a database, usually the Postcode Address File (PAF) or Address Base. This means that households are selected at random and nothing is known about the person or people who live there. Therefore, it is important that initial contact materials are inclusive, accessible and meet the needs of a range of different users.

Although the materials are an offline part of the respondent’s journey, they are asking user to complete an action online, if they are able. The UK Government accessibility regulations mean that all public sector organisations have a legal duty to make their websites accessible for everyone, including those with disabilities. We have extended our approach to accessibility to consider the end-to-end journey, including both online and offline touchpoints, and in collaboration with our graphic designers applied the principles of accessibility to printed products. The ONS Style Guide includes guidance on web accessibility.

We explain more about how we have designed our materials for accessibility later in this report.

What we need people to do

ONS needs households to take part in voluntary online surveys so that they can collect data that are used in the production of accredited official statistics. The aim is to get as many different people from sampled households across the UK to take part in these surveys to help produce quality statistics. Although users can complete on behalf of another person in their household (known as to ‘proxy’), the best outcome of a household survey is where each individual aged 16 years or over in the household completes their own section.

Unlike other GOV.UK services which a person might be obliged to engage with, such as obtaining a driver’s license or paying tax, all ONS social surveys are voluntary.
In practical terms, for a user to take part we need them to:

  1. Open the envelope
  2. Read the letter
  3. Go to the website
  4. Complete the survey
  5. Return for a subsequent wave, if taking part in a longitudinal survey

This research explores the role of communications at and between all five steps and the different user journeys that might be involved in reaching the completion step.
This document outlines the several years of research and development carried out to understand user needs and create respondent communications that are integrated as part of the whole survey experience.

Given our wide target audience and feedback from ONS field interviewers, many people asked to complete an ONS survey are unlikely to consider it a priority. In acknowledging this potential lack of propensity to respond (and evidence of current response rates), we set out to explore different strategies to try and ‘nudge’ people into acting as soon as possible in their user journey. We aimed to learn about their motivations or barriers to taking part and came up with solutions to address them.

This report explains the qualitative research carried out to develop each of the respondent engagement products in the user journey. This includes the envelopes contact letters, between wave engagement and interviewer stationary used in mix-mode data collection. Each piece of material or set of materials is referred to in this document as a ‘product’. The research conducted for online respondent materials follows the three agile development phases:

This is the approach used in the development of products by the Government Digital Service (GDS).

Discovery

The purpose of a discovery is to understand the problem. In this research we have used a discovery to start understanding user needs and conducting conceptual exploration of topics. Typical discovery research activities include desk research, workshops staff, focus groups with the users and pop-up testing. These insights are used to inform the next stage of research and development; alpha.

Alpha

In the alpha phase, research involves testing out prototype products and ideas that have been developed using the insights gathered and identified in the discovery research. It allows you to explore some assumption you might have around the best way to meet using need. The alpha allows you to try things out and drop or re-iterate if you find they don’t work.

Beta

Product development in the beta phase involves taking the findings from alpha and rolling out a product into a practice live environment (private beta) and exploring whether the product meets user needs in the way that you expect. It allows you to assess how the product performs in this environment and identify where improvements may still need to be made.
The qualitative research conducted through discovery and alpha phases was used to developed prototype products to be trialled in large-scale quantitative tests as part of a private Beta.

The high-level results of these trials will be covered a in each product section, as well as links to the full reports. This report will begin by discussing the main contact letters.
Although these products have been developed for push-to-web survey, many of the learnings from our work and design features in final products can be applied to any mode.

Back to top of page

Summary of findings and recommendations

The user centred design approach taken in this research revealed not only the consistent user needs, described throughout this report, but also important features that can help address these needs.

  1. Branding: including official logos and any supporting branding, such as the Royal Coat of Arms adds authenticity and legitimacy to the letter, providing reassurance for user. We would advise thinking about how your users interact with your brand across their journey and how you can ensure consistency at different touchpoints.
  2. Tone: in our research we explored different tones, from overly friendly to authoritative. For ONS, users expected the tone of our communications to be friendly but professional. This will depend upon the type of organisation you are and the image you already have, so it is worth exploring in context.
  3. Icons: using iconography in the letters next to the text was found not only to make the letter more visually engaging but helped skim readers find the information they needed quickly. Icons also help guide users with low levels of literacy, breaking up the text and indicating what the adjoining paragraph is about. Although helpful, icons should only be used as support tool, and not to replace text to convey information.
  4. Sub-headings: using sub-headings to divide up text helps users navigate the text and understand what to expect in each subsequent paragraph. This allows the user to assess whether they want to read the section or not.
  5. Infographics: are useful when helping to explain the survey process. For example, the three step instructions for completing the online survey and the five-step diagram on our leaflet include both pictorial elements and text. In our research, infographics were found to engage users by making the process look simple and drew attention to the action required by the users.
  6. Layout: in the research we found that ‘less is more’. Users told us that they would be put off by having too much information or being faced with a wall of text. Some users also mentioned a reluctance of turning the page or opening a leaflet. To minimise any potential friction in the users’ experience, we suggest limiting the content to what is needed by the user at each point in time. This will help create more white space, providing breathing room for the text.
  7. Sign off: there was an expectation from our users that the letter is signed-off from a named person with a signature. This provides degree of authority and legitimacy to the letter.
  8. Leaflet: when presented alongside a letter, the leaflet is considered as secondary material by users. It should be used for any supplementary information that has not been prioritised for the letter. The content should still be user-focused and assist in meeting the needs identified in section 11.

As a result of this research we have produced a suite of generic materials, these can be found on the report web page, alongside three letter templates that can be used as a starting point for creating your own letters.

Back to top of page

Contact letters

Discovery

Due to the lack of an address register in the UK with population contact details, at first contact, the only way to initially communicate with people to take part in an online survey is by post. The letter therefore needs to be easily understood and persuade people to take part.

Note: After first contact we can resort to other mediums as we will have collected alternative contact details, such as email. These alternative approaches are integrated into our transformed communication strategy and are detailed later in this report.

Exploring and experimenting with contact strategies for push-to-web surveys is not a new concept. There are many past studies in the UK (Taylor and Lynn, 1998; Lynn, 2016), the USA (Bandilla, Couper, and Kaczmirek 2012; Dillman, 2011; Messer and Dillman, 2011; Dillman 2017; Kaplowitz, 2012) and Australia (personal correspondence from Australian Bureau of Statistics) that illustrate the role that communications have in influencing engagement with surveys.

However, each of the studies and strategies explored are context specific to the sample population, country, culture, organisation, survey topic and mode. Therefore, if possible, we always recommend conducting your own research to mitigate contextual risk.

Furthermore, the ONS’ position as an independent government department is unique to the UK and we know from internal research by our communications department engagement is likely to be influenced by audience perceptions and attitudes towards, politics and government and largescale data collection more generally.

The move towards push-to-web in the context of household surveys also presents new challenges (Couper, 2000). It is easier for a user to ignore a letter or exit a survey than it is to avoid answering the door to an interviewer at their home or asking them to leave mid-interview.

The Discovery phase of this work started in 2016 during ONS’ Electronic Data Collection programme (EDC) where initial contact letters were developed alongside the questionnaire. The letters went through multiple iterations of cognitive and user testing. This section summarises the research and insights from this Discovery work on those initial contact letters. The EDC insights were later used to inform the development of the later CDCTP communications work.

Main findings

The Discovery research involved focus groups and one-to-one interviews (reports available on request). In the Discovery research across several iterations of the materials, the main user needs identified for the initial contact letters were:

  • understanding what the data is used for and by who
  • knowing what to expect when completing the study and how long it would take
  • making it clear that the study was being offered online
  • clarifying which types of device the study could be completed on
  • providing reassurance about data protection and confidentiality
  • offering a telephone number to call for help
  • a website address to complete the study
  • clear login details and instructions

In addition to the main user needs, the following insights were obtained:

  • navigation from the letter to the web should be as simple and straightforward as possible to enable those with low digital skills or other accessibility requirements to access the study independently
  • URL and login codes should not contain ambiguous characters for example ‘l’ and ‘I’ or ‘O’ and ‘0’
  • when text was bolded it stood out and should be used to highlight important messages
  • short paragraphs of information rather than bigger paragraphs were easier to understand
  • the inclusion of the ONS logo and Royal Coat of Arms stood out to users
  • using colour in the letter design made it look more friendly
  • quick reference (QR) code stood out but didn’t necessarily mean users would use it
  • users felt that it was important to know that their contribution was valued
  • using “Dear Sir/Madam” to address the reader was as expected (see Contact letters  – Alpha) for a later update on how we address the letters)
  • the use of “Dear Resident” was considered spam or junk mail, and users would prefer the letter to be personalised with their name
  • users felt that including signature on the letter made the letter more personal
  • the purpose of the prenote was understood to be an introduction to the study and encouragement to take part

Following these high-level insights, there was also feedback on specific aspects of the prototype letter:

Headline “Play your part”:

  • the words “play your part” made respondents feel that taking part could help and this was encouraging
  • it also made users feel important and gave the letter a personal feel

Sub-heading “Your household has been selected”:

  • this phrase had mixed reviews from users, including some negative comments relating to it sounding a bit “bogus” and “a scammy type thing”

Sub-heading “We value your contribution”:

  • users felt positively towards this subheading because it promoted a feeling of having real input which was found to be encouraging
  • users felt that this was important to include

The phrase “a few days from now”:

  • users understood the sentence and expected to receive another letter
  • it was important to users that context was given before reading this information about what would happen next

The phrase “Each year about half a million…”:

  • initially, users felt that this was a lot of people, but then compared with the UK population seemed small
  • the large sample for the survey indicated legitimacy and suggested to users that the results would be reliable
  •  because half a million people take part some users felt that it was less special to be taking part and this took away the importance of their participation

The phrase “The ONS is the country’s largest provider…”:

  • users felt that the short introduction to ONS was important to have at the beginning of the letter
  • the reference to the Census was appreciated to help users understand what the ONS does

The insights gained from this Discovery work during the EDC programme informed later design phases and laid the foundation for the CDCTP materials product work. Some of the EDC wording was re-tested and re-developed in the later phases of this work to reaffirm the findings still held.

The ONS’ CDCTP approach to transformation follows the Government Digital Service (GDS) standards and principles which is underpinned by human-centred design. In this approach to design, the user’s needs direct the work and content. Insights are gathered through user testing which considers the whole end-to-end journey for the respondent. This joined up user centred way of working is a more holistic and inclusive way of considering survey design. It includes engagement contact strategies alongside the data collection instrument for different user groups and needs. All legal and ethical commitments are retained and met by the content.

Alpha

To invite users to take part in a study it is traditional to send an invitation letter. In face-to-face or telephone modes this is often referred to as the ‘advance letter’. In the online context, research has shown that for online surveys, prenotification letters can be beneficial (Dillman, Smyth, and Christian, 2014). A prenotification letter is a letter used to prepare users, sent ahead of the survey going live for data collection. In developing the suite of letters, these three products; prenotification, invite and reminder, were the primary focus.

The EDC Discovery insights prompted us in the CDCTP work to think more about the psychological tools that could be used to encourage response. Recognising recent successes of behavioural insights within policy and government, this is what we turned to next.

Behavioural insights research

Behavioural insights or ‘nudge theory’ is a psychological technique used to encourage behaviour change. Traditionally used in health interventions, nudges involve implementing small changes to try and produce a desired outcome of effect. In 2010 the Nudge Unit, later renamed to the Behavioural Insight Team (BIT), was set up by the Cabinet Office with the aim of applying behavioural economics to public policy. Successful trials carried out within UK government, for example HMRC sent tax letters stating, “most people pay their taxes on time”. This social norming nudge helped significantly increase payment rates (Halpern, 2015).

The framework used to develop these techniques or interventions known as ‘EAST’ which is based around the principles of make it: easy, attractive, social and timely.

We used this framework to think about how behavioural insights could be applied to online social survey letters. Below are some of the ideas from the behavioural insights literature that we suggested could be explored in the ONS survey context.

To apply the make it easy principle:

  • ensure text is clear and simple
  • avoid complicated or specialist language
  • reduce volume of text and remove clutter
  • people are more likely to believe statements written in bold

To apply the make it attractive principle:

  • use the right messenger for the audience; the ‘messenger effect’ can add credibility and build in personalisation and reciprocity
  • use the word ‘you’ to grab attention, promote personal responsibility and reduce diffusion of responsibility
  • use the word ‘our’, for example, ‘our society’ to increase relevance
  • frame the user’s decision to not respond to the initial ask as an oversight, allowing their participation at the next touchpoint to be perceived as an opportunity to do something positive (i.e. make it attractive) and their decision not to take part conversely, as unattractive
  • loss aversion – frame chance to take part as something to be lost e.g. ‘don’t miss out on your chance to be counted’
  • explore envelope colour, white envelopes with personalisation have been shown to be effective at improving engagement with mail
  • give a reason for taking part i.e. ‘because’ increases compliance
  • incentivise behaviour as a friendly thank you, not as a payment for doing the right thing
  • make it salient by personalising or referring to local area

To apply the make it social principle:

  • use descriptive social norming – what people actually do versus what they should do, for example ‘most people answer our surveys’
  • diffusion of responsibility – focus on the importance of their selection and avoid mentioning sample sizes
  • skewed estimates of prevalence influence behaviour in a self-reinforcing way – avoid specific figures in favour of terms like ‘most people’
  • people can be influenced by behaviour of others who are similar or from same area, draw on this, rather than people in general
  • encourage a ‘good mood’ to improve open mindedness – preface with an incentive (for example a gift) use pleasant graphics, photos and colours
  • include a photo of the messenger to increase compliance; a photo can also increase reciprocity

To apply the make it timely principle:

  • appeal to hyperbolic discounting by offering incentive up front and mention the time cost as soon as possible
  • temporal decay – reward participation upfront (also making it attractive), describe how long the task will take and ask people to plan the time into their schedule later
  • use the endowment effect, which means that when something is ‘yours’, it has greater worth than the same object belonging to someone else – use this to make the survey and its responses belong to the participant – ‘your responses’ makes them seem valuable

Focus groups with public

After desk research reviewing previous Discovery work and the behavioural insights literature, three prototype prenotification letters using different tones and techniques were developed and tested with members of the public in focus groups (see below). These focus groups were conducted in England, Wales and Scotland.

Headline: Play your part in this important ONS Study
Tone: Friendly, personal
Behavioural Insights or Design Feature: A behavioural commitment device which was designed as a cut out for the fridge that states: ‘Remember to play your part online in an important ONS study’

Headline: Play your part in Shaping Society – Take part in this National Online Study
Tone: More formal, direct and professional
Behavioural Insights or Design Feature: Icons per paragraph (house with a tick; computer, tablet and smartphone; house with a family) and headed sections

Headline: ‘Play your part in this important ONS Study’
Tone: Formal
Behavioural Insights or Design Feature: Photo of the sender, Director General

All letters were printed on the ONS social survey business as usual headed paper which is important because we wanted users to experience the letters as they would do if selected to take part. It also allowed us to gather feedback on the current branding. The footer is printed on white paper and with icons integrated into a turquoise banner including a heart to indicate health, piggy bank to indicate savings, a shopping trolley, car, mortarboard, nuclear family, aeroplane, wheelchair icon, in the same colour or navy blue (see Appendix 1 of the Word version of this report DOCX 2MB).

Alpha main findings

The letters developed and tested as part of this Alpha phase of research were for the online Labour Market Study (LMS). However, some learnings from this research are generic and could be applied to any online household survey.

The main high-level user needs that were identified from this research that users said they needed to know about in their decision-making process, include:

  • what the study is about – to help users gauge relevance
  • how long it would take them – so they could plan when to do it
  • whether users would be “good at it” – to determine whether to take part or not
  • how they could do it – mode options for participation
  • the study deadline – to plan and avoid the study sounding indefinite.

All the letters developed for push-to-web social surveys at ONS aim to include this information at least once across the ONS-user touchpoints.

The letters tested took three distinct tones; friendly, professional and authoritative.

The friendly approach using the voice of the Director General in letter one was considered “overly familiar” and conflicted with perceptions of how a Director General usually communicates. Perceptions evolved into an important user need to meet. Users felt the opening paragraph should focus on the study itself, rather than about the Director General.

However, once all the letters had been seen, on reflection, users felt that having the voice of the Director General added personality to the letter and made it seem more personable compared to others. A balance should be struck when considering the messenger affect and meeting user needs and expectation. In this case the tone needed formalising slightly.
A slightly more formal, but less personalised in tone used in letter two was found to be clear and friendly and more in keeping with what users expected from the ONS.

Non-native English speakers understood the information in the letter well enough to comprehend what was being asked of them. This feedback was important because of the need for the materials to be inclusive. GDS best practice recommends writing in plain English and aim for a reading age of 9 years old because of the way people learn to read and the vocabulary built up at that age. You can read more about this in the on the content design pages on the GDS website.

The formal, traditional letter was an expectation from some users. However, the more authoritative tone was found to be off-putting and intimidating to some users, considering the voluntary nature of the survey.

Action: As a result, the tone developed for the letters was pitched somewhere between professional and formal to meet users’ expectations. The voice of the sender (Director General) was used to make their presence known but kept light. Tone was adjusted at the different stages of the user journey and depending on the call to action. For example, at the prenotification stage it is important to build rapport with the user, whereas by the reminder a more direct and authoritative tone is used.

The branding used on the letter was found to play an important role in adding authenticity to the letter and building trust. Including ONS branding and the Royal Coat of Arms reassured users that the letter was form the ONS, adding credibility and made the letter look official. Their placement as part of the letter head which we found out was where users expected to find them.
In the body of the letter, we learned that icons helped break up the text making it easier for users to read. Skim readers used the icons to gain an indication of what each paragraph is about without having to read it all. Whilst icons should never ne used to replace words, these findings illustrate the important role icons can play in supporting interpretation of letter content and help make the letter more inclusive.

There were some instances where users felt that the icons made the letter look too busy which detracted from its officialdom.
The feedback on the aqua blue footer used in these letters was understood by those who knew ONS, but not by those who were not familiar. To some, the design made it appear unimportant. It was noted that the aqua blue in the footer did not match the colour scheme of the ONS logo at the top of the letter and colour scheme reminded some users of British Gas and implied junk mail. This supports findings on earlier iterations of the letter during the discovery phase. The users shared a preference for the darker ONS blue, rather than the light aqua blue. Although it was a more eye-catching colour, it was considered less professional.

The important takeaways from this research are:

  • branding should be consistent and joined up and convey the right level of professionalism
  • icons are considered in context and when used they should be in line with the brand and overall look and feel of the letter

Action: These findings were taken forward in the next stage of research. Icons were applied to the letter and the header and footer were re-designed using the ONS logo colours.

The headline used in letter one was not received well by users, the use of ‘ONS’ in the title was not helpful for those who did not know who ONS was. Those who were not familiar with ONS related it to corporations which was a negative connotation and resulted in the letter being perceived with less importance. There was a lack of association between the ‘ONS’ acronym and ‘Office for National Statistics’, which suggests a lack of brand awareness to potential users. It also highlighted that we should not take for granted that people recognise who ‘ONS’ are. This is supported by internal research by the communications team who regularly run research into the brand awareness of the organisation.

The header used in letter two was considered too elaborate and big, disrupting the flow of the letter. It was noted that the headline on the letter repeated information in the introductory paragraph which made it un-useful.
The headline used in letter three was attention grabbing because it was printed in blue to match the footer, however it was not seen as serious as the use of headers in a black font.
As a result of feedback received from all three letters, headlines at the top of the letters were not pursued.

The three letters had slightly different layouts. Those that did not use sub-headings were considered more traditional and formal and to some extent met users’ expectations. The letters which used sub-headings in the text were found to enhance clarity by breaking up the text and drew the attention of users, supporting previous Discovery research. However, it was important to users that each section was focused on just one topic and was directly related to the subheading.

Action: subheadings were taken forward in the next iterations of the letters.

A photo of the sender was included as a behavioural reciprocity building nudge. The photos that was tested qualitatively was of the Director General at the time who was a white middle-aged man. The feedback about including a photo in the letter was either neutral or negative. Users commented that it was political, unnecessary and reminded them of charity solicitations.

Action: For these reasons the use of a photo in the letter was not pursued, despite the literature suggesting that it could encourage response.

The prenotification letter gives users an advance warning that they will receive a follow up letter with instructions for accessing the online survey. In testing, users in suggested that sending two letters would be unnecessary and wasteful. However, to explore what people think versus the behavioural impact might be, it was recommended that this was explored as part of a quantitative trial (see Contact letter – private beta).

In the quantitative trial a prenotification letter was found to significantly increase response rate and improve timeliness of data collection. We found that users took part sooner, which maximised the online data collection and suited the short completion window. For full details, download the report authored by Ipsos MORI (DOCX 2.2MB ).

In one of the letters we explored the concept of having a section of the letter that could be cut out and put on a fridge for example to reminder the user to take part. The design feature was illustrated with a dotted line with instruction about what to do. This feature of the letter drew attention and some users understood it to serve as a reminder, however others were confused and alienated by it.

Action: As this feature was deemed to reduce perceived credibility of the whole letter for some users it was not pursued further.

The phrase ‘unique access code’ was used in the letter to describe the log-in code to be used in the process of accessing the online study. An important observation was made in this testing – the word ‘unique’ led to users being confused about whether the code could be shared with others in the household or whether they would receive individual codes.

Mentioning the code in the prenotification letter also led to some confusion for users. It was unclear what this would be. Some users expressed unhappiness about the thought of having to wait to enter the system and having to keep hold of the letter or remember the code. Where users interpreted the letter to mean they would need to set up an account, the reaction was overwhelmingly negative, and users were put off by the suggestion of a long-term commitment.

The wording about there being a log-in on the other hand, made the process sound safe and important. However, the users deemed that the data being collected wouldn’t be sensitive and therefore the login approach may be excessively secure.

Action: Following this feedback, the wording of the letter was amended to explain that the invitation letter will contain a more detailed explanation as to how the log-in will work.
As a result of these insights, the phrase used to refer to the log-in code was amended to make it clear that it is for use by everyone in the household. The word ‘unique’ was removed and replaced ‘household’ to better clarify its use.

The word ‘selected’ had negative connotations, reminding readers of spam. This is supported by earlier discovery research where similar comments were made. Furthermore, the word ‘selected’ led users to think that they had been selected on purpose, when in fact it was random, and at that point we did not know anything about them.

Action: As a result of these findings, the word ‘selected’ was replaced with ‘chosen’.

Addressing the letter to ‘the resident’ implied mass mail to some users. It was also noted that it conflicted with the fact that they have been selected. This implies that using ‘the resident’ in a letter is not appropriate, which is supported by previous discovery testing that concluded ‘sir/madam’ is more acceptable.

Action: Although sir/madam was initially adopted, we later reverted to addressing the letter to ‘Dear Resident…’ to ensure gender inclusivity.

Using the words ‘study’ has the connotations of a long piece of research, and where the data being collected was considered to have greater importance for users, as opposed to being collected “for the sake of it”. However, it did imply that it would be voluntary.

Action: Due to the greater perceived value in providing data to a study than a survey, we continue to use the word ‘study’ in respondent facing materials.

In the materials reference to how user’s data is used was mentioned. In the testing, we learned:

  • mentioning businesses and charities led to some worries (for example about cold calling or targeting by charities). If included in respondent communications, careful consideration should be made to the wording
  • an example of the data being used to create a well-being centre was too general and abstract for users
  • despite recognising that well-being is important there was no understanding of what the well-being centre was or what purpose it might serve
  • examples of impact should be tangible and relevant for users to show how statistics have influenced decisions
  • local examples were expressed to be a potential successful method of drawing users in.
  • users suggested that they would appreciate references to their local area in general, even outside the domain of examples, such as saying their local area will be underrepresented without their input
  • users would be willing to accept high-level examples and noted that a specific example should only be present if it is well known
  • across all the letters, users thought the example that was presented was the survey topic

Action: Relevant local examples need identifying and the framing of examples needs exploring further. Using a local example would help make the study more relevant to the user, however if local examples are not available, high level examples would be acceptable method to achieve wide-spread perceptions of relevance.

The prototype letters were limited to one page, based on the ‘less is more’ finding and content was kept to a minimum to not overwhelm the user. Within the text signposts were given to supplementary information which users found helpful. For example, directing users to the webpage which was overlooked in the header and signposting to the leaflet containing important additional information (see Contact letter – private beta). Including further information and contact details in the body of the text also gave an action for users to complete whilst waiting for the second letter to arrive.

Users also expressed a desire to have a specific webpage, as opposed to the ONS website in general. This would make it easier for the user to find the information they need and reduce friction across the different touchpoints in their journey. This also adhered to the behavioural insight principle of ‘Make it easy’.

After reading the letter users appeared to be under the impression that we would be asking for their opinions as opposed to their information. When asked why they think this, it came out that this is what is expected of ONS. This led to concerns over proxy (i.e. someone answering on another person’s behalf), and sixteen-year olds completing (i.e. their opinions won’t be ‘adult’). Letting people know there are questions with set answers was perceived to be easier, although others saw this as tick boxes without “getting a chance to have anything to say”.
Action: To address these insights an explicit statement was developed and tested that states that questions will be based on facts, as opposed to opinions.

Alpha summary

The research conducted in the Alpha phase used qualitative methodologies and therefore gathered qualitative insights. One of the challenges of this type of data is trying to avoid research and acting on what people say they want or will do versus what they do when faced with the situation in a natural environment. For example, we didn’t pursue a photo on the letter because of the strong feelings it evoked with some users and associations with scam mail, which we wanted to avoid. Whereas feedback about having a potential unnecessary letter didn’t play on those same sentiments and negative emotions and was therefore explore further in beta testing as part of a larger-scale quantitative trial.

Private beta

Private beta: Quantitative test summary – contact strategy

The first private beta of the Labour Market Survey (LMS) was an online only quantitative uptake test of different engagement strategies to determine which was more effective and cost appropriate.

Using the insights gained from the qualitative testing the following conditions were trialled:

  1. Prenote, invite, reminder
  2. Invite plus reminder
  3. Invite plus two reminders

The other conditions in this study included posting on different days of the week (Wednesday versus Friday), brown versus white envelopes and branding on envelopes (see Envelopes). No incentives were given to respondents during this trial.

To find out more about this private Beta experiment download the report authored by Ipsos MORI [DOCX 2.2MB].

The optimum strategy identified in this quantitative research was:

  1. Invite plus reminder plus reminder = 21.4%
  2. Prenote plus Invite plus Reminder = 20.2%
  3. Invite plus Reminder = 17.9%

Although condition one produced a higher response rate, the addition of a prenotification letter led users to complete the online study sooner. Given the importance of timeliness in the collection of labour market data, this is the strategy that ONS has taken forward in subsequent private betas and the early launch of the live LMS beta.

The letters developed for the ONS private beta spaces were achieving acceptable response rates; however, we became aware that as the content of the letters increased, font size and white space had reduced which posed a risk to the usability and accessibility of the materials. The Digital Accessibility Centre (DAC) is an organisation that offers a service to review digital media to ensure that it meets best practice and accessibility standards.

Given their familiarity with ONS digital services and development on online surveys, DAC were well placed to asses our materials and kindly conducted an informal review of our letters and highlighted areas for improvement.

To ensure the products were accessible and to keep up with digital standards we conducted further research to review the content of our letters. In our efforts to improve the accessibility of the letters we wanted to ensure all user needs identified in earlier research were still valid and still being met after the content was reduced.

To do this we conducted a series of research, each one will be discussed in turn.

In order to reach a wide range of people across the UK we conducted this research remotely. The first step to validating existing user needs and working out where content could potentially be refined or removed was to conduct a highlighter exercise.

Participants were sent packs in the post with the letters presented as a series of tasks. The tasks asked them to use different colour pens to highlight the content that they:

  • thought was important
  • needed in order to be able to do the online survey
  • did not need to know

They were also given a short set of follow-up questions to answer. The participants had approximately one week to complete the task and post back their materials for us to analyse. A total of 24 participants completed this task and the analysis was completed in multiple stages.

First, a workshop was held with other trained researchers and selected stakeholders where the main needs were identified from each stage of the user’s journey, based on the highlighted content, and then grouped. The next stage was to look across all four pieces of material and draw out the needs at each step in the engagement journey. This was first done manually with the physical data and then transferred to a digital representation using Excel.

The findings from the remote highlighter test largely validated the existing needs identified in earlier research, predominately:

  • what the letter is about i.e. an invitation to take part in an online study
  • who ONS are and what we do
  • that ONS is independent
  • that ONS is the organisation that runs the census
  • what to expect next
  • that their data will be collected securely
  • who should take part and if they can’t they may answer by proxy
  • what types of questions they’ll be asked – i.e. facts not opinions
  • who needs to see the letters and leaflets
  • how to find out more

At the prenote stage, our research found that users did not need:

  • to be explained that the letter was sent from the Director General (opening paragraph) or that “at the ONS we value those who take part in our studies”
  • to know that they would be sent a household access code
  • to know the survey is about employment and unemployment
  • the information contained in the footer

At the invitation stage, our research found that users did need the following information:

  • what they are being asked to do
  • what the study will ask about
  • instructions on how to take part, including the household access code, devices they can complete on and the website address they need to go to
  • where they need to enter their access code
  • how long the study is expected to take
  • a reminder on who should take part and whether they can complete on behalf of someone else
  • that other household members should see the information sent
  • types of questions asked
  • telephone number to find out more

At the invitation stage, our research found that users did not need the following information:

  • “Most people we invite to our studies take part”
  • this phrase was included in the letter to draw on the behavioural insight technique of social norming (see Contact letters  – discovery)

The actions taken forward following this piece of research were:

  • reposition the contact information
  • reduce content without removing the important information, including the infographic with the instructions

At the reminder stage, our research found that users did need the following information:

  • that not everyone in the household may have completed the study
  • that they can ignore the letter if the study has been completed in the last few days
  • this is a reminder to do the study
  • a recap of how to complete the study, including their access code and the website address
  • how long the survey will take
  • the deadline for taking part
  • that everyone in the household needs to take part and that should be over the age of 16
  • what the consequences are for not taking part
  • that no specialist knowledge is needed
  • the statistics ONS publishes will not be disclosive

At the reminder stage, our research found that users did not need the following information:

  • why they should take part
    • although users didn’t see this as a need, research using other methods has shown us that including the ‘why’ helps users understand the reason they are being asked to take part
    • other references to local and national policies letter relevant for users

The insights gained during the remote highlighter exercise were used to develop further iterations of the letters that were prototyped and researched with users again in focus groups.

The highlighter exercise allowed us to confidently validate the existing user needs that we had identified in earlier research. However, we concluded that the method did not allow us to effectively reduce the amount of content required to meet accessibility standards. The next step of the accessibility review was to use the findings from this research and test alternative (shorter and redesigned) content for the letters with users, alongside existing content that we knew met users’ needs.

The focus group activity involved what we are calling a ‘Frankenstein’ test, whereby users were explained the context of the exercise and given task packs. The packs contained component parts of the letter (both new and previously tested content) and asked to piece together what they thought they would need and how they thought it should look. Users were also provided with blank sections of paper to add their own suggestions. This activity was carried out for the three stages of the user journey: prenote, invite and reminder. After each stage, users were invited to feedback on what was important to them and why they set their letter out how they had.

In the analysis we looked for similarities and differences between how users had put together their letters, how they had amended them and the reasons they gave. The main insights from this research are:

  • the letter was considered the primary material and should therefore contain the important information, with the leaflet secondary
  • although the users recognised that the letter was from an authority figure at ONS, they didn’t see it as necessary for the writer to introduce themselves at the start of the letter
  • many users included the signature and sign-off of the sender at the end of the letter, suggesting that this is what they expect to see
  • the main user needs identified in the prenote are: the why, who, what’s next, how to find out more; reassurance of confidentiality and finding out what’s in it for them (incentive – whether simply motivational or a gift)
  • users used the headline statements and questions to structure their letter and didn’t always read the text underneath – despite time available
  • in putting together pieces for the invitation letter users opted to include the who, when, how, confidentiality and a signpost to the ONS
  • when information about how long the study takes was included as part of the three-step diagram on how to complete the study it was not picked up on by users
  • the three-step diagram about how to take part was provided in both horizontal and vertical format (see Appendix 2 of the Word version of this report DOCX 2MB), however there was no clear preference for use of one over the other
  • at the reminder stage users picked out the following items to be included in the letter: the why, the deadline, confidentiality, how to do it, about the study, what happens if they don’t do it and how long the study takes
  • on the reminder letter, users chose the more direct statement headers rather than the softer questions

Another aspect of the beta stage of materials product development has been to review the comments that have been received by the Survey Enquiry Line (SEL). The SEL is operated by ONS staff and the telephone number is provided on the materials for users to call for support. In addition to SEL, we also reviewed comments left in the open text feedback question at the end of the questionnaire.

Over the multiple quantitative trials of the online survey, a wealth of data has been collated by our SEL. As calls come in, these are coded into different categories, allowing us to identify relevant areas for investigation. Analysing these comments after each test has allowed us to identify user needs from real-life subjects and explore ways to improve or meet that need. In the LMS trials, the main themes identified through this method were:

  • tone
  • privacy
  • sequence of letters
  • timing of reminder letter

Another method we used in the Beta phase included conducting debrief interviews with respondents post survey where they had agreed to be re-contacted to take part in future research.
Debrief interviews were conducted with users to understand more about their experience and what their needs might be for future waves of the study. The Labour Force Survey is traditionally ‘5 waves’ long, households are invited to take part in the survey 5 times 3 months apart.

In this exercise, debrief interviews were carried out with respondents who completed all three waves of LMS online only attrition test.

The debrief exercise told us:

  • users had a positive experience of taking part, they described it as “easy” and “straightforward”
  • users noticed the repetition of questions and queried why they personally had to do it multiple times, although there was some understanding of measuring change over time
  • although statistically the between wave engagement (BWE) seemed to have an impact, respondents did not, generally, remember seeing it
  • users felt that the amount information they received from us was “just about right”
  • users needed to know why we collect the data and what it is used for and the subsequent uses and impact of the data

At the ONS the graphic design team has grown since the work began, and with-it knowledge and expertise. In addition accessibility became a legal standard for digital services and as an end-to-end experience it was important the all parts of the user journey was joined up. In order to deliver quality products, it was necessary to work across disciplines, forming a multi-disciplinary project team.

The ONS print designers established their principles and helped us understand the main criteria for ensuring materials are accessible:

  • contrast of colours
  • font type and size
  • alignment and styling of images
  • creating white space
  • writing in plain English
  • providing access and options for alternative formats

We worked closely with the print designed to incorporate our research findings and develop the final suite of letters.

Summary: Letter research

The beta phases of research and development of these materials has shown us how to make our respondent communications more accessible. This was achieved through re-iterating, re-testing and re-validating our user needs and making decisions on what to keep and remove based evidence.

Here is a list of nine tips to consider when developing a respondent letter for a push-to-web survey:

  1. seek to understand your users’ needs
  2. test and re-iterate your design and content
  3. only include information your users need
  4. keep to one A4 page
  5. identify the tone suitable for your organisation
  6. use branding and official logos
  7. break up the text and create white space
  8. write in plain English
  9. use minimum font size 12 in a legible font

For further advice, download the templates on this page.

Back to top of page

Leaflet

Discovery

In our face to face surveys a leaflet has traditionally been sent alongside survey invitation letters to provide additional information for users. In the ONS Electronic Data Collection programme two types of leaflet were developed and tested with users.

Focus group and cognitive interview insights

The first leaflet tested was the ‘1 in 60 million’ leaflet, which included generic content as it was used across all the social surveys. The front page said, “you have been selected from 60,587,000 to help shape Britain today and tomorrow”. The inside of the leaflet included the title “Make sure your voice is heard…” and provided information for respondents about the ONS and why they had been selected. The second was an instructional ‘Getting Started’ leaflet, which explained how to log on to complete the online survey.

The ‘1 in 60 Million’ leaflet – main findings

On the front cover, users were engaged and liked the use of the people within the numbers. However, respondents questioned the message and associated it with the lottery and thought it was related more to a sales approach.

Feedback on the visual design and layout was as follows:

  • users thought the leaflet had the opportunity to be more aesthetically and visually pleasing
  • the inside was described as a “wall of text” and an “old fashioned way of presenting information”
  • users thought the layout of the text was cramped
  • users suggested taking the colour away to make it seem more official and there were association made with a glossy takeaway leaflet

Through the analysis, this research revealed users need to know:

  • more information about ONS
  • what will be done with statistics and why the study is important
  • how they take part
  • what the study is for
  • why they should take part
  • when they need to do it
  • more about security and confidentiality information (for example use a padlock icon)
  • about the process for completion and what their role will be (for example diagram of the process respondent goes through)
  • about the stats we produce (for example, infographics, ‘did you know?’ type facts or provide current data on the topics of the survey)
  • ”should you wish to see the results they will be available in X months time here”
  • how to get to more information for example, include QR code taking respondent to website or social media
  • a QR code for the Twitter and Facebook links

The recommended actions from this discovery research was to:

  • remove the phrase “you have been selected from 60 million”
  • review the title “make sure your voice is heard”
  • review wording on first subheading – “Please take part…” to avoid sounding like pleading
  • review the ‘why have you been chosen section’ which talks about address but not person, to include more information about importance of study and why the household has been selected
  • amend the sentence “your address has been selected from the royal mail list of addresses…” to explain that this a publicly available resource and not just the government to reduce suspicion
  • change the line “if you are not in when we call…” as does not apply to online completion
  • review the language and use a warmer tone to explain why the respondent is important and cannot be replaced other than for sampling reasons
  • review the mention of the history of the wartime survey
  • at this stage the specific study is not explained, and the user will not know if they are taking part at this point, to address this, change: “If you have questions about the study you have been chosen for, go to our website www.ons.gov.uk and click on “Taking part in a Survey?” on the right-hand side. Or “ring us on 08000 298 5313” to a simpler “For more information, go to our website www.ons.gov.uk/surveys or give us a call” address the fact that at this stage the specific study is not fully explained.

The ‘Getting Started’ leaflet – main findings

(see Appendix 3 of the Word version of this report DOCX 2MB)

Icons and visual design

The follow observations we made about the visual design of this leaflet:

  • users responded well to the front cover
  • users noted the lack of consistency in the icons between the letters and the leaflet and said this affected the legitimacy of the process
  • the fact the disabled icon was present on the letters but was missing from the leaflet was picked up
  • the blue colour pallet was considered modern and formal

The recommendations following these insights were to:

  • continue to use the green icons to ensure consistency with other survey branding (for example; the online questionnaire design banner)
  • investigate possibility of including the disabled icon in the survey specific icons
Basic steps

This leaflet provides instructions for accessing the online survey, in the testing:

  • it was suggested to change ‘Have you got your letter, this contains’ to ‘..it contains..’ or the letter you have received contains..’ or ‘inside you’ll find’
  • there was some confusion over the use of ‘XXX – XXX …’ to indicate the enrolment code in Step 1
    • instead of a dummy code, ‘XXX etc’ was used in the testing to avoid users mistaking an illustrative alpha numeric code for their login details
  • users liked the basic steps but thought it was unclear about what they should do when they had entered the questionnaire
  • users that were not used to using computers said it gave them the confidence to go online and follow the steps to log on to it
  • the size of the images was observed to not be big enough for respondents to discern the content clearly

The recommendations following these insights were to:

  • change the text in Step 1 to make it clearer
  • change the design of the example enrolment code
  • provide some further guidance and investigate possibly including a next step explaining what they do when code is entered
  • reduce the green space in the margins by having Steps 1, 2, 3 to the left rather than above the detail allowing the images to be enlarged
QR Code and Frequently Asked Questions (FAQ)

This leaflet included a QR code, in testing:

  • users noticed the QR code on the reverse but were unclear where it would go
  • the use of the QR code in Step 2 was suggested for those using a mobile device
  • users suggested having a link to the outputs of the study and when the results would be available
  • users suggested including information on what they should do if they do not have access to the internet

The recommendations following this testing were to:

  • explore if the ‘getting started’ leaflet would be questionnaire specific and if so, continue to use colours and icons and include a link to the outputs and possible include some statistics
  • include information in the FAQ on reverse of leaflet to address what they should do if they do not have internet access

Note: At the time this research was done (around 2014) a recommendation was made to continue to use the QR code within the basic steps if questionnaire compatible with tablet. More recent research (around 2020) however, has identified that whilst QR codes and the technology that support their use have become more common in recent years, they should be used with caution. Qualitative insights suggest they should be used only to support and not in replace of any crucial information. Users expect to have all the important information in the letter and do not want to have to take an extra step to find it.

Following the testing of the two leaflets, they were re-iterated and taken out for further user testing. The ‘1 in 60 million’ leaflet became the ‘Your household has been chosen’ leaflet, adopting a more informal approach. Although the ‘1 in 60 million’ concept worked when it was originally introduced, it had become outdated and didn’t fit with expectations of government branding. For the ‘Your Household has been chosen’ leaflet the amount of information was reduced, infographics were included, and a greater amount of colour used.

The ‘Your household has been chosen’ leaflet – main findings

Initial observations:

  • users usually read the leaflet straight after the letter
  • some users left the leaflet in envelope when they pulled out the letter and needed prompting to find it during probing about the letter
  • the information in the leaflet provided answers for many of the questions that users had after reading the letter, about why they had been selected
  • users were drawn to the leaflet because it was brightly coloured and was “pretty eye catching”, and some users first noticed and commented on the colour of the leaflet
  • other features that were salient included the box at the bottom of the page that included reference to statistical releases

Front cover observations:

  • the title was considered appropriate for the leaflet and sufficiently explained what would be found in the leaflet
  • the word ‘chosen’ made users feel special. It was also mentioned that the word chosen made the respondent “want to get involved”
  • users favoured the use of ‘chosen’ over ‘selected’, because “chosen has many positive connotations” and “implies you’ve done something good”
  • it was suggested that ‘selected’ seemed more formal whereas chosen was a “a bit softer”
  • the title reminded users of junk mail that says you have won something, and ‘your household has been chosen’ was compared with the phrase ‘you’re the lucky winner’
  • the title of the leaflet was also considered threatening and was suggested that we add: “…to take part in a survey” to the title to clarify what the user has been chosen for
  • users thought the subheading ‘here is what you need to know’ was clear and informative: “it’s very informal but very clear I know that when I open this it is going to talk me through what is going on, which is great”

Headings feedback:

  • users were attracted to the headings within the leaflet and stated that they were all clear and easily understood
  • users thought the headings covered all of the main points and were presented in a good order
  • use of acronym in ‘who are the ONS’ was not liked by some users

Infographic feedback:

  • users responded positively toward use of the image and thought it was better than using just text, as it “makes it less boring” and is helpful
  • there was some confusion about what the infographic was supposed to be representing
  • it was not clear what the first part of the infographic was showing and the faint icons behind the figure were difficult to see and often unnoticed
  • it was suggested that an image of what the survey would look like could be included in the section about confidentiality
  • users understood the images on the right as different household types that the ONS talk to for their studies

Feedback on the phrase “why has your household been selected?”:

  • most users understood that their household had been chosen randomly
  • ‘at random’ was not always noticed and was suggested that we change ‘why has your household been selected’ to ‘your household has been selected at random’ to make it clearer
  • users commented that this section was meaningless and did not fully explain why they couldn’t use another household close to theirs if they were not willing to take part, questioning “what difference does it make?”
  • the reference to the Royal Mail was thought to be misleading when attempting to understand the sentence about not being able to replace their household
  • some users would like more information on why they had been selected, and the Royal Mail section was referred to, however this would not largely affect respondent’s decision to take part
  • it was mentioned that the size of the list that users were selected from should be stated
  • there was also a query about the age for participation

Feedback on who is ONS and what do they do:

  • most users knew who the ONS were and said they were satisfied with the amount of information that was provided
  • if they wanted to find out more information users said they would use the website or telephone number
  • users liked the mention of the weekly shopping basket in this section of information

Feedback on descriptions of how data is used:

  • some users had a clear understanding of who the data was being used by, mentioning government, charities, and members of the public
  • some users were unsure exactly how their information was being used
  • users mentioned that if the information was helping people, the community and services this would encourage participation
  • however, not all users were concerned, and thought they had enough information: “it doesn’t actually tell me what you’re going to do with it” but the user was not bothered by this

Feedback on the use of the phrase “we are not a commercial organisation”:

  • this information was often missed when the leaflet was read
  • users suggested that the phrase was missed due to its position in the fold of the leaflet and because the text was too small
  • this information was considered essential as without this knowledge users may think information was being provided to political parties and may be discouraged from taking part
  • it was suggested that this information should be made more prominent and could be part of the main paragraph ‘who are the ONS?’
  • other users thought the sentence should remain in the sub-box, but the box should make up a whole line rather than two thirds of the space

Feedback on the phrase “your answers will be treated confidentially…”:

  • there were mixed views over the importance of confidentiality and how much information should be provided
  • users were generally reassured by the information about confidentiality and thought no further content should be added
    • this included some of those that were most concerned about the protection of their data, but not all
  • there were concerns that users’ data would be sold on to other companies, specifically sharing with third party marketing organisations
  • there were positive responses to the use of bold text and the padlock icon
    • these features drew user’s attention to the area and made them more willing to take part

Feedback on the phrase “our statistical releases are available…”:

  • users were happy with the information provided about statistical releases and thought it was “useful to know”
  • some users were surprised to learn that they could access the statistics online and said it would be interesting to see, although there was uncertainty about what they would find online
  • this sentence was missed by some users and was noticed on the back of the leaflet before it was seen at the bottom of the page inside

Feedback on the back page of the leaflet:

  • users liked the information about how their data is used and were surprised to find out that it wasn’t only the government that uses the data but charities and universities too
  • some users mentioned that they liked the use of the informal tone
  • some users disliked certain parts of the sentence that referred to being ‘rich’ or poor’ saying that is was “quite offensive” and could be considered “slightly depressing”
  • a lack of ethnic diversity was picked up in terms of representation and should be addressed
  • another change suggested was that ‘living up north’ could be changed to ‘living in the north instead’, or thought that maybe it should say north, east, south or west
  • users found it helpful to include the contact details and opening times on the back
  • it was suggested that it should be mentioned that the enquiry line is a free number

Views on visual design:

  • users liked the colour used in the leaflet, commenting that it “catches your eye”
  • users also mentioned the contrast of the text on the white background
  • users commented that as leaflet was split into chunks it made it easy to read
  • users were happy with the icons for Twitter and Facebook and understood what these were showing, although not everybody was prepared to use these websites

Length and amount of materials:

  • as there is limited space to provide information, a decision should be made on what respondents should be most aware of, such as how to complete the survey and the return date
  • the most important information should be highlighted, bolding text where necessary
  • large chunks of text should be avoided and make use of infographic to improve the visual appeal

Alpha

The next steps for developing the leaflet for online surveys came alongside the work developing the contact letters as part of the DCTP (see Contact letter – private beta). Using the insights gained in the prior discovery work for the EDC programme, iterations we made to the leaflet and further testing was conducted.

Alongside the testing of the letters described in section 2, a leaflet was also included. In the leaflet tested in this focus group was a horizontal folded ‘Dimension Lengthwise’, commonly referred to as ‘DL’ size (one third A4) using the aqua blue colour and icon design to match the letter.

The front headline read “Play your part in Shaping Society” with the subheading underneath “Here is what you need to know”. Inside contained a section on ‘what we do’ and horizontal infographic showing five steps to producing statistics, information on sampling and confidentiality. On the back, printed in dark blue with white test were sections on ‘we value your contribution’, ‘find out more’ and ‘you can call us for free’ along with contact details and an offer of large print and braille on request.

In general, the leaflet answered lots of questions users had, however they said that they would want to know what the study is about and how long it would take. As this was repeatedly highlighted as important, this need should be met in letter as opposed to in the supplementary leaflet.

The leaflet as described above, is broken into sections. The main feedback concerned the main title, “play your part in shaping society”. Users suggested that it should be removed as it is too dramatic, repetitive, and because it didn’t seem accurate to the reader. The section titled ‘here’s what you need to know’ on the other hand was highlighted important as it explains what the leaflet is about and what to expect.

One of the sections on the leaflet has the ONS confidentiality pledge, on this:

  • users neither knew nor cared about the code of practice mentioned in the pledge
  • the paragraph was considered too long and wordy, with a desire for a brief summary instead
  • it was also poorly understood, for example users were unaware of a difference between survey information and statistics
  • users also expressed concerns that the ‘tax man’ would get hold of information about their earnings from us, which mainly seemed to arise from reading the pledge

The leaflet explains the sampling strategy for the survey, the research revealed:

  • that users felt the title of the paragraph “why your household has been selected” implies it will explain ‘why’ they’ve been selected; instead, the content is more about ‘how’
  • this highlights the importance of meeting users needs by providing the information they expect to receive
    • similar feedback was also noted on the letters; section headings are only useful when relevant to the body of the text below
  • there was a divergence between users wanting more information, less information, and being happy with what was provided
  • the ‘how much information on sampling to provide’ paradox would ideally be solved by keeping the information but making the paragraph more concise
  • there should be consistency in whether materials state it is the ‘household’ or ‘address’ being selected, as the inconsistency was noted by users
  • despite evidence suggesting that the paragraph on sampling was understood, comprehension of the sampling technique appeared poor
  • there was also poor understanding as to whether we know about them and have selected them based on their characteristics, or whether we selected addresses without knowing anything of the residents

Although some suggestions were made to just say that the users have been randomly selected, this would be advised against given the flexible definition some users had of the word ‘random’. When it was suggested, some users were quite adamant that ONS wouldn’t randomly select participants.

]

There was some feedback from users around the sentiment of providing reassurance about taking part, the main takeaways were:

  • the sentences on the back of the leaflet about not needing special knowledge and ‘whoever you are’ etc, were highlighted as important, and participants expressed a desire for them to be more prominent
  • the second of these sentences (regarding no need of special knowledge) was seen by others as somewhat patronising, even though many expressed a desire for reassurance that they would be good at completing the survey
  • increasing the prominence of the ‘whoever you are’ sentence could be achieved by moving it somewhere more obvious, rather than on the back of the leaflet

To illustrate the users role, the importance of their participation was stressed in the leaflet. The feedback suggests:

  • telling people that participating makes them important was negatively received
  • some users felt that people should be important just for existing, and that it is important that their views get out as opposed to it being important that they participate
  • other users wanted it to be clear that their participation was important because only some households were selected

The recommendation made at this stage was to change the sentence so that it either states that their participation is important (rather than them), or not include it.

In the testing, users highlighted that there are issues preventing completion than other needing large text.
It was recommended that the sentence could be changed to ‘if you need help’ to make it more inclusive of different accessibility requirements.

In the leaflet there is a statement saying our statistics are free and available to all. This was positively received because it demonstrated to users that once you take part the impact can be viewed in the form of the statistics.

Comments from users on the layout and design of the leaflet revealed:

  • users felt that the leaflet does not look important
  • users mentioned that an unfolded leaflet would reduce the effort of opening a folder leaflet, with a preference for having the information presented up front
  • orientation, layout, icons, gloss paper and colour have all been highlighted as reasons why it looks like junk
  • users felt that it was ‘trying to be cool’, confirming previous findings regarding the materials that plain and official would be preferred
  • although they didn’t expect to, due the appearance of the leaflet, users felt that the contents were more important than what is on the letter
    • Action: this highlights the need to re-consider the design or placement of the content considered most important by users
  • users also said that they found the leaflet uncomfortable to read because of too many different font sizes, and a lack of structure
  • users expected to see more official logos on the leaflet
    • Action: although the ONS logo is on the front, no one commented on its presence, demonstrating that it could be made more prominent, and the Royal Coat of Arms added
  • the text was noted as difficult to read where there was positive contrast, and black font was suggested to be easier to read
    • Action: light backgrounds with darker text should be taken forward
  • users expect contact details, whether phone number, website or social media to presented together in one place, so that they could choose the appropriate method for them

The five-step diagram (see Appendix 4 of the Word version of this report DOCX 2MB) illustrates at a high-level how survey statistics are produced.

The findings from testing are:

  • the infographic on the leaflet was said to be concise, informative, and to show the relevance of the study
  • users wanted it to be more prominent, even suggesting moving it to the letter
  • requested changes to the infographic were:
    • for more information on how data informs decisions
    • to state that when data is released it should say statistics
    • that it these statistics are anonymous (no names)
  • formatting the infographic in steps had both positive and negative impacts; inclusion of the first two steps highlight that people are stuck on step one until they receive a second letter, which users felt could be frustrating
  • providing an action for the interim may be useful in alleviating this frustration
  • showing only five steps implies that participation is simple, especially given that those who want to take part only need complete step two to be involved in the process
    • Action: it is thought that playing on this would increase perceived ease of participation, adhering to the behavioural insight principle, ‘make it easy’ so should be pursued in further testing

The findings:

  • relating to the general comment that users want less to read, respondents mentioned that they would like to have the leaflet or the letter, and not both
  • the leaflet could be cut down based on the desire for less to read and focus on meeting the main user’s needs. For example, there was a suggestion that the second and third paragraphs are irrelevant
  • for users who want more information web link should be provided

Beta

GDPR requirements

In May 2018 the General Data Protection Regulations (GDPR) were introduced. This meant updating respondent materials with the relevant information.
Whilst ensuring we met the requirements of GDPR by including compulsory information to satisfy business needs, we needed to ensure that the new content was clear and understood by users too.

The changes to the leaflet were minor adjustments to the confidentiality statement, signposting users to the Data Protection Officer (DPO) and the Information Commissioner (ICO). All this information is presented together on the back of the A5 ‘What you need to know’ leaflet.

One-to-one cognitive testing was carried out on the leaflet and the main findings were:

  • the confidentiality statement was scanned, rather than read line by line
  • users felt reassured by the presence of the confidentiality statement
  • the lock icon was interpreted to mean the information was going to be kept secure
  • the confidentiality statement was considered clear and easy to understand
  • users understood their data would be anonymised and used for statistical purposes
  • general trust in how their data will be used because of ONS being an official organisation
  • users understood that they could contact the DPO if they had further queries

The leaflet in isolation does not meet GDPR requirements there are a combination of factors across the suite of materials which ensure our compliance.

In section 2.2 on the letters we describe how we conducted research as part of the accessibility review. Included in this suite of materials under review was the ‘What you Need to Know’ leaflet. The next section describes the findings on the leaflet from the highlighter exercise, followed by the insights gained in the ‘Frankenstein’ focus group.

In this exercise we provided the leaflet alongside the prenotification letter to replicate what we had done in previous quantitative tests. The analysis of the highlighter exercise revealed that the information users felt they need to know at this stage was:

  • ONS is not a commercial organisation
  • ONS do not work for political parties
  • what ONS is asking of them
  • that the study informs government decision making and they need an example i.e. winter fuel allowance
  • that their information would be treated as confidential
  • that the statistics produced will not identify them or anyone in their household
  • their data will be treated according to the Code of practice
  • that their data will only be held for as long as it’s being used to produce statistics
  • who their data could be given to
  • how to contact the DPO via email and telephone if they have further queries regarding their personal data

In terms of the information they definitely did not need, there was a consensus from the exercise that users did not need know that they could read our statistics on the website for free.

As mentioned, and in line with earlier research, the leaflet it considered a supplementary piece of material. In the ‘Frankenstein’ focus group exercise (see Contact letter – private beta), it was found that:

  • reassuring users of data confidentiality was considered a strong need however, when presented with an option of a shorter or longer version, the former was perceived as adequate
  • an alternative icon was tested for step three, however, it did not convey to users that data is collected from multiple units
    • As a result, the original design with two people and multiple dots to represent data was kept
  • users responded well to the content around their participation ensuring their local area is properly represented in statistics and the fact that these go on to impact us all
  • users also responded positively to phrase ‘Whoever you are, whatever you do, we are keen to hear from you’ as it promotes inclusivity (and addressed a need for wanting to know that they should take part, not just for select people)
  • users included the box highlighting their role in the statistical process ‘To take part, all you need to do it complete step 2’, an existing behavioural nudge that had been applied
  • contact information and opening hours were identified as a need so that users don’t get frustrated if they try to call out of hours

Back to top of page

Envelopes

Discovery

As outlined in the introduction (section 1.0) of this report one of the main barriers to engaging users with an online survey is getting them to open the envelope. As this is the first step in the user’s journey, we wanted to explore what needs users had at this stage and what we could potentially do to encourage response. To start with we looked back at what had been done in previous ONS survey work.

Insights from Electronic Data Collection (EDC) Programme

The envelope used in the EDC testing was a plain white generic envelope except for the Royal Coat of Arms and the words “On her Majesty’s Service”. They were printed in black across the top, and the ONS logo in the bottom right hand corner. In testing it was found that these official markers made the letter look important and encouraged respondents to open the envelopes.

Other discovery activity included looking at the testing conducted on previous Census envelope designs.

In this research the findings showed that:

  • images, particularly maps were favoured by respondents
  • users responded positively towards the idea of being ‘counted’ because it makes them feel included
  • some users were put off by forceful language such as ‘act now’
  • users didn’t always know what ‘ONS’ meant or ‘Census’
  • users in Wales responded positively to regional designs

On one of the envelopes the phrase ‘Act now – login inside’ with an image of a post-it note with ‘Census Day! 21st March’ was included in the front bottom right hand corner. On this, users felt the tone was a bit forceful and the inclusion of a post-it note made it look unofficial.

Another envelope included the environment nudge “Save money, save time, save the environment… do your Census online”’. Users feedback that the call to action was too long winded and something snappier would be better. The use of the saving the environment was not universally appealing to users and it was suggested that saving money or helping the community would provide more motivation.

Using these insights in combination with the desk research, further testing was conducted as part of the alpha phase.

 

Alpha

Pop-up testing

Pop-up (also known as Guerrilla testing) was used to gather feedback on several prototype envelope designs. The aim of this research was to gather people’s reactions to the different envelopes. Users were asked ‘which envelope jumps out at you?’ and the researcher explored the reasons why. Further follow up questions were asked about envelope for example colour, design, layout and how the presence of the Royal Coat of Arms made them feel. The way the envelopes were presented alternated to provide a counterbalance.

Four types of envelope were included in the testing:

  1. plain envelope
  2. various regionally branded envelopes for England, Wales and Scotland
  3. a ‘word jumble’ envelope containing survey topics
  4. a call to action ‘play your part’ envelope

Pop-up testing helped identify which envelopes to take forward and which to drop. For example. the map of Wales was dropped from further testing because it was found that only people born in Wales recognised it. However, the evidence showed that there was merit in regional branding so other options were later explored.

Focus group testing

Using the pop-up testing insights, envelopes were further tested in focus groups. For brevity the feedback across two iterations on each element tested, is summarised.

Although this design drew some people’s attention, it reminded users of junk mail, marketing material or charity solicitations and made the envelope look unofficial and therefore unimportant. The words included in the word jumble attempted to summarise the contents of the study, however not all users picked up on this and were confused by it. For those that did understand the words they were curious about what would be inside and felt that it could be important. As a result, this style was not pursued further.

There was feedback from users that suggested and that slogans on envelopes make them look less official. However, where the ‘you’ was in bold in the slogan, it was positively received as people expressed the feeling that it meant they need to do something. In the context of the regional envelope, feedback suggested combining the bold ‘you’ with the word ‘Wales’. In comparison to ‘play your part’, the phrase “make sure you are counted” was considered more honest.

This was possibly as findings from the letters showed ‘play your part’ to be a phrase that was understood only after reading all the material. Whereas make sure you are counted seem to make sense in isolation. In contrast to the preference of ‘make sure you are counted’ over ‘play your part’, other comments were about using the play your part wording also in combination with the word ‘Wales’.

This phrase was included as the main sentence on one of the envelopes. Users said that it made it clear what the contents would relate to and others felt that it had a community feel and was an inviting sentiment. Users felt that the envelope implied action, and those who held this belief expressed that this action would have a positive impact. This text also implied that the contents of the envelope would be important.

However, others suggested that it was trying too hard and which would be off putting. Other views were that it was patronising and looked political. Despite this misconception, some felt it was less vague than ‘make sure you are counted’ in terms of reflecting survey related contents. Similar to the ‘word jumble’ envelope, the use of ‘your’ here made users feel good as it made the reader feel relevant.

This was used as sub-sentence to the play your part line. The use of this wording led to expressions of mixed views. Saying a ‘letter’ is enclosed for some implied passivity, although those who felt this had two trains of thought:

  1. it can be seen as prompting no action
  2. it can be seen as doing something easy or of little work

It also implies importance; however, some believe pushing the idea of importance could lead to the opposite. Users made comments that when things are important, they don’t usually feel the need to announce their importance. Although some users found the reference to ONS sending them an important letter honest in that it clearly shows who it is from, others perceived this to be threatening.

When asked about the potential contents, people noted this would hold a ‘more important’ survey, compared to the other envelopes. Despite strong views that it looks important, people suggested that they would not open it urgently.

Envelopes that contained lots of different imagery and statements, like the ‘play your part’ design were felt to look too busy. They were thought of as not being in line with expectations of official correspondence, which is expected to have more blank space. Further envelope iterations took this finding into consideration, as officialdom is important in building trust with users.

Images of the English lions and rose grabbed attention over anything else on these envelopes and reminded users of sports. On the rose specifically, users associated this icon with rugby, the labour party and commercial brands. The users didn’t understand why the rose was there and thought the Royal Coat of Arms suitably represented England and the UK in this context. The England-centric text used on the envelope prompted responses reflecting negative associations (such as racism or far right political parties). The text also seemed passive in that people did not feel as though they had to do anything from it.

The initial dragon graphic used on the Wales-centric envelope was criticised in terms of its appearance, in that it was over-sized making the envelope look less formal. However, more generally Welsh users did relate to the Welsh dragon and could not come up with a more representative icon. In a second iteration of the dragon design it evoked both local and national pride, and the feeling that it is the land talking to the reader. Another positive comment was that it highlights the fact that participating won’t only affect England, which plays on the theme that Wales is often left out.

The Wales-centric text had largely negative reactions. By talking to ‘you’ as in ‘Wales’ as opposed to ‘you’ as in a person, users expressed that it felt less personal. This also felt that it implied that everyone in Wales would receive the letter, and that it was only for Wales thereby making it less important for some than an envelope with which suggested a wider reach. Saying that Wales needed to be ‘counted’ made some suggest that it was not currently being counted, or that there was something wrong with Wales.

This led to the comment that a welsh graphic alone without welsh text would be enough. In a further iteration the word ‘Wales’ was removed from ‘make sure you are counted’ but this was interpreted the same as before, due to the presence of the Welsh dragon. However, it was also interpreted as potentially encouraging response so that Wales doesn’t miss out repercussions of the survey.

Whereas in Wales a clear distinction was seen between Wales and the UK, in England, the two were used synonymously. This seem to somewhat defeat the point of a regional material. Therefore, the decision to pursue regional envelopes in each country in the UK was abandoned, for lack of a suitable icon. However, regional envelopes were pursued in Wales based on user insights as part of a quantitative trial (see Private Beta: Quantitative test summary – envelope testing).

The use of the Royal Coat of Arms crest on the envelope suggested that the envelope contained important contents suggests, legitimacy, official contents, and that by not opening the envelope there could be serious consequences. This finding is similar to the research conducted during the discovery phase on plainer envelopes.

The text across the envelope that reads: “On Her Majesty’s Service” (OHMS) drew people’s attention to the envelope and made it look official and important. The OHMS was considered important, government related, and some said that that alone would be enough to get them to open the envelope. For others, the phrase prompted expressions of fear, leading some people to believe the contents would be mandatory. When this sentence was on the back of the envelope, strong views were expressed regarding its placement, with users suggesting that they expected to see OHMS and the royal crest adjacent.

Inclusion of a return address was considered an important need by users. Its presence was considered honest and using an actual address rather than a PO box was expressed as a reason to think it is not junk mail. A return address also suggested to users that the contents are important, indicating that the office does not want the letter to go astray. When the ONS logo is paired with the return address it reassures users who the sender is and where it was sent from, giving users the information they need to make a judgement about its legitimacy.

In the Wales-centric envelope, there was an expectation that the return address would be in Wales, rather than Hampshire (where the ONS field operations are based). The return address should continue to be present on the back of the envelope to help satisfy user expectations and provide confidence in the mail.

The use of the ‘UK mail’ stamp reminded participants of junk mail. A recommendation would be made to use an alternative provider where possible to minimise the risk of the letter being mistaken for unsolicited mail.

In the testing brown envelopes perceived as more formal, but in some instances threatening. Whilst this may encourage engagement with the material a quantitative experiment was recommended combined with the insights above. The results of this quantitative trial are summarised in section 4.2.

A theme that came up in all materials was that official correspondence is expected to be plain and with little text. This was very strongly expressed, so should be a pillar of creating envelopes that are user centred. A further expectation from users was the need for the contents of the envelope to match the perceived level of importance in order for them to take the request to participate seriously. Otherwise there is risk of disappointing or irritating the respondent which would discourage response. In terms of statements or slogans on the envelope, it was important to users that these were active phrases to encourage action.


Private beta

Quantitative test summary – envelope testing

The first private beta (in this instance a large-scale quantitative experiment with using prototype materials) of the Labour Market Survey (LMS) was an online only uptake test of different engagement strategies. Two elements of the trial involved envelopes; these tested:

  1. White versus brown (manila) envelopes
  2. Regionally branded envelopes versus non-branded envelopes

Using the insights obtained in the qualitative testing, both white and brown envelopes sent out across England contained:

  • the Royal Coat of Arms logo
  • the phrase ‘On Her Majesty’s Service
  • a slogan at the bottom-right hand corner saying “Play your part in shaping the future of the UK”
  • the ONS return address was printed on the back of the envelope

In Wales and Scotland the envelopes adopted regional branding in addition to ‘OHMS’ and the Royal Coat of Arms. In Wales, the slogan was ‘Wales, make sure you are counted’ (in English and in Welsh) and included a greyscale image of a dragon. In Scotland the envelope said ‘Scotland, make sure you are counted’ and contained a printed picture of a map.
The other conditions in this study included posting on different days of the week Wednesday versus Friday, and different types and number of contact letters (see Contact letters). No incentives were given out during this trial.

Conclusion

The results of the envelope trial showed that brown envelopes achieved a better engagement rate with the survey, but the difference compared to the white envelope was not statistically significant. In terms of regional branding, the Welsh envelope appeared to improve response but the results were not statistically significant. The same was not seen for Scotland, although it’s worth noting this experiment was conducted around the time of the Scottish referendum.

To find out more about this private beta experiment download the  report authored by Ipsos MORI [DOCX 2.2KB].

Back to top of page

Interviewer calling cards

Discovery

The LMS Statistical Test was a mix-mode (online with face-to-face follow up) study requiring interviewer stationary. In the event that the user didn’t respond to the invitation to take part or follow up reminders or if they could not take part online they would be followed up by an in-person interviewer visit. If the door is not answered, interviewers will often leave a calling card to show that they have been tried to make contact and offers away for the respondent to get in touch. To develop the calling cards for this mix-mode test we first sought to understand the current processes.

Desk review

To develop these materials, the first step was to understand what items interviewers currently use on the doorstep in the face-to-face surveys and understand which items were most useful. This involved looking at metrics from our operations division to evaluate which items to focus on. These were found to be:

  • called today cards
  • appointment cards
  • broken appointment cards
  • impediment to entry letters

These items were reviewed to establish their role in the user journey and explore how we could bring them in line with the letters and leaflets to ensure a consistent look and feel across the suite of materials received by potential respondents.

We also looked at research by the Behavioural Insights Team (BIT) to see how we could apply learning from the Census 2017 test which trialled different types of nudges on two different

  • calling cards – these applied the concepts of:
  • implementation intentions – the cards asked respondents to tick when they would complete their Census Test questionnaire. Research shows that people are more likely to complete an action when they have planned to do so.
  • local – the cards referred to their local census officer
  • monitoring – the cards said that we noticed that they hadn’t filled in their questionnaire, indicating that they are being monitored
  • reciprocity – the card said “we designed this reminder to help you out” invoking a sense of reciprocity

The second card made use of the above and the additional behavioural nudge:

  • interviewer personalisation – to personalise the card, the field worker had space to sign their own name and referred to the individual as ‘neighbour’.

The two cards designed by BIT were trialled, using the existing ONS calling card as the control group in the 2017 Census Test. The results showed that both BIT calling cards outperformed the control, but the addition of the interviewer personalisation did not significantly differ from the one employing only the implementation intention. For more information about this trial contact the team at behavioural_insights@ons.gov.uk.

These learnings were applied to create a ‘nudge’ calling card as part of an operational test for the LMS. The test was only conducted with small sample, but findings indicate that they helped improve engagement with survey by approximately 10% compared to those who did not receive a nudge calling card (no statistical testing was applied due to small sample size).

In addition to the nudge calling card, traditional calling cards were developed with interviewers and members of the public.

Interviewer workshops

Users of the calling cards include interviewers as well as the recipient (respondent), so it was important that feedback was obtained from both types of user. Research with interviewers on this product sought to find out about current behaviours, for example how and when they use the each of the available calling cards. It also provided an opportunity to get feedback on the current cards to see what ideas and suggestions they had for any potential improvements. We carried out two research activities, firstly a face-to-face workshop with interviewers and secondly, we circulated the calling cards via interviewer managers to collate feedback via correspondence.

In this research, we found:

  • interviewers don’t necessarily all use the cards in the same way, with the same frequency or at the same point in time
  • not all interviewers include the time on the card they leave
  • there was an appreciation for wanting to standardise as much as the card as possible to minimise the amount of ‘write-in’s’ required on each card in order to save time and provide consistency.
  • use of the word ‘study’ was preferred over ‘survey’ to convey importance
  • interviewers need the card to be clear it’s not a cold call
  • users responded positively to and recommended a polite and friendly tone, over more authoritative approach
  • that the ‘Sorry I missed you’ was deemed appropriate  for the called today card, with mention of interviewers using that already or writing it on the current card.

The actions taken away on the prototype cards we shared with interviewers included:

  • creating more space for interviewer’s details
  • adding space for day of the week on the appointment card
  • linking to ONS website to provide legitimacy
  • simplifying some of the language used
  • ensuring consistency across layout and design of different components across the three cards

The next step was to test these cards with the public.

Alpha

Public perception testing

After re-iterating the designs for the calling cards based on user feedback from an interviewer perspective, we then took them out for further testing with members of the public. It is important that the calling cards also meet the needs of the public and are not perceived negatively. The findings of this round of testing is summarised below.

Main insights from the ‘called today’ card (see Appendix 5 of the Word version of this report DOCX 2MB):

  • users didn’t like the thought of being approached in person to take part as they would feel under pressure
  • users were surprised that they would be followed up face-to-face if they had not responded to the online survey invitation
  • users understood why they were receiving the card
  • users said they would call or recognised they should phone the interviewer either to arrange an appointment or to opt out
  • users expect that they could still do it online at this point if they’d prefer
  • users demonstrated a mixture of preference over phoning and going on the website for further information
  • users did not always appear to notice the website straight away, even if it would be their preferred mode of finding out more.
  • users felt reassured by the ability to call up and verify the interviewer identity

Main insights from the ‘Your appointment’ card (see Appendix 6 of the Word version of this report DOCX 2MB):

  • the appointment card met users’ expectations
  • users found comfort in the fact that the option to call and re-arrange or cancel is included
  • some users mentioned that it was helpful as they would put it on their fridge as a reminder
  • other users said that they would prefer something electronic
  • there was a need to know how long the appointment would take by a handful of participants
  • fewer mentioned about being given a heads up of what they might need to prepare or have available whilst doing the survey
  • users understood that the appointment would take place as scheduled and the interviewer would come at the agreed time (unless they had called to cancel or re-arrange)
  • there was some expectation from users that a reminder message would be received the day before, for example a text or email
  • users said that they would call the interviewer to tell them if they could no longer make the appointment
  • there was mention of sending a text or email, but it was clear that the card was asking them to phone the interviewer

Main insights from the ‘broken appointment’ card (see Appendix 7 of the Word version of this report DOCX 2MB):

  • users understood why they received the card and felt the onus would be on them to call the interviewer to re-arrange, apologise or cancel
  • in the ‘I’ll try again’ box users expected it to be filled in with another date and time and if they couldn’t make it, they would call to say so and re-arrange
  • however other users said they would prefer for it to be on them to initiate contact if they’d missed their appointment, rather than have an appointment re-arranged without their consultation
  • users were happy with the tone of the card; it didn’t appear to make them feel bad for missing the appointment and users empathised that the interviewer would probably be feeling frustrated by this point

Interviewer Calling Card: Alpha

A version of these cards was also used in the LMS statistical test by our delivery partner. The cards were adapted so that they were dual branded with the third-party contact details and logos included.

Calling cards summary

The calling cards developed for the met the needs of both field interviewers and the public. The next step is to trial the calling cards as part of a live Beta to asses their effectiveness at a larger scale with ONS interviewers and seek to identify any improvements to be made before rolling out.

Back to top of page

Between wave engagement (BWE)

Discovery

One of the main challenges with voluntary longitudinal surveys is retaining sample and minimising attrition (drop off). One strategy which can potentially help with this is to engage with users in-between the waves of the survey to keep their interest (Cleary and Balmer 2015; Lynn, 2017). However, it is important that the content adds value for the respondent and meets users’ needs and expectations. To develop this content a series of research activity was conducted.

First, questions were added to the current business as usual LFS survey to establish what users might want to receive, if anything. We also spoke to previous ONS social survey respondents to find out their thoughts on engaging with them in-between waves of the survey. Following the insights from both these activities and some desk research, prototypes were developed and tested over a series of iterations in focus groups and then as part of end-to-end cognitive testing.

The findings from each of these activities will be discussed in turn.

Dress rehearsal BWE questions of the business as usual Labour Force Survey (LFS)

In the 2017 Labour Force Survey (LFS) dress rehearsal, respondents were provided with information about the survey and asked questions about being followed up between waves.

The results of the survey found that of the total sample of 730, 49% said ‘no’ to being followed up, 30% said ‘don’t know’ and 21% said ‘yes’.

Of those who said ‘yes’ (n=149) in answer to an open text question, most people were keen to hear about: findings from the study, use of their information and when the study had contributed to news headlines.

Respondents were also asked about the specific items they would be interesting in receiving.

Options included:

  • a thank you
  • a reminder about the time of the next interview
  • some findings from the survey
  • how the information is being used
  • a way to tell us about moving house
  • newspaper headlines of things we have done
  • information tailored to age group

112 wanted to receive findings and 108 how the information is being used (see figure 1). Over half of respondents said they would prefer this information via email with post being the second most popular choice (see figure 2).

Figure 1:  Items respondents would be most interested in receiving between waves

Click on the image to view full screen

Figure 2:  Preferred communication methods between waves

Click on the image to view full screen

Of the people who said ‘no’ to being contacted (n=360), 148 said they were not interested, 59 cited too much post and 176 suggested other reasons. The main themes were that it was unnecessary and that they didn’t have time or were too busy.

A full summary of the results is available on request.

To develop insights on users a workshop was held with field force interviewers to draw on their first-hand experience interacting with the public and understand what they thought should be included in a between wave engagement. Their suggestions which were relevant to online first mode are summarised below.

Interviewers felt that:

  • users should be told that the 5th contact is the final time they’ll be asked to take part
  • users should be thanked for their time
  • we should reinforce that users’ data is anonymised
  • users should be informed about topics, which include socio-demographic questions (told it is about work etc. but has a lot of personal stuff first)
  • we should explain why we ask the same questions
  • users could be told about news headlines from our data
  • we could include information about what our statistics have led to, for example, introduction of the minimum wage, winter fuel allowance
  • why users are asked to confirm answers to repeat questions, even if their situation hasn’t changed
  • we should remind users who ONS is, for example; mention that we run census
  • we should remind they have taken part before
  • we could explain that they only required to do their own section and can pass the rest on to other household members to complete their bit
  • social media links should be included
  • a link to the Nomis website, which is a source of up-to-date labour statistics at the local level, so users can find more personalised data
  • ONS Visual website could be sign-posted to, for statistics that are presented in a more engaging and accessible way (note. this resource has since been decommissioned)
  • we should make sure it is clear we will be contacting them again

Telephone interviews were conducted with respondents who had taken part in all five waves of the Labour Force Survey. There were asked for their thoughts retrospectively on between wave engagement. As these users had already completed all five waves of the LFS without BWE, their feedback should be considered with this caveat in mind.

  • one of the users expected contact between the interviews
  • users thought that a BWE wasn’t necessary
  • these users expected to receive letters or phone calls from ONS
  • users without a computer prefer a letter or a phone call
  • for some users, email and SMS were deemed the best form of contact
  • users thought a thank you note was appropriate
  • users said they were interested in hearing the results of the study
  • users like the idea of hearing about news headlines
  • users were in favour of a newsletter
  • users appeared to dislike items that were for promotional purposes such as: calendars, celebrations cards, certificates for participating

A focus group with another group of field interviewers were carried out for initial feedback on some prototyped ideas and to gather further insights.

In this focus group interviewers said:

  • he level of information provided at the close of the face-to-face interview was variable (i.e. whether they mentioned future waves).
  • they felt that respondents would be more interested in what the statistics are used for, rather than the figures themselves
  • they were concerned about the prototyped postcard design disclosing the anonymity of the respondent who had taken part in the survey
  • the concept didn’t have to be sent between waves but could go alongside the advance letters

Discovery summary

In the discovery phase of this research there were clear themes identified. The insights suggest that communication about study results and how the data is used is the most appropriate content to provide users, rather than promotional items that were not meaningful. Email and post were considered suitable forms of contact and what users might expect from the ONS.
In the next stage of the BWE development these insights were taken forward, concepts and content were tested, and products were prototyped.

Alpha

The research objectives of the alpha phase of research were to find out:

  • what information users would find most engaging
  • establish the best ways of presenting the data
  • explore the best medium for disseminating the information

The qualitative insights would be used to inform a trial containing three experimental conditions exploring the impact of an offline and digital BWE against a control group, which would receive no BWE.

Concept development and testing – public focus groups

The content developed and tested in focus groups and one-to-one interviews was based on data from the current Labour Force Survey. Labour market data was used as this is what makes up the core of the online LMS. The concepts of employment, unemployment and those not looking for work were broken down into different categories to show how user’s data are used in the production of statistics.

In developing the prototypes some considerations that we made include:

  • accessibility of content for online and offline
  • content should be the same across email and postcard
  • timeliness of data release
  • no ability to personalise email

In the focus group several different concepts were explored, using a postcard as the medium to display the information. As part of the exercise, users were asked to rank the different ways of presenting the data on a scale. The first scale they were given was around clarity – with ‘clear’ one end and ‘unclear’ the other, along a horizontal line. An additional vertical line was added to this, to create an axis. The vertical scale was labelled ‘interesting’ and ‘not interesting’. In groups, users were asked to sort the different types of data presentation and feedback where and why they placed each one on the axis.

The findings from this research are summarised below:

When information from the study was presented in a bar chart (see Appendix 8 of the Word version of this report DOCX 2MB), users fed back that they felt it was:

  • uninteresting
  • unclear
  • boring
  • vague

On the axis scale, the bar chart was placed in the quadrant of unclear and uninteresting when compared against the other prototypes.

The feedback on the postcard which illustrated employment statistics in a doughnut chart (see Appendix 9 of the Word version of this report (DOCX 2MB)) can be summarised as:

  • providing users with more of a breakdown of the data (compared to the bar chart)
  • making it more it difficult for users to imagine how many people the chart is referring to because of the use of percentages (rather than raw numbers)

In the axis exercise the doughnut chart was placed in the middle of the unclear to clear axis, relative to other designs and above the middle line on the engagement scale.

The testing of the map showing employment rates across the whole of the UK (see Appendix 10 of the Word version of this report (DOCX 2MB)) was found to be:

  • unclear
  • confusing
  • percentages were difficult to comprehend when the numbers were presented as percentages because users did not know the raw numbers
  • too high-level

On the axis, this map was felt to be unclear and uninteresting.

The map (see Appendix 11 of the Word version of this report (DOCX 2MB)) which just drew on the regional employment rate received mix reviews. On the one hand It was felt to be unclear by placing the map of Wales in the context of the UK and a “waste of space”. Users felt that more could have been done with the space available. Whereas others felt that because it just focused on one nation it was clear. This was reflected in the placement of the map during the axis exercise.

To provide an alternative to presenting data in a type of chart, breakdowns and comparisons were drafted in the form of ‘factoids’ or short statements that would illustrate through words how user’s data is used.

The facts presented (see Appendix 12 of the Word version of this report (DOCX 2MB)), were descriptive and traditional but were considered by users as “wordy” and to “jump around too much” in terms of what topics were covered. This highlighted the importance for consistency within the content.

To make the alternative version of the ‘did you know’ facts using the LFS data more engaging for users’ statements were developed that drew on more abstract concepts (see Appendix 13 of the Word version of this report (DOCX 2MB)).

The feedback on this version was that:

  • it was the most engaging
  • the Welsh reference was appealing and relevant to Welsh users
  • there was confusion over the use of other reference points i.e. Australia, UK and the consistency of facts chosen

This postcard which laid out the labour market statistics using the idea of the UK as just 100 people (see Appendix 14 of the Word version of this report) was well received, users said it was:

In the axis exercise, relative to the other postcards, this design was placed in the top right-hand corner indicating that users felt it was both clear and interesting.

<”h5″ class="h5 panel-title">

The content presented on this postcard (see Appendix 15 of the Word version of this report (DOCX 2MB)) shows the same concept of representing the UK labour market as 100 people. It includes the employment rate broken down into four high-level categories, and then offers a further breakdown of the ‘economically inactive’ into five subcategories.

The feedback on this was positive, it was considered:

  • easy to visualise the 100 people
  • on the clear side (but not as clear as version one)
  • to contain a lot of information

Summary of public focus group

The research showed us that often the simple ideas work best, such as representing the UK at just 100 people. In the next stage of one-to-one testing, we took forward this concept and alongside the idea of regionalising the statistics.

In the next stage of testing the ideas were developed into an email and tested in that medium. The testing was conducted alongside testing of the questionnaire. At the end the cognitive interview the user was sent an email in real-time. This testing helped set the context by replicating the process of taking part and being followed up with an email, except in a much shorter timeframe.

The focus of this testing was to:

  • check the UK as 100 people idea in email design
  • explore different subject lines (see A/B test section)
  • regionalisation of content

In general:

  • users expect the email to personalised by being address to them by name, rather than ‘Dear Resident’
  • the use of the logo and ‘Office for National Statistics’ added legitimacy to the email and promotes trust with the user helping them identify that it is not spam

Although the email (see Appendix 16 of the Word version of this report (DOCX 2MB)) is quite long, the testing revealed that they content is shorter and more digestible for users. In summary:

  • users found the information clear
  • • users preferred the simplicity of this email, particularly in comparison to the email made up of did you know facts and statements
  • users found the information interesting for the most part and could identify where they or people they know fit into the categories
  • users recognised that the information was generally relevant to have, but not necessarily information they would use or have an impact on their lives

The facts produced for the content of this postcard (see Appendix 17 of the Word version of this report (DOCX 2MB)) drew on the concepts of Welsh facts that were positively received in the previous focus group and applied it to England. However, the desk research proved tricky to find suitable reference points that worked with the figures. The feedback from users in one to one testing told us:

  • the references to the Tate Modern did not appeal or found to be relevant across the board, and there were instances of this being affiliated with political agendas, which is best avoided
  • not all users recognised the Tate Modern reference
  • users felt the reference to the population of Greater Manchester did not make sense, without first knowing what the population is
  • the use of Greater Manchester as a reference point was random
  • the top two facts were convoluted making them difficult to comprehend for users

The bottom two factoids were more palatable for users – they were considered more important, straight forward and to the point helping users understand how their data might be used. The findings from this testing told us that concepts may not work when recycled in different contexts and careful consideration should be made to ensure ONS maintain neutrality.

Other insights from BWE work in themes

The feedback around using numbers or fractions to explain the data was mixed. The numbers, for example saying 4 million people was difficult for some people to comprehend. Whereas reframing it to 1 in 10 was more useful. However, some people find fractions vague and prefer a more direct figure. In relation to percentages, users don’t find them useful or clear when the number i.e. The population the percentage is referring to is missing.

To overcome this providing both by setting the scene with numbers and breaking that down into a fraction of percentage, where possible, is likely to help meet the needs of more users.

[Tab title=”Front of the postcard: ‘Shape tomorrow’ versus ‘Thank you’” tag=”h5″]

In the first iteration of the postcard prototype we recycled the ‘Shape tomorrow’ branding from the tote bag incentive on the front of the postcard. However, this did not work in this context. Users felt that they would have already shaped tomorrow by taking part in the survey and saw it as a call to action. Although users found the style of the ‘Shape tomorrow’ engaging, particularly the images that represented parts of everyday life.

The comments on the alternative postcard front which simply read ‘thank you’ were that it was more appropriate and enabled them to connect the dots between what they had done, i.e. the survey, and the postcard they were now receiving.

Incorporating feedback from both designs, a ‘thank you’ was developed in the same style as the ‘shape tomorrow’. This allowed us to make the communication more engaging for users and join up the branding between different touch points in the users journey over multiple waves.

[/tab]

As part of alpha testing the email version of between the wave engagement it was important to make the most of the subject line to encourage users to open it.

We tested two different subject lines in face-to-face testing:

  1. ‘Thanks for taking part: An update from the ONS’
  2. ‘You’re helping shape the future of the UK – thank you’

The feedback from this face-to-face testing was that option one was clear and sounded legitimate. Option two on the other hand was unclear and users said they would be less inclined to open it or would skim read it.

Following the one to one testing, we decided to run an A/B test using the email platform, GovDelivery, which was managed internally by the Communications Team.
Desk research on the best time and day to send an email was inconclusive. Commercial organisations and insights from ONS communications based on statistical users suggest midday would be appropriate.

We also consulted with the ONS behavioural insights team who suggested that we trial a subject line that could potentially hook people in, against the more straight forward thank you that was perceived well in qualitative testing.

In the A/B test the subject line and corresponding preheader (what shows underneath the headline) were:

Subject line A: Thanks for taking part: An update from the Office for National Statistics
Preheader: You’re helping shape the future of the UK
Subject line B: You’ve been counted
Preheader: Thank you from the ONS

The sample used for this small test came from previous LMS respondents. Users were randomly allocated to receive email A or email B. Emails were sent on Thursday 4 April at 11am and all emails contained the same content, it was just the subject line and preheader that differed.

The results of the test are shown in Table 1.

Table 1 Email Subject Line A/B Test Results
Email version Delivered Opened Open rate
Email A - Thanks for taking part 235 148 63%
Email B - You've been counted 232 161 69%
Total 467 309 66%

Further analytics available from the platform used told us:

  • of the email addresses given 97% were valid
  • only 2.3% of emails bounced back
  • for email B, the highest number of opens by an individual email was 13
  • for email A, the highest number of opens by an individual was eight
  • those who responded online were more likely to open the email than those who responded face to face
  • there were no unsubscribes
  • no phone calls were made to the survey enquiry line following the email

Although the sample for this experiment was small, we took forward the ‘You’ve been counted’ subject line.

Between wave engagement (two)

The next stage of development involved testing content for a second between wave engagement, continuing the work conducted for wave one and exploring regionalisation of content further.

Findings suggested that people in London wanted the national picture, whereas users in different parts of the country, for example the West Midlands, Wales and Scotland preferred hearing about statistics more local to them, even if at the country level.

Summary

The qualitative research conducted through the alpha phase of develop, informed what products were tested in the quantitative LMS trial that explored attrition between waves. The results of which are summarised in the next section.

Live Alpha: Quantitative test summary – attrition test strategy

The LMS attrition test was online only and consisted of a three-wave experiment exploring the impact of a communication between waves of the study.

In the trial design respondents were allocated at random to receive either a postcard, an email or no between wave engagement after taking part in each wave of the study. In addition to being allocated to one of two incentive groups (£5 unconditional voucher or no incentive).

The uptake rate at wave one was 29% (25% completion rate). At Wave two we saw 62.5% of households return to the survey to complete (56.4% completing the whole survey). The breakdown of the materials groups with no incentive at wave two was as follows:

  • 58.5% completed who received a postal communication
  • 62.6% completed who received no communications between wave
  • 66.4% completed who received an email between wave

In the analysis, the email group was found the be significantly more effective at retaining respondents than the post and no communications group at wave two.

At wave three we saw an overall uptake rate of 73.8% with a 69.9% completion rate. The breakdown of the materials groups with no incentive at wave two was as follows:

  • 70.2% completed who received a postal communication
  • 70.2% completed who received no communications between wave
  • 74.0% completed who received an email between wave

The analysis shows that email was significantly more effective than post at retaining respondents in wave three.

The final versions of the between wave engagement communications rolled out and a full write up of the results of the study are available on request.

Back to top of page

Helpful tools and guidance

Designing products for everyone is an important principle underpinning this work to ensure as many people as possible who are selected to take part can. There are several tools we recommend to help assess the accessibility of your materials.

Hemingway app

The Hemmingway app asses the reading level of text and makes suggestions on where content could be improved to achieve the recommended reading age of 9 years old which is considered best practice in content design.

Plain English Campaign

The Plain English Campaign offers an A to Z of alternative words [PDF 174KB] and guides for writing in plain English.

Collins English Dictionary

Collins English Dictionary allows you to search a word to see how frequently is used. The higher the frequency, the more familiar it is likely to be.

Google Trends

Google Trends allows you to explore and compare different search terms, this can help with deciding the best way to refer to something in a way that is recognised.

Seeing AI

Seeing AI is a multi-purpose tool, a recommended feature includes the ability take a photograph a document which can be converted into a file that can be then read out loud.

Accessibility posters

Home Office Accessibility Posters offer guidance on do’s and don’ts when designing for accessibility.

Simulation glasses

Simulation glasses are physical tools which attempt to mimic the effects of visual impairments. They can be used as a basic and initial assessment of how your product may be viewed by users with different visual impairments. However, every individuals experience will be unique to them.

These tools shouldn’t replace getting feedback from real users. There are professional centres that offer accessibility review of products such as the Digital Accessibility Centre.

Many government departments now have user research teams and content designers embedded within the development of services, so reach out to them if you need support in this area. Or visit user research pages on GOV.UK.

Back to top of page

User needs

The research detailed in this report has consistently identified a selection of user needs over the course of developing these respondent materials. These needs are summarised below as user stories. These stories may be helpful to recycle for developing respondent communications other push-to-web studies.

Users stories help frame the need from the point of view of the users you are designing for. You can read more about creating your own user stories in the GDS service manual.

User stories

As a potential survey respondent, I need to know:

  • what I’m being asked to do so that I can decide whether I want to take part
  • why I should do it so that I understand the importance of taking part
  • who in my household needs to take part so that I can complete it on their behalf or ask them to complete their own section of the study
  • what is in it for me so that I can decide if it’s worth my time
  • how long it will take so that I can plan an appropriate time for me to complete the study
  • how I can take part so that I have the right equipment
  • when I need to do it by so that I can make time to complete the study before the deadline
  • that my data will be treated confidentially and handled securely so that I trust my data will not be misused
  • how I can find out more information so that I can make an informed decision about whether I want to take part
  • what happens next so that I know what to look out for or what action I need to take

We have met these needs through the design and content of our letters. Our user-centred design approach put users at the heart of this, but we also need to consider the business needs, such as GDPR requirements.

Back to top of page

References

  1. Bandilla, W, Couper, MP and Kaczmirek, L ‘The mode of invitation for web surveys’, Survey Practice 2012: Issue 5, Volume 3, pages 1-5.
  2. Cleary A, Balmer N. ‘The Impact of Between-Wave Engagement Strategies on Response to a Longitudinal Survey’, International Journal of Market Research 2015: Issue 57, Volume 4, pages 533-554.
  3. Couper MP ‘Web surveys: A review of issues and approaches’, Public Opinion Quarterly 2000: Issue 64, Volume 4, pages 464-494.
  4. Dillman DA, Smyth JD, and Christian LM ‘Internet, phone, mail, and mixed-mode surveys: the tailored design method’, 2014, John Wiley and Sons.
  5. Dillman DA. ‘Mail and Internet surveys: The tailored design method (second edition)’ 2017, John Wiley and Sons.
  6. Dillman DA. ‘The promise and challenge of pushing respondents to the web in mixed-mode surveys’ Statistics Canada, 2017.
  7. Kaplowitz MD, Lupi F, Couper MP, Thorp L ‘The effect of invitation design on web survey response rates’, Social Science Computer Review 2012: Issue 30, Volume 3 pages 339-349.
  8. Lynn, P ‘Targeted appeals for participation in letters to panel survey members’, Public Opinion Quarterly 2016: Issue 80, Volume 3, pages 771-782.
  9. Lynn P. ‘From standardised to targeted survey procedures for tackling non-response and attrition.’ Survey Research Methods 2017: Volume 11, Number 1, pages 93-103.
  10. Messer BL, Dillman DA. ‘Surveying the general public over the internet using address-based sampling and mail contact procedures’, Public Opinion Quarterly 2011: Volume 75, Issue 3, pages 429-457.
  11. Taylor, S and Lynn, P ‘The Effect of Preliminary Notification Letter on Response to a Postal Survey of Young People’, Market Research Society Journal 1998: Issue 40 Volume 2, pages1-11.
Back to top of page
  • If you would like us to get in touch with you then please leave your contact details or email Analysis.Function@ons.gov.uk directly.
  • This field is for validation purposes and should be left unchanged.