This content is under review.
We are currently developing a Government Statistical Service User Engagement Strategy. This page will be updated when the strategy is released. If you would like more information, or wish to share your experiences of engaging with users of statistics, please email firstname.lastname@example.org.
In the interim, we are adding case studies to this page that we have identified through our engagement with producers of statistics. We hope these are helpful as examples of approaches you might wish to consider.
|Publication date:||5 April 2019|
|Author:||Good Practice Team|
|Approver:||Good Practice Team|
|Who this is for:||Members of the Government Statistical Service|
Why we need to engage with users
We need to engage with users so that we:
- prioritise our work based on what statistics users want
- understand what users do with our data
- understand what is an acceptable level of quality
- present statistics when they want them and in a way they understand
- can build trust in our statistics
- maximise the use of our statistics
- can test experimental statistics
- can make users feel a part of the process
Furthermore, the Code of Practice for Statistics says:
“Users of statistics and data should be at the centre of statistical production; their needs should be understood, their views sought and acted upon, and their use of statistics supported.”
How to identify users
Have a think about who might use your statistics.
It might help to think about groups of users:
- Small, medium and large businesses
- International organisations
- Trade unions
- Local government
- Politicians and parliament
- Lobby groups
- The NHS
- Bank of England
- The general public
Build user personas
User personas are detailed descriptions of “typical” people who use your statistics. They are a highly effective way of capturing essential information about your target audience. You can build them by doing user research.
They can be used to ensure you develop and design ways of communicating statistics in a user-centred way.
The Office for National Statistics have done work on this when developing their website.
Take a look at the user personas they have put together.
Use what you already have
Take a look at what you already know about your users. Look over feedback, public enquiries, Freedom of Information requests, Parliamentary Questions etc.
Use Google Analytics
You can use Google analytics to look into lots of things, including how many people viewed your publication online and how they found it. How you will be able to use Google Analytics will depend on the website your statistics are published on.
Set up Google Alerts
Google Alerts are emails sent to you when Google finds new online content that matches your search term. They are a simple way to monitor anything on the web.
They can be used to monitor how your statistical publications, or key phrases linked to your statistics, are being used in Google. They are a really good way to find out more about how the statistics you produce are being used.
Words of warning:
1) There are limitations to the Google algorithms as they are essentially a text search. So you may get some irrelevant alerts.
2) Acronyms can be problematic because one acronym can stand for different things. E.g. UKSA can stand for “UK Statistics Authority” but it can also stand for UK Space Agency or UK Sailing Academy!
Use Google Trends
Google trends allows you to look at what people are putting into Google. Take a look at Google trends for the UK.
Use social media
For example, create an account on Twitter. Look at accounts that tweet about your statistics. Look at their followers – these are likely to be users of your statistics.
You could take a sample of users and look into what they do and why they are interested in your statistics.
You can also use online tools to do some web-scraping:
- If you go to analytics.twitter.com while logged in, you can find some high level info (follower’s interests, demographics, location etc) in the audiences tab
- If you use Hootsuite to manage social media there are some analytics built into that (even in the free version).
- Other free tools are followerwonk.com and TweetStats
- Onalytica and Brandwatch are other scraping tools you have to pay for
Talk to people
Ask policy colleagues or your media team if they know of any users you’re not aware of.
Ask users to get in touch:
- Put information on how they can get in touch with you on your publications
- Put adverts in industry and special interest publications or websites
- Ask users to get in touch through social media.
How to engage with users
Users are central to the statistics we produce, and engaging with users should therefore be at the core of what we do.
Consultations and surveys
User engagement through consultations and surveys is common across the GSS. Our case study relating to housing and planning statistics outlines some things to consider when taking this approach.
Esther Sutherland (Social Survey division in the Office for National Statistics) and Ian Boreham (GSS Strategy and Delivery division) are looking to improve the accessibility and coherence of housing and planning statistics.
They conducted an external user engagement survey. The online survey ran for eight weeks in April and May 2019 and asked users for feedback on the housing and planning statistics published across the GSS. Read more about the results of their survey.
The lessons learned
- Planning processes – while designing the survey they sought lots of feedback, spoke with statistics producers, tested the survey with users and got constructive feedback from the GSS Best Practice and Impact division. This effort up front was key to ensuring they collected good quality data
- Promoting the survey – they invested a lot of time promoting the survey and used a variety of communication channels. They also worked collaboratively with other government departments who also promoted the survey
- Design to analyse the survey – They designed their survey with lots of open questions, and were pleased that respondents provided lots of information in the free text boxes – over 15,000 words! However, this comes with an associated cost in analysis. So, think of the trade-off between closed and open questions. While closed questions don’t give you such rich insight, there are simpler to analyse and disseminate.
- Collaboration – they engaged with the Data Science Campus at the Office for National Statistics. The campus applied machine learning techniques to look at the data which gave us a solid base to start the analysis from and provided results within 24 hours of the survey closing.
- Think through how you will use the results – different sections of their results were relevant to different departments, depending on the statistics they produce. They are now considering how they can package the findings to ensure each department gets the information they need to take the work forward. On reflection, they should have considered how they might do this earlier on. They could have made changes to the survey design and subsequent analysis to make this process more straightforward
- Digital tools – It has never been easier to run a light touch consultation, with many relatively inexpensive products and an increasing ability to automatically analyse free text
- Timing – Eight weeks may seem long, but it will allow different communication methods to be used and greater variety of users to be approached. They also had to take account of purdah when planning the release and promotion of the survey.
“Though there are many different ways you can do it, it is never a bad idea to conduct user engagement. The investment needed to design and undertake our survey has been easily repaid by the wealth of useful feedback received which we can continue to draw on to maximise the public value of housing and planning statistics.
We hope that our successes have inspired you and our reflections will help you think about how you can engage with your users to ensure that we are enhancing the public value of our statistics across the GSS.”
Want to know more?
Get in touch via email: GSS.email@example.com
You can set up meetings, workshops and seminars and invite users to attend. Invite users you already know about and ask them to spread the word. It is also good to advertise on social media and in statistical and industry publications or websites.
Webinars (which are online seminars) are a good way to get a wider range of users attending. Take a look at using GoToWebinar.
Webinars are good because:
- they are easy to set up and join
- up to 100 people can join
- no specialist software is needed for the host or attendees – you just need a laptop or computer or tablet with internet access and a phone.
- you can use a webcam if you wish.
- attendees can see the slides of the presentation and hear the voice of the host live, without moving from their desk
- attendees can also ask questions, either by typing them or asking them verbally
- the session can be recorded, meaning you can circulate the presentation with commentary to anyone who could not attend
You might also consider holding a publication launch event and inviting users to it.
The Cabinet Secretary for Social Security and Older People asked the Scottish Government’s Chief Statistician to prepare guidance to support better collection of data on sex and gender by public bodies. Good quality data on sex and gender will help the Scottish Government deliver its vision for Scotland. However, statistics about these characteristics is a sensitive area and the language and terms used are very important to people.
They set up regular working group meetings and they spoke with their analytical and policy colleagues to identify the right people to involve in this group. Membership comprised of professionals from across statistical services and key public sector bodies. Once a membership list was established, formal letters were sent with an invitation to join the working group.
They spoke to a whole range of organisations, individuals and groups. They asked about the definitions that people use, the uses of sex and gender data and the different approaches to collecting this data. They were also interested to know what people have seen work well and less well. The working group sought advice from internal colleagues on how to best to engage with stakeholders.
Two public events were organised to take the conversation wider. These events were advertised on blogs and via emails to people who had expressed an interest in the work. People signed up to events in Glasgow and Edinburgh via Eventbrite. Around 40 people attended each event. Participants took part in facilitated round table discussions about sex and gender in data. This included academics, members of the public and representatives from public sector organisations.
“The atmosphere at the public events was constructive. People engaged respectfully during the round table activities. This allowed evidence to be gathered from a wide range of individuals and organisations.
The work is still underway, but this is a good case study in engaging with a range of users in an open and transparent way.”
Want to know more?
Contact the Scottish Government’s Office of the Chief Statistician via email: firstname.lastname@example.org.
Invite people to join user groups and set up regular meetings.
Carry out user research. These case studies from the Office for National Statistics (ONS), the Cabinet Office and the Office of Rail and Road outline approaches you might wish to consider.
In 2018, statistical outputs from the Office of Rail and Road (ORR) were spread across three sites: the main ORR website, the ORR’s internal content management system and a tired data portal, only capable of creating downloadable excel data tables.
Users were having difficulty finding ORR statistics, and demand for more flexible outputs was growing. The team at ORR saw an opportunity to give the data portal a much-needed refresh and merge all their outputs into one, efficient, user-friendly site.
In the development stage, the team held a series of workshops with different stakeholders to understand their needs. They also conducted an online user survey and did extensive user testing. They engaged with other directorates within ORR, key players across the rail industry and other government departments. They engaged with external stakeholders through the Rail Statistics Management Group.
The team then constructed four user personas and created targeted content on one webpage to meet their specific needs. These needs varied from statistical releases to infographics and factsheets – a lot of ground to cover. The team appointed an IT consultant to build and help design the new platform.
In July 2019 the team launched their new data portal. They received some excellent feedback from users, particularly on the format, accessibility, content and visualisations. They also received positive feedback from the Office for Statistics Regulation in October 2019 and were a runner up for the GSS Presentation and Dissemination Committee award in February 2020.
Examples of improvements the team made:
- new data on delay compensation claims
- new measures published on passenger rail performance
- the creation of team capability within Power BI
- launching new self-service graphics for users, allowing for further interrogation of datasets
- the production of animated charts, using in-house R coding skills to show time-series changes
Going forward, the team plan to improve the user experience and will continue to consult their stakeholders where possible. They launched a user survey in February 2020 and are currently working on a new implementation plan. They also plan to share their improvements with other government departments.
The new portal was successful due to the commitment to best practice guidelines and extensive user engagement.
Want to know more?
If you’d like to learn more about this work, email the Information and Analysis team at ORR: email@example.com
The Connected Open Government Statistics (COGS) project is working to improve the way users can find and use government statistics. They want to transform government statistics into machine readable 5* open linked data. They want to improve the findability, usability and interoperability of GSS statistics.
As part of this work they needed to understand how statistics are being used by the wide variety of end-users who rely on them. By understanding what people are doing with government statistics they can ensure current functionality is maintained.
They decided to use an approach called Jobs To Be Done (JTBD) for their user research to help them understand end-users motivations and needs. This approach focuses on the “jobs” the user is trying to achieve, rather than focusing on user demographics or product characteristics. This allowed them to get straight on with understanding their users and getting insights about what they do. It also ensured that their work would relate directly to what their users are trying to achieve and make their “jobs” easier, quicker and more straightforward.
A “job” in this context is what a user is seeking to accomplish in a given situation. It’s not necessarily what they’re doing, but their end goal and their drivers for doing so. To help them capture this information one of the tools they used was a Value Proposition Canvas (VPC). The VPC allowed them to get information about the user’s jobs, and also the pains and gains associated with each job. Additionally, they were also able to capture information about how to relieve pains and create gains. This also added context to the information they captured about jobs users were doing.
They started by completing VPCs with user groups, allowing us to create job stories which were helpful to the development team. To support this they also provided acceptance criteria alongside the job stories to provide more context about user needs. They currently have about 40 job stories covering the key themes of:
- finding data
- interrogating data
- data presentation
- harmonisation and standardisation of data
The JTBD approach allowed them to quickly get insights about how government statistics are being used. It was therefore possible to keep the user research focus on what the users need, which is what we have tried to keep their whole project centred on. Also, by keeping focus on the “job” end-users are doing it helps get a complete understanding of what users are trying to achieve. This also enabled them to collect contextual information to share with the rest of the team and make informed decisions about developing their products and services.
Want to know more?
If you have any questions about the Jobs To Be Done approach, or the COGS project e-mail John Lewis, senior user researcher at the Office for National Statistics: firstname.lastname@example.org
Former Prime Minister, Theresa May, announced the launch of a Race Disparity Audit in August 2016. The aim of the project was to gather and publish data collected by government about the different experiences of the UK’s ethnic groups. The findings from the audit would then be used to influence government policy.
The Race Disparity Audit was published in October 2017. Data from the audit is regularly updated on the Ethnicity facts and figures website. This allows the public to compare the experiences of people from a variety of ethnic backgrounds.
When developing the Ethnicity facts and figures website, the project team used the Code of Practice for Statistics to ensure their statistics were trustworthy, high quality and of public value. They also used the 14 criteria of the Digital Service Standard to ensure their digital service was good enough for public use.
Hundreds of users were interviewed or involved in user research as part of the project. The agile methods the project team employed led to a variety of user research techniques, depending on the stage of the project.
The team first carried out contextual interviews with different user groups. Weekly usability lab sessions were then used to test prototypes of the website and different presentations of the data. Before launch, the team held workshops and ran a private beta phase of the website. Through these, over 70 stakeholders and representative users fine-tuned the information architecture and content design of the final product.
It was important the team understood user needs. They also wanted to understand how these needs are met elsewhere, and how the Ethnicity facts and figures website could solve user problems.
The groups of users the team engaged with included:
- members of the public from diverse backgrounds and ethnicities
- government policy officials and analysts
- non-governmental organisations
- public service managers (for example, headteachers and Jobcentre managers)
- journalists and the media
After launch, the team also gathered feedback on the website content. They used a short online survey and presented the content to other government departments, local authorities and academics. This led them to produce analytical reports such as ethnic group summaries.
The project team said:
“We learned that the needs of users vary a great deal, as does their understanding of statistical data. The content of the website has to be clear and meaningful for everyone, including those who aren’t experts in statistics and data. Users with more expertise in statistics often need more detailed background information and access to the raw data. The website must also be accessible, so that people with disabilities can use the service.”
Want to know more?
Social media and newsletters
Use social media to promote your statistics and start conversations with users. Remember to adhere to civil service and departmental rules regarding the use of social media though!
You could also start a regular newsletter about your statistics and ask people to sign up.
Use Google Trends to find out what sort of questions users are asking about your statistics or, more generally the area you work in.
Then directly address these questions in your publications.
This BBC newsbeat article about Brexit does this.
Make use of online user forums like StatsUserNet (an interactive website for communication between users and producers of statistics). These are particularly useful if your users are active on these forums.
This guidance is reviewed annually.