GSS user engagement case studies

These case studies provide you with examples of successful user engagement from across government.

They are grouped by common themes and support our user engagement strategy for statistics.

We will add more case studies as they become available. If you would like to share a case study, please email Engagement.Hub@ons.gov.uk.

Collaborating across boundaries

The task

During the 1990s the Construction Market Intelligence and Housing Statistics Teams in the Department of the Environment, Transport and the Regions (DETR) produced a range of Official Statistics on the housebuilding and construction sectors. The statistics served a wide range of external customers including the Office for National Statistics (ONS), academics, industry analysts and industry professionals. The statistical teams wanted to gather the views of users to improve their statistical products, but there was no established user group covering these topics.

The approach

Statisticians in DETR decided to set up the Consultative Committee for Construction Industry Statistics (CCCIS) to enable them to consult users about the outputs and plans for development. The Committee was administered by the department, with members invited to join, and would focus on topics set by the statistical teams. Meetings were held twice a year.

Over subsequent years, CCCIS has changed its remit and membership to reflect the development of National Statistics and increased focus on user engagement. National Statistics are now also described as accredited official statistics. The committee now covers construction and housing statistics produced by ONS, the Ministry of Housing, Communities and Local Government (MHCLG) and the Department for Business and Trade (DBT). Membership is now open to all users who wish to join. Members can suggest agenda items and present their own research and analysis for discussion.

DBT administers the committee, hosting regular meetings as well as providing information directly to members. CCCIS is chaired by the grade six Head of Business Statistics with administrative support provided by a junior member of the Business Statistics Team.

The impact

CCCIS has had two significant impacts. Firstly, it has ensured that users are kept informed about changes across the range of official data on construction and housebuilding and have provided valuable support to statistical development. In particular, the committee supported the development and accreditation of the ONS National Statistics on construction output, new orders, and prices, and was mentioned in the Office for Statistics Regulation letters confirming National Statistics status for these releases in 2019.

Secondly, the committee has helped to build a network of people interested in analysis of the construction and housebuilding sectors, building relationships and improving analysis and research outside government.

More information

Contact the chair, Frances Pottier by emailing MaterialStats@BEIS.gov.uk.

The task

Housing is a devolved policy area across the UK’s four nations. Currently, more than 20 departments and public bodies publish such statistics. This means that finding the right statistics for the right area can be time-consuming. In November 2017, the Office for Statistics Regulation (OSR) published a systemic review of UK housing and planning statistics which highlighted this disparate statistical landscape as a barrier to accessing official statistics.

The approach

Producers across the Government Statistical Service (GSS) agreed on a collaborative approach. A housing and planning statistics working group was formed which involved colleagues from a range of government departments and the Devolved Administrations. This group’s aim was to share insights and engage users in a more coherent way.

Working together, the group identified and collated all official housing and planning statistics produced across the UK. Contextual information was written with oversight from the group to accurately reflect the statistical differences arising from the devolved nature of housing statistics. This has been identified as an important user need. This work culminated in the publication of a GSS Interactive Tools in September 2019 – a series of web-based interactive tools that empowers users to search, filter, and explore the UK housing statistics landscape.

To promote the tool and seek feedback, producers continued to work together. We shared user engagement tips and ideas to cast a wide net in engaging with producers. By actively seeking feedback through mailing lists and stakeholder events across the UK, we could include more information and optimise the accessibility and functionality of the tools.

The impact

User feedback has been overwhelmingly positive. We have also been able to analyse the tool’s web-metrics to understand how users interact with the tools. Again, doing this as an exercise with all statistical producers has helped us to collectively identify more effective engagement strategies. This has helped drive a growing uptake of the tools. The approach taken by the GSS Strategy Delivery Team has now been adopted by other groups across the Civil Service, who were also looking to improve the accessibility of their statistical areas.

These tools have been cited as exemplars in applying the principles of the Code of Practice for Statistics (V2: Accessibility – housing; V3: Clarity and insight – homelessness) and have been nominated for a GSS Excellence Award. The OSR’s two-year follow-up report on housing statistics identifies the tools as having made a significant contribution to improving the value of UK housing and planning statistics, with other OSR reviews on health and care and mental health also cited the tools positively.

Working together as part of a cross-government initiative, we were able to achieve what we could not have done in isolation, which was a comprehensive overview of the intricate jigsaw of UK housing statistics. We are dedicated to working collaboratively to ensure that we continue to deliver an outstanding service to users.

More information

If you would like to know more, please get in touch with Alex Amaral-Rogers by emailing Alexander.Amaral-Rogers@ons.gov.uk

The task

Esther Sutherland (Social Survey division in the Office for National Statistics) and Ian Boreham (GSS Strategy and Delivery division) are looking to improve the accessibility and coherence of housing and planning statistics.

The approach

They conducted an external user engagement survey. The online survey ran for eight weeks in April and May 2019 and asked users for feedback on the housing and planning statistics published across the GSS. Read more about the results of their survey.

The lessons learned

  1. Planning processes – while designing the survey they sought lots of feedback, spoke with statistics producers, tested the survey with users and got constructive feedback from the GSS Best Practice and Impact division. This effort up front was essential to ensuring they collected good quality data
  2. Promoting the survey – they invested a lot of time promoting the survey and used a variety of communication channels. They also worked collaboratively with other government departments who also promoted the survey
  3. Design to analyse the survey – They designed their survey with lots of open questions, and were pleased that respondents provided lots of information in the free text boxes – over 15,000 words! However, this comes with an associated cost in analysis. So, think of the trade-off between closed and open questions. While closed questions don’t give you such rich insight, there are simpler to analyse and disseminate.
  4. Collaboration – they engaged with the Data Science Campus at the Office for National Statistics. The campus applied machine learning techniques to look at the data which gave us a solid base to start the analysis from and provided results within 24 hours of the survey closing.
  5. Think through how you will use the results – different sections of their results were relevant to different departments, depending on the statistics they produce. They are now considering how they can package the findings to ensure each department gets the information they need to take the work forward. On reflection, they should have considered how they might do this earlier on. They could have made changes to the survey design and subsequent analysis to make this process more straightforward
  6. Digital tools – It has never been easier to run a light touch consultation, with many relatively inexpensive products and an increasing ability to automatically analyse free text
  7. Timing – Eight weeks may seem long, but it will allow different communication methods to be used and greater variety of users to be approached.  They also had to take account of purdah when planning the release and promotion of the survey.

The impact

They said:

“Though there are many different ways you can do it, it is never a bad idea to conduct user engagement. The investment needed to design and undertake our survey has been easily repaid by the wealth of useful feedback received which we can continue to draw on to maximise the public value of housing and planning statistics.

We hope that our successes have inspired you and our reflections will help you think about how you can engage with your users to ensure that we are enhancing the public value of our statistics across the GSS.”

The task

For the first time, the 2021 Census asked the public if they have previously served in the UK Armed Forces. This question was added as it will enable data users and policy makers to meet their commitment to the Armed Forces Covenant, which ensures fair and equal opportunities for veterans. The development of the question is fulfilling a data gap as outlined in the Help Shape Our Future White Paper, as there was previously no data collected on Armed Forces personnel.

It was essential to develop a consistent question and question guidance to ensure harmonisation of data collection across the UK. Office for National Statistics (ONS) joined forces with National Records of Scotland (NRS) and Northern Ireland Statistics and Research Agency (NISRA) to consult with internal and external stakeholders and data users to develop question guidance.

The approach

After creating an initial draft for the Census, the ONS, NRS and NISRA hosted a stakeholder meeting. The team utilised established relationships with stakeholders made in the Census Topic Consultation. Attendees included Ministry of Defence, Royal British Legion and Poppy Scotland, who had been instrumental in getting a question included in the Census. The aim of the workshop was to discuss the draft guidance, gain expert knowledge and gather feedback.

Following the meeting, the guidance was then tested with a diverse range of respondents to ensure the intended data was being collected accurately. Findings from the testing was fed into revising the guidance, and was then presented again to stakeholders at a follow-up meeting.

The impact

ONS, NRS and NISRA were able to gather critical, insightful feedback from stakeholders to develop effective, implementable, and relevant guidance. The team were able to meet the needs of the intended data users through continual revising and engagement. The collaborative approach meant that the work was valuable and meaningful, as it formed part of the 2019 Census rehearsal and 2021 Census guidance.

The development of the question means that the data is more accurate and comparable and meets the Covenants commitments and government’s legal obligations. The government has commitments to honour the covenant, including providing equal access to services.

The team have developed and established an enthusiastic network of stakeholders. The network were then called upon to encourage participation in the Census, helping the team have a wider reach and access.

Please note: NISRA did not include the question and guidance, however their input was useful to ensure representativeness of data.

Want to know more?

Please, get in touch if you’d like to know more: Adam.Kelly@ons.gov.uk. Or, to read more on the UK’s Census’: England and Wales CensusScotland Census and Northern Ireland Census.

The task

The Cabinet Secretary for Social Security and Older People asked the Scottish Government’s Chief Statistician to prepare guidance to support better collection of data on sex and gender by public bodies. Good quality data on sex and gender will help the Scottish Government deliver its vision for Scotland. However, statistics about these characteristics is a sensitive area and the language and terms used are very important to people.

The approach

They set up regular working group meetings and they spoke with their analytical and policy colleagues to identify the right people to involve in this group. Membership comprised of professionals from across statistical services and relevant public sector bodies. Once a membership list was established, formal letters were sent with an invitation to join the working group.

They spoke to a whole range of organisations, individuals and groups. They asked about the definitions that people use, the uses of sex and gender data and the different approaches to collecting this data. They were also interested to know what people have seen work well and less well. The working group sought advice from internal colleagues on how to best to engage with stakeholders.

Two public events were organised to take the conversation wider. These events were advertised on blogs and via emails to people who had expressed an interest in the work. People signed up to events in Glasgow and Edinburgh via Eventbrite. Around 40 people attended each event. Participants took part in facilitated round table discussions about sex and gender in data. This included academics, members of the public and representatives from public sector organisations.

The impact

They said:

“The atmosphere at the public events was constructive. People engaged respectfully during the round table activities. This allowed evidence to be gathered from a wide range of individuals and organisations.”

More information

Contact the Scottish Government’s Office of the Chief Statistician by emailing Statistics.Enquiries@gov.scot. You can access the guidance on collecting and publishing data about sex, gender identity, and trans status on the Scottish Government website.

The task

As part of the annual Measuring Tax Gaps publication His Majesty’s Revenue and Customs (HMRC) produce estimates of the size of the illicit alcohol and tobacco market.
In the 2020 publication, a change to a survey in one of the external data sources meant that the tobacco tax gap estimates needed to be forecast.

The 2018-19 estimates had to be forecast for the 2020 publication due to this change. Prior to this, estimations for the tax years 2005 to 2006 up to 2017 to 2018 used the latest outturn data.

The Office for Statistics Regulation had made a recommendation to review an important assumption in the alcohol tax gap methodology. Furthermore, other potential development areas in the alcohol and tobacco tax gap were identified. So, there was a need to set up a group to focus on this work.

The approach

HMRC spoke to their working level stakeholders to determine who else could support in developing these complex measurements.

The Tobacco and Alcohol Tax Gap Steering Group was set up, bringing together expertise from colleagues across the department who have an interest and insight into the illicit alcohol and tobacco markets. The group includes colleagues from intelligence, policy and strategy and operations.

The purpose of the steering group was to:

  • set and agree the direction of the development priorities
  • provide expertise on the tobacco and alcohol markets
  • identify data sources and main stakeholders to support delivery of the priorities
  • champion and promote the value of the tobacco tax gap estimates
  • provide high level quality assurance and scrutiny of the methodology changes and estimates before the publication

HMRC also reached out to external industry experts (who they had previously engaged with) to broaden their knowledge of the tobacco and alcohol markets and available data. It also was an opportunity to gather feedback on our suggested development areas.

The impact

This approach has successfully ensured a joined-up approach in developing the estimates. It has also encouraged an appreciation of the complexity of the modelling and ensured high levels of engagement. This has led to productive conversations on topics outside of the original scope.

HMRC have gained a better insight into additional data sources and methods to produce greater value in the tobacco and alcohol tax gap estimates for future publications.
The steering group has provided a function to facilitate joined up engagement and it will continue to support the substantial programme of development work for both the tobacco and alcohol tax gaps.

More information

For further information, you can contact Rani Nandra, HMRC by emailing Rani.Nandra@hmrc.gov.uk.

Engaging with the media

The task

The Welsh Government has published a wide range of statistics on its response to the coronavirus (COVID-19) pandemic as well as the effect it has had on society in Wales.

The accuracy of media reporting of these statistics plays an important role in wider public understanding. The Welsh Government sought to provide opportunities for journalists to find out more about the detail behind the statistics and how they should be interpreted.

The approach

Technical briefing sessions were held for the media on testing and contact tracing statistics. These were both high-profile weekly statistical releases produced rapidly in anticipation of a demand for accessible data. Press office, policy officials and statisticians worked together to facilitate the sessions.

In both cases, the statistics were derived from administrative data sources with their own inherent quality and coverage issues. The sessions gave statisticians the opportunity to explain how this affects the methodology underpinning certain indicators and the impact on the reporting. Presentations were delivered jointly by policy officials and statisticians which ensured that the operational context of the subject was clearly set out alongside the technical detail behind statistical indicators. Providing a rounded picture linking the statistical reporting back to the day-to-day operation of the systems helped journalists to understand why the statistics are reported as they are and any limitations.

The impact

The sessions were attended by a range of media outlets and were well received. They provided an opportunity for attendees to ask specific and detailed questions to further their understanding on the topics. Welsh Government statisticians were able to address the more common media queries on certain measures leading to a reduction in enquiries of this nature after the publication of the weekly statistics.

The sessions further strengthened the relationship between statisticians, policy officials and press office that had been built through the rapid development of these statistics. Media outlets were able to understand the detail behind newly published measures which led to accurate and more detailed reporting. The Welsh Government is continuing to hold media technical briefing sessions on statistics in other topic areas.

More information

For more information please email KAS.COVID19@gov.wales.

The task

During the coronavirus (COVID-19) pandemic, NISRA assisted users’ understanding of the three different definitions used to report the number of COVID-19 deaths. This aimed to avoid misinterpretation, confusion and loss of user trust in the statistics.

The approach

The NISRA Vital Statistics Unit identified the need to work across teams to ensure their messaging was reaching the press. The statistics team embedded two members of the Press Office within the statistical production team for the weekly death statistics. In collaboration with the Press Office, a statistical press notice was developed alongside the statistics. The Press Office organised a closed, virtual, media briefing which gave NISRA statisticians a vital opportunity to talk through the statistics and associated definitions in a controlled environment. The first briefing was attended by around 35 members of the media.

The impact

These were very successful events and had a notable impact on improving the media’s understanding and subsequent accuracy in reporting. It has also enabled a closer working relationship to develop between NISRA statisticians and journalists and for members of the press to understand and appreciate the complexities of the statistical production process. Due to the success of these events the approach has been embedded by the Vital Statistics Unit as a more ‘business as usual’ part of the statistical dissemination process, with the most recent event in with the most recent event in December 2020 covering the release of the  Covid-19 related deaths and pre-existing conditions in Northern Ireland report.

More information

For more information, please email Deborah.Lyness@nisra.gov.uk

Gathering user insight

The task

Public Health England (PHE) created an online dashboard to provide the public and data professionals with up-to-date COVID-19 data. The dashboard is updated daily, with UK data at national and local levels. The data presented includes cases, deaths, tests and healthcare-related figures. Since the dashboard launched, more features have been added based on user feedback, and as more data becomes available. For example, people can now search for data by postcode.

The approach

PHE used a user-centred approach to design the dashboard. This helped achieve high standards of usability and accessibility.

To enable users to understand what is happening with COVID-19 and to create a coherent narrative, PHE helped users to:

  • find the information they are looking for
  • interpret the data correctly

The main concern has been to understand:

  • who the COVID-19 dashboard users are
  • which devices they use
  • what they want to achieve

PHE found that the primary audience is members of the public, with a skew towards older age groups. There is also a secondary audience of professionals and more than 63% of the dashboard visits are through mobile phones.

To understand what users need, PHE gathered feedback through qualitative and quantitative user research and made iterative improvements based on that.

Through this process, they have:

  • launched three versions of the design
  • carried out four user surveys (38,000 responses in the last survey)
  • carried out one-on-one user research sessions with  members of the public, data professionals (journalists, Public Health data analysts in the local authorities, HR staff, researchers, etc) and users with disabilities – participants (116 in total so far) were recruited from pools of volunteers with the criteria of having a good representation of user personas
  • analysed over 23,500 emails sent to the feedback email address

The impact

As a result of the user-centric approach, the dashboard design evolved with the users’ changing needs. Consequently, usability, user satisfaction and trust have increased substantially over time.

The outcomes of the last survey showed that:

  • 89% of respondents found the graphs in the COVID-19 dashboard as very easy or easy to understand
  • 93% trust the data in the COVID-19 dashboard
  • 91% are satisfied with the COVID-19 dashboard
  • 92% are very likely or likely to recommend the COVID-19 dashboard

More information

If you would like to know more, please contact Anxo Roibas by emailing Anxo.Roibas@phe.gov.uk.

The task

Following the publication of the 2016 Northern Ireland House Condition Survey (NIHCS), and in response to users’ needs, the Housing Executive commissioned the Building Research Establishment (BRE) to provide up to date estimates of fuel poverty and to examine the impact of changes in fuel prices on fuel poverty in Northern Ireland (NI). BRE produced two reports, The ‘Northern Ireland fuel price ready reckoner for fuel poverty’ and ‘Estimates of fuel poverty in Northern Ireland in 2017 and 2018’.

In May 2019 the Housing Executive held two workshops for users of NI fuel poverty statistics. The aims of the workshops were:

  • to help users gain a better understanding of the method used to produce fuel poverty figures for NI; and how it compares with the methods used in England, Scotland and Wales
  • to provide information on how to use the fuel price ready reckoner
  • to provide information on how the fuel poverty estimates for 2017 and 2018 were produced
  • to get feedback from users about how well the statistics meet their needs and to get their views on possible improvements to the current method

Each workshop had three presentations:

  • measuring and estimating fuel poverty
  • fuel poverty ready reckoner
  • 2017 & 2018 estimates of fuel poverty

View the  presentation slides.

Our approach

Through ongoing informal discussions with the key users of fuel poverty statistics we were aware that users would benefit from an in-depth explanation of the Northern Ireland fuel poverty model, from the technical experts. We have a number of known key users who were invited to attend. We also keep a record of any NIHCS data requests and we used this to identify other users. In addition, our key users were invited to nominate other potential delegates. We wanted the workshop to be interactive and for delegates to have the opportunity for discussion in a small group, so we held two sessions (one in the morning and one in the afternoon) with approximately ten delegates at each.

The impact

Following the workshops, delegates were invited to complete an online feedback survey, view the results of the survey .

The workshops received very positive feedback from users, with delegates indicating that as a result of attending they had a better understanding of how fuel poverty is calculated in NI and the rest of the UK, how to use the fuel price ready reckoner for fuel poverty, and how the fuel poverty estimates for 2017 and 2018 were produced.

Want to know more?

View the discussion points from the workshop.

For any other information please email NIHCS@nihe.gov.uk

Alternatively, you can contact Donna.Mclarnon@nihe.gov.uk or Jahnet.Brown@nihe.gov.uk

The task

The content design team at the Office for National Statistics (ONS), with the support of a user researcher, wanted to understand how people access and use ONS statistical bulletins’ content, using evidence to identify the topics that matter to users. The team did extensive user research and used this evidence to improve bulletins, by increasing their relevance, making them easier for users to find and simpler to understand.

The approach

The team used a mixture of qualitative and quantitative methods to gather evidence, including:

  • 16 face-to-face interviews with bulletin users
  • surveys for several high-profile releases to understand how people’s needs vary across topics
  • developing new analytics dashboards that highlighted user journeys, reading age of the bulletin and the time users spent reading it
  • embedding polls in bulletins to get a high volume of instant feedback from users
  • creating heatmaps to track in granular detail how users are behaving on a page

After extensive testing and iteration of prototypes, the team developed templates and guidance on ONS bulletin content to more clearly fit users’ tasks.

This focussed on:

  • providing simple, concise analysis which is useful to all types of user
  • prominently flagging essential issues of uncertainty which affect the commentary
  • improved on-page navigation helping users get to the content that matches their task
  • new methodology sections highlighting strengths and limitations, and data sources
  • using naming conventions that match users’ search terms

The impact

All new ONS bulletins now follow a consistent structure, approach and format. The word count in these bulletins has been drastically reduced, each section of the bulletin has a clear purpose and there is a greater use of charts to tell users the story behind the data.

User feedback to new-style bulletins has been positive. Around 15 times more users read to the end of the redesigned analysis sections of new-style releases compared to the older versions. On average, there has been a 50% increase in the amount of content that users consume on new-style bulletins.

For some teams, this led to a 41% increase in pageviews, 80% user satisfaction, and a 32% decrease in content nobody saw.

More information

If you’d like to know more please email Content.Design@ons.gov.uk.

The task

The Population and Household Projections team at Office for National Statistics (ONS) are responsible for producing biennial population projections. In the 2018-based population projections it was proposed to not produce 2020-based national population projections (NPPs) and subnational population projections (SNPPs). This proposal was made as the releases would clash with the planned release of provisional Census 2021 results.

The proposal was to base the next population projections on Census 2021 data. Using Census 2021 data would allow more accurate projections, through use of updated base population from Census 2021 and revised back-series of earlier years of data.

Since the coronavirus (COVID-19) pandemic there has been heightened interest in population changes and how these may have impacted on the population projections. To understand user needs on the proposal, ONS invited users to feedback on the proposed approach and timing of the next NPPs and SNPPs releases.

Our approach

The team considered their approach, before deciding to contact users of population projections. They took a three-step approach to identifying the widest possible network of interested groups and individuals.

Step one included sifting through the team mailbox to map users that had been in contact. Step two was to map out existing contacts, which included other government departments, Local Authorities, City Regions, Clinical Commissioning Groups, Academic experts, GSS Housing and Planning Statistics Working Group and other groups. Followed by step three, which involved writing to other projections users subscribed to our mailing lists.

Information on the project was published on the website, asking for user feedback on:

  • their timescale preferences for future publications
  • how they would use the projections
  • whether they needed any variant projections
  • whether a shorter release would meet their needs
  • the usefulness of a new set of projections based on trends and assumptions from earlier releases
  • any other general comments about the projections

The team categorised their users into groups based on their intended use of the projections.

The impact

The main benefits of this engagement have been:

  • greater clarity and understanding of users’ needs and uses of statistics
  • better understanding of their wide variety of users
  • further insight into the balance between quality and completeness with user needs for timely data
  • more UK-wide feedback through working with the devolved administrations
  • improved transparency with users as the results were published in a User Feedback report
  • positive feedback from the Office for Statistics Regulation, stakeholders, and users

The team have since been able to adapt their workplan, planned actions and engagement according to their users’ needs. User feedback showed that there is a need for 2020-based national population projections, and a strong desire for integration of Census 2021 data to provide the highest possible quality and timeliness. The team also had the required evidence to finalise the decision not to produce 2020-based SNPPs.

Lessons learned

The team received more responses than expected and had to enlist support for the analysis and quality assurance of the feedback.

To maintain contact with users who responded, the team published a short update after the engagement exercise had closed. This thanked users for their responses, stated the number of responses received and explained the planned next steps. The update reduced the gap in communications ahead of the publication of the final report.

More information

Since this 2020 engagement and the publication of 2020-based and 2021-based interim releases of national population projections, we are conducting a new engagement called User needs from 2022-based national, subnational and household projections. This involves inviting views on output needs to understand what users would like to see in these publications, such as different projections scenarios or new datasets. The results of the 2024 engagement will be published in late spring or early summer. The next projections releases have been preannounced on the ONS release calendar.

For further information or if you have any questions, please email the team at Projections@ons.gov.uk.

The task

The Population and Public Policy (PPP) forums were developed to address the need to engage more with audiences on the Office for National Statistics’ (ONS) outputs. They have three objectives, which are:

  1. to build stronger relationships between ONS and important policy makers/influencers
  2. to gain insight into the requirements of policy makers and influencers
  3. to raise ONS’ profile amongst important stakeholders and position ONS as a thought leader in relevant policy areas

The approach

The forums began in Autumn 2017. They have been on a range of topics including:

  • sustainable development goals and climate change
  • young people
  • human capital
  • ageing
  • equalities and inclusion
  • housing

ONS teams present on their analytical work aligned to these topics. External speakers are also asked to provide insight into how they use government data in effective decision making. These speakers have ranged from analysts in other government departments, to directors of research in Think Tanks.

The use of panel sessions at these events has allowed ONS to gain feedback and insight from the audience on their priorities and interests. At the young people’s forum, the panel consisted of young people talking about their experiences, ranging from social media, to education and the labour market. This was particularly impactful.

Breakout spaces are used to allow attendees to share their views on specific elements of ONS’s work programme, directly with ONS staff.

The PPP forums are focused on an external audience. Invitations are sent to policy officials and researchers from a range of influential organisations. These often include the following sectors:

  • government
  • think tanks and research organisations
  • charities
  • trade bodies
  • academia
  • local government
  • business

The ONS business area responsible for the forum topic help to develop the aims and ambitions, as well as the content for the forum. The ONS external affairs team provides support on the planning and the invite list.

The impact

The forums have been successful, with positive feedback received and increasing engagement and attendances. Following each forum, ONS teams have continued to engage with contacts, to develop ONS work plans and to provide support and advice on their work. Feedback and insights into users’ requirements have also helped direct future work at ONS.

The forums have been successful at showing ONS’ commitment to engaging with users and highlighting ONS’ analytical plans for these topics.

More information

If you’d like to know more, please contact Nick Mavron by emailing Nick.Mavron@ons.gov.uk.

The task

In 2018, statistical outputs from the Office of Rail and Road (ORR) were spread across three sites: the main ORR website, the ORR’s internal content management system and a tired data portal, only capable of creating downloadable excel data tables.

Users were having difficulty finding ORR statistics, and demand for more flexible outputs was growing. The team at ORR saw an opportunity to give the data portal a much-needed refresh and merge all their outputs into one, efficient, user-friendly site.

The approach

In the development stage, the team held a series of workshops with different stakeholders to understand their needs. They also conducted an online user survey and did extensive user testing. They engaged with other directorates within ORR, important people from across the rail industry and other government departments. They engaged with external stakeholders through the Rail Statistics Management Group.

The team then constructed four user personas and created targeted content on one web page to meet their specific needs. These needs varied from statistical releases to infographics and factsheets – a lot of ground to cover. The team appointed an IT consultant to build and help design the new platform.

The impact

In July 2019 the team launched their new data portal. They received some excellent feedback from users, particularly on the format, accessibility, content and visualisations. They also received positive feedback from the Office for Statistics Regulation in October 2019 and were a runner up for the GSS Presentation and Dissemination Committee award in February 2020.

Examples of improvements the team made:

Going forward, the team plan to improve the user experience and will continue to consult their stakeholders where possible.

The new portal was successful due to the commitment to best practice guidelines and extensive user engagement.

More information

If you’d like to learn more about this work, email the Information and Analysis team at ORR: Rail.Stats@orr.gov.uk 

The task

Former Prime Minister, Theresa May, announced the launch of a Race Disparity Audit in August 2016. The aim of the project was to gather and publish data collected by government about the different experiences of the UK’s ethnic groups. The findings from the audit would then be used to influence government policy.

The Race Disparity Audit was published in October 2017. Data from the audit is regularly updated on the Ethnicity facts and figures website. This allows the public to compare the experiences of people from a variety of ethnic backgrounds.

The approach

When developing the Ethnicity facts and figures website, the project team used the Code of Practice for Statistics to ensure their statistics were trustworthy, high quality and of public value. They also used the 14 criteria of the Digital Service Standard to ensure their digital service was good enough for public use.

Hundreds of users were interviewed or involved in user research as part of the project. The agile methods the project team employed led to a variety of user research techniques, depending on the stage of the project.

The team first carried out contextual interviews with different user groups. Weekly usability lab sessions were then used to test prototypes of the website and different presentations of the data. Before launch, the team held workshops and ran a private beta phase of the website. Through these, over 70 stakeholders and representative users fine-tuned the information architecture and content design of the final product.

It was important the team understood user needs. They also wanted to understand how these needs are met elsewhere, and how the Ethnicity facts and figures website could solve user problems.

The groups of users the team engaged with included:

  • members of the public from diverse backgrounds and ethnicities
  • government policy officials and analysts
  • non-governmental organisations
  • academics
  • public service managers (for example, headteachers and Jobcentre managers)
  • journalists and the media

After launch, the team also gathered feedback on the website content. They used a short online survey and presented the content to other government departments, local authorities and academics. This led them to produce analytical reports such as ethnic group summaries.

The impact

The project team said:

“We learned that the needs of users vary a great deal, as does their understanding of statistical data. The content of the website has to be clear and meaningful for everyone, including those who aren’t experts in statistics and data. Users with more expertise in statistics often need more detailed background information and access to the raw data. The website must also be accessible, so that people with disabilities can use the service.”

More information

The Ethnicity facts and figures website

Email: Ethnicity@CabinetOffice.gov.uk

Reaching new audiences

The task

The Office of Rail and Road (ORR) wanted to maximise the public value of their statistics and provide a forum for all types of users (media, government departments and members of the public) to ask questions about their most popular annual statistical publication; the ‘Estimation of station usage’. ORR decided to host a live Twitter question and answer (Q&A) session.

The statistics team at ORR had previous experience using Twitter in conjunction with the ORR Communications Team and knew users would often look to ORR’s Twitter name (handle) for the latest statistics.

Twitter is now known as ‘X’. The team continue to run Q&A sessions for this publication on X every year.

The approach

Preparation work involved using existing skills within the Statistics Team to create animated charts (GIFs) for the statistical release and collaborative sessions between the responsible statistician and ORR’s Communications Team. Together they created a pre-recorded introductory video and live-stream interview giving information about the publication.

On publication day, the responsible statistician, and members of ORR’s communications team were in a virtual meeting room (before COVID-19 they sat side by side in the same physical location). The responsible statistician worked closely with the Communications Team before tweeting agreed responses to questions asked. Most questions could be answered straight away while others that required additional information or support to respond to took slightly longer, but were generally answered on the same day.

The impact

The main aim of the live Twitter Q&A was about promoting the statistics (and ORR) via proactive and direct engagement with users. There were well-integrated infographics and videos used via Twitter. ORR received positive feedback on the format, proactive and timely engagement with users. Furthermore, the Estimates of Station Usage statistical release was assessed in 2020 by the Office for Statistics Regulation for National Statistics accreditation and this form of user engagement was praised. National Statistics are now also described as accredited official statistics.

The recorded introductory YouTube video posted on Twitter received over 200 views and the live interview had nearly 1,000 views on publication day. The use of an introductory video and live-stream interview on publication day has improved the capability and built confidence of both our Statistics and Communications Teams to promote other ORR statistics in this way in future.

More information

If you would like to know more, please contact the team by emailing Rail.Stats@orr.gov.uk.

The task

Travel and tourism division within the Office for National Statistics (ONS) wanted to improve communication and engagement with known stakeholders and reach out to a wider audience to gather feedback on their statistics. A new role was established within travel and tourism statistics at ONS – a designated ‘stakeholder engagement’ role.

The approach

To ensure a wide range of stakeholders were being engaged with, ONS approached the well-established International Passenger Survey steering group. The group includes a range of important users across the GSS with huge knowledge and experience of the tourism sector.

Beginning with these users, a ‘snow-ball’ effect was used to gather as many different requirements and user needs as possible. Face-to-face and phone meetings were held to collect feedback on the statistics ONS publishes on travel and tourism. Wider user needs and ideas on improvements both to the data sources and communication of the statistics were also collected.

A series of webinars and an online questionnaire was developed and rolled out to further engage with a wider range of stakeholders who had not been contacted before. Subsequently, over a six-month period, a mailing list with regular newsletters and communications has been established, and now has over 1,000 subscribers.

The impact

These methods of communication helped to build sustainable relationships and facilitate open and frank conversations. Stakeholders can tell ONS about their uses for travel and tourism data and freely express any views and feedback. Various stakeholders have welcomed the renewed approach. Additionally, the Office for Statistics Regulation (OSR) have given encouraging feedback to the team on how they have tackled users’ on-going needs.

In response to requests, the team are scheduling webinars as part of their standard communication tools and will run them more often when publications are released. The team is well on the way to a much better understanding of what users want from travel and tourism statistics and will continue to identify areas for improvement.

Stakeholder engagement has become part of business as usual for the team and is worked on alongside other priorities. It is also a regular topic of conversations within team meetings to ensure standards are maintained.

More information

If anyone would like to hear more about the work we have done, please email Travel.and.Tourism@ons.gov.uk.

The task

The Connected Open Government Statistics (COGS) project is working to improve the way users can find and use government statistics. They want to transform government statistics into machine readable 5* open linked data. They want to improve the findability, usability and interoperability of GSS statistics.

As part of this work they needed to understand how statistics are being used by the wide variety of end-users who rely on them. By understanding what people are doing with government statistics they can ensure current functionality is maintained.

The approach

They decided to use an approach called Jobs To Be Done (JTBD) for their user research to help them understand end-users motivations and needs. This approach focuses on the “jobs” the user is trying to achieve, rather than focusing on user demographics or product characteristics. This allowed them to get straight on with understanding their users and getting insights about what they do. It also ensured that their work would relate directly to what their users are trying to achieve and make their “jobs” easier, quicker and more straightforward.

A “job” in this context is what a user is seeking to accomplish in a given situation. It’s not necessarily what they’re doing, but their end goal and their drivers for doing so. To help them capture this information one of the tools they used was a Value Proposition Canvas (VPC). The VPC allowed them to get information about the user’s jobs, and also the pains and gains associated with each job. Additionally, they were also able to capture information about how to relieve pains and create gains. This also added context to the information they captured about jobs users were doing.

They started by completing VPCs with user groups, allowing us to create job stories which were helpful to the development team. To support this they also provided acceptance criteria alongside the job stories to provide more context about user needs. They currently have about 40 job stories covering the themes of:

  • finding data
  • interrogating data
  • data presentation
  • harmonisation and standardisation of data

The impact

The JTBD approach allowed them to quickly get insights about how government statistics are being used. It was therefore possible to keep the user research focus on what the users need, which is what we have tried to keep their whole project centred on. Also, by keeping focus on the “job” end-users are doing it helps get a complete understanding of what users are trying to achieve. This also enabled them to collect contextual information to share with the rest of the team and make informed decisions about developing their products and services.

More information

If you have any questions about the Jobs To Be Done approach, or the COGS project e-mail John Lewis, senior user researcher at the Office for National Statistics: John.Lewis@ons.gov.uk