An update from ONS on talking to our users about their quality and methods information needs.
The Quality Reporting Team at ONS are in the process of developing a layered approach to communicating quality and methods information to users. This approach builds on our traditional corporate template, the Quality and Methodology Information (QMI) and will provide additional quality reporting products to widen the accessibility of Quality and Methods information to a broader range of users. Our approach is based on an inverted triangle of communication: Most Newsworthy Info; Important Details and Other General Info (Background Info).
Over the last year or so we’ve developed:
- Quality Summaries (Important Points and Overview of the Outputs sections) to help less experienced users reduce the risk of misusing data
- Quality Information within Statistical Bulletins which has quality and methods information as part of the integral structure of the bulletin.
We are now reviewing the QMI itself, and to ensure this product continues to meet the needs of our users, we’ve partnered with colleagues in our Digital Services Division to conduct some user testing.
So far, we’ve done 3 user tests:
User test 1
As part of the review of the QMI, we ran a first user test asking for opinions on the usefulness of existing topics contained within the report. The results of this first test were very interesting and confirmed some assumptions but provided a couple of surprises.
What did we test?
The aim of the user test was to understand which areas of the current QMI reports are most important to users. Participants were given a card sorting exercise and were asked to rank a pre-defined set of 12 cards into categories to indicate how important each bit of information was to them.
Who were the respondents?
Of the 60 responses to the user survey, 31 fully completed the task and this analysis is based on their responses.
- Main users are Expert Analysts (41%)
- A surprising number of Citizen Users (19%) also responded
Further QMI specific testing will concentrate primarily on the needs of Expert Users, the usefulness to Inquiring Citizens should also be improved, particularly in terms of accessibility and clarity of the information given.
Main Points from User Test 1
- Combined for all user types, the top 5 Very Important/Important topics are, in descending order: Relevance, Accuracy, Output Quality, Comparability and Accessibility.
- Expert Analysts categorised all current QMI topics as either Very Important or Important.
- Policy Influencers particularly value Clarity and Output Quality.
- Inquiring Citizens have a wide range of topics that they require but the highest priorities are Clarity and Concepts and definitions.
Quality Information User Profiles
The results of user test 1 gave us breakdowns of interest in topic by user type. These were analysed to produce User Profiles for Quality Information, which will enable us to ensure that we successfully communicate quality information to a wide range of users.
User test 2
What was the purpose of the second user test?
To ensure we’d interpreted the results of the first test correctly when creating the Quality User Profiles, we set up a second survey targeted at specific persona groups that assessed user agreement with our analysis. We listed content items, and users in each group were asked to agree or disagree that each item was more or less useful. Approximately 90 people responded to this user test.
Main Points from User Test 2
The results of this test helped us clarify which topics were most important to each type of user, and to adjust their ranking where necessary, for example in Test 2 40% of Policy Influencers felt Relevance to be of higher importance than previously. Conversely, ‘Concepts and definitions’, ‘How the output was created’ and ‘Timeliness’ were all reported to be more useful than initially thought.
This test also allowed us to gather comments on how different user types use QMIs. This was particularly valuable at enriching the Quality Information User Profiles. For example Technical Users responses included:
- To check consistency between waves of longitudinal datasets and to gain precise definitions of variables.
- Use for teaching to show students importance of reviewing methodology when evaluating statistics.
- To add reader confidence in the reports which the data support.
These responses indicated that, although only 6% of Test 1 users were Technical Users, their impact was greater than we first thought. Therefore, although Expert Analysts remain our main target group for QMI, we will take Technical Users needs into consideration where appropriate in the design of the new QMI.
Quality Information User Profiles
User test 2 gave us further useful information to update the Quality Information User Needs document. This is final for now, but will continue to be updated as we get more information.
User test 3
This user test was sent to Expert Analysts only and was designed to gather further detail on topics of interest including proposed new topics such as:
- The purpose of the QMI.
- Suitable uses for the data.
- A summary of strengths and weaknesses of the data.
- Quality of the administrative data sources.
- Explanations for the use of particular methods.
Results from this test are currently being analysed but we do know that, of the 40 respondents around a quarter felt that more detail was needed in the Comparability and Concepts and Definitions sections. We will need to do some further work to explore this.