Skip to content
GSS > Policy and guidance hub > Quality guidelines

Quality guidelines

This item is archived and is no longer current

Policy details

Metadata item Details
Policy status:Archived
Publication date:19 January 2016
Author:Quality Centre
Approver:Quality Centre
Who this is for:Members of the Government Statistical Service

Brief description:

This guidance has been developed to enable the Government Statistical Service (GSS) to effectively carry out all aspects of quality management.

This section provides information on measuring and reporting quality of statistical outputs using the European Statistical System’s dimensions of quality. This includes reporting quality in Background Quality Reports (BQRs) and background notes of statistical releases. These can be used to inform users of quality, as well as their use in conjunction with the GSS risk assessment tool to identify potential risks.

As part of the commitment to being an official statistics producer, we must all ensure that our ‘statistics are produced to a level of quality that meets user needs, and that users are informed about the quality of statistical outputs, including estimates of the main sources of bias and other errors, and other aspects of the European Statistical System’s (ESS’s) definition of quality. In addition, producers are expected to publish details of the methods adopted, including explanations of why particular choices were made.

In practice, official statistics producers can meet the need to make this information available through:

  • The use of their statistical releases background notes (monthly and  quarterly releases) to detail quality and methodology specific to that release i.e sampling error.
  • A background quality report providing other information on the quality of statistics within a publication detailing the strengths, weaknesses, methods and so on against the ESS Dimensions of Quality (pdf, 156KB).

There are a number of guidelines currently available to aid the measurement of statistical quality. These include:

Both sets of guidance provide worked examples of measuring statistical quality against the ESS dimensions of quality, with quality measures and indicators used in this guidance designed for a wide range of statistical outputs. They are therefore relevant for statistical data sources including survey, administrative and census data.

At the moment, the recommended best practice for reporting statistical quality to users is dependent on the information being reported. For measures of statistical quality that are relevant to a particular release (e.g. coefficients of variation, response rates etc), it is recommended that they appear where appropriate within that release. For longer term measures of statistical quality it is recommended that official statistics producers follow Eurostat and GSS best practice and produce Background Quality Reports (BQRs).

BQRs form the backbone of reporting on the quality of official statistics. These not only allow official statistics producers to provide background information on a statistical output, they also allow users to understand the quality of a statistical output. This includes strengths, limitations and other key contextual information.There is GSS guidance on completing a BQR (xls, 49KB) and Eurostat guidance.


This section provides information and guidance on methods of continuous quality improvement and quality assurance processes for official statistics producers. This includes use of Six Sigma and Lean Six Sigma.

Quality isn’t just about the statistics themselves. It’s also about the overall production environment and processes required to produce official statistics. While improvement to methods and technology are common, improved work practices and quality management are less frequent. This is where the process of Continuous Quality Improvement (CQI) comes into its own. This is particularly true of the current working environment where costs and resource are a priority.

Carrying out CQI normally starts by understanding the processes that you’re trying to improve by breaking them down into their component parts (however large or small) and mapping them out (known as process mapping). This process mapping doesn’t have to be detailed but does allow you to better understand each step in a process.

One of the many methods currently used to improve process quality in a variety of organisations is Lean Six Sigma. This technique has been employed by a number of official statistics producers and found to be effective in improving processes.

Lean Six Sigma is based on a combination of Lean’s approach to removing waste from a system together with Six Sigma’s approach on targeting processes that are critical to overall process quality.  Lean Six Sigma projects tend to focus on specific problems which are likely to be known problems that often cause delays (at least in the first instance). For official statistics producers this could range from the fact that it’s taking too long to process data, to time taken to respond to parliamentary questions, or validation checks that are rejecting too many responses.

Once a problem has been identified, Lean Six Sigma uses a similar approach to that used in Six Sigma called DMAIC.

DMAIC stands for:

  • Define – Clarify what the problem is and what it isn’t.
  • Measure – Measure what the problem is using an established metric i.e. time, resource etc.
  • Analyse – Identify how the problem can be resolved.
  • Improve – Implement the solution in order to achieve the improvement.
  • Control – Sustain the improvement through monitoring


Training for Lean Six Sigma is provided through a belt-based system similar to that of Six Sigma or martial arts. Gaining a belt is achieved through a mix of training and practice, with the larger the process, the higher the belt achieved.

  • Yellow belts – can be achieved through attending a training course. Yellow belts are expected to have a good understanding of Lean Six Sigma but are unlikely to lead projects themselves.
  • Green belts – are achieved through a mix of training and practical application in the form of an improvement project. Green belts generally participate in delivering process improvement projects, and may lead their own projects, but will do this as part of their normal day-to-day wider role within an organisation.
  • Black belts – are individuals at management level who spend a significant amount of time on improvement projects and are often be responsible for improvement projects for the whole organisation.

This section provides information on reporting structural and reference metadata within statistical outputs, including use of the Statistical Data and Metadata eXchange (SDMX) format.

Official statistics producers have traditionally made their outputs available in a variety of formats. These range from statistical releases through to Excel® and CSV (Comma Separated Value) files. Users are, however increasingly demanding that data are available in formats that are both open, directly useable by computers (without the need for human intervention) and contain all the additional relevant information to make the data useful. This includes structural and reference metadata.

Structural metadata is the information required in order to directly use data. This includes information such as the name of what is being measured, its format i.e. numerical and the time period to which the data refers, and so on. Reference metadata is different as it provides context similar to that found in quality reports but in a way which is not independent of the dataset. Reference metadata is concerned with reporting aspects such as the periodicity i.e. quarterly and definition of a measure (not the name of the measure itself).

Excel and CSVs – although using the Excel format allows for the presentation of structural metadata, it is not a format which allows for consistency of presentation. This consistency is required for the data to be machine read without understanding the file’s structure. Excel is also proprietary. Although CSV files are open source, they are unable to store a lot of auxiliary information such as metadata and suffer from the same inconsistency in the presentation of data within the file.

Standards for metadata – thankfully there are a number of solutions to presenting data and metadata in a consistent, harmonised way within the same file. The two most common of these are the Statistical Data and Metadata eXchange (SDMX) format and the Data Documentation Initiative (DDI) format.

In its simplest terms, both of these formats are a set of rules (and tools) on data structures, object classes and variables which mean that data (both primary and metadata) are consistently presented irrespective of the source. Both SDMX and DDI can be implemented using eXtensible Markup Language (XML; a markup language which defines a set of rules for encoding), with SDMX also being made available in Javascript Object Notation (JSON). Importantly, it is the consistency (harmonisation) provided by SDMX and DDI formats which mean that these datasets can be consistently machine read through the use of an Application Programming Interface (API).

The Statistical Data and Metadata eXchange (SDMX) format is primarily designed to standardise the way data are organised and exchanged through defined data structures, object classes and variables. The SDMX format is particularly important to official statistics producers. It is the European Statistical System (ESS) standard and was explicitly designed to aid the exchange of statistics.

In essence, SDMX provides a:

  • logical model to describe statistical data, together with guidelines on how to structure the content.
  • standard for automated communication from machine to machine.
  • technology supporting standardised IT tools that can be used by all parties involved in data exchange and processing.

The latest version of SDMX is 2.1, with this version able to take account of the primary data source, reference metadata and structural metadata. SDMX is also required for the transfer of various data between Eurostat and National Statistics Institutes. Since implementing SDMX can be complex, official statistics producers are often encouraged to take a step by step approach.


This section provides information and guidance for assuring the quality of administrative data used to produce statistics.

Administrative data is increasingly being used for statistical purposes meaning that quality assurance of administrative data is an issue that has come to the forefront.

The UK Statistics Authority’s Quality Assurance of Administrative Data (QAAD) guidance helps statistical producers to think about the range of issues to consider and challenge when quality assuring administrative data. It includes:

  • QA toolkit- including the QA matrix and data quality risk/public interest profile
  • Case examples – highlighting different ways that statistical producers have applied the Statistics Authority’s QAAD guidance
  • Frequently Asked Questions (FAQs) document
  • QAAD questions- a prompt for producers when finding out about quality issues


This section provides information on Eurostat, United Nations (UN), Organisation for Economic Cooperation and Development (OECD) and other National Statistics Institute (NSI) practices on quality management and assurance.

All official statistics producers face similar challenges on quality management and assurance, irrespective of the country in which they produce their statistics.
Good international practice on facing these challenges include:

  • The OECD Quality Framework (doc, 600KB) and the associated quality review process was produced in 2003 as part of a wider reform of OECD statistical activities. The framework underwent a review and was enhanced in 2011.
  • The Eurostat Handbook on Data Quality Assessment Methods and Tools (DatQAM) (pdf, 1.46MB) aims at facilitating a systematic implementation of data quality assessment in the European Statistical System (ESS). It presents the most important assessment methods and  provides a concise description of the data quality assessment methods currently in use. It includes recommendations on how these methods and tools should be implemented and how they should reasonably be combined. It includes numerous successful examples and is primarily targeted towards quality managers in the ESS.
  • The UN Quality Assessment Framework was developed following a commission at the UN Statistical Commissions meeting in 2010. The page contains a link to a spreadsheet which maps the National Quality Assurance Framework (NQAF) to other existing frameworks.
  • Australian Bureau of Statistics (ABS) Data Quality Framework provides standards for assessing and reporting on the quality of statistical information. It also assists with the development of statistical collections to produce high quality outputs. It was developed using the Statistics Canada Quality Assurance Framework and the European Statistics Code of Practice (pdf. 357KB).

Review frequency:

This guidance is reviewed periodically.

How useful did you find this?

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Thank you for your feedback

we're happy you found this useful

  • If you would like us to get in touch with you then please leave your contact details or email directly.
  • This field is for validation purposes and should be left unchanged.

We are sorry that this post was not useful for you!

Tell us how we can improve this post? Your feedback is anonymous however, if you'd like us to get in touch then please include contact details or email directly.