To enable the GSS to effectively carry out all aspects of Quality Management the following guidelines have been developed to aid producers in their production of official statistics.
Measuring and Reporting Statistical Quality
Provides information on measuring and reporting quality of statistical outputs using the ESS dimensions of quality. This includes reporting quality in Background Quality Reports (BQRs) and background notes of statistical releases. These can be used to inform users of quality, as well as their use in conjunction with the GSS Risk Assessment Tool to identify potential risks.
- As part of the commitment to being an official statistics producer, we must all ensure that our ‘statistics are produced to a level of quality that meets user needs, and that users are informed about the quality of statistical outputs, including estimates of the main sources of bias and other errors, and other aspects of the European Statistical System (ESS) definition of quality’ (Principle 4, Practice 2 of the Code). In addition, producers are expected to publish details of the methods adopted, including explanations of why particular choices were made.
- The use of their statistical releases background notes (monthly and quarterly releases) to detail quality and methodology specific to that release i.e sampling error.
- A background quality report providing other information on the quality of statistics within a publication detailing the strengths, weaknesses, methods and so on against the ESS Dimensions of Quality.
- In practice, official statistics producers can meet the need to make this information available through:There are a number of guidelines currently available to aid the measurement of statistical quality. These include:
- GSS Guidelines on Measuring Statistical Quality – Provides a checklist of quality measures and indicators for use when measuring and reporting on the quality of statistical outputs.
- ESS Handbook for Quality Reports – Provides comprehensive guidance on measuring the quality of statistical processes.
- Both sets of guidance provide worked examples of measuring statistical quality against the ESS dimensions of quality, with quality measures and indicators used in this guidance designed for a wide range of statistical outputs. They are therefore relevant for statistical data sources including survey, administrative and census data.
- At the moment, the recommended best practice for reporting statistical quality to users is dependent on the information being reported. For measures of statistical quality that are relevant to a particular release (e.g. Coefficients of variation, response rates etc), it is recommended that they appear where appropriate within that release. For longer term measures of statistical quality it is recommended that official statistics producers follow Eurostat and GSS best practice and produce Background Quality Reports.
- Background Quality Reports (BQRs) form the backbone of reporting on the quality of official statistics. These not only allow official statistics producers to provide background information on a statistical output, they also allow users to understand the quality of a statistical output. This includes strengths, limitations and other key contextual information.There is GSS Guidance on Completing a BQR (Excel), together with guidance from Eurostat in the form of the ESS Standard for Quality Reports.
- Good Practice – There are various good practice examples of BQRs from across the GSS some of which have been including in the examples section.
Quality Assurance through Business Process Improvement
Provides information and guidance on methods of continuous quality improvement and quality assurance processes for official statistics producers. This includes use of Six Sigma and Lean Six Sigma.
- Quality isn’t just about the statistics themselves. It’s also about the overall production environment and processes required to produce official statistics. While improvement to methods and technology are common, improved work practices and quality management are less frequent. This is where the process of Continuous Quality Improvement (CQI) comes into its own. This is particularly true of the current working environment where costs and resource are a priority.
- Carrying out CQI normally starts by understanding the processes that you’re trying to improve by breaking them down into their component parts (however large or small) and mapping them out (known as process mapping). This process mapping doesn’t have to be detailed but does allow you to better understand each step in a process.
- One of the many methods currently used to improve process quality in a variety of organisations is Lean Six Sigma. This technique has been employed by a number of official statistics producers and found to be effective in improving processes.
- Lean Six Sigma is based on a combination of Lean’s approach to removing waste from a system together with Six Sigma’s approach on targeting processes that are critical to overall process quality. Lean Six Sigma projects tend to focus on specific problems which are likely to be known problems that often cause delays (at least in the first instance). For official statistics producers this could range from the fact that it’s taking too long to process data, to time taken to respond to parliamentary questions, or validation checks that are rejecting too many responses.
- Once a problem has been identified, Lean Six Sigma uses a similar approach to that used in Six Sigma called DMAIC. DMAIC stands for:
- Define – Clarify what the problem is and what it isn’t.
- Measure – Measure what the problem is using an established metric i.e. time, resource etc.
- Analyse – Identify how the problem can be resolved.
- Improve – Implement the solution in order to achieve the improvement.
- Control – Sustain the improvement through monitoring
- Training – Training for Lean Six Sigma is provided through a belt-based system similar to that of Six Sigma or martial arts. Gaining a belt is achieved through a mix of training and practice, with the larger the process, the higher the belt achieved.
- Yellow belts – can be achieved through attending a training course. Yellow belts are expected to have a good understanding of Lean Six Sigma but are unlikely to lead projects themselves.
- Green belts – are achieved through a mix of training and practical application in the form of an improvement project. Green belts generally participate in delivering process improvement projects, and may lead their own projects, but will do this as part of their normal day-to-day wider role within an organisation.
- Black belts – are individuals at management level who spend a significant amount of time on improvement projects and are often be responsible for improvement projects for the whole organisation.
Reporting Structural and Reference Metadata with Datasets
Information on reporting structural and reference metadata within statistical outputs, including use of the Statistical Data and Metadata eXchange (SDMX) format.
- Official statistics producers have traditionally made their outputs available in a variety of formats. These range from statistical releases through to Excel® and CSV (Comma Separated Value) files. Users are, however increasingly demanding that data are available in formats that are both open, directly useable by computers (without the need for human intervention) and contain all the additional relevant information to make the data useful. This includes structural and reference metadata. Structural metadata is the information required in order to directly use data. This includes information such as the name of what is being measured, its format i.e. numerical and the time period to which the data refers, and so on. Reference metadata is different as it provides context similar to that found in quality reports but in a way which is not independent of the dataset. Reference metadata is concerned with reporting aspects such as the periodicity i.e. quarterly and definition of a measure (not the name of the measure itself).
- Excel® and CSVs – Although using the Excel® format allows for the presentation of structural metadata, it is not a format which allows for consistency of presentation. This consistency is required for the data to be machine read without understanding the file’s structure. Excel is also proprietary. Although CSV files are open source, they are unable to store a lot of auxiliary information such as metadata and suffer from the same inconsistency in the presentation of data within the file.
- Standards for metadata – Thankfully there are a number of solutions to presenting data and metadata in a consistent, harmonised way within the same file. The two most common of these are the Statistical Data and Metadata eXchange (SDMX) format and the Data Documentation Initiative (DDI) format.
- The Statistical Data and Metadata eXchange (SDMX) format is primarily designed to standardise the way data are organised and exchanged through defined data structures, object classes and variables. The SDMX format is particularly important to official statistics producers. It is the European Statistical System (ESS) standard and was explicitly designed to aid the exchange of statistics. In essence, SDMX provides:
- A logical model to describe statistical data, together with guidelines on how to structure the content.
- A standard for automated communication from machine to machine.
- A technology supporting standardised IT tools that can be used by all parties involved in data exchange and processing.
- The latest version of SDMX is 2.1, with this version able to take account of the primary data source, reference metadata and structural metadata. SDMX is also required for the transfer of various data between Eurostat and National Statistics Institutes. Since implementing SDMX can be complex, official statistics producers are often encouraged to take a step by step approach. For more information, check out the Guidance section.
Auditing of Administrative Data
Provides information and guidance for assuring the quality of administrative data used to produce statistics.
Administrative data is increasingly being used for statistical purposes meaning that quality assurance of administrative data is an issue that has come to the forefront.
The UK Statistics Authority’s Quality Assurance of Administrative Data (QAAD) guidance helps statistical producers to think about the range of issues to consider and challenge when quality assuring administrative data. The web page to access this guidance is linked here, the page includes:
- QA toolkit- including the QA matrix and data quality risk/public interest profile
- Case examples – highlighting different ways that statistical producers have applied the Statistics Authority’s QAAD guidance
- FAQ document- answering questions asked by producers about QAAD
- QAAD Questions- a prompt for producers when finding out about quality issues
- Using Administrative Data: Good Practice for Statisticians
- NISRA Quality Assurance and Quality Review
- Quality Assurance and Audit Arrangements for Administrative Data
- UKSA’s Regulatory Standard for quality assurance of administrative data
- UKSA’s Quality Assurance of Administrative Data (QAAD) guidance
- Guidance for Measuring and Reporting on Quality when Administrative Data is used to Supplement or Replace Survey Data
International Practices on Quality Management and Assurance
Provides information on Eurostat, UN, OECD and other National Statistics Institute (NSI) practices on quality management and assurance.
- All official statistics producers face similar challenges on quality management and assurance, irrespective of the country in which they produce their statistics.
Good international practice on facing these challenges include:
- Organisation for Economic Co-operation and Development (OECD) Quality Framework – The OECD Quality Framework and the associated Quality Review process was produced in 2003 as part of a wider reform of OECD statistical activities. The framework underwent a review and was enhanced in 2011.
- Eurostat Handbook on Data Quality Assessment Methods and Tools – The Handbook on Data Quality Assessment Methods and Tools (DatQAM) aims at facilitating a systematic implementation of data quality assessment in the ESS. It presents the most important assessment methods and provides a concise description of the data quality assessment methods currently in use. It includes recommendations on how these methods and tools should be implemented and how they should reasonably be combined. It includes numerous successful examples and is primarily targeted towards quality managers in the ESS.
- United Nations Quality Assessment Framework – this was developed following a commission at the UN Statistical Commissions meeting in 2010. The page contains a link to a spreadsheet which maps the National Quality Assurance Framework (NQAF) to other existing frameworks.
- Australian Bureau of Statistics (ABS) Data Quality Framework – The ABS Data Quality Framework provides the standards for assessing and reporting on the quality of statistical information. It also assists with the development of statistical collections to produce high quality outputs. It was developed using the Statistics Canada Quality Assurance Framework and the European Statistics Code of Practice.