This is a past event. The webpage remains for information only.

GSSM25: the 25th GSS Methodology Symposium

4th December 2020 9:00 am to 5:00 pm

We will continue to monitor the appropriateness of this event during the coronavirus (COVID-19) pandemic.


The theme for this is statistical methodology evolution: a celebration of new and innovative methodology over the years. This is to celebrate change and innovation over the 25 years of the Government Statistical Service methodology (GSSM). This event is a great opportunity to both showcase and learn from methodological developments from across the GSS and beyond.

In line with the theme of looking at how statistical methods have evolved, we have adapted the format of the day. This includes parallel presentation sessions in the morning and a new unconference session in the afternoon.

Event format

Parallel presentation sessions

There will be 24 parallel sessions, each 15 minutes long, hosted in the afternoon.

Unconference sessions

The aim of the unconference sessions is to increase networking and learning, initiating steps towards collaborative working.

In line with GSSM25’s main theme we will be adapting the usual format of the GSSM to introduce an afternoon unconference session. The unconference session will be informal and collaborative, allowing delegates to choose what they want to present or discuss on methodological topics.

There will be around 15 delegates per topic group or discussion table. Delegates will be asked to submit their choice of topic for the afternoon unconference session when they register to attend. They will also be prompted why they chose this topic and what they will bring to a table discussion around this (if applicable). Registration opens in September. The themes of the topics included in the unconference session are as follows:

Theme discussion titlesTheme discussion descriptions
Re-designing surveys in a rapidly changing worldSurvey researchers had to respond quickly to the suspension of fieldwork data collection because of the coronavirus (COVID-19) pandemic. This included developing new survey designs and testing questions and concepts remotely. What did we learn from this?
Telling a consistent story when data collection and sources changeSurvey designs are changing to save money, to fit better with respondent expectations and to react to crises. Survey data may be supplemented with other sources of data and tests used to evaluate measurement change. We need to think about how best to tell a consistent story through changing times.
Assessing the quality of statistics based on one or more administrative sourcesThe quality of the administrative sources that go into a statistic will have an impact on statistical quality. What are the options for measuring the data quality of each source? How can we use that information to assess the quality of the statistics?
Protecting confidentiality of respondents within our statisticsUnderstanding the methods and building capability across government on privacy and disclosure control. Areas for discussion could include:
- testing data for confidentiality risk
- intruder testing
- reconstruction or newer attacks
- challenges faced in ensuring confidentiality in extraordinary times
- synthetic data to avoid reidentification and privacy
- new approaches to making more outputs available
Building capability in data linkage methodsA discussion about approaches for building capability across government in data linkage methods. For example, sharing best practice, training, developing networks and academic collaboration
Data linkage methodsData linkage methods and showcasing examples of successful and potential data linkage methods in government.
Creating and maintaining a quality culture when linking data across data sources and longitudinally.
Producing and communicating data linkage quality metrics.
Extending the analyst's toolboxData science techniques like machine learning and natural language processing can complement and improve our analysis, enhancing automation and prediction. What will the future analyst's toolbox look like?
Enhancing the statistical picture with new data sourcesData scientists are extracting and exploring novel data sources. What are the strengths and limitations of these methods and of these novel data sources? How can they best be used to complement and improve statistics and analysis?
What makes a Reproducible Analytical Pipeline (RAP) more, or less, robust?We implement RAPs to reliably implement methods. But in practice they are run by teams with less experienced programmers and limited development resources. What are the best ways to build and implement RAPs?
Quality assurance of methodsA wide-ranging discussion about quality assurance approaches and choosing the best methods.
Working with time series in a rapidly changing society, when measurement may also be disruptedNear real-time, high-frequency data provide more timely indicators to complement more traditional data sources. They generally cover only part of the picture and may be biased for the main measures of interest. They may also present technical challenges around temporal disaggregation, benchmarking, decomposition of series to interpret change and combining data sources. How do we best address these problems?
Significance and substanceThe use and misuse of statistical inference has led to debate around how best to describe the patterns we see, taking account of uncertainty. Statistical testing can overstate the importance of significant changes or focus too narrowly on the latest in a series of accumulated changes. What should, and shouldn't, we do?
Modelling to clarify the pictureStatistical models can improve estimates by exploiting linked data, representing the structure in the data and specifying the relationships between variables. They can also give insight by helping to separate the signal from the noise. When can our statistics most benefit from applying models?

Delegates will be prompted on which topic sessions they are interested in attending and this will dictate the plan for this session. Sessions will take place simultaneously.

Panel session

The theme for this session is “Assuring quality in innovation”. In it, we will explore which statistical methods have stood the test of time and been adopted.


For any queries, please e-mail