GSSM25: the 25th GSS Methodology Symposium
- 4th December 2020 9:00am to 5:00pm
We will continue to monitor the appropriateness of this event during the coronavirus (COVID-19) pandemic.
Call for abstracts now open
The GSS Methodology Symposium (GSSM) committee are now calling for abstract submissions to present during the parallel presentation sessions at this year’s GSSM. The closing date for abstract submissions is 30 September. Please see details on how to apply.
The theme for this is statistical methodology evolution: a celebration of new and innovative methodology over the years. This is to celebrate change and innovation over the 25 years of the GSSM. This event is a great opportunity to both showcase and learn from methodological developments from across the GSS and beyond.
Take a look at information from previous symposiums for examples of the type of projects that we are keen to showcase at the event.
In line with the theme of looking at how statistical methods have evolved, we have adapted the format of the day. This includes parallel presentation sessions in the morning and a new unconference session in the afternoon.
Parallel presentation sessions
There will be approximately 12 parallel sessions, each 30 minutes in length.
Call for abstracts
The call for abstract submissions for the morning presentation session is now open, closing end of day 30 September 2020. We are looking for presentations of innovative work, deriving value from data and improving the quality of statistics. Any submissions should fall into one or more of the following categories:
- data processing
- data analysis
- data collection
- data linkage
- data mining
- data science or big data
The aim of the unconference sessions is to increase networking and learning, initiating steps towards collaborative working.
In line with GSSM25’s main theme we will be adapting the usual format of the GSSM to introduce an afternoon unconference session. The unconference session will be informal and collaborative, allowing delegates to choose what they want to present or discuss on methodological topics.
There will be around 15 delegates per topic group or discussion table. Delegates will be asked to submit their choice of topic for the afternoon unconference session when they register to attend. They will also be prompted why they chose this topic and what they will bring to a table discussion around this (if applicable). Registration opens in September. The themes of the topics included in the unconference session are as follows:
|Theme discussion titles||Theme discussion descriptions|
|Re-designing surveys in a rapidly changing world||Survey researchers had to respond quickly to the suspension of fieldwork data collection because of the coronavirus (COVID-19) pandemic. This included developing new survey designs and testing questions and concepts remotely. What did we learn from this?|
|Telling a consistent story when data collection and sources change||Survey designs are changing to save money, to fit better with respondent expectations and to react to crises. Survey data may be supplemented with other sources of data and tests used to evaluate measurement change. We need to think about how best to tell a consistent story through changing times.|
|Assessing the quality of statistics based on one or more administrative sources||The quality of the administrative sources that go into a statistic will have an impact on statistical quality. What are the options for measuring the data quality of each source? How can we use that information to assess the quality of the statistics?|
|Protecting confidentiality of respondents within our statistics||Understanding the methods and building capability across government on privacy and disclosure control. Areas for discussion could include:
- testing data for confidentiality risk
- intruder testing
- reconstruction or newer attacks
- challenges faced in ensuring confidentiality in extraordinary times
- synthetic data to avoid reidentification and privacy
- new approaches to making more outputs available
|Building capability in data linkage methods||A discussion about approaches for building capability across government in data linkage methods. For example, sharing best practice, training, developing networks and academic collaboration|
|Data linkage methods||Data linkage methods and showcasing examples of successful and potential data linkage methods in government.
Creating and maintaining a quality culture when linking data across data sources and longitudinally.
Producing and communicating data linkage quality metrics.
|Extending the analyst's toolbox||Data science techniques like machine learning and natural language processing can complement and improve our analysis, enhancing automation and prediction. What will the future analyst's toolbox look like?|
|Enhancing the statistical picture with new data sources||Data scientists are extracting and exploring novel data sources. What are the strengths and limitations of these methods and of these novel data sources? How can they best be used to complement and improve statistics and analysis?|
|What makes a Reproducible Analytical Pipeline (RAP) more, or less, robust?||We implement RAPs to reliably implement methods. But in practice they are run by teams with less experienced programmers and limited development resources. What are the best ways to build and implement RAPs?|
|Quality assurance of methods||A wide-ranging discussion about quality assurance approaches and choosing the best methods.|
|Working with time series in a rapidly changing society, when measurement may also be disrupted||Near real-time, high-frequency data provide more timely indicators to complement more traditional data sources. They generally cover only part of the picture and may be biased for the main measures of interest. They may also present technical challenges around temporal disaggregation, benchmarking, decomposition of series to interpret change and combining data sources. How do we best address these problems?|
|Significance and substance||The use and misuse of statistical inference has led to debate around how best to describe the patterns we see, taking account of uncertainty. Statistical testing can overstate the importance of significant changes or focus too narrowly on the latest in a series of accumulated changes. What should, and shouldn't, we do?|
|Modelling to clarify the picture||Statistical models can improve estimates by exploiting linked data, representing the structure in the data and specifying the relationships between variables. They can also give insight by helping to separate the signal from the noise. When can our statistics most benefit from applying models?|
Delegates will be prompted on which topic sessions they are interested in attending and this will dictate the plan for this session. Sessions will take place simultaneously in the booked rooms.
The theme for this session is “assuring quality in innovation”. In it, we will explore which statistical methods have stood the test of time and been adopted.
Registration and contact
Registration for the event is free and can be done through the GSSM25 Eventbrite page.
For any queries, please e-mail firstname.lastname@example.org