Voluntary Application and 2024 Winner of the Award for Statistical Excellence: Human Fertilisation and Embryology Authority Dashboard

2024 Winner of the Award for Statistical Excellence in Trustworthiness, Quality and Value for the Human Fertilisation and Embryology Authority Dashboard

The Human Fertilisation and Embryology Authority (HFEA) launched what is thought to be the first fertility dashboard of its kind in the world. The dashboard offers access to accurate and customisable UK-wide data dating back 30 years from the HFEA’s national fertility register in an accessible format on the HFEA website. The dashboard displays information on fertility treatments such as egg freezing and birth rates, patients, partners, donors and children born as a result of these treatments. This includes around 1.5 million IVF and 270,000 donor insemination treatments undertaken by around 665,000 patients since 1991.

Judges of the award praised this entry for the value delivered by HFEA’s work, as well as the response to user needs.

Applying the Code: Trustworthiness, Quality and Value (TQV)

The publication of the HFEA dashboard has produced Value by improving the ease of accessing our 30 years of national UK fertility register data. The dashboard enabled them to increase the breadth of data available on their website, in comparison to tables and graphs available in statistics publications, while presenting the information in a customisable and user-friendly format.

The aim of this dashboard was to make the data the HFEA holds more easily accessible and to reduce administrative work required to respond to data enquiries. The idea was based on efficiency and the benefits of improving data transparency. The dashboard was completed internally by a very small team within a year supported by a wide range of specialists within the HFEA and user testing, including initial training of the dashboard software, coming in underbudget. Following launch, around a third of data requests have been satisfied by referrals to the dashboard, with over 30,000 public views in the first two months, leading to improved team efficiency.

In demonstrating Trustworthiness, the HFEA provided caveats to data provided in the dashboard, particularly with success rates where preliminary figures are impacted by missing outcomes of treatments from clinics. The HFEA included a grey band in the figures to denote where data is preliminary and included caveats in tooltips to mark where data should be interpreted with caution, further information is available on the landing page and in the Quality and methodology report (Q&M). Planned updates and a change log are detailed on the landing page, as well as details on how patients cannot be identified from the data.

Quality of the data used from the HFEA register is ensured through validation exercises with clinics, inspections and audits. Production of data in the dashboard was quality assured by qualified staff through a multistep process documenting checks on each aspect of the data and presentation, including accessibility, scripts and verifying that data matches previously published sources. Details on limitations in the data and coherence with other publications are detailed in a Q&M report, and key limitations are additionally detailed in information icons in the dashboard.

Peter Thompson, Chief Executive of the HFEA, said:

“We’re delighted that the HFEA dashboard has won this year’s Trustworthiness, Quality & Value award. Our dashboard, which we believe to be the first of its kind in the world, is designed to provide impartial information in an easy-to-use format to help inform the many difficult decisions around fertility treatment. This award is a real tribute to the quality of work of our expert team at the HFEA and a recognition of the huge interest in UK fertility data. The HFEA will continue to build on the work recognised by this award and enable the public, clinicians and researchers to access the data we collect. Thank you to the Office for Statistics Regulation and the Royal Statistical Society for the award.”

Citations from users:

Prof Adam Balen, consultant in reproductive medicine at Leeds teaching hospitals NHS trust said:

“The new dashboard enables researchers to access data for study and patients to access information to better inform them on their fertility journey and thereby demystify some of the complexities behind the statistics of treatment outcomes.”

Other useful background

User testing was performed with patients, researchers, clinicians, staff and stakeholder organisations to ensure the dashboard could satisfy the various needs of key audiences. Improvements were made based on feedback including additions of information notes, FAQs on the landing page, and production of a two-minute animated explainer video.

The finalised dashboard also includes a link to a feedback form to ensure HFEA continue to receive input from users. Accessibility was reviewed by an external company, informing updates including improvements to keyboard-only navigation and developing dynamic alt-text.

The dashboard has been covered in numerous media publications with a focus on the dashboard’s user-friendly design and improved data transparency. Recent feedback from a patient referred to using the dashboard as a trusted source to check following meetings with clinics to see whether the rates they quoted were consistent with values displayed on HFEA’s dashboard.

Related Links

Using RAP to strengthen organisational approaches to quality management

The Department for Transport (DfT) has fostered a culture of innovation and improvement which has supported the application of core RAP principles and supported the quality management of its official statistics. This culture has been created through a combination of enthusiastic and driven individuals, strong senior support and strategic direction, and working in an open and transparent way. DfT’s RAP developments have been underpinned by a strategic goal to produce most of its statistics using a RAP approach.

DfT work transparently and openly using GitHub, to share code and host materials from DfT’s weekly coding meetings and signpost to useful resources online. DfT have developed and published an R cookbook of coding standards, that specify DfT’s minimum requirements for ‘good code’. DfT require that the master version of a script is not edited without going through a code review and encourage the use of automated testing (Continuous Integration) tools. The R cookbook is community edited, so standards can evolve as change as needed.

To introduce RAP principles to its official statistics, the DfT has focused on automating data tables and quality assurance processes. DfT identified these as the best areas for development in its existing processes since they would be the most prone to human error.

For example, by using R code to automatically run validation checks and identify issues for further exploration, quality assurance is now carried out in a more standardised and efficient way than it was before for DfT’s Road Safety statistics. DfT ensures that the R code to produce these statistics is peer reviewed, providing an additional layer of quality assurance. Peer review is often carried out by members of the RAP committee, the group which supports RAP developments in the department.

The committee has developed a template which is used as the basis for all new coding projects. This supports a standardised coding style across the department and results in improved quality, readability and reusability of code.

DfT has a strong community of statisticians and its RAP committee has been instrumental in supporting RAP developments. This includes running internal code clubs, inviting external speakers to share learning, and developing training and tools such as an R project template and an R cookbook which provides comprehensive coding examples (see Case study T5: Developing statisticians’ coding capabilities to meet future organisational needs). DfT has also developed a RAP training session for managers which focusses on quality assurance and gives managers the confidence they need to sign off publications which use a RAP approach.

This example shows how DfT has created a culture that supports RAP developments and continuous improvement. By working openly through GitHub, DfT is transparent about its approach to quality management. It has also established organisational tools that help it to manage quality to appropriate quality standards, strengthening of the quality management approach used in the production of its official statistics.

Leading the development of statistics on transport use during the pandemic

To monitor the use of the transport system in Great Britain during the coronavirus (COVID-19) pandemic, the Department for Transport (DfT) rapidly produced new statistics on transport use by mode, from March 2020.

The DfT Head of Profession for Statistics (HoP) and senior statistical leadership team were instrumental in developing the statistics. They led and encouraged collaboration and innovation with organisations outside government to gain access to new data. For example to develop near real-time indicators, the lead for Travel and Safety Statistics worked with a bus technology company to get information about bus use outside of London, and the HoP proactively led discussions with a telecoms provider about the potential application of telecoms data which formed part of the methodology to providing estimates of cycling. The lead for Road Traffic worked with their team to develop a new approach to use existing automated traffic counters.

The production of these statistics involved a coordinated effort across multiple analytical teams, overseen by the HoP and senior leadership team. The HoP put in place a fast but rigorous quality assurance process. Provisional numbers were produced by individual analytical teams and sent to a central team by 3pm each day. The HoP reviewed them, provided feedback as needed and the numbers were then finalised by the individual teams before being signed off by the HoP for inclusion in an updated data dashboard at the end of each day.

The statistics were first presented via slides at a series of press conferences at 10 Downing Street in response to coronavirus. For example, the Transport Secretary presented on the statistics in his statement on coronavirus (COVID-19) on 4 June 2020. The statistics were used to show the change in transport trends across Great Britain and give an indication of compliance with lockdown rules. They proved vital for informing the government, the media and the general public and continued to be valuable as the lockdown rules changed. Statisticians at DfT had continuous close engagement with the Cabinet Office to ensure that the data had been well understood by their policy colleagues.

OSR carried out a rapid review of the statistics, which highlighted that the data was only sometimes included in the daily briefing slides and therefore only available to the public on those days. The DfT HoP then played a key role getting the data published daily each weekday.

The DfT HoP later determined that changes should be made to the frequency and timing of the publication of the statistics, to reflect changes in user demand. This involved weighing up the user need with the resource required to produce the data and the impact on staff and being proactive in anticipating future user interest. As user demand for daily data initially reduced, the decision was made to publish the data weekly on a Wednesday instead. Then later, the HoP determined the timing and frequency of publication should be moved back to daily for a set period, to inform users shortly ahead of schools reopening, before then reverting back to weekly publication when that need again reduced.

This example shows the key roles played by the DfT HoP and wider statistical leadership team in the production of a new and important data source, which has been used for informing the government, the media and the general public during the pandemic. Key aspects of the role the HoP played included encouraging collaboration and innovation with organisations outside government to gain access to an important new data source, ensuring the rigorous quality assurance of new outputs produced to a tight timescale, and determining changes to the frequency and timing of the publication of the statistics as user demand changed, and being proactive in anticipating future user interest, while considering with the resource impact on analytical staff required to produce the data.

Being transparent and orderly when publishing revisions and corrections

The NHS Business Services Authority (NHSBSA) is a new official statistics producer that was added to The Official Statistics Order 2018. 

In April 2020, NHSBSA published a suite of information titled Prescription Cost Analysis (PCA) England as National Statistics. This followed the responsibility for the publication of these statistics transferring to NHSBSA from NHS Digital.  

NHSBSA have used the Code of Practice for Statistics to frame many of its policies and processes that support the production of official statistics. The NHSBSA Revisions and Corrections policy is an excellent example of this. Users are put at the centre of the policy, ensuring that any revisions or corrections are clearly communicated including any reasons for the change. The policy details both scheduled and unscheduled revisions and corrections. Three types of revisions are defined as: 

  • scheduled revisions  
  • changes in methodology, and  
  • changes made due to receipt of further data.

NHSBSA commit to maintain a revision history for each statistical release and clearly mark any provisional or revised data.  

NHSBSA also commit to notify users of any significant errors that may occur. The policy states that decisions on how to address any errors in published statistics will be made by the Lead Official for Statistics, in consultation with the National Statistician. In line with the policy, NHSBSA published a correction notice in April 2020. The notice is prominent and clear and includes an explanation for users to understand the reason, nature and cause of the correction.  

More widely, the NHSBSA demonstrate the orderly release of the PCA statistics by using a 12-month release calendar to pre-announce the publication of its official statistics. A Pre-Release Access (PRA) list is also published as part of the PCA release, containing details of all individuals in NHSBSA who have been granted 24hour pre-release access to the statistics in their final form, as well as those engaged internally in production and dissemination of the statistics. The 24hour pre-release access list is commendably small, and we understand NHSBSA has plans to remove 24hour pre-release access to the PCA statistics entirely in future. 

This example shows how the NHSBSA supports trustworthiness in its official statistics by taking transparent and orderly approach to publishing statistical revisions and corrections. It also highlights other positive aspects such as of NHSBSA’s management of Pre-Release Access as well as the pre-announcement of the PCA statistics through a 12-month release calendar. 

The secure and effective management of pre-release access

This is a case study for Principle T3: Orderly release.

The Welsh Government makes excellent use of its corporate electronic records management software (Objective ECM) to manage pre-release access across the organisation. The system allows secure groups to be set up which means documents can be shared with only specific individuals.

For each output, a group is set up that is bespoke to the specific pre-release access list. Links to those documents are then sent around under pre-release access rules which means that final versions of releases are never attached to emails as unsecure documents. This is done with clear guidance that no indication of the substance of the statistical release are included in the email text.

This has a number of benefits:

  • If a name is accidentally included on an email distribution list, or it is forwarded to someone else, they cannot actually open the document
  • Pre-release access arrangements can be managed centrally
  • The Objective ECM system has full audit functionality, which allows statisticians to identify who has opened the release (and when). This ensures that they know who has accessed the statistics pre-release, but also allows them to assess whether pre-release access is actually needed in future
  • It provides a safe space for press notices and lines to take to be developed under the same secure conditions as the statistical release
  • It provides a facility for secure external sharing of official-sensitive material through a linked product Connect, in the event that there are pre-release access recipients in outside organisations
  • It ensures that outputs and associated correspondence are automatically part of corporate record
  • Functionality can provide built-in sign-off of documents to support quality assurance and internal clearance processes

Since the introduction of this approach Welsh Government report that the number of accidental or near-miss pre-release sharing occurrences has reduced to virtually zero.

This example shows how Welsh Government has developed an effective system to manage the circulation of statistics in their final form, in line with its obligations under pre-release access legislation. The system also securely supports Welsh Government’s quality assurance and press-notice development processes, and supports statisticians in reviewing whether listed individuals should require continued access in future, which helps to keep the number of individuals granted pre-release access to a minimum.

Independent production and the managed handling of statistics and data

This is a case study for Principle T1: Honesty and integrity.

The Office of Rail and Road (ORR) is the authoritative and recognised provider of rail statistics in Great Britain. ORR’s production of official statistics is distinct from its role as a regulator. Its statistical releases do not compare results against targets, or make judgements about how Network Rail, train operators, or other entities should be performing. All commentary is therefore factual, based on the trends in the statistics, without giving an opinion on the results.

ORR’s Information & Analysis (I&A) team is responsible for the production of all of its official statistics. The team consists of statisticians and business intelligence analysts. The principles of the Code and the working practices used to produce its statistics are explained to all ORR staff involved in the production of statistics upon induction.

The I&A team ensure that clear handling instructions are provided to those who access the statistics before they are published as management information, or in the production or pre-release stages. For example, some of the data ORR uses to publish statistics is also used as management information to support ORR’s ongoing regulatory monitoring of Network Rail through a series of internal dashboards. These are provided to users with specific handling instructions that they are for internal monitoring purposes only, and must not be disclosed to third parties.

ORR’s Communications team also have a clear understanding of how the Code applies to their activities, and new joiners are briefed by a senior statistician. The statistics pages are the most popular on the ORR website and the team recognise the value of being the trusted source of rail statistics. ORR’s statistics are one way in which it can demonstrate its authority, professionalism and breadth of knowledge about the rail industry. Communicating them clearly helps the media, stakeholders and public to be well-informed on key aspects of the rail industry. A media handling plan is agreed between ORR’s head of profession for statistics and the Communications team to ensure social media, press releases, blogs, internal briefs, and other communication using official statistics are compliant with the Code whilst still supporting ORR’s role as a regulator.

To further raise awareness of the Code within ORR, statisticians present at staff briefings and regularly offer work shadowing to communications staff and analysts joining other teams.

This example demonstrates some of the ways in which ORR ensures its statistics are truthful, impartial and independent. It also shows how ORR ensures its staff handle and use statistics and data with honesty and integrity and meet consistent standards of behaviour that reflect the wider public good.

Demonstrating transparency when linking and publishing data

The Scottish Government’s (SG) health and homelessness in Scotland project linked local authority data about homelessness between 2001 and 2016 with NHS data on hospital admissions, outpatient visits, prescriptions, drugs misuse, and National Records of Scotland information about deaths.

Transparency around the risk assessment process helps to demonstrate a producer’s Trustworthiness to users, suppliers and the public. One of the ways in which SG demonstrated this was by conducting and publishing their data privacy impact assessment alongside the main analysis report. SG also published the original application for the data, the public benefit and privacy panel application and the correspondence documenting its approval, and details of how to access the data. This approach is now standard practice for all SG publications based on linked data.

Since SG carried out this work, a new tool for risk assessment – Data Protection Impact Assessments (DPIAs) – have been introduced following the 2018 Data Protection Act (DPA), as a requirement of GDPR. They are mandatory where data are combined from multiple sources and the Information Commissioner’s Office recommends they are also conducted on a voluntary basis for any large-scale processing of personal data.

The accountability principle in the DPA requires organisations to have appropriate records in place to demonstrate compliance if required. Departments can meet the DPA accountability principle by conducting a DPIA, and publishing them helps to meet the Code’s requirements for transparency (providing that they are accessibly presented). It isn’t essential to publish a DPIA in full, a summary of the process and the lessons learnt would be sufficient to demonstrate transparency.

Another step producers can take to increase transparency is to publish details of all the data share requests made to them and their outcomes. SG publishes details of the data sharing requests submitted to its Statistics Data Access Panel on its website, which also includes details about past decisions made and the justifications for those decisions.

The Department for Education in England has also been publishing details of the data share requests and outcomes in relation to ad hoc National Pupil Data Sharing for several years. In December 2017, the Department for Education broadened the scope to cover all routine sharing of personal data and have recently consulted users about further changes to make this easier to engage with and understand.

These examples show how Trustworthiness can be demonstrated by statistics producers being transparent about their approaches to the management of the data linkage process and data shares, and their relevance to some of the current legislation in this area.

Developing statisticians’ coding skills to meet future organisational needs

The Department for Transport (DfT) has been upskilling its analysts to facilitate the adoption of data science methods in the department. To help with this, DfT has established weekly Coffee and Coding sessions and bespoke R coding workshops, building on successful models used in the Department for Education and Business Enterprise Industry and Skills.

Coffee and Coding sessions aim to nurture and encourage a vibrant, supportive and inclusive coding community. They provide a regular opportunity for people to share coding skills, knowledge and advice, and to network and get to know each other. The format is usually a presentation followed by a Code Surgery. Presentations usually demonstrate a tool or technique and/or a show and tell of new work done within the department. Code Surgeries allow people to raise coding queries or ideas with the coding community; there is no such thing as a silly question and it is understood that the quest for knowledge necessarily includes failure.

The R workshops are a suite of sessions designed to train DfT’s statisticians in the basics of R coding. They are mainly based around the use of tidyverse R libraries to maintain regular standards, and include topics such as data wrangling with dplyr, graphing with ggplot2, and report automation with rmarkdown. DfT’s first cohort graduated in late 2018 and the second is due to start in early 2019.

DfT runs a mentorship programme (akin to the GDS Data Science Accelerator) to provide support to those taking on data science projects using a new tool or method. DfT expects that eventually there will be enough coders in the department that asking for statistical coding advice will be as easy to source as advice on using Excel.

A big part of DfT’s approach is to encourage people to share knowledge, so that pioneers trying methods for the first time generate resources for others to use and adapt. GitHub has become central to this process – DfT uses it to share code and host any materials from DfT’s weekly coding meetings and to signpost to useful resources online. DfT has also developed coding standards, that specify DfT’s minimum requirements for ‘good code’, whilst not burdening the developer with lots of extra work. For example, DfT requires that the master version of a script is not edited without going through a code review and encourage the use of automated testing (Continuous Integration) tools. The document is community edited so standards can evolve as change as needed.

DfT encourages analysts to use similar variants of code and to follow a style guide. For data analysis, R and Python have proved popular language choices, but there are also style differences within R and Python. For this reason, DfT has default suggested packages in DfT’s coding standards and approaches the R workshops with a consistent coding style, encouraging developers to use the Tidyverse syntax style. This means that a relatively new coder only has to learn this syntax style to be able to interpret typical code across the department.

DfT collaborates closely with its Digital Services team to ensure that the core functions of the software development tools work, making sure analysts can install packages for Python and R, use Git to version control their code, and use dependency management tools like packrat.

Senior leaders, including the Head of Profession for Statistics and managers responsible for teams of statisticians, have a good understanding of the benefits of RAP. As a result, staff are strongly supported to take time to develop new skills and improve their statistics. DfT’s RAP developments have been underpinned by a strategic goal to produce most of its statistics using a RAP approach. This has been recognised by the wider department – for example, the RAP committee won the Excellence in Learning award at the DfT 2020 Staff Celebratory Event.

This example shows how DfT staff are provided with the time and resources required to develop new coding skills, knowledge and competencies to meet DfT’s future organisational needs and how DfT is developing new quality strategies and standards.

Being transparent about user engagement and quality management approaches

This is a case study for Principle T4: Transparent processes and management.

The Welsh Government produces all official statistics on housing for Wales and engages with a wide range of users and stakeholders on these statistics.

In 2013, statisticians in the Welsh Government refreshed and relaunched an existing user group as the Housing Information Group (HIG). The HIG meets three times a year (two themed meetings and one seminar) and acts as a forum for the Welsh Government, local authorities, housing associations, participating agencies, and the academic housing research community to share views about housing.

The HIG meetings are used to inform stakeholders and users about ongoing developments in policy, data collection and statistics and to discuss future developments to housing statistics. The Welsh Government publishes the agenda, minutes and actions from each meeting on its website.

Discussions at the HIG have influenced decisions about the scope and content of a new Housing Conditions Survey and the direction of work on the Housing Stock Analytical Resource for Wales. They also influenced the direction of work to investigate the feasibility of collecting individual statutory homelessness data.

This demonstrates that the Welsh Government is transparent about its public engagement with users, potential users and stakeholders.

The Welsh Government also publishes a Statistical Quality Management Strategy, which is a helpful document that demonstrates that it is open about its commitment to quality and quality management.

The strategy reflects relevant Code Pillars, Principles and Practices and includes general information about the quality of Welsh Government statistics and the users of its statistics. It describes the Welsh Government’s four Statistical Quality Objectives (quality assurance training, questioning data, publishing quality reports, and reviewing processes and outputs). The strategy explains how it implements the four objectives, contains links to example quality reports, and provides guidance for analysts on checking and validating data.

Adopting a transparent and consistent approach to user engagement and quality management helps the Welsh Government to demonstrate to its users that they can have confidence in its statistical services and products.

The Head of Profession’s role in assessing continued compliance with the Code

In July 2016, the Welsh Government’s Head of Profession for Statistics wrote to the Office for Statistics Regulation (OSR) to request that the National Statistics designation of its Homelessness Statistics in Wales be temporarily suspended, as some local authorities faced difficulty in providing complete and accurate data following changes to the legislation that impacted the data collection.

Statisticians in the Welsh Government engaged extensively with local authorities to improve data collection practices and strengthen the quality assurance of the data used to produce these statistics. This involved issuing new guidance, holding workshops with local authorities to discuss difficulties in data collections and clarifying definitions and carrying out detailed reviews of individual data collection forms.

Welsh Government’s Head of Profession for Statistics wrote to OSR again around a year later. He set out the actions taken by the statisticians to ensure the quality of data recorded by local authorities and requested an end to the temporary suspension, which resulted in the National Statistics status being restored.

In June 2018, the Ministry and Housing and Local Government’s (MHCLG) Head of Profession for Statistics wrote to OSR outlining planned changes to MHCLG’s statutory homelessness statistics, following the introduction of the 2018 Homelessness Reduction Act.

The Head of Profession highlighted that the change would result in a break in the statistical series, as future homelessness data was being collected on a different basis by English local authorities going forwards. She noted that as similar legislative changes in Scotland and Wales had initially had a negative impact on data quality, new MHCLG homelessness statistics would be published as experimental statistics from the end of 2018, until MHCLG was sufficiently assured about the quality of the new data. The Head of Profession also set out plans to test the new experimental statistics by engaging with users of homelessness statistics to obtain their feedback.

These examples highlight the central role of the Head of Profession for Statistics in assessing continued compliance with the Code and determining the need for statistical developments to reflect changing legislative context and users’ needs. As required by the Code, they also highlight the Head of Profession’s role in reporting concerns they may have about continuing to meet the standards of the Code to the Director General for Regulation.