Voluntary Application and 2024 Winner of the Award for Statistical Excellence: Human Fertilisation and Embryology Authority Dashboard

2024 Winner of the Award for Statistical Excellence in Trustworthiness, Quality and Value for the Human Fertilisation and Embryology Authority Dashboard

The Human Fertilisation and Embryology Authority (HFEA) launched what is thought to be the first fertility dashboard of its kind in the world. The dashboard offers access to accurate and customisable UK-wide data dating back 30 years from the HFEA’s national fertility register in an accessible format on the HFEA website. The dashboard displays information on fertility treatments such as egg freezing and birth rates, patients, partners, donors and children born as a result of these treatments. This includes around 1.5 million IVF and 270,000 donor insemination treatments undertaken by around 665,000 patients since 1991.

Judges of the award praised this entry for the value delivered by HFEA’s work, as well as the response to user needs.

Applying the Code: Trustworthiness, Quality and Value (TQV)

The publication of the HFEA dashboard has produced Value by improving the ease of accessing our 30 years of national UK fertility register data. The dashboard enabled them to increase the breadth of data available on their website, in comparison to tables and graphs available in statistics publications, while presenting the information in a customisable and user-friendly format.

The aim of this dashboard was to make the data the HFEA holds more easily accessible and to reduce administrative work required to respond to data enquiries. The idea was based on efficiency and the benefits of improving data transparency. The dashboard was completed internally by a very small team within a year supported by a wide range of specialists within the HFEA and user testing, including initial training of the dashboard software, coming in underbudget. Following launch, around a third of data requests have been satisfied by referrals to the dashboard, with over 30,000 public views in the first two months, leading to improved team efficiency.

In demonstrating Trustworthiness, the HFEA provided caveats to data provided in the dashboard, particularly with success rates where preliminary figures are impacted by missing outcomes of treatments from clinics. The HFEA included a grey band in the figures to denote where data is preliminary and included caveats in tooltips to mark where data should be interpreted with caution, further information is available on the landing page and in the Quality and methodology report (Q&M). Planned updates and a change log are detailed on the landing page, as well as details on how patients cannot be identified from the data.

Quality of the data used from the HFEA register is ensured through validation exercises with clinics, inspections and audits. Production of data in the dashboard was quality assured by qualified staff through a multistep process documenting checks on each aspect of the data and presentation, including accessibility, scripts and verifying that data matches previously published sources. Details on limitations in the data and coherence with other publications are detailed in a Q&M report, and key limitations are additionally detailed in information icons in the dashboard.

Peter Thompson, Chief Executive of the HFEA, said:

“We’re delighted that the HFEA dashboard has won this year’s Trustworthiness, Quality & Value award. Our dashboard, which we believe to be the first of its kind in the world, is designed to provide impartial information in an easy-to-use format to help inform the many difficult decisions around fertility treatment. This award is a real tribute to the quality of work of our expert team at the HFEA and a recognition of the huge interest in UK fertility data. The HFEA will continue to build on the work recognised by this award and enable the public, clinicians and researchers to access the data we collect. Thank you to the Office for Statistics Regulation and the Royal Statistical Society for the award.”

Citations from users:

Prof Adam Balen, consultant in reproductive medicine at Leeds teaching hospitals NHS trust said:

“The new dashboard enables researchers to access data for study and patients to access information to better inform them on their fertility journey and thereby demystify some of the complexities behind the statistics of treatment outcomes.”

Other useful background

User testing was performed with patients, researchers, clinicians, staff and stakeholder organisations to ensure the dashboard could satisfy the various needs of key audiences. Improvements were made based on feedback including additions of information notes, FAQs on the landing page, and production of a two-minute animated explainer video.

The finalised dashboard also includes a link to a feedback form to ensure HFEA continue to receive input from users. Accessibility was reviewed by an external company, informing updates including improvements to keyboard-only navigation and developing dynamic alt-text.

The dashboard has been covered in numerous media publications with a focus on the dashboard’s user-friendly design and improved data transparency. Recent feedback from a patient referred to using the dashboard as a trusted source to check following meetings with clinics to see whether the rates they quoted were consistent with values displayed on HFEA’s dashboard.

Related Links

Using independent reviews to inform assurances around quality

This is a case study for Principle Q3: Assured quality.

HMRC took an in-depth look at its approach to quality after it found a significant error in its Corporation Tax statistics. HMRC’s action in inviting an independent review of their data quality management approach is an example of best practice for other statistics producers.  

Corporation Tax is levied on the taxable profits of companies and makes up about 9% of HMRC’s total tax receipts. HMRC’s Corporation Tax statistics provide annual data on receipts and liabilities, obtained from an administrative data source.  

HMRC found an error in its published Corporation Tax receipts statistics in 2019, which led to substantial revisions across these statistics, affecting the period from April 2011 to July 2019HMRC carried out a range of activities to assess this issue and identify mitigating actions to take to improve the quality of its statistics, including inviting the OSR to carry out a review of the principles and processes underpinning the quality of HMRC’s official statistics. OSR carried out a review of HMRC’s quality management approach and published its findings in April 2020.   

The review primarily focused on source data quality and the importance of statistics producers understanding the nature and quality of the data they work with. The review report contained nine recommendations, which included producing process maps of end-to-end processes, developing Reproducible Analytical Pipelines, and reducing HMRC’s suite of publications.  

The expectation is that these changes will improve the quality of HMRC’s statistics going forwards, for example by making sure analysts fully understand the data they are working with. All nine recommendations were welcomed by HMRC’s senior leaders, and a programme of work has been designed to implement the recommendations in 2021 to 2022 and beyond. This work has already increased HMRC’s level of assurance around the quality of its data and led to further improvements. 

In asking OSR to carry out an independent review, HMRC showed a proactive and open approach to strengthening data quality. The review and HMRC’s response to it, emphasise the importance of having analytical leaders that transparently campaign for and support changes and innovations that can enhance the quality of statistics. This provides an example for other producers who might wish to inform their own assurances around the quality of the data or the statistics that they produce, through periodic or systematic, independent reviews. 

Improving quality assurance and its communication to aid user interpretation

This is a case study for Principle Q3: Assured quality.

The Department for Work and Pensions (DWP) publishes statistics on new National Insurance number (NINo) registrations to adult overseas nationals, on a quarterly basis. In 2017, following an assessment by OSR, the statistics had its National Statistics designation suspended. Since then, DWP have implemented a range of improvements leading to re-designation of the statistics in November 2020. One of the areas of greatest improvement is their quality assurance processes and how they communicate these procedures with users.

The team have worked hard to review their quality assurance processes and apply guidance from OSR’s Quality Assurance of Administrative Data (QAAD) toolkit, as discussed in their Quality report. Applying guidance from QAAD supports good practice in monitoring the quality of data over time and in identifying data quality issues, whilst the Quality report itself demonstrates transparency about the quality assurance approach taken.

More recently, the production team have focused on harnessing aspects of Reproducible Analytical Pipelines (RAP). DWP have a separate dedicated team focused on introducing RAP across the department. Working closely with them, the production team – recognizing the benefits associated with quality assurance, streamlining and automating the production of the publication – have started to adopt many of the RAP principles. This has also involved team members upskilling and developing their programming skills. These are DWP’s first statistics to be produced using RAP and the team continues to develop and further integrate RAP principals into its production processes.

To further explain the quality assurance processes, the administrative data and the strengths and limitations of the statistics, an informative background and methodology document is available for users. This includes illustrative figures of the data journey – showing how different data sources fit together – and flow diagrams documenting the steps in the quality assurance process.  The inclusion of these helpful diagrams, alongside descriptions and further detail within the text, offers users a clear explanation of the strengths and limitations of the data, and enables an understanding the underlying data and processes.

This case study shows how DWP has made improvements to how it assures the quality of its NINO statistics and how it communicates this assurance transparently to aid users in their appropriate interpretation. The enhanced quality assurance processes and their clear communication through new documentation, reflect a positive and engaged approach to enhancing the quality of the statistics and improvements to their overall public value.

 

 

 

 

 

Ensuring source data is appropriate for intended uses

This is a case study for Principle Q1: Suitable data sources

Legal aid statistics for England and Wales are published quarterly by the Ministry of Justice (MoJ) and draw on a range of Legal Aid Agency (LAA), an executive agency of the MoJ, administrative data sources. Legal aid statistics were first published independently as Official Statistics in 2013, and were awarded National Statistics status in 2016.  

Legal aid is a complex area and the statistics report on a variety of criminal and civil legal aid schemes, including police station attendance and civil representation. The statistics provide an extensive evidence base on the legal aid system, but the constraints of using administrative data from LAA systems means that there are some things they do not measure precisely, or at all. To enable user understanding, MoJ publishes a comprehensive Guide to Legal Aid Statistics in England and Wales. The user guide includes considerable detail about operational context in which the data are recorded and case studies to show the types of cases where legal aid would be granted and how this would be shown in the statistics  

The guide also provides a summary of the team’s professional judgments around the robustness of each data source and, more generally, a clear steer on the sort of comparisons that the overall statistics allow (e.g. volume and expenditure levels by scheme) or do not permit (e.g. the number of clients or precise geographic distribution of legal aid clients). A detailed account of the individual data sources used is further detailed in a separate ‘index of legal aid data’. The index and user guide both include a flow diagram which presents the data sources for each of the legal aid schemes. 

Many legal aid data sources are subject to minor revisions within each quarterly update from new information being included, or previous information being amended, on the underlying systems. These revisions are clearly flagged in the quarterly statistics.  

The legal aid statistics team were embedded in the LAA until recent years and maintain close links with LAA colleagues, including those responsible for the management and supply of the administrative datasets. These relationships help provide additional insight into the detail of the data sources used and any changes to these. A recent example of this was when a new provider contract for telephone advice services led to a discontinuation of a published time series on costs. These changes were explained by LAA colleagues and subsequently reported in the statistical series. 

There have been numerous other enhancements to the statistics over time, which are also clearly documented in the user guide timeline, and which have continued to improve the comparability and transparency of the data sources used to produce legal aid statistics. 

This example shows how the legal aid statistics team within MoJ ensure that the LAA data they draw on is appropriate for statistical purposes by having a thorough understanding of the operational context within which the administrative source data used to produce the statistics are collected, and by maintaining close links with LAA data suppliers. It also shows the considerable lengths that the statisticians go in explaining the relative strengths and limitations of the various data sources used to ensure the appropriate interpretation of the official statistics, including explaining the impact of changes or revisions to data sources and administrative systems over time. 

Developing harmonised national indicators of loneliness

This is a case study for Principle Q2: Sound methods.

In 2018, in response to the manifesto published by the Jo Cox Commission on Loneliness, The Prime Minister called lonelinessone of the greatest public health challenges of our time”. As such, a consistent approach is needed to measuring how loneliness affects people’s lives and who is more susceptible to it. The Prime Minister tasked the Office for National Statistics (ONS) with developing the evidence base and to develop national indicators of loneliness, suitable for use on major studies, to inform future policy in England.

The harmonisation of the new loneliness indicators was important for enabling more surveys to measure loneliness in the same way, in order to build a better evidence base more quickly. This is needed to enable a better understanding of what factors are most associated with loneliness, what the effects of loneliness are for different people, and how it can be prevented or alleviated. As this is a devolved matter, ONS took this work forward for England, with scope for future work to harmonise across the Devolved Administrations.

In December 2018, following consultations with key stakeholders and experts, and extensive collaboration with the ONS Quality of Life team, the GSS Harmonisation Team published the Harmonised Principles for measuring loneliness. The principles can be used to measure loneliness using any survey or administrative data source, which ensures a consistent approach can be adopted across major studies to inform future policy in England.

After identifying the need for indicators across all ages, the GSS Harmonisation Team agreed upon two sets of indicator questions and one direct loneliness question. The first set of four indicator questions is recommended for use with adults while there is an alternatively worded set recommended for use with children. The questions were tested and then used in several established surveys using different survey modes, including paper self-completion (English Longitudinal Study of Aging), online self-completion (Community Life Survey, Good Childhood Index Survey), and telephone interview (Opinions Survey).

All four questions are also due to be adopted on the:

And the direct loneliness question is due to be included on the:

Given the important link between health and loneliness, there is also ongoing work with various agencies including Public Health England, NHS England and NHS Digital to include the loneliness measures in key surveys, such as the Health Survey for England. Work is also ongoing to continue harmonisation of the loneliness indicators across the GSS, including consultation with the Devolved Administrations.

This example shows how the GSS Harmonisation Team has worked effectively with statistics producers across government and experts in loneliness measurement, to develop consistent methods for measuring loneliness in both adults and children. These measures can then be adopted in a comparable way across major studies to help inform effective government policy responses in this area of current public debate.

Developing and refining UK House Price Index methods

This is a case study for Principle Q2: Sound methods.

The UK House Price Index (UK HPI) has been published since June 2016 and is produced by HM Land Registry in partnership with the Office for National Statistics (ONS), Registers of Scotland and Land and Property Services Northern Ireland (referred to as HM Land Registry and partners).

The method used to produce the UK HPI was originally published in Development of a single Official House Price Index which set out the rationale for the approach, the data sources used and how it complied with international standards. It also considered users’ questions raised during an earlier methods consultation and from a peer review conducted by the Government Statistical Service Methodology Advisory Committee.

Each month, the UK HPI presents a first estimate of average house prices in the UK based on the available sales transactions data for the latest reference period. The first estimate then updated in subsequent months as more sales transaction data become available for inclusion in the calculation.

In March 2017, there was a large increase in the magnitude of revisions between first and subsequent estimates of annual change to average house prices. This negatively affected some users’ confidence in UK HPI as they were unable to understand or explain house price trends using the first estimate with certainty. After investigating, ONS established that they were being driven by volatility in new build property prices, compounded by an operational backlog in HM Land Registry registering new build sales transactions.

HM Land Registry and partners took steps to improve the methods by changing the calculation for the first estimate to reduce its sensitivity to the impact of new build transactions. The approach was developed by GSS methodologists, and several options were tested before a final one was chosen.

HM Land Registry and partners communicated the method change to users prior to its implementation through the About the UK HPI section of the UK HPI release, a blog, and later produced an enhanced Quality and Methodology report which includes details of the impact of the changes and supporting analysis. Details about the HM Land Registry operational backlog have also been included in Section 4.4 of About the UK HPI, with a reference to HM Land Registry’s speed of service and its future plans, which present information about average completion times for new build registrations.

As a result, the scale of revisions to the first estimate of UK HPI annual change to average house prices has reduced, and is more stable over time. HM Land Registry and partners and UK HPI users are now more assured that delays in processing new build registrations are not adversely impacting on the robustness of the UK HPI first estimates.

HM Land Registry and partners also compare UK HPI with other non-official house prices indices to identify and explain any differences between the series, and publish their analyses in an annual article Comparing house price indices in the UK.

This example shows how HM Land Registry and partners have transparently developed UK HPI’s methods by collaborating with relevant experts during their development, informed users in advance about methods changes with clear reasons and explanations of their impact, and published supporting information that helpfully sets out the rationale behind their various decisions.

Assured quality in the Mental Health Act annual statistics

This is a case study for Principle Q3: Assured quality.

The NHS Digital Mental Health Act annual statistics bulletin contains official statistics about uses of the Mental Health Act in England.

In 2015 NHS Digital announced changes to the way it sources and produces these statistics. Previously these statistics were produced from the KP90 aggregate data collection. They are now produced from the Mental Health Services Data Set (MHSDS). This transition to a new data source was a cost saving factor as well as a programme of work to improve data quality. MHSDS provides a much richer data source for these statistics, allowing for new insights into uses of The Act.

For the October 2017 release, NHS Digital published the annual statistics with the new data source, and also produced a background data quality report that clearly communicates this assurance to their users. The document highlighted the improvements to data, methods and source and provided information on data relevance, reliability, coherence, timeliness, and clarity. NHS Digital included detailed information on, and published, missing data they were using to identify the most efficient way to increase coverage. Other positive mentions within the report include a section on the trade-offs between output quality components.

The report is a good example of what to include in a background quality document that accompanies the statistics bulletin and the data. The result of the overall quality improvements ensures the Mental Health Act annual statistics fit their intended use.

Publishing information about data quality assurance processes

This is a case study for Principle Q3: Assured quality.

The Consumer Price Index including Owner Occupiers’ Housing Costs (CPIH) is published monthly by the Office for National Statistics (ONS) in its UK Consumer Price Inflation bulletin.

ONS publishes information about the quality of the Valuation Office Agency (VOA) private rents data, which are used to estimate owner occupiers’ housing costs, a key component of the inflation measure:

ONS communicates clearly with VOA to understand the quality assurance of these data. ONS is currently looking into gaining access to the private rents microdata, using the powers granted through the Digital Economy Act 2017. This is expected to help ONS further understand data quality issues.

In addition, ONS has developed several comparative analyses to provide assurance to itself and to users about the behaviour of CPIH:

  • One analysis compared different methods of estimating owner occupiers’ housing (OOH) costs
  • Another analysis compared the CPIH private rents data with other data source

By publishing clear and detailed information about data quality assurance and embedding quality assurance practices in its production process, ONS provides reassurances to itself and users about the quality of the data used to produce CPIH.

Working effectively with contractors to deliver a survey

This is a case study for Principle Q1: Suitable data sources.

The Scottish Crime and Justice Survey (SCJS) is an annual, large-scale, continuous survey carried out by the Scottish Government that measures adults’ experiences and perceptions of crime in Scotland.

The Scottish Government appointed a consortium of two contractors to jointly deliver the SCJS from 2016/17 under a single contract. Fieldwork is shared jointly across both organisations, while Ipsos MORI focuses on the content and questionnaire development, and ScotCen is responsible for data processing, from data cleaning through to delivery of final data sets.

The Scottish Government maintains a close and effective working relationship with the contractor consortium to manage and deliver the survey:

  • It established appropriate safeguards by specifying the roles, data requirements, delivery arrangements and communication channels between the survey team in Scottish Government and the contractor consortium
  • When setting up the contract, the statistical team modernised the data coding and processing arrangements for the ‘offence codes’ used to classify incidents, to ensure a smooth transition to the new contractors

The Scottish Government carried out a range of activities to decide on the survey design, including meeting with users and potential contractors. It also produces a detailed technical report with information about the survey design and delivery, including the sample design and selection and the survey response.

This approach gives the Scottish Government and users of the statistics confidence in the suitability and quality of the data sources and data collection process.