National Statistician’s Guidance – Management Information and Official Statistics


Better data, statistics and analysis present a radical opportunity for better decision making, both in government and society generally. More data is available now than ever before, in richer and more complex forms. Some of this data forms the basis for official statistics, which are produced to professional standards in compliance with the Code of Practice for Statistics. At the same time, within government there is an increasing appetite to use unpublished management information to help shape policy, operational and management decisions.

The Government Statistical Service is well placed to help unlock the power of data. As well as our important role in producing official statistics, government statisticians can help improve the quality of management information, and provide advice to policy makers on how to use and interpret such data.

Sometimes, unpublished management information will feed in to published statistics. It is important that departments handle such data with care so that no action is taken which might damage public trust in the official statistics.

This guidance sets out best practice advice as to how departments should treat management information in order to get maximum value from it while protecting public trust in official statistics.

John Pullinger, National Statistician
17 June 2016

This guidance supersedes the following documents: (1) National Statistician’s Guidance, Use of Management Information, published in February 2011 (2) National Statistician’s Guidance, Identifying Official Statistics, published in February 2010 and updated in December 2012 (3) Authority Statement, Management information and research data as Official Statistics, published in July 2009 and updated in September 2010 and March 2011

Key points

  1. Unlock the value of data. Government statisticians should help organisations to unlock the power of data for policy, operational and managerial purposes. This includes facilitating the use of unpublished management information that will subsequently form the basis of official statistics.
  2. Use proportionate safeguards. Where unpublished management information feeds into official statistics, it should be handled with care. A range of safeguards can be used to reduce the risk of a breach of the Code of Practice. More stringent safeguards are required when data is very close to, or identical to, the final official statistics (see paragraph 5.3).
  3. Understand the difference between management information and official statistics. Decisions on whether data should be treated as official statistics should be taken in consultation with the Head of Profession for Statistics, and should take into consideration the principles set out in this document (see Box 2).
  4. Ensure equality of access. Public statements should not be made based on unpublished management information that feeds into official statistics. If this happens inadvertently, don’t try to cover it up – seek advice from the Head of Profession or from the UK Statistics Authority straight away (see paragraph 5.7). Selective release of favourable data must be avoided.
  5. Publish more data. Publishing data in an orderly way is one of the best ways to ensure equality of access and provide wider public value. The more data that is published routinely, the less likely it will be that your organisation will be compelled to make an unplanned release under the Freedom of Information Act.

Scope and definitions

3.1 Government bodies collect numerical information, carry out research, and make estimates of various kinds in order to run their own businesses, and will often wish to make these numbers public. Such data when made public can sometimes be described as official statistics, and in these cases the Code of Practice for Statistics applies.

3.2 The following terms describe some different types of numerical information.

  • i. Administrative data refers to information collected primarily for administrative reasons (i.e. not initially for statistics or research). This type of data is collected by government departments and other organisations for uses such as registration, transactions and record-keeping, usually as a by-product of delivering a service. Administrative data is often used for operational purposes and their statistical use is usually secondary.
  • ii. Management information describes aggregate information collated and used in the normal course of business to inform operational delivery, policy development or the management of organisational performance. It is usually based on administrative data but can also be a product of survey data. The terms administrative data and management information are sometimes used interchangeably.
  • iii. Official statistics. These are statistics published by a Crown body, or a body listed within an Official Statistics Order. They are sometimes based on administrative data, but also can be based on survey data. Official statistics fall under the remit of the Code of Practice for Statistics.
  • iv. National Statistics are a subset of official statistics which have been independently assessed by the UK Statistics Authority’s regulatory function and have been found to be compliant with the Code. The National Statistics label can only be granted or removed by the Authority. The National Statistics label is a mark that the statistics reach the highest standards of trustworthiness, quality and value.

3.3 Administrative data and management information sometimes feed into official statistics. Sometimes management information is made public in its own right, and this data may or may not subsequently feed into official statistics. Crime statistics are a good example of this.

Box 1

Police recorded crime statistics Crime records begin at the local level as administrative data. They are then collated as management information and used by the police, ministers and officials to inform operations. The records are published regularly to the website as management information. They are also aggregated and quality assured to create official statistics. These sources have different purposes. The official statistics are used by the public, the media and policy makers to understand trends in crime. The monthly management information is used by ministers and senior officials to identify emerging crime threats that require an immediate policy response. is used by the public to map crime and identify hotspots, and by academia in developing models of crime. Data published to does not cover all crimes and is not in the quality assured form that is published as official statistics.

3.4 This raises the question of which sets of published aggregate management information should, in the public interest, be treated as official statistics, and which may reasonably continue to be produced without full compliance with all aspects of the Code. Decisions on whether or not a particular set of data should be treated as official statistics should be taken in consultation with the Head of Profession for Statistics and should take into consideration the principles in Box 2.

3.5 Not all cases will be clear-cut. There may be some cases in which the Statistics Authority will conclude that data that has not previously been published as official statistics should in future be treated as such, and that, in the interests of maintaining public confidence, the full requirements of the Code should be observed. Sometimes the Statistics Authority may take a different view from the organisation that produced the figures and in some circumstances the Authority may consider them to be official statistics, in line with the recommendations of the Bean review.

Box 2

Identifying official statistics Under the Statistics Act, official statistics should comply with the Code of Practice for Statistics and fall within the remit of the Office for Statistics Regulation. While the Act identifies bodies that can produce official statistics in Section 6, it does not set out which data should be classed as official statistics. This remains a matter for judgement. Decisions should be made on a case-by-case basis by the relevant Head of Profession for Statistics (or Lead Official) or based on their advice. Producers should refer to the GSS guidance document “Labelling Official Statistics” when making such decisions.

The guidance sets out eight practical considerations to assess. While the first two are mandatory, it is not necessary for all of the others to be met. This is not a “pass or fail” checklist, and the importance placed on each consideration will depend on the context. Heads of Profession can bring in other relevant considerations that have a bearing on the trustworthiness, quality and value of the statistics.

  1. MANDATORY: The organanisation that produced the statistics is within a Crown body or named on an Official Statistics Order and so the numbers that they produce are within the scope of the Statistics Act.
  2. MANDATORY: The statistics are put in the public domain through a regular output or adhoc release.
  3. Materiality: The data are used publicly in support of major decisions on policy, resource allocation or other topics of public interest.
  4. Public interest: the data have a high public profile, attract controversy or debate when published and / or public debate would be better informed if they were classified as official statistics.
  5. Methods used: the data are collected and results compiled using widely accepted statistical methods that are appropriate for the intended use.
  6. Independence: The data were produced by appropriately skilled analysts who are free from conflicts of interest, including political and commercial pressures.
  7. Quality: Data inputs and statistical outputs are of sufficient quality to support the intended uses.
  8. Representative: The statistics are broadly representative of the population that they are designed to measure. Please refer to the guidance document for more detailed information.

3.6 Official statistics are also subject to pre-release access rules set out in secondary legislation. Pre-release access is the practice of making official statistics “in their final form” available in advance of their publication to specific individuals not involved in their production. This concept is more straightforward to define in cases where official statistics are produced from surveys controlled by statisticians, or when statistical techniques are used to combine data sources.

3.7 In cases where official statistics are derived directly from management information, the management information may sometimes differ little, if at all, from the final statistics. There may be concern that sharing the management information in the run up to publication is contrary to pre-release access rules. However, it is appropriate to continue to make use of management information in the normal way, including sharing data between government departments where this is necessary. The important constraint is to ensure that there is no public use of unpublished management information which could undermine the official statistics and thus breach the Code.

Box 3

Definition of ‘Final Form’ The definition of final form official statistics in this context is the statistics and commentary which have been subjected to quality assurance and have been signed off by the responsible statistician as being ready for publication.

3.8 This guidance does not extend to the specific rules governing pre-release access to statistics in their final form.

General principles

4.1 In framing this guidance, we have concluded that four high level principles should be kept in mind.

  • i. Maximum value should be made of information held by the public sector, through its use to inform policy and operational decision making, to help direct economic and commercial activities, and to inform wider public debate.
  • ii. Equality of access. To the greatest extent possible, there should be equality of access to the data from which any public statements are based.
  • iii. Transparency is central to building trust in statistical information generally, and should guide decisions about the use and release of data and statistics.
  • iv. Integrity of official statistics. Not all situations will be clear cut and sometimes the above principles may come into conflict. The Head of Profession for Statistics should make a judgement about how to balance these principles to ensure that no action is taken that might undermine confidence in the independence of related official statistics when released.

Maximum value

4.2 As the availability of near real-time management information has increased, so has the appetite to use it to inform decision making. Government statisticians are there to help organisations make full use of management information for policy, operational and managerial purposes. This includes management information that will form the basis of official statistics in advance of their release, provided that conditions of use are in place to preclude the public use of unpublished management information (see paragraph 5.3), and the statistics are not in their final form for release.

4.3 When making important decisions, the best available data should be used. Sometimes this will be the quality assured official statistics, but other times management information can provide additional insights. Caution is needed when interpreting management information as it is often incomplete, not always quality assured and not necessarily fully representative. Government statisticians can help here by:

  • i. working to improve the quality of the data derived from administrative systems; and
  • ii. providing expert analysis, helping to interpret meaning and insight from the data.

4.4 Organisations should employ the skills of professional statisticians to improve the information derived from administrative systems and to achieve maximum value at all stages of data use.

Equality of access

4.5 To the greatest extent possible, there should be equality of access to the data from which any public statements are based. Public statements should ideally draw on the latest published official statistics. They should not be based on unpublished management information that subsequently feeds into official statistics. Where there is a significant reason to use more up-to-date management information in a public statement, that data should be published before or at the same time as the public statement.

4.6 Where unpublished management information covers similar ground to a subsequent official statistics release, and it is deemed necessary to make a public statement based on that management information, then advice should be sought from the Head of Profession for Statistics on how to mitigate the risk of pre-empting or compromising the official statistics. The Head of Profession will need to be satisfied that such use is justified, and that this will not undermine the planned official statistics release or broader public trust in official statistics. Selective release of favourable data is not conducive to public trust in official statistics.

4.7 Where unpublished management information is the basis for an official statistics release, and there is a pressing need to publish that management information prior to the next scheduled release of the official statistics, then the Head of Profession should review the publication schedule to see whether the official statistics might be brought forward. If this is not feasible then an ad hoc statistical release can be pre-announced and published. If exemptions to the Code are needed, the UK Statistics Authority’s regulatory function will need to be informed and is always available to provide advice in cases of doubt.

4.8 Public statements which reveal unpublished data ahead of the publication of related official statistics can damage public trust in statistics. Depending on the materiality of the statement, appropriate responses might include publishing a statement about the circumstances, or publishing an ad hoc statistical release (see paragraph 5.7).


4.9 Increasingly, in line with the principles of Open Data, departments are making management information publicly available. Publishing data in an orderly way is one of the best ways to ensure equality of access and provide wider public value.

You should:

  • i. consult the Head of Profession about whether to treat the data as official statistics;
  • ii. use the skills of professional statisticians to improve the information derived from management and administrative systems;
  • iii. follow your organisation’s processes for protecting information, for example relating to the Data Protection Act and consulting the relevant Information Asset Owner if necessary; and
  • iv. apply statistical disclosure control methods as required.

4.10 If the management information does not feed in to official statistics, the Code of Practice does not formally apply. Nevertheless, voluntary adherence to some elements of the Code is advised. We recommend that the Head of Profession is asked to provide advice on those aspects of the Code that are most relevant to the situation – for example, that trust in the data would be enhanced if it were supported by information about methods and quality. The Head of Profession should also consider whether the need to publish the data might be an indicator of its importance, and hence whether it should be published as official statistics.

4.11 The Office for Statistics Regulation has published a draft guide to voluntary compliance with the Code of Practice. This sets out how organisations that are not formally bound by the Code can still adopt its principles and practices in order to promote trustworthiness, quality and enhance the public value of information. We recommend that departments consider adopting this guidance when publishing information that is not covered formally by the Code.

4.12 The more data that an organisation is able to publish from its administrative or management sources in the form of official statistics, the less likely it is that the organisation will be compelled to make an unplanned release under the Freedom of Information Act.

Scenarios and case studies

5.1 The complexity of different arrangements in place across government makes it difficult to lay down a comprehensive set of rules for all situations. Data owners, analysts and users must often make on-the-spot judgements about propriety and the correct course of action. To assist these judgements, the principles described in the previous section are expanded in a series of scenarios below, illustrated by real-life case studies.

Internal use of management information

5.2 Management information systems are rarely owned or controlled by statisticians, so ensuring best practice is followed will often require coordination and cooperation between various parts of an organisation.

5.3 Where unpublished management information feeds into official statistics, it should be handled with care. A range of safeguards can be used to reduce the risk of a breach of the Code of Practice. Risk based judgements should be made, in consultation with the Head of Profession, about the appropriate level of safeguards required. More stringent safeguards are required when data is very close to, or identical to, the final official statistics.

Such safeguards might include the following.

  • i. Limit access on a need-to-know basis. This should include only those who have a legitimate need to use the data for policy, managerial, operational or other appropriate decision-making purposes in advance of the official statistics publication.
  • ii. Clearly mark data as sensitive.
  • iii. Ensure those with access understand their responsibilities under the Code and give a clear undertaking not to place such data in the public domain.
  • iv. Keep records of who has access to such data. Named individuals are preferable, but if very large numbers of people have access to administrative systems then it may only be possible to record which groups of people have access. For particularly sensitive data, named individuals should be a requirement.
  • v. Apply clear ‘conditions of use’ on access.

5.4 Users of unpublished management information that feeds into official statistics must:

  • i. abide by the ‘conditions of use’ attached to the data. This also applies to any people outside the owner organisation (for example staff in other organisations who may be collaborating on an initiative that spans several organisations);
  • ii. avoid ad hoc or selective comments on, or reporting of, unpublished data; and
  • iii. avoid making any public statement that prejudges or pre-empts the contents of any subsequent statistical release.

Box 4

Case Study: Business, Innovation and Skills – Apprenticeships Official statistics on apprenticeships starts are produced by the Department for Business, Innovation and Skills (BIS). The source of the data is a BIS administrative system designed to support funding and operational functions. There are a considerable number of people who have operational access to the dataset in the Skills Funding Agency (SFA) and BIS. Given the high profile nature of the apprenticeship programme, it is subject to a range of
formal boards and groups within government. BIS support these though managing the cascade of information, analysis, performance measures and dashboards.

BIS supply data on apprenticeship starts to the Earn or Learn Taskforce, chaired by the Minister for the Cabinet Office. Such a meeting is treated as operational and BIS ensure they have the best data to make decisions. In line with the guidance, BIS share management information containing numbers not yet published that are considered to be: “incomplete and not quality assured”, or “not fully representative”. This data may give an indication of direction of travel and may occasionally be very near the final official statistics.

In supplying January’s monthly management information, the data was exactly the same as what statisticians expected the ‘final form’ official statistics number to be. But following advice from the National Statistician’s office, it was agreed that this data, while close to the ‘final form’ number, was not the same as the final form official statistics release which would be subject to formal pre-release access (see definition of final form at Box 3).

Box 5

Case Study: Treasury – Public Sector Finances A market sensitive official statistics publication on Public Sector Finances is produced jointly by ONS and HM Treasury. This draws together a wide variety of administrative data sources from across the public sector including HMRC tax data and HM Treasury data from cash monitoring systems. In total about 50 people have access to the administrative datasets within HMRC and HM Treasury. About 30 people have access to compiled management information circulated by the Treasury, including Treasury ministers, senior management and people from other departments.

The main uses of the data are as follows:

  • Source data: Tax teams and senior managers monitoring progress on tax receipts, sometimes on a daily basis or key days that affect forecast judgements. The Treasury and Debt Management Office (DMO) use the data to ensure best value for money decisions are made around daily financing of activities.
  • Management information: Treasury senior ministers and Office for Budget Responsibility (OBR) and DMO staff to monitor progress against fiscal targets and provide a real time indicator of the economy and the government’s cash position.
  • Official statistics: Media, commentators and the government monitor the general fiscal position, and the gilt market to assess future supply and demand for gilts.

Some of the data is highly provisional and can be a long way from the final published statistics. Later in the monthly cycle the data is closer to the final statistics, though it can still be subject to considerable change once compiled by ONS.

As the source data and management information underpins market sensitive releases, there is a pressing need to protect the data, and to ensure no public statements are made.

The main mitigations are to

  • ensure that conditions of use are signed by all recipients of the management information, including ministers and those from external organisations;
  • ensure that all recipients are aware of the restrictions placed on the use of their data; and
  • tightly control circulation to an expected schedule so that ministers and senior stakeholders know when to expect news and do not seek it through other sources.

Box 6

Case Study: Scottish Government – A&E Performance Data

Operational data on waiting times is collected in the administrative systems of hospitals across NHS Scotland and is used for management purposes by hospital managers. NHS Scotland collates this information, and sends weekly aggregated data to the Scottish Government on a defined set of indicators, which are used for discussions with ministers and for management discussions between the Scottish Government and NHS Scotland Boards and Hospitals. Weekly and monthly official statistics are published by NHS National Services Scotland (NSS) based on the same data source.

The exchange of data between NHS Scotland and Scottish Government is critical for the proper management and control of the Scottish health system. One consequence is that those who receive the data are broadly aware of what the official statistics are likely to say. To ensure the official statistics are not undermined, there is agreement that Ministers and officials only pro-actively comment in public on data presented in official statistics. They can however respond to management information published by others – such as individual Health Boards.

When to publish management information

(a) Proactive

5.5 Any public statement should ideally use official statistics, if available. But if someone who has access to internal management information wants to use it in a speech or statement, an assessment needs to be made whether it is in the public interest for this information to be used ahead of the scheduled publication of official statistics. In these circumstances the Head of Profession should be asked to advise on options – and the data should be published according to one of the methods set out at paragraph 5.11 below.

5.6 In any case, care should be taken to avoid selective publication of favourable data, or any action that might encourage such a perception. In order to mitigate the need to publish ad hoc data, departments should endeavour to publish official statistics in accordance with a timetable which strikes an appropriate balance between timeliness of release, users’ needs, resource availability, and fitness-for-purpose. Repeated ad hoc releases of similar content should lead to the consideration of a routine official statistics publication.

Box 7

Case Study: Ofsted – Inspection outcomes Shortly after Ofsted moved to a new framework for inspecting schools, which involved the introduction of short inspections, there was public concern about the small number of inspections that had taken place and how the new framework was bedding in.

Ofsted publishes termly statistics about the outcomes of inspections of maintained schools and some monthly management information. But Her Majesty’s Chief Inspector wanted to make a public statement about short inspections and refer to a more up-to-date figure of the number of short inspections that had been completed. The latest published management information showed 207 inspections whereas over 300 had taken place at the point the Chief Inspector requested the data. Details of all of these inspections had already been put in the public domain in the individual inspection reports, but the published data had not caught up with the latest position.

Prior to the Chief Inspector making his speech, a special release of management information was issued. This ensured equality of access to all. While there was already information in the public domain on each of the 300+ inspections, there was no readily aggregated number available.

Box 8

Case Study: Ministry of Justice – Prime Minister’s Speech on Prison Reform

Data on the applications and admissions of mothers and babies into specialised prison units is held on administrative systems. The data can be accessed by all National Offender Management Service (NOMS) staff who have been given access to the Data Performance Hub and signed the terms of use.

In February 2016, Number 10 wished to use the numbers in a speech the Prime Minister was making on prison reform. If the PM had used unpublished data there was a risk this would have been unverified and not followed correct release practices. It was decided to issue a one-off ad hoc release so that the data would be in the public domain at the time of the speech.

One difficulty was around ensuring the figures were of sufficient quality. The ad hoc release was required at very short notice in order to be published before the speech. Due to the short deadline to validate the data, some additional measures available were not included due to quality concerns.

Statisticians liaised with Private Office officials to understand more about what the PM wanted to say. Through this action they were able to find a compromise that addressed the issue the PM wanted to raise while ensuring the speech accurately reflected the data.

There was significant public interest in the weeks after the release and speech. This has led to consideration as to whether the management information should be published regularly.

(b) Reactive

5.7 Inadvertent public statements based on unpublished management information that subsequently feeds into official statistics can sometimes occur, despite best efforts to avoid this. Such statements can damage public trust in statistics. Don’t try to cover it up – seek advice from the Head of Profession or from the UK Statistics Authority straight away.

Depending on the materiality of the statement, appropriate responses might include:

  • i. publish a note confirming the source of the data and any relevant context; or
  • ii. publish an ad hoc release.

5.8 It might be appropriate for an audit or investigation to take place to identify whether any additional safeguards should be put in place.

5.9 It may be appropriate to invite statisticians to review the text of speeches or statements to check that data contained within is both accurate and appropriate for the public domain. In some departments statisticians are asked to check briefing packs for Select Committee appearances. Despite these measures, if a significant piece of unpublished numerical information is said during a hearing, statisticians should be ready to respond appropriately.

Box 9

Case Study: Select Committee

While giving evidence to a parliamentary select committee, a senior civil servant who was briefed on internal, unpublished, management information inadvertently made a statement based on that information.

The Chair of the select committee subsequently asked a parliamentary question to seek context for these figures, and the relevant Minister confirmed that the information had been derived from management information, which would be included in the next quarterly publication. The Chair of the select committee then wrote to the Chair of the UK Statistics Authority, complaining that it was difficult for the select committee to hold the Government to account if unpublished data was used in evidence.

Following an investigation, the Chair of the Statistics Authority replied publicly, stating that in this case the relevant Head of Profession for Statistics had not been involved in compiling or checking the briefing given to the senior civil servant. This was contrary to the department’s usual practice, where the relevant analyst was usually asked to check and sign off the accuracy of the statistical and management information to be used in briefings for select committee hearings. The Chair expressed regret that the department’s usual practice had not been followed and suggested that the corresponding official statistics could be published on a more frequent basis to help avoid the situation of officials quoting unpublished data.

How to publish management information

5.10 If the data is considered fit for purpose and it is decided to release the data, then regardless of whether the data will be considered official statistics or not we expect that that the core principles of the Code are followed: the data should be accessible, data quality and limitations should be explained, and there should be clear separation between the published data and any policy or political message.

5.11 Broadly speaking there are four ways that management information can be published and these are set out below. It is important that releases are clearly labelled as management information, official statistics or National Statistics, as appropriate.

(a) An ad hoc management information release

Box 10

Case Study: Ministry of Justice – Legal Aid Exceptional Case Funding

A Legal Aid Agency (LAA) team of around 10 caseworkers process and determine applications for Exceptional Case Funding (ECF) for cases beyond the normal scope of the legal aid system. Managers and legal advisers keep a close watch on the management information because this concerns a relatively new scheme which is still being tested and challenged in the courts.

There is a quarterly statistical release, usually published three months after the end of the quarterly period to which they relate, allowing two months for the application to be processed and determined, and three weeks for the statistical bulletin to be compiled.

It was necessary to provide more up-to-date data material to a court case. After obtaining guidance from the National Statistician’s office, the Head of Profession decided to preannounce and publish an ad hoc release of early figures, clearly labelled as management information, with caveats about differences from the official statistics. This ensured equality of access, while still complying with the deadline set by the court order.

(b) A regular release of management information

Box 11

Case Study: Ofsted – Regular release of inspection outcomes data Ofsted publishes termly official statistics on the outcomes of school inspections (inspection judgements). It also publishes each inspection report. In the past, the official statistics were the only published aggregated figures on inspection outcomes. However, advances in technology mean that it is now possible to scrape the individual inspection judgements from Ofsted’s website and aggregate the data. This is done by other organisations and at least one non-government organisation publishes their own aggregated data and infographics for school inspection judgements, updating them twice each working day. The statistics they present differ from the official statistics in both coverage and methodology. Nevertheless, some users, including the media, have referred to this data rather than the official statistics, possibly because more up-to-date information was perceived to be better.

This indicated a user need for more timely data. The Head of Profession considered whether this need could be met by publishing frequent management information, or whether this would risk pre-empting the official statistics, as both rely on the same data set.

Here, timeliness is a user-driven quality measure that takes some precedence over accuracy. Ofsted now publishes management information, which has some bias, every month, making the issues with data quality clear to users. Management information is not held back, even if it pre-empts the official statistics. It also looking to update its Data View infographic tool monthly, in order to meet user needs. Ofsted will explore the potential for publishing daily updates to the tool, again making issues with data quality clear to users.

(c) Amending the release schedule of planned official statistics

Box 12

Case Study: Student loan applications Information about the processing of student loan applications in England is published in an annual National Statistics release, but in 2009 there were significant problems with the processing of applications which got a lot of press interest. Because of the number of enquiries the Student Loans Company were getting, and following a discussion with the Head of Profession at the parent department (BIS), they decided to introduce a series of interim official statistics publications on processing of student loan applications in England, published every two weeks. Increasing the frequency of publications meant that press office could simply point to the website when asked for information which ensured a consistent and transparent response.

The series started in October 2009 and finished in February 2010. Similar arrangements were made for the next three academic years, until it was decided that there was no longer demand for a fortnightly or monthly release as applications processing was working smoothly and there was no longer as much public interest.

(d) Ad hoc official statistics release

Box 13

Case Study: DWP Personal Independence Payments The Department for Work and Pensions brought forward official statistics on Personal Independent Payment cases ahead of a Work and Pensions Select Committee meeting. Had they not done so, they would have been discussing official figures that were two months old and presented a different picture of performance – and the departmental representatives may also have been at risk of accidently mentioning unpublished data. The lead analyst for that area wrote to the Head of Profession seeking permission to release the figures early in an ad-hoc release, as otherwise it could prejudice the debate. Only data that was considered to be relevant to the committee session was published. The Head of Profession agreed and the information was released four days prior to the Select Committee hearing, instead of two months later as would have been the case.

Box 14

Case Study: Ministry of Justice – Statistics on public disorder From 6 August 2011 to 8 August 2011 there were outbreaks of public disorder which began in London and spread to other cities in England. During this period and the aftermath, the Ministry of Justice (MoJ) received daily returns from the individual courts who were dealing with defendants identified as being involved in the public disorder, taken from court registers.

The courts data met the needs of Her Majesty’s Courts and Tribunal Service (HMCTS), National Offender Management Service (NOMS) and other Criminal Justice agencies for demand planning purposes. Work to link this to the prison population data helped support the planning process to ensure that there was sufficient capacity within the prison estate to deal with those involved in the public disorder.

There was significant public interest in timely data on the processing of defendants. The need for the data to be released quickly outweighed the benefit of waiting for the additional accuracy and commentary from the National Statistics release.

MoJ worked quickly to publish twice weekly releases of limited analysis of the outcome of the initial court hearings. More comprehensive bulletins were released later covering sentencing outcomes, prison population and previous offending histories of those involved.

The National Statistics on court outcomes are based on completed cases, whereas for the purpose of assessing the response to the disturbances it was important to publish information on initial outcomes at the magistrates’ courts, even if the case was still ongoing. Waiting until all the cases had completed would have resulted in a lag of several months before any information had been released to the public, missing the opportunity to illustrate the ongoing response of the criminal justice system to the incidents.