Select Committee on Merits of Statutory Instruments Thirty-Fourth Report


The work of the committee in session 2007-08

3.  The Merits Committee is the only Parliamentary committee that takes a policy-based overview of the quality and effectiveness of the hundreds of statutory instruments (SIs) produced by Government every year, instruments which directly affect all corners and sectors of society. At the end of each Parliamentary session the Committee reviews the material we have seen in order to help Ministers and Departments continue to drive up the standard of statutory instruments. The Appendix continues the series of statistics about our scrutiny of statutory instruments for session 2007-08, setting out the number of instruments that we have seen and the number of times and basis on which we have reported.

FLAWED INSTRUMENTS

4.  Disappointingly, we continue to find that Departments need to check instruments more carefully before laying them. This is clearly shown by the number of corrections or redrafts that have been laid. Chart 1 indicates that 3.5% of negative instruments were corrected, slightly up on last session (2.5%). Affirmative SIs do not appear in the table as they are laid in draft and can be withdrawn and re-laid up until the debate is held: this facility appears to have been exploited rather more than usual this session[1]. We have also seen a number of SIs which correct or withdraw legislation a day or two before it is due to take effect because Departmental plans have gone awry (e.g. SI 2007/3149, SI 2008/2677[2] both of which were urgent amendments to correct a misunderstanding about instructions between the policy official and the drafting solicitor). Such last-minute changes cause confusion in the law and should have been picked up in the clearance process before the original instrument left the Department.

LEVELS OF DEPARTMENTAL ENGAGEMENT IN EXPLANATORY MEMORANDA (EMS)

5.  EMs matter, to Parliament and to the public, as they explain what the law actually means. We continue to observe that changes in key members of staff or levels of Ministerial interest can have a noticeable effect on the quality of EMs produced. Some Departments have shown significant improvement over the year (the Food Standards Agency), aspects of some Departments' performance have slipped back (MoJ, DCLG) and some Departments show a wide variation in the quality of the EM depending on which division produced it (DWP, Home Office[3]). As a result, we again recommend that senior staff should be more involved in quality control and achieving consistent standards.

QUALITY OF EMS

6.  Standards of EMs generally are improving and a few have been excellent[4], which the Committee has acknowledged. But we are frequently frustrated by an EM's failure to explain, in plain English, what the instrument does and why. Drafters of EMs often assume too much specialist knowledge or lapse into acronyms which obscure an otherwise clear explanation. Three particular weaknesses stood out this session:

  • consultation analysis - the Cabinet Office recently revised its Consultation Code, and a prominent feature of the public comments on the proposals for revision was concern that the results of consultation should be available when the final version of the SI is laid[5]. Departments should also note a recent case in the Court of Appeal where an instrument was quashed due to the Department's failure to undertake the required consultation and impact analysis at the proper time[6]. A clear explanation is particularly important when stakeholders' objections are being overruled[7], and we still receive too many EMs that just cross-refer to a website or say that the analysis will be made available later. This is not good enough. We cannot properly consider an instrument without all the necessary information being available. We are grateful to the Office for Public Sector Information (OPSI) for revising its guidance to those writing SIs so that it now states unequivocally that all necessary documents to support an instrument (that is, the consultation analysis, Impact Assessment and any other associated codes or guidance) should be available at the time the SI is laid.
  • why new legislation is required - although EMs are now better at explaining what the instrument does, Departments frequently fail to produce an adequate explanation of why change is required[8]. This is particularly true in relation to European policy changes (that may not have previously been before the House), or where significant public sector change is being made. In response to our Inquiry Follow-up report, OPSI have added a new monitoring and review section to the EM format that explicitly requires a statement of what the success criteria for the policy will be and when they will be assessed. This is crucial. The Committee cannot be satisfied that an SI is likely to achieve its policy objective if that objective is not clearly stated.
  • evidence for assumptions made - the Committee needs guidance on the size of the problem and what its impact will be to enable us to consider if the measures proposed seem proportionate to the stated policy objective. A good Impact Assessment (IA) reassures us that the process has been properly thought through, costed and the wider implications assessed, including what harm would arise if no action were taken. The Committee's ability to scrutinise regulations has been enhanced by the inclusion, from April 2008, of public sector legislation in the requirement to provide an IA. This new information enables us to question, for instance, when Social Security benefits are changed, whether the consequential costs to organisations like the Citizens Advice Bureaux have been fully taken into account. In the past we have had to delay an instrument while further information is sought or make critical comments in our report[9]. We often find that Departments do have the information, but officials preparing SI documents are sometimes so closely involved in the policy that they cannot take a step back and carefully set out the basic principles and evidence base that others need to understand the policy and how the SI is intended to implement it.

7.  We have been pleased to see an increase in the contributions that we have received from the public this year. These alert us to areas of concern where they believe that the instrument may not work as intended, or may have unforeseen consequences. The Committee considers all such contributions carefully, but we only publish such material when we feel it will add significant value to the House's consideration of the instrument.

TIMING

8.  In our first four sessions the pattern of laying instruments was similar, with discernible peaks in March, July and December. This session has shown a more rolling profile than the established pattern (see Chart 2) and the March peak was much reduced (170 SIs this year, down from 222 in 2007), although it still appeared to include a number that could have been scheduled for a different time. We are monitoring this new profile with interest to see if it is a deliberate change, reflecting a greater awareness by Departments of the need to plan SIs properly.

9.  At the time of our Inquiry Follow-up report[10] several Departments sent us forward plans for their secondary legislation relating either to the next few months or to the implementation of a particular Act, but this practice has not been sustained. It is a practice we would like to see grow, not only for our own purposes but also for the benefit of those who will be required to obey or enforce these new obligations. We regard allowing enough time for the legislation to be disseminated and applied as an important element in its effective implementation. We have also been concerned about the Government's failure to take into account the cumulative effect of several new sets of regulations landing on a particular sector at the same time. Accordingly we are currently looking at SIs applying to schools, to examine this effect in detail, and we hope to study other sectors in the New Year.

CONCLUSION

10.  We welcome the improvements made to SI procedure this session. We are grateful that the Government have issued revised guidance that is helpful to our scrutiny role but also will aid transparency and accountability. These changes will take a while to bed in, but we look forward to seeing the improvements materialise in the next session. However, progress will be limited if certain Departments' clearance processes are not significantly strengthened, and if the drafters of the key supporting documents for SIs do not explain their policy in simple terms, and provide strong evidence to support the chosen course of action.

Statistical Appendix

11.  We met 33 times in session 2007-08 and published 34 reports on a total of 1154 instruments (189 affirmative and 965 negative). This is slightly fewer than the total of 1179 instruments considered in 2006-07 but we note that at 16% the proportion of affirmatives was rather higher than in previous sessions (13% in 2006-07 and 11% in 2005-06).

12.  We drew 20 affirmative instruments and 34 negative instruments to the special attention of the House: a reporting rate of 10.6% for affirmative instruments and 3.5% for negative instruments. Of the negative instruments which we reported, 11 were debated or otherwise engaged with by the House: an engagement rate of 33%. (The figures for 2007-08 were: 23 affirmative instruments and 39 negative instruments: a reporting rate of 15% for affirmative instruments and 4% for negative instruments. Of the negative instruments which we reported, 15 were debated or otherwise engaged with by the House: an engagement rate of 38 %.) We have held no oral evidence sessions on individual instruments this year, but have welcomed the increase in the number of written submissions that we have received from the public which have helped to broaden our perspective on a number of instruments.

13.  Using our terms of reference, which are set out in full on the inside cover of all our reports, we have drawn 54 instruments (that is 4.7% of the total number of considered) instruments to the special attention of the House this session as follows:

14.  This is broadly consistent with last session in which we drew just over 5% to the special attention of the House. As in the last session, the majority of SIs reported have shown flaws, either in a lack of evidence to support the assertions made in the supporting documentation or through insufficient explanation of how the policy will work in practice.

15.  In deciding which instruments to draw to the special attention of the House, we have continued to limit our reports only to those on which we believe the House may wish to take action. In contrast, our use of short paragraphs under the heading "other instruments of interest" has grown. We use this device to alert members to instruments that appear to pursue their stated policy objective accurately, but may be of topical interest. Members have told us that they find this a useful service.

CHARTS

16.  The charts on the following pages cover the period from November 2007 to November 2008:

Chart 1 - Number of correcting instruments laid (the number of negative instruments reprinted free of charge as a result of errors)

Chart 2 - Number of instruments laid by week

Chart 3 - Number of instruments laid by month

Chart 4 - Number of instruments laid each calendar year since 2004.




1  
Both through policy flaws, e.g. the draft Local Authorities (Alcohol Disorder Zones) Designation Regulations - see our 8th and 18th reports of this session; and procedural flaws, e.g. the draft Proceeds of Crime Act 2002 (Investigative Powers of Prosecutors in England, Wales and Northern Ireland: Code of Practice) Order 2008: these SIs were both laid 3 times before being debated.  Back

2   SI 2007/3149 Prison (Amendment No. 2) Rules 2007 urgently revoked SI 2007/2954.

SI 2008/2677 National Health Service (Directions by Strategic Health Authorities to Primary Care Trusts Regarding Arrangements for Involvement) (No 2) Regulations 2008 urgently revoked SI 2008/2496. Back

3   Home Office: the draft Immigration and Nationality (Fees) (Amendment) Order 2007 gave very little information whereas the draft Immigration, Asylum and Nationality Act 2006 (Duty to Share Information and Disclosure of Information for Security Purposes) Order 2008 gave strong persuasive evidence to support the proposed change. DWP: SI 2008/2424 Social Security (Miscellaneous Amendments)( No 4 ) Regulations 2008 were criticised in our 28th report of this session; the draft Social Security (Lone Parents and Miscellaneous Amendments) Regulations 2008 were complimented in our 30th report on the support material provided. Back

4   DCLG laid seven structural change orders under the Local Government and Public Involvement in Health Act 2007: SIs 2008/490-494; 634 and 907. The accompanying EMs gave very thorough accounts of the policy background. Back

5   http://www.berr.gov.uk/files/file44374.pdf see particularly paragraphs 4.45-49. Back

6   Secure Training Centre (Amendment) Rules (SI 2007/No 1709) reported in the Times on 14 October 2008. Back

7   A good example is the recent General Medical Council (Constitution) Order 2008 (SI 2008/2554); poor ones include the Education (Local Education Authority Performance Targets) (England) (Amendment) Regulations 2007 - see our 2nd report of this session with correspondence from DCSF apologising for the omission of consultation information from the EM; and the Town and Country Planning (Mayor of London) Order 2008 (SI 2008/580) - see our 16th Report (HL 85) which published additional information about consultation provided by DCLG to supplement the EM. Back

8   As an example of poor practice see the European Qualifications (Health and Social Care Professions) Regulations 2007 (SI 2007/3101) - our 4th Report of this session published additional information obtained from DH about the background to the decision to bring forward Regulations; or the General Ophthalmic Services Contracts Regulations 2008 (SI 2008/1185) and 3 related instruments - our 20th Report of this session published additional information needed to make the policy clear.  Back

9   For example, the Education (Student Support) (No. 2) Regulations 2008 (SI 2008/1582) - see our 26th Report which published additional information obtained from DIUS about the practical implementation of these Regulations. Back

10   The Management of Secondary Legislation: follow-up, 13th Report session 2007-08, which included plans from MOD and DWP. We also received forward plans from DCLG, FSA and the Home Office. Back


 
previous page contents

House of Lords home page Parliament home page House of Commons home page search page enquiries index

© Parliamentary copyright 2008