The role of Regional Schools Commissioners Contents


Impact so far—by levels of activity

87.The DfE told the Committee that it was “too early to assess the longer term impact that the RSCs are having on educational outcomes in their regions”, but that “they are introducing new and different ways of working that support increased collaboration and self-regulation of the system”. The focus of RSCs has been on “the sustained growth of the academies market and taking swift and targeted action where it is needed most”. The Department provided details on levels of activity as follows:

Table 1: RSC activities 1 September 2014–1 August 2015


Pre-warning and warning notices issued

Academies and free schools moved from a trust or sponsor

Converter and sponsored academies opened

Sponsor applications (approvals)

Free schools opened

East Midlands & the Humber




17 (15)


Lancashire & West Yorkshire




24 (22)


East England & North East London




20 (23)






9 (8)


South Central & North West London




17 (15)


South East & South London




17 (16)


South West




19 (10)


West Midlands




16 (11)






139 (120)


Source: Department for Education (RSC 28) Figures 2–5

Key Performance Indicators

88.The performance of the RSCs is monitored through a set of Key Performance Indicators (KPIs). These were first published in the newspaper Schools Week following a Freedom of Information request made in December 2014, but are currently still not available on the Government website. The DfE confirmed for us that the KPIs are:155

(1)The percentage of academies, free schools, UTCs and studio schools below the floor standard, broken down by number of years below the floor. [These schools must have been open at least a year, and alternative provision and special schools are not included.]

(2)The percentage of academies, free schools, UTCs and studio schools in the Ofsted inadequate category, broken down by length of time. [Alternative provision and special schools are included.]

(3)The percentage of

i)schools that are academies or free schools. [UTCs and studio schools are not included as RSCs do not have a role in opening these types of provision.]

ii)eligible schools issued with an academy order, where in this case an ‘eligible’ school is defined as one: that is not already an academy, free school, UTC or studio school; that is not below the floor; and that is not in Ofsted inadequate category.

(4)The number and percentage of academies below the floor or in Ofsted inadequate category within the first two years of opening.

(5)The percentage of local authority areas in the region where more schools require a sponsor than there are sponsors available.

(6)The percentage change in sponsor attainment rating. [This rating is calculated using a combination of metrics relating to the performance of the schools managed by the sponsor.]

(7)The percentage of approved sponsors that are active (i.e. that are sponsoring one or more academies).

(8)The number of free schools and percentage of high quality free schools, UTCs and studio schools in the region. [This includes the approval rate, the attrition rate, the percentage of good and outstanding reports after 1st term visits and 3rd term visits, and the percentage of good and outstanding Ofsted inspections (1st inspection only included in this KPI).]

89.Pank Patel said that “We may not always be able to meet some of those key performance indicators for various reasons that are completely beyond our control. I do not think we should lose our jobs as a result of it. However I do think that we should be held to account through these key performance indicators”.156

90.The National Association of Headteachers (NAHT) argued that KPI 3(i)—the percentage of schools in the area that are academies or free schools—risked an assumption of academisation and a conflict of interest for RSCs, leading sometimes to “an element of collusion” between RSCs and LAs, with “schools feeling almost bullied into academisation”. Russell Hobby told us that this performance metric was “absolutely the wrong one” and that “it damages the credibility of RSCs”, since “academisation is a means to an end, not an end in itself”.157 Robert Hill told us that this KPI was “crude and inappropriate”.158 It also raises a conflict of interest. The Association of Teachers and Lecturers also noted that it would subsequently be “wholly inappropriate” if the RSCs were to begin to judge whether a coasting school’s improvement programme was sufficient to avoid academy conversion, given their potential interest in the outcome.159

91.Lord Nash acknowledged that this KPI “may be inappropriate”,160 and the Government has promised to review the KPIs for RSCs. This work is expected to be completed early in 2016, and will take account of objections to using the number of academies as a performance indicator.161

92.The Government’s review of Key Performance Indicators for RSCs should ensure that the KPIs do not prejudice the decisions made on academisation and changes of sponsor. In particular, we recommend that KPI 3(i) relating to the proportion of schools that are academies, should be removed on the grounds that it constitutes a conflict of interest.

Impact as measured by existing KPIs

93.We asked the DfE to provide data on the RSCs’ performance against the current KPIs. The Department explained that “as each region has its own distinct characteristics, they are not used to compare one region to another. Instead they are compared to the baseline data for the individual region. The baseline data is taken from the beginning of September 2014, when the RSCs came into post”.162

94.Despite extensive discussion of the KPIs during the evidence sessions for our inquiry, and references from Lord Nash, Frank Green and the RSCs themselves to the Department using these KPIs to hold the Commissioners to account,163 the DfE struggled to provide information on RSC performance against their KPIs within five weeks of our requesting it.164 The information that we were provided with shortly before publication of this report relates to KPIs 1–4 and 8, and is given in the tables in Appendix 1. We were told that the Department did not have current data in relation to KPIs 5–7.

95.The tables show that all of the RSCs are making some progress against KPI 1—the proportion of academies that are below the floor standard. The information provided also reveals an academisation target level of 28% for most of the regions, which has been exceeded in some cases. KPI 4 relates to the proportion of sponsored and converter academies which the RSC has been involved in opening which have moved into the Ofsted inadequate category and/or have fallen below the floor standards within the first 12 months to two years of being open. In the North, the figure for sponsored academies is 31%, and in the East Midlands and the Humber 6% of converter academies are in this position.

96.It is troubling that the DfE struggled to provide us with data on the performance of RSCs, given that KPIs were referred to throughout our inquiry and the Department’s written evidence. In particular, the lack of data for KPIs 5–7 undermines the Department’s claim that the impact of RSCs is being monitored and that RSCs are being held to account internally. The Government should produce an annual report on the work of RSCs, showing each RSC’s performance against all of their (revised) KPIs and their targets, and should undertake to publish online regular performance monitoring information as it is available. This is an important part of improving the transparency and accountability of RSCs.

How should the impact of RSCs be measured?

97.Lord Nash told us that “It is too early to have clear evidence of impact […] We can see impact in the way you would manage any organisation in terms of their performance, but in terms of the performance overall of the system, it is obviously going to take a few years”.165 Several other witnesses agreed that one year of operation was too soon to see an impact on the wider system.166

98.Pank Patel described the role of RSCs in terms of “improving outcomes for young people and making sure that the experience they have is a quality one”,167 and several witnesses suggested that the impact of RSCs ought to be measured in a way that reflects this. For instance, ASCL told the Committee that:168

The success or otherwise of the RSC model should be judged on the basis of its effectiveness in improving the quality of academies specifically, and in due course the education service more broadly, in each region.

Similarly, Russell Hobby said that “The rationale behind an RSC is to improve the quality of schooling. The measure of their performance should be the quality of schools within their region, both the proportion of good and outstanding schools, the number of schools above the floor standard and the number of schools above the coasting standard.169

99.While many agreed with such laudable aims for RSCs, Jon Coles questioned whether the Commissioners had sufficient levers to affect significant school improvement: “If you think about what the drivers of school improvement are, the underpinning thing, number 1, is can you get enough good teachers? But they do not have much control over teacher supply in their regions or teacher professional development or leadership development in their regions […]”170 Martin Post agreed that “it is fair to hold us to account, but it is always important to bear in mind that the people who affect the change in schools are good school leaders, good governors and, in the end, excellent teachers”.171

100.Nevertheless, while the Government’s focus is on tackling underperforming schools through academisation and changes of sponsor, it seems fitting to measure the overall impact of RSCs in terms of school improvement.

101.The impact of RSCs should be considered in terms of the improvement in young people’s education and outcomes, rather than merely the volume of structural changes introduced or other levels of activity. This approach would mirror the way in which the effectiveness of local authorities is measured, such as the number of children attending Good or Outstanding schools, and would increase confidence in the work of RSCs.

155 Department for Education (RSC 42) Annex D

156 Q247

157 Q6

158 Robert Hill (RSC 1) para 11

159 ATL (RSC 37)

160 Q281

161 Qq 274–275

162 Department for Education (RSC 42) Annex D

163 For example Q88 [Dominic Herrington], Q168 [Frank Green].

164 A request was submitted to the DfE Parliamentary Team on 4 December 2015, following Lord Nash’s oral evidence. Partial data was provided on 7 January 2016.

165 Q269

166 For example, Nottinghamshire County Council (RSC 6) para 5.1, The Education Foundation and the Sheffield Institute of Education (RSC 24) para 2.9.

167 Q200

168 ASCL (RSC 29) para 29

169 Q6

170 Q44

171 Q77

© Parliamentary copyright 2015

Prepared 18 January 2016