- Children, Schools and Families Committee Contents


Examination of Witnesses (Questions 320-339)

CHRISTINE GILBERT CBE AND MIRIAM ROSEN

6 MAY 2009

  Q320  Derek Twigg: How many do you plan to do this term?

  Miriam Rosen: We are still considering the exact numbers for this term.

  Q321  Mr Slaughter: Do you think this issue of the time between inspections is important? I would have thought that it is fairly fundamental to determine the role of Ofsted. I suspect that, like the tide going in and out, you will move one way until everyone raises their hands in horror that you are inspecting too infrequently, and then move back the other way until you get to the point where people think you are doing it too frequently, and that things will go on like that by the year or by the decade. Five years seems such a long period of time, and not just for primary schools. If you have an inner-city area with very high pupil and teacher mobility, schools are sometimes quite fragile, and the reputation or the actuality could change or there could be a bad head teacher appointment. Inspecting such schools every five years almost makes Ofsted irrelevant to the process of monitoring whether a school is working.

  Christine Gilbert: The approach that we adopted essentially links right back to where we started this morning—to the nature of inspection, bringing different inspectorates together and the notion that inspections should be proportionate. Although we are talking about once in five years, the idea is that that is for schools that were last judged as good or outstanding, and where it looks, from the indicators that we have on them, as if they would still be performing well, we are suggesting a number of hard indicators—tests results, attendance and so on. We also believe that perception is really important, and we want perception survey results from pupils and parents to be taken into account. When I look at a number of schools, parental dissatisfaction with a school—you have outlined some of the reasons, and things do change quickly—often emerges as an indicator before you start to see changes in exam results or test results. We are looking at a number of indicators that will give us a feel for what is going on in the school. Nevertheless, they will never give us the accuracy that full inspection will. We are aware of the dangers. Even in the reduced tariff inspections that we have been doing, which have essentially involved less time—again, we are looking at schools that were good or outstanding last time and at the indicators and so on—the indicators are right in slightly over 90% of those schools, but they are not right in just under 10%. So there will be a margin of error, but in terms of adopting a system of proportionate inspection, we think that the one where we go for five years, rather than the reduced tariff, will be more effective. At the same time, the focus on satisfactory schools is more intense, as I said earlier. If their capacity to improve does not seem secure, we will go back more frequently, and special measures and notice to improve will continue as now. It is just a proportionate approach, and the thinking is that good and outstanding schools do not need inspection to the same degree as satisfactory or failing schools.

  Q322  Mr Slaughter: Do you see this as a further withdrawal of the inspectorate from hands-on involvement with schools overall? That seems to be the current trend. As far as parents are concerned, knowing that there is an independent process is the most reassuring thing, because you can get reputation wrong. As you say, reputation can be an early warning sign, but equally there can be reputational lags, with schools having bad reputations that they no longer deserve. Even test results may not be as reassuring to parents as the feeling that somebody with expertise has gone in, looked at the school and given it the okay. Do you feel that you are withdrawing from that process?

  Christine Gilbert: We feel that the system that we are establishing will use the voice of parents as a failsafe to bring the inspection forward. The concern is how we hear the voice of parents and how we get them to fill in questionnaires and so on to tell us about the school. If parents begin to understand that they are filling in a questionnaire to express dissatisfaction—I do not mean just a one-year dip, but an emerging trend—they will use the questionnaires more and tell us more, so that will have more validity. Otherwise, it has been hard to see how we could establish a proportionate system of school inspection. It is not right to keep inspecting every school in the same way all the time. The other thing we will do is continuing with our survey inspections. Even now, we will go on a survey inspection if we pick up things that concern us, that could also trigger a fuller inspection of the school.

  Q323  Paul Holmes: I want to go back to what you said about Key Stage 2. Head teachers are balloting on boycotting the tests; the teaching profession and many parents have long been opposed to them. You are saying that they are essential to what you do. Have you not evaluated what other countries do? The previous Education Committee went to New Zealand and saw how it tests a random sample of 4 to 5% of the kids in each school each year. Because it is small and random, the schools don't teach to the test—they can't. In Ofsted reports in this country, you have criticised schools for teaching to the test, but New Zealand has a different way of doing it that is less disruptive.

  Christine Gilbert: A number of countries have different ways of doing it. Interestingly, a number of countries—I do not know about New Zealand—are moving to the system that we seem to be dismantling. The thing that parents tell me—I hear this really strongly—is that they want to be clear about where their child is at a key phase of education. So, there was no outcry about Key Stage 3. The parents who speak to me—and they spoke to me during our discussions about inspection—feel very strongly that we need some clarity at 11 about where their child is. They do not want the school and the curriculum distorted. They do not want months and months of preparation for these tests, but they want some clarity at a key phase of education. That is what I have experienced and what I have heard.

  Q324  Paul Holmes: But in Ontario, which does very well in the PISA—Programme for International Student Assessment—studies, internal school-based tests are used by local government inspectors and not by a national organisation. They go in and say to schools, "You are not doing well enough." Again, there are no league tables or teaching to the test—it is a whole different system—so there are different and effective ways of doing things.

  Christine Gilbert: I did say that it depends on what gets put in its place. At the moment, we are basing a lot on the results from Key Stage 2. They are nothing like all—but they are an important element—of the judgements that we make about which schools we select and whether a school is dipping or improving its performance.

  Chairman: Thank you. We are moving on to inspectors.

  Q325  Annette Brooke: The Chairman has mentioned the differences between HMI and the regional inspection service providers. I need to take that a bit further, because we heard a number of comments in our previous evidence-taking sessions that expressed concern about inspectors not being experts in the phase of education that they are inspecting. I have asked you about nursery education in the past but it applies equally to primary and secondary. I would also like more detail on special educational needs. Will you expand on what you told us initially just to cover those points?

  Christine Gilbert: I do think that HMI are generally well respected, and there is a long tradition of respect for them. They have the quality assurance role that I was talking about earlier, but the additional inspectors are also good inspectors. If they are not, work goes on with a contractor to remove such inspectors and so on. We therefore have fairly secure—I am not saying that they could not be improved—systems of quality assurance. There is a clear requirement for the contractors to provide training. I did not go into detail—Miriam might want to do so—about the ways in which we talk to the contractors. There is a national board and so on and a number of things are organised, so we set requirements for training. For example, when community cohesion was introduced to the framework, it was incumbent on us to train our own inspectors. Also, all the inspectors involved in school inspections were trained. The same is true of HMI. I am not saying that the people who inspect schools inspect only in their particular area. You may well get somebody who was primary-trained involved in a secondary inspection and vice versa. They would have had to want to do that and they would have been trained and supported in doing it. I said earlier that I read special measures reports each week. If I do not look at the front cover, I cannot tell you whether the report has been written by an HMI or by an additional inspector—that is the term that we give to inspectors employed by the contractors. We also have a scheme in which we second heads and some deputies. What I am saying is that an HMI doesn't always write better reports than AIs (Additional Inspector). I would stick by the brand, but I also think that AIs are good inspectors, and our systems would suggest that.

  Chairman: Miriam, do you want to come in?

  Miriam Rosen: If people are inspecting in two phases, as I think you are suggesting, they will have the necessary expertise and training. We would not put somebody in an area where they were uncomfortable and untrained.

  Q326  Annette Brooke: So the teachers' fears are groundless?

  Miriam Rosen: If something has gone wrong in a particular instance, of course, there is the complaints system. As Christine says, we take that seriously, but we also try to ensure that the inspectors are deployed in a way that fits their training and expertise. We provide top-up training throughout.

  Christine Gilbert: But we also look at the results and evaluate the grades and scores that are given. We had some anxiety a few months ago that non-specialists looking at special needs were making too generous judgements about what they were seeing. We analysed this in some detail and then ran an intensive—I think that it was interactive—training programme devised by Ofsted specialists. Every inspector was to undergo this training. We always look at what we are doing to see if we can improve it. It is the same with community cohesion. We started to look at what is emerging from it and we felt that inspectors—HMI and AIs—could be more specific about some of the things being said, so we introduced additional training for that and so on.

  Q327  Annette Brooke: A point has been made to me very strongly by various special educational needs organisations and representatives that they are concerned that shorter, more infrequent inspections could result in a school's special educational needs aspect not being given enough attention. Clearly, the status of special educational needs within a school can change quite dramatically following a change of key members of staff. There is the query about whether you pick up changes when inspecting less frequently. Should not good special educational needs provision be an absolute requirement for getting a good overall grade?

  Christine Gilbert: I shall ask Miriam to talk in detail about how special educational needs are looked at in a section 5 inspection. We are about to embark on a very large review of special educational needs, and I think that that will be one of the issues that we look at—how to deal with less frequent inspections and so on. However, we feel that the framework that we have devised gives a central role to the evaluation of special educational needs provision in schools. It requires judgements to be made about those areas. We think, therefore, that we have addressed that in the proposals for September. However, as I say, our survey will be very large and extensive and will pick up those issues.

  Miriam Rosen: I agree that the inspection of special educational needs is very important. We are not proposing to move to shorter inspections, so it is not the case that less time will be devoted to special educational needs. At the moment, we look at provision and pupil progress, which involves looking at the teaching and the way in which pupils are assessed—is progress being properly monitored, are they being properly supported and so on? It is correct that with less frequent inspections they will not be looked at as often, but we are not proposing shorter inspections. At the moment, if pupils with special educational needs do not make good progress, the overall effectiveness of the school cannot be good. Our judgement is about the progress that all pupils make. That will be the same as we move forward into the new system, so you will not get a situation whereby the pupils with special educational needs are being badly served, and their provision is inadequate, but the school is judged "good". That could not happen.

  Chairman: John wants to come in on a specific point.

  Q328  Mr Heppell: I want to ask something a bit more specific about deaf children. I have read the briefing from the National Deaf Children's Society. You talked about evaluating grades. Deaf children are 42% less likely to get five A grades at GCSE, including English and maths. I should have thought that that evaluation tells you that there is something wrong there in the first place. You talked about the over-generous marks that inspectors have been giving. An NDCS case study of a school in London, in 2008, said: "`Pupils in the PDC (provision for deaf children) progress well because they are supported by highly experienced staff who ensure that pupils enjoy their work and are fully included in school activities.' However: The unit did not have a teacher in charge who was a qualified teacher of the deaf—or who was even a teacher." It went on: "No evidence was provided to substantiate the claim that deaf pupils were progressing well", and the "acoustics in the classrooms were poor and constitute a hostile listening environment." That may be a one-off, but if you currently have four inspectors who have sensory training—I don't know whether that is from the 200, from HMI inspectors or from the lot in general—isn't that a small number to be doing an evaluation of units in which there are deaf children?

  Christine Gilbert: We do use additional inspectors from the contractors to help us with different specialisms, and so on. One thing that is going to happen, too, from September—this is in terms of special schools—is that we are increasing what we call the tariff. Essentially, we are spending more inspector days in special schools to look at what is going on, so we do build in specialisms where we can.

  Miriam Rosen: We certainly will try to make sure that there are specialists. If there is a particular resource unit in a school, we will try to provide an appropriate specialist. We might not be able to do so all the time, but if there is a unit for deaf children, I would hope that we could provide people who have been specially trained to do that.

  Q329  Mr Heppell: Do you have any figures that you could provide later to show how many units for deaf children were not provided with an inspector who had specific skills in sensory impairment—hearing impairment in this particular case? I am worried about this. In some respects, it is not the quantity, or even the quality, of the inspectors that counts; if the inspector does not have that particular knowledge, they are never going to be able to judge what is necessary. I recognise that there might be some occasions when that happens, because you cannot have a specialist on everything, but there seem to be rather a low number for deaf children.

  Christine Gilbert: We will look at that and get back to you.[10] May I clarify something though. Was that information read out from an Ofsted report, at the start of your question? If it was, I would also be interested in that.

  Chairman: John, what was the origin of your quote?

  Q330  Mr Heppell: It was from the NDCS briefing, and the quote was from an Ofsted report. Apparently, the local authority was aware of the inadequacy of the unit because of a tribunal that was going on that was showing up difficulties in the school. An advisory teacher of the deaf for the local authority had reported that there was not appropriate leadership in the unit and so on, so there were problems with the unit, but it got a good report. I think that the implication is that the person who was doing the inspection may have been a good inspector, but did not understand the special requirements of that particular unit. Can I just ask one further thing very briefly on British Sign Language. I know that you answered a letter just last month, so it is rather early to be asking if there is any progress, but if people were going into a unit or a school where British Sign Language was used, would they either be proficient in British Sign Language or have an interpreter? It seems mad that someone would not have an interpreter on such occasions. The answer you have given seems to be, "We are reviewing that and will get back to you on inspection arrangements for September." Have you made any progress with that?

  Christine Gilbert: I would need to check that. I will get back to you quickly.[11]

  Q331  Chairman: We could do with the full information on that. Miriam, do you know anything about it?

  Miriam Rosen: No, we have not finalised our arrangements for September. We are still looking at that.

  Q332  Chairman: Chief Inspector, isn't this highlighting the problem? It is all right having Ofsted-lite, if I can use that expression, but it has disturbed me that we now have a language that includes a reduced tariff and a health check. If I went to see my doctor, I would not want him to do my health check on the internet. In this area, where we are talking about special needs, everyone tells us—certainly when I visit schools—that they particularly need highly qualified, thoughtful inspectors looking at the SEN provision, and then they say that they would like an HMI. Is it better to have many more HMI, or are you just saving money by having only 100 HMI, and lots of other cheap people from the private sector? Is it a cost saving? Otherwise, why do you not just have 300 or 400 HMI and be done with it?

  Christine Gilbert: The contractual position is helpful because it gives us great flexibility. It gives us flexibility in this area, too. I do think that we need specialisms to do some things, but I also have to say, about the extract that was read out from that report, that you do not need to be a specialist to know that it sounds very strange to have all the bit at the beginning that is really positive and then to say that there is not a specialist teacher in charge.

  Q333  Chairman: Chief Inspector, with great respect, you have not answered the question. Is it a cost saving? Are HMI too expensive?

  Christine Gilbert: No, that absolutely is not the reason for doing this. It has given us much more flexibility in the way that we do things. If we were to employ HMI rather than run these contracts, I do not think that we would be able to manage all the school inspections that we do. The additional inspectors are paid a competitive rate, and some of them are even ex-HMI.

  Q334  Chairman: But do you see our point? People tell us that they prefer an HMI-led inspection, that they would like HMI rather than the people whom you are hiring from these organisations, so it is only fair to ask you why you don't have more HMI.

  Christine Gilbert: We urge the contractors to use head teachers who are then trained as part of the team, and they are good, too. Additional inspectors are good inspectors. They are not second-rate inspectors.

  Chairman: All right. We will perhaps have the organisations in front of us to talk about how they are training. Test data: David and Annette. Who is starting? Annette?

  Q335  Annette Brooke: First, I thank the Chief Inspector for her letter following up the previous meeting when I asked about the correlation between overall Ofsted grading and schools' actual test results. It was quite interesting. I think that 56% of results corresponded to the satisfactory rating, so although it was not a close correlation, there clearly was a connection. I really want to pursue this a bit further, because it seems to me that we could never get away from the fact that the results of the tests are becoming the main criteria for a school's success or otherwise, whether in parents' eyes or Ofsted's eyes, yet we are not really getting the full picture of the school in the round. I really want to tease this out a bit further and ask what you say to people who have not got a balanced view of what is going on in schools because, whatever you say, they are just looking at the headline result figures. We are not really seeing innovation, a balanced curriculum or creativity—all the things that we really want to see in a school so that we know that it is successful.

  Christine Gilbert: We believe that our reports give a much fuller picture of a school—where the school is at, the progress in the school and so on—than just looking at straight test results. You yourself, when you produced your report on testing, recommended that the Ofsted report, for completeness, form part of the profile of things that would be produced and published and so on. We think that our reports give a full coverage. That is not to say that they will pick all of the interesting things going on in a school—that simply could not be replicated in a report. The NFER has done a more longitudinal study of schools and says that, initially, the schools were complaining about an over-reliance on test results and on data. That has absolutely gone. The new system was introduced in September 2005 and gradually, through time, that seems to have eased off and gone. I think that I have said to the Committee before, when I was concerned about people not understanding CVA, that we did a publication for our own inspectors and additional inspectors, but also sent it to all schools, to explain how they use data and so on. So data and test results are important, but they are absolutely far from being the complete picture. We make 30 separate judgements on what we are looking at in schools. Test results are still very important—you do need good results to get jobs or to access the courses that you want to do or should be doing at 16, 18 and so on. I hope that our reports give the broad picture and are not completely data-ridden, which seems to be the gist of what you are saying.

  Q336  Annette Brooke: May I follow that up? We were talking to a group of people, who were all SIPs, and their evidence came over for the most part as their greatest contribution being helping head teachers with the data. That brings us full circle back to the data. Who is helping the school improvement? Is this true improvement, if they are concentrating on getting the data in the right form for when you come along? Are we in some sort of vicious circle, do you think?

  Christine Gilbert: Until you said what you said at the end there, I thought that the first bit was positive, because helping schools with the data is a real help in terms of the schools understanding where they are and making a really good self-evaluation of their progress and what their needs are. If you look at data properly, you can see all sorts of differences within your school; you are not just comparing yourself with other schools. If the SIPs were doing that, it would be very positive. Just interpreting RAISEonline or CVA for the schools is not a good use of their time. That would not be a sensible thing for them to be doing.

  Chairman: Can we move to David pretty quickly? When I said "quickly", it didn't apply to you, David. We have three sections of questions—two and a half now—to get through before 11.30 am. I have guaranteed the Chief Inspector that we will be finished by then.

  Q337  Mr Chaytor: What is the margin of error on the typical SATs Key Stage 2 test result?

  Christine Gilbert: I don't think I could say with any confidence.

  Q338  Mr Chaytor: Do you accept that there is a margin of error?

  Christine Gilbert: I suppose that there must be. Do you know, Miriam?

  Miriam Rosen: The data package that we use for RAISEonline highlights statistical significance, so we would only say that there is a difference between, let us say, two schools if it were statistically significant. That is pointed out in the data package.

  Q339  Mr Chaytor: So you dismiss out of hand the academics who say that their analysis suggests that 30% of them are wrong.

  Miriam Rosen: You are talking about the inputs into the test data.

  Mr Chaytor: Yes.

  Miriam Rosen: Well, our data package cannot take that into account. What we do, of course, is supplement that with inspections. We do not look just at test results. That is the whole point of the inspector going into lessons and looking at what pupils are doing, talking to them about their work, seeing whether they understand it, doing work scrutiny and looking at pupils' books over the passage of time to see what progress has been made. That is the whole point of the way in which inspectors triangulate their evidence.


10   See Ev 143-44 Back

11   See Ev 143-44 Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2010
Prepared 7 January 2010