Select Committee on Innovation, Universities, Science and Skills Minutes of Evidence


Examination of Witnesses (Questions 40-59)

MS KAREN DUNNELL AND MR MIKE HUGHES

19 MARCH 2008

  Q40  Dr Gibson: Do you do CO2 and greenhouse gas emissions?

  Ms Dunnell: We do not in ONS, but the GSS does them, yes.

  Q41  Dr Iddon: So things can only get better! Is it possible, using statistics, to measure the quality and output of public services?

  Ms Dunnell: Yes, in many ways. As you probably know, we have a unit in ONS that has been set up to try to measure the productivity of public services, partly because the traditional method in the national accounts is to say that output for example of the NHS is equal to the financial input, and a big review was done several years ago which recommended that we should use statistics and research methods much more effectively to look at other ways of doing it. Therefore, a unit has been set up, which has looked quite thoroughly now at education and health and the administration of social security and so on. It is developing this with the departments in question, which also extensively use academic colleagues and help them to devise much better ways of measuring the output of health or education services and the quality of them; but it is quite challenging.

  Q42  Dr Iddon: The Statistics Commissioner has suggested that the emphasis on the use of statistics as performance indicators and targets has politicised your professional area, and also suggested that we are trying to push the boundaries too far. What would you say to that?

  Ms Dunnell: I certainly do not believe that it has politicised statistics. I think it has led to the development of statistics that may not be national statistics, and the use of statistical information about what is actually happening in public services, which is sometimes quite difficult to interpret—I am not quite sure what your question about boundaries meant, to be honest.

  Q43  Dr Iddon: It is about putting statistics beyond their own capabilities.

  Ms Dunnell: Yes, I think that there possibly are some examples of that. I think much more likely, however, is that administrative systems are set up that produce answers about things, and they are not set up in a proper scientific and statistical way. We would prefer to get involved in the process and set up proper systems to produce statistics that everybody can trust and that are based on proper methodology and codes of practice and so on.

  Q44  Mr Boswell: I think we are familiar with the fact that targets, or statistics are used as proxies for an outcome that inevitably has happened.

  Ms Dunnell: Yes.

  Q45  Mr Boswell: Do you have any evidence that when a particular statistic or variables are also targets as proxies for some kind of quality measurement of public service, they get abused? I am thinking rather of the Goodhart's law about a monetary aggregate: once it becomes the objective breaking down because people gain it. Is there any element of this?

  Ms Dunnell: I think that is part of the danger with it. For example, if you take hospital waiting lists, you can see from a political point of view that it is something that the population truly understands, because it is one of the things that everybody who goes into hospital experiences and it has a great meaning for people. I do think that it is one of those targets where it is possible for people to work extra hard in one sense or another in order to meet a target, and that of course may be the right way to propel different organisational units into higher levels of activity; but it may not be. That is why it is very, very important from a national statistics point of view to make sure that these statistics are produced properly—as Mike was saying, form some kind of portfolio of statistics about what is going on in hospitals so that people can make a wider judgment about it.

  Q46  Dr Harris: You just said that you thought hospital waiting list statistics were widely understood because everyone experienced waiting lists.

  Ms Dunnell: I am sorry, I did not mean to imply that waiting list statistics were widely understood, but I think that the notion of being on a waiting list is widely understood. It has a lot of political meaning.

  Q47  Dr Harris: If you take someone on a waiting list, what relevance is it to how long they wait how many other people are waiting? They can wait one day and there could be 2 million people also waiting one day. That would be 2 million people waiting for one day, which would show a fantastically high-capacity, excellent service. One hundred thousand people waiting for five years—much lower numbers—a bit of a disaster! The waiting list numbers coming down was what the target was—or what about maximum waiting time? What relevance is a maximum waiting time to a non-urgent operation if I have an urgent cardiac complaint?

  Ms Dunnell: I am sorry, I did not really mean to get into a big debate about it; I am just using it as an illustration that some targets—I am trying to think of one at the moment—

  Q48  Dr Harris: You are not going to find waiting list ones, are you?

  Ms Dunnell: They do not have meaning to the population to whom they apply.

  Q49  Dr Harris: A waiting list is actually probably the worst thing you could think about for an individual patient experience.

  Ms Dunnell: I would probably entirely agree with you. I was just using it as an example of something which at least the public understands. If you have a target—for example, the Treasury will have targets probably about the balance of payments, but not too many people will understand what the figure means or have any understanding. That was the point that I was trying to illustrate.

  Dr Harris: I will come back to this.

  Q50  Dr Iddon: In 2003 the Royal Statistical Society concluded that performance monitoring in public services was poorly conducted, and it called for a number of changes including reporting of measures of uncertainty and of random sampling. Have any changes been made following that report in 2003 and are any projected to happen in the future?

  Mr Hughes: It is not only the RSS that made those sorts of comments; the NAO was saying very similar things, and also the Statistical Commission looked at this. There is now a much stronger engagement of analysts in these sorts of processes. It is a political decision as to what the target should be. Quite often those targets have been in areas where traditionally statistics have not been collected before, so there has been a fairly large learning process about how you can compile statistics from administrative sources. Our traditional mechanisms have been surveys and things of that sort. There is a far higher level of engagement now by analysts across the piece. It may not always be a statistician that is doing it; it could be a researcher. We belong to a large analytical community in Government where jobs are interchangeable at times. The exemplification for this is that certainly with PSAs in the latest 2007 CSR there is a hope that there will be a senior analyst on the boards looking at those indicators to make sure that they do have data integrity.

  Q51  Dr Iddon: Mike, I think the real question is, was that 2003 RSS report relevant to your work and did it help you to see things a little more clearly?

  Mr Hughes: I think it did, Dr Iddon, but not just the RSS one; as I say, the Stats Commission's report, and the NAO saying very similar things at the same time.

  Q52  Mr Boswell: Can we turn to the user end? I must say that in my limited ministerial career it was useful to have even a rudimentary knowledge of statistics at least once or twice in particular! I know that your official guidelines for measuring statistical quality talk about providing the user with sufficient information to judge whether or not the data are of sufficient quality for their intended use. That is something that you might like to expand on briefly, but are most civil servants, and indeed dare I say Ministers, statistically literate enough to understand the messages your statistics carry; and do you feel it is important that we should encourage them to do that?

  Ms Dunnell: I would have to say that I think statistical literacy generally in the UK—and that applies to civil servants and politicians and most people actually—is very, very low. Nevertheless, our statistics are scrutinised not only by the statistically illiterate, but the statistically literate watch everything we do like hawks, and that is why we have a policy for national statistics where measures of quality such as competence intervals and information about sample sizes, et cetera, is placed on the website and is easily accessible when you are actually using the statistics. We are now trying—we have a goal, without a target, to make sure that when we produce our key statistics we have competence intervals or some measure around them in press releases and in the publications. A huge number of people still disregard competence intervals, I have to say.

  Q53  Mr Boswell: I think they do. I think you are in two minds on this: professionally you are bound to be cautious because, like a scientist, you are making assertions which are constrained by competence limits, and so a lot of your users will not necessarily understand what you say, even if you explained it to them; and some of them may have a motive for trying to pick things off the shelf to justify, say, a policy or a target. How are you going to work at getting those messages fitter for purpose into the public debate, and into the decision-making process? How can we make this a less ill-informed dialogue?

  Ms Dunnell: One of the ways we try to do it when we publish any series is to put it in the context of the past. For example, every month we produce a report on what is happening in the labour market, and every month we include a graph which goes back over a few years, so that you can interpret what is usually tiny changes in employment rates, for example, and getting a view on a graph over a period of time does help people to interpret the importance of this month's blip, which is going to be basically a blip around a point in time. We do try to do that. We are doing a lot of work trying to improve the graphs we use and the words we use to describe them. It is by constantly trying to do that that, hopefully, people will get used to it.

  Q54  Dr Harris: Would you say that one justification for the fact that the Government Ministers get statistics some time earlier than everyone else is that it gives them an interval to understand them, and therefore to not over-interpret them; and that if they do not use that opportunity there is little merit in them having statistics in advance?

  Ms Dunnell: I personally believe that we should have any pre-release time very short. The proposal at the moment that is being consulted on is that it is 24 hours. From a purely statistical integrity point of view I would prefer it to be no time at all. It is very much ingrained in our culture, the notion of pre-release, and if we can get the time cut down and the number of people that releases go to cut down, we will make significant progress.

  Q55  Dr Harris: I do not blame the Government for this—I would do exactly the same—put a spin on statistics. Is it the case that you use that interval when you are the provider of statistics to say, "This is what can be said; that is the limit of what can be said" and written advice saying, "This is what can be said on the basis of these statistics"? Otherwise, what is the point from your point of view? I know it is not your preference, but given there is this pre-release period how are you using that to ensure that you counter unreasonable spin?

  Ms Dunnell: The labour market is quite a good example of that, and we will probably have to adapt it when we get new rules in. Basically, ONS produces the labour market data and has done for ten or so years now, and we produce the monthly thing. Then we have a pre-briefing with the stands and the economists from departments that are interested in the labour market, which is mainly DWP, Treasury, DBERR and so on. They will come to a briefing with our statisticians, so that everybody can discuss the meaning of the latest trend in the context of the trend over the last year or so, so that they all have an opportunity to think about what these latest changes may be. Then those people go back and do their own briefing to their own Minister. That is the thing that everybody finds very, very useful at the moment, but it does open up the possibility (a) for leaks and (b) spin. That is why we need to cut the time down.

  Q56  Dr Harris: That was not my question, was it? I was suggesting why do you not use the time to directly talk to the special advisors and the press officers about what you think should or could or could not be said—could and could not be said about statistics? Then if it is ignored at least you have a record for your integrity—you have made a lot about that—so that you can sleep at night; and if it builds up you can say, "I resign if this continues". Why do you not do that?

  Ms Dunnell: Because the briefing mechanism we have, in your words, does that, but it is ONS talking to the people in departments who do that.

  Q57  Mr Boswell: It cannot be negotiation.

  Ms Dunnell: It is not a negotiation. We are trying to get a team of statisticians with a lot of experience of understanding the labour market and policies around the labour market to say, "What do these statistics tell us this month?" Then we get the best advice we can about what the statistics are telling us.

  Q58  Dr Harris: So you do not have any interaction with the people who are doing the spinning and the media briefing and—

  Ms Dunnell: Well, I—

  Q59  Dr Harris:— Ministers the line to take, you as statisticians I mean?

  Ms Dunnell: In my department, yes, I have a relationship with my press office. We have a dialogue about the way that we will present statistics in press releases and so on, and that is exactly the same as the dialogue that people returning from our labour market briefing to their departments will have with their press office.



 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 October 2008