Previous SectionIndexHome Page


6.28 pm

Mr. David Kidney (Stafford): I was pleased to serve on the Treasury Sub-Committee under the leadership of the right hon. Member for Fareham (Sir P. Lloyd), who was always a wise and good-humoured guide in our deliberations. What he omitted to say about our choice of the Office for National Statistics as the subject for our first inquiry is that the members of the Committee suspected that it would be a fairly straightforward issue--a gentle dipping of our toes into the water. Little did we suspect that there would be the great national explosion of the failure of the average earnings index in October, which would concentrate the attention, seemingly, of the world on the work of the investigation and of the ONS itself.

In defence of the Government--I ask myself why I should defend them, but I choose to do so--and as regards the timing of the publication of the White Paper, following what happened with the average earnings index we had two investigations into the index, a further review of the efficiencies of the ONS and the deliberations of our Sub-Committee report--all of which feature strongly in the White Paper, which is good to see, as matters that have been taken into account. Those also include the work of the Select Committee. That is all pleasing and a reasonable defence of the timing of the production of the White Paper.

Information is power, and statistics are a component of that for policy advisers or makers in the public, private or voluntary sectors and for the public at large. Everyone is interested in possessing accurate statistics and information. I shall use the story of the average earnings index to show what happens when official statistics are compromised and what the possible causes of that might be. The story of the AEI shows two pressing needs for the work of the Office for National Statistics. First, there should be strong management capable of clear strategic thinking. Secondly, the management must have robust independence from political interference.

The right place to start the story of the AEI is in the spring of 1998. The earnings growth figures that spring were worrying. The Monetary Policy Committee had delivered itself of the opinion that average earnings growth in excess of about 4.5 per cent. would be inconsistent with meeting its inflation target without having to raise interest rates. That spring, earnings growth was about that figure. Everyone knew that bonuses influenced the figures but people wanted to know whether they represented one-off payments as a reward for the past 12 months' performance, in which case they could largely be discounted as an inflationary pressure, or whether there was something more to them. In spring 1998, the Bank of England asked the ONS to work on analysing the bonus-related element. In April and May that year, the ONS did some work on that and passed its findings to the MPC.

In June 1998, the MPC announced another rise of a quarter of a percentage point in the base rate, to 7.5 per cent. That decision surprised forecasters and, in explaining its

19 Oct 1999 : Column 302

decision, the MPC said that it had had regard to the average earnings growth figures. The figure available was that from the ONS for March 1998, which had risen to 4.9 per cent., clearly above the 4.5 per cent. that the MPC had stated was consistent with not raising interest rates. Given the MPC's thinking, it was no surprise that it raised interest rates at that months' meeting. The MPC was heavily criticised at the time for that rate rise, for giving too much emphasis to only one economic indicator. In defence of the MPC, although the indicator that it was relying on was earnings growth, it did not rely only on the figures from the ONS. It had access to other information such as that produced by the Reward Group, business surveys and its contacts with the business community. Nevertheless, that was its decision.

We returned to the average earnings index in October 1998. The public's attention was attracted to the index by two sets of startling revisions in a short space of time. First, on 6 October, the earnings figures for May, June and July were slightly revised. There were no great shocks there, but on 14 October, the whole back-run of data was heavily revised without any warning to the markets interested in the statistics. It was a stunning event and everyone was shocked. There was huge criticism of the MPC's decision back in June because the revised data suggested that the peak in earnings growth in spring and summer had not been as high as the MPC had been led to believe. One national newspaper carried a banner headline saying that the MPC's blunder had cost the country 10,000 jobs. Little wonder that the Chancellor was described as being incandescent with rage.

The director of the ONS, Tim Holt, announced that he was suspending the index forthwith and that an independent inquiry was to be undertaken by Southampton university into what had gone wrong. He said that it was right to suspend the index from publication because everyone had lost trust in the statistics. At the same time, the Chancellor announced a separate inquiry into what had gone wrong with the index, led by Andrew Turnbull from the Treasury and Mervyn King from the Bank of England. They directed Martin Weale of the National Institute of Economic and Social Research to conduct a second inquiry especially for the Chancellor. Both inquiries produced reports this March, when the average earnings index was rehabilitated and reinstated.

It is interesting now to look back at what the reviews concluded was the state of earnings growth back in spring and summer of 1998. The reviews showed that the original figures, prior to any revisions, were about correct; if anything, they were slightly overstated. It was a case of "as you were". It could be argued that the MPC had been vindicated but, alas, there were no banner headlines to that effect.

During the furore of the reviews, the Treasury also announced a new efficiency review of the ONS. KPMG was to report to a Treasury-appointed steering group with recommendations for making further efficiency savings in the costs of the ONS. That review led to several recommendations, three of which I shall cite as examples. It said that the number of ONS offices could be reduced, that more support services could be outsourced and that there should be changes at the top of the management. The collective result of the recommendations would eventually be annual savings of £20 million. In announcing that it accepted most of the report's recommendations, the Treasury said that the £20 million would be recycled into improving the rest of the ONS's services.

19 Oct 1999 : Column 303

I fear that all this demonstrates the potential for political control of statistics. The Government set the ONS's budget and are its major customer. On this occasion, the Government appointed their own review into what had happened with a particular set of statistics and put in train their own efficiency review. Those are the sort of things that we as politicians should guard against in protecting national statistics. As the Green Paper said, it is a matter of trust.

As the right hon. Member for Fareham said, the ONS has many other customers, not only the Government of the day. As my example showed, the Bank of England is an important customer. The nations and regions of the United Kingdom, local government, business and the public at large also all want accurate statistics.

When the Government responded to the Treasury Select Committee report, it seemed that there was agreement between the Committee and Government on two things that should be improved in the ONS: improving the statistical base and capacity, and prioritising services to users of the ONS. That offers some encouragement as to how we go from here.

I want to consider the White Paper. No one has mentioned that our report draws attention to the international comparison of the costs of the ONS with those of similar bodies in other countries. We get our statistics remarkably cheaply. Who am I to say whether those countries are spending too much on the provision of statistics or we too little?

Despite several speeches this evening which showed ways in which doubt could be cast on statistics, the people producing them or the Government standing behind them, it has to be said that the evidence received by the Select Committee was that by and large the statistics produced by the ONS are among the best in the world in terms of their reliability and the processes by which they are produced. It is important to make that point to redress the balance of the tone of the debate so far. There is no reason why we should not have it as our ambition, White Paper or not, to make the statistics produced in this country world leaders. Indeed, that phrase appears in the foreword to the White Paper.

Some of the vital issues to be discussed as a result of the White Paper have already been mentioned by other hon. Members. In deciding what statistics are national and what are not, I stand by the recommendation of the Select Committee. It ought to be the job of the national statistician, the statistics commission or a combination of the two. It is not the job of Ministers who want to defend their departmental interests. So in that area, the White Paper needs further development.

Mention has also been made of the right of the director of national statistics to go to the top--to the Prime Minister of the day--to complain about any infringement of the integrity of national statistics. As two hon. Members have mentioned already, the Select Committee recommended that in addition to being able to go to the Prime Minister about the integrity of national statistics, the director ought to be able to go to the Prime Minister on the question of resources. Resources are the Achilles heel of the White Paper. There is no explicit mention of the mechanisms by which anyone can decide whether we are spending the right or the wrong amount on producing national statistics. That needs to be corrected.

19 Oct 1999 : Column 304

Four components are essential to our future statistical service. The first is quality management. That is dealt with in the White Paper. The second is political accountability--not interference in management. To some extent, that is dealt with in the White Paper, but not to my satisfaction. I make it clear that that is my own opinion. The third is adequate resources, which I have now mentioned twice. It is important to stress a third time that, when used by some people such as exporters, accurate and timely statistics may have an influence on the wealth of the country. So if we need to spend more money to produce greater wealth, that is a consideration to which we ought to have regard. The fourth component is customer involvement. It is important for the commission to take on the role of ensuring that the views of those who want good statistics are heard right at the top of the tree.

I said at the beginning of my speech that information is power. I hope that what we are about tonight is helping to form a view on how to create a statistical commission and national statistician that produce statistics that contribute to the power and wealth of our country. I suggest that statistics ought to be impartial, reliable, accessible, timely and economic to produce. When right hon. and hon. Members read Hansard tomorrow, they will see that the first letters of those words spell irate. I suggest that Members of this House will be very angry indeed if we do not produce a statistical service of which we can all be proud.


Next Section

IndexHome Page