Select Committee on Science and Technology Second Report


19. For the 2001 RAE, 2,598 submissions (296 fewer than 1996) were received from 173 higher education institutions, listing the work of 48,022 researchers (50 fewer than 1996). The RAE 2001 results, announced on 14 December 2001, showed substantial improvements in research ratings. In 1996 31% of research active staff worked in 573 departments rated 5 and 5*. In 2001 the figure was 55% in 1081 departments. 64% of the research submitted was of national or international levels of excellence (grades 4, 5 or 5*), compared to 43% in 1996.[20]

Table 2: The distribution of staff and departments in RAE grades

No. staff
% staff
No. depts
No. staff
% staff
No. depts

20. Results in the sciences (as defined by UoAs classified as being high cost research - see paragraph 60 - were better than the overall figures. 60% of active research scientists now work in 5 or 5* departments (compared with 34% in 1996). A selection of UoAs shows that this improvement is not uniform.

Table 3: Performance of selected science and engineering UoAs

  Staff submitted Percentage in 5/5*Number of departments Staff submittedPercentage in 5/5* Number of departments
Physics1,51651% 551,66879% 50
Environmental Sciences484 23%38541 25%34
Chemistry1,36934% 621,30042% 45
Electrical and Electronic Engineering1,204 33%65863 69%45
Metallurgy and Materials466 43%38402 49%30
Chemical Engineering331 40%22294 55%17

21. The graph below compares the results in high, medium and low cost subjects in 1996 and 2001. There is a clear upward trend in all areas.

Accuracy of RAE 2001

22. With such a spectacular increase in RAE ratings, it is legitimate to ask whether the improvement is a true reflection of the state of UK academic research and its performance over the last five years. The evidence we have received suggests that most in the science and education communities agree with HEFCE's assertion that it is indeed largely a reflection of reality. The Funding Councils' international benchmarking exercise supports the view that there has been improvement. The ratings of all 5 and 5* departments were validated by 290 overseas experts. All but nine agreed with the judgements of the panels,[21] although the EPSRC argues that "the involvement of international expertise is limited so the thoroughness of the international calibration could be questioned".[22] The improvement in research quality is also supported by citation analysis: the UK's share of the world's 1% most cited papers increased from 11% to 18% over the assessment period (1996-2001) and the average citation rate of UK papers has increased by 12%.[23]

23. There are sceptics. In an article in The Guardian on 15 January 2002, Professor Susan Bassnett, a pro-vice-chancellor at the University of Warwick, wrote "What that 55% represents [the proportion of researchers in 5 or 5* departments] ... is a morass of fiddling, finagling and horse trading. Nobody who works in a university in the UK in 2002 seriously believes that research is improving". Professor Richard Green of the University of Hull told us "the results are starting to lack credibility".[24] The biological science societies state that "some or all of this improved RAE score is undoubtedly due to increased familiarity with RAE exercises and the ability of university departments to play the RAE game".[25] We are aware of a mismatch between the formal institutional view and that expressed by individuals in private.

24. A number of concerns have been expressed. First, there is concern about the non-inclusion of researchers. In 1992, many new universities with little background in research entered the RAE for the first time. HEFCE changed the rules to allow departments to choose which researchers to submit. This was reasonable, given that the ratings system recognises the proportion of quality research in a given department. Departments with a small research capacity would be penalised and in any case universities would no doubt split the departments to isolate the research element to achieve the same effect. There is greater concern that researchers who are active are omitted for tactical reasons. We are aware of one department that received a 5* in 2001 despite having submitted less than 60% of its staff.[26] We recognise that few top-rated departments went to such extremes, but there have to be question marks about a mechanism that aims to measure UK research quality yet provides an incentive not to include some of the research conducted.[27] Funding should reflect the actual amount of research and its quality over the whole department and not those deemed active. Universities should have no incentive to omit any researchers.

25. Second, there is concern that by moving researchers between UoAs or splitting and merging departments universities can improve ratings without any improvement in quality. HEFCE said universities "are free to divide the staff employed in any one department between several UoAs, or to submit staff employed in more than one department to a single UoA".[28] Universities may, of course, use these techniques to improve their ratings, and no doubt some will achieve this without any actual improvement in research quality. Very strong departments could be used to protect weaker researchers and maximise the ratings of other departments. To achieve 5* status, a department needs to achieve levels of international excellence in more than half of the research activity submitted and attainable levels of national excellence in the remainder. Thus there is no additional benefit to a 5* department to achieve international excellence in, say, three quarters or almost all of its research activity. Thus by 'diluting' its best researchers with a few less successful researchers, a strong department can increase its income by boosting its active staff numbers while maintaining its rating.

26. Similarly, HEFCE says "Institutions with a number of poorly­rated departments often seek to increase quality by merging less successful departments with those that were highly­rated, in the expectation that exposure to the new culture and management will raise the overall level of the merged entity".[29] Mergers can have this positive effect. But if a university merges a very strong department, rated 5*, with a good 4-rated department, with some selective staff omissions, the new department could still get 5*. As a result, more staff are working in a 5* department and the university receives a larger QR grant, but not necessarily with any real increase in research quality.

27. Third, there is concern that transfers between institutions can distort the RAE results. Transfers are not a bad thing as they may bring dynamism to scientific research. We note that academic movement is lower in the UK than in the United States, for example, and that while the timing of staff movements has been affected, overall movement has not increased as a result of the RAE.[30] Mr Bekhradnia, HEFCE's Director of Policy, suggested that transfers would be neutral since, if one department gained, another lost. This is not quite true. Category A* staff (those who transferred in the year preceding the RAE) must submit two items of research. Thus if a researcher has only produced two high quality outputs in the assessment period, his or her contribution to that department's rating would be modest.[31] If that researcher moves to another university, however, his or her new department need only submit two outputs. Since he/she has only produced two high quality outputs, this means that someone who could be considered a '4-rated' researcher suddenly becomes a 5* by virtue of having moved institution. Again, there would be no increase in research quality, but a possible increase in departmental rating.

28. Fourth, there are concerns about the way the panels operated and their membership. There is a lack of clarity about how panel members and chairmen are chosen, and concerns about whether, as academics judging other academics, they are truly objective. We questioned HEFCE about the representation from industry. They told us that on science and engineering panels, non-academic membership was between 20 and 26%[32]. We solicited comments from the Chemical Industries Association and the Association of the British Pharmaceutical Industry.[33] The former had identified no problem. The ABPI felt that the volume of work may have discouraged participation from industry and that in the future it would suit industry representatives to act as observers on the panels rather than full members. These problems are confirmed by HEFCE.[34] Of the initial invitations to industry to serve on panels, 29 of 102 were declined (28%) compared to 52 out of 653 for other groups (8%). The EPSRC believes that there are still too few active industrialists on the panels.[35]

29. There are also concerns about the size of the panels and the number of outputs they had to consider. In its guidance to panel members, HEFCE says "you should not feel that you are required to collect, review or examine all research outputs listed".[36] It is easy to understand why. In Chemistry, for example, 11 panel members had to sift through over 5,000 submitted outputs, many of which must have been outside their area of expertise. The criteria by which sub-panels are set up not clear. For example, we have been told that plans for a sub-panel in Development Studies were dropped at the eleventh hour.[37] We recognise that panels could call on help in fields where they were weak, but there is still the suspicion that place of publication was given greater weight than the papers' content. It has been suggested to us that the RAE could make better use of the review process employed by the Research Councils at the end of project grants. The burden of work on panel members was considerable, and they seem to have been given inadequate support from HEFCE, for example, having to obtain their own copies of submitted papers.[38] We recommend that, in any future RAE, HEFCE provide panel members with more effective administrative support. Ensuring the validity of the ratings is money well spent.

30. On request, HEFCE supplied us with data that showed that those panels who received the lowest number of submissions tended to award higher ratings.[39] Such UoAs will not necessarily get more money since in England this is determined by the research volume in each Unit. While HEFCE rightly points out that a relationship is no proof of cause, it does give rise to doubts about the objectivity and reliability of the ratings.

31. With the above reservations, we accept the widespread view that the RAE ratings reflect an improvement in UK higher education research. There are areas of concern, however, and it is in no-one's interest to give a false impression of the state of UK research. HEFCE seemed rather complacent about these concerns. Mr Bekhradnia said "we should celebrate it, not try to denigrate it".[40] Unlike HEFCE, at least Mrs Hodge was prepared to admit that gamesmanship had had an effect. In her evidence she said "Is there an element of the clever old researchers in the HE sector learning how to manage the system that they themselves have put in place? There is probably a slight element but I do not want to overplay that".[41] We think the Minister has got it about right.

20   2001 Research Assessment Exercise: The Outcome. HEFCE 2001 Back

21   Ev 2, para 17 Back

22   Ev 77 Back

23   Ev 2, para 18 Back

24   Ev 59, para 5 Back

25   Ev 108, para 3 Back

26   Ev 64 Back

27   Ev 65, para 5 Back

28   Ev 3, footnote 6 Back

29   Ev 4, para 32 Back

30   Q 6; Ev 5, paras 47-49 Back

31   Ev 59, para 9 Back

32   Ev 17, para 24. We recognise the difficulty in classifying panel members Back

33   Ev 119-121 Back

34   Ev 121, para 3 Back

35   Ev 77 Back

36   Research Assessment Exercise 2001: Panel Members Handbook, HEFCE Back

37   From a meeting at the University of East Anglia attended by the Chairman on 22 March 2002 Back

38   Ev 89, Ev 119 Back

39   Ev 124, para 7 Back

40   Q 60 Back

41   Ev 51, para 164 Back

previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2002
Prepared 24 April 2002