Select Committee on Science and Technology Appendices to the Minutes of Evidence


APPENDIX 39

Memorandum submitted by Professor Gary Craig, Professor of Social Justice, University of Hull

WHAT DOES IT ALL MEAN?

SOCIAL POLICY AND THE 2002 RESEARCH ASSESSMENT EXERCISE

  Before the RAE 2001 results were announced, there were already anxious questions being asked about the strength of the discipline which it was felt might be addressed by the exercise. From data supplied by HEFCE to the SPA in the autumn, it was already clear that there had been considerable change in the list of those submitting to social policy, reinforcing the picture which had been emerging over the past two years. Nine HEIs which submitted to the 1996 exercise had dropped out of the reckoning; most of these were new universities with a grade of two or less in 1996 and vice-chancellors had obviously decided that the game wasn't worth the candle. Some of them were institutions where undergraduate social policy teaching has been terminated in the past few years and some had seemingly submitted social policy staff elsewhere, notably to sociology, which also suggested a downgrading of social policy within the HEIs concerned. A total of about 87 staff, about 13 per cent of all FTE research-active social policy staff submitted in 1996 were "lost" in 2001 as a result of these disappearances (these staff may not have left the discipline but had not been submitted to the RAE). All of this was not good news to an area of study which has appeared in recent years to be losing ground and losing its distinct identity. The professional community entertained hopes that the RAE would provide a welcome boost to its image.

  Other early indications were more positive. In 1996 there were 44 submissions to social policy. In 2001, there were 47 submissions to social policy; not much change apparently there then. However, even with the nine HEIs from 1996 no longer present, a substantial new number submitted to social policy in 2001 which had not done so in 1996 and a total of almost 850 staff were submitted, more than 30 per cent up on 1996. This appeared to suggest a healthy growth in the discipline, contradicting the worrying news of course closures and difficulties in recruiting undergraduates. However, closer inspection also suggested that even these indications were not as positive as seemed at first sight.

  New submissions in 2001 ie those from HEIs which did not submit to the 1996 social policy panel, could be grouped into four clusters as follows:

    (a)  very new to the RAE field: Bolton Institute, Chichester University College (20 staff in all);

    (b)  new to the RAE social policy field: City, Plymouth, Westminster, Paisley: all small staff teams (five-nine), totalling 30 staff.

    (c)  HEIs which appear to have done a straight swap from social work or some other panel in 1996 to social policy in 2001: Birmingham had five in social work and eight in social policy in 1996 but submitted 16 social policy and social work to the social policy panel in 2001; Bradford (11 in both 1996—social work—and 2001); Liverpool (14 in 1996—social work, 18 in 2001); Nottingham (24 in sociology in 1996, 18 in 2001); Nottingham Trent (seven in sociology in 1996, 18 in 2001): these shifts may have reflected changed staff priorities or tactical moves or a combination of both. Overall, these shifts to social policy appeared to bring 73 new staff into the social policy field.

    (d)  Fairly seismic shifts:

(1)  Birmingham: Birmingham put two separate submissions to the social policy panel, one of which, a new public policy submission, has 51 staff, presumably gathered up from various other places not submitted to social policy previously. This submission sits alongside a separate social policy submission from Birmingham, an apparent case of having your cake and eating it.

(2)  Leeds submitted 22 to sociology in 1996 (grade four), 48 to social policy in 2001.

(3)  Southampton submitted 17 to sociology in 1996 (grade four), 26 to social policy in 2001.

(4)  University of Wales, Swansea submitted 15 to social work in 1996 and 10 in 2001 but none to social policy (or sociology) in 1996, and 15 to social policy in 2001.

        These reconfigurations may also reflect a combination of tactical moves, the outcome of arguments within HEIs (and particularly those which faced the difficulty of dealing with staff in joint departments covering two or more separate disciplines), and shifts in staffing profiles but are particularly marked here by the numbers of staff involved.

  The overall effect of these "new" submissions was about 260 new staff recorded as social policy academics in the RAE. The average number of staff per submission was 18 (up from about 15 in 1996). In summary, then, the picture was of more submissions, bigger submissions; seemingly good news.

  However, in the parallel context of social policy's well-rehearsed difficulties as a discipline, it was hard to avoid asking what proportion of those staff new to the social policy submissions were recognisably social policy academics. In the 1996 RAE, whilst the social policy community were supportive of the attempts of the panel to accept a wide-ranging understanding of research quality, mutterings were heard from colleagues in sociology that they had had a rough ride and it would have been perfectly understandable if some HEIs had rebranded their groupings—as opposed to reorienting their research priorities—in order to submit to social policy. These questions may only be answered in the medium term.

  When the results were announced in mid-December, more fundamental questions yet have been posed by social policy's "performance". Of the 47 submissions, 16 (about one third) attained the same grade in 2001 as in 1996. 13 went up one grade, two (North London and Sunderland) went up two. Two (Open and Bangor) dropped a grade. 14 of the 47 had no social policy grade in 1996 and therefore, in reality, for half of those 33 HEIs submitting in both 1996 and 2001, the picture was one of standstill. With the exception of Plymouth, none of those new (second submission only) or very new (first submission) to the social policy panel gained a grade higher than 3a, as one might have expected. Amongst the five HEIs which appear to have done a straight swap from social work or some other panel in 1996 to social policy in 2001, four achieved a grade 4, one had a 3a. And of those HEIs newly submitting large numbers of staff to social policy, one gained a 3a, one a grade 4 and two gained a 5; a total of 140 staff, or about one in six of the total submitted to social policy were involved in these last four submissions.

  Of the submissions for 1996 and 2001, the distribution of grades was as follows:

No achieving grade
1996
2001
5*
1 (1)
2 (2)
5
4 (3)
8 (6)
4
13 (13)
16 (10)
3a
6 (5)
15 (11)
3b
12 (10)
4 (4)
2
6 (2)
2 (0)
1
2 (0)
  


  This table offers the basis for two types of comparison. The figures in brackets in each column show the spread of grades obtained by those institutions submitting to both 1996 and 2001 exercises and probably gives the clearest sense of how the core social policy community—in one definition—has developed over the five year period measured by RAE indicators. This does show a very modest upward grade drift indeed, particularly between 4 and 5 grades, but there is also a clear bifurcation: half in each exercise were 4 and above, half 3a and below. In 2001, there were some notable casualties, the drop at the Open University of one grade and the failure of Brighton to improve on its 1996 grade, both causing eyebrows to be raised in considerable surprise amongst the community. This overall picture is in any case muddied considerably by the 2001 "imports"; the overall picture, taking all submissions into account suggests a much stronger—if substantially misleading—picture of upward grade drift.

  HEFCE has now announced that it will protect 5* departments, offer less than in 1996 to 5 and 4 rated departments, little to 3a departments and nothing to 3b departments. HEFCE has also announced it is not to implement any decisions on funding for a year, presumably as a tactic to bid up in negotiations with government for the higher education settlement. This is where the profile of social policy vis-á-vis other cognate disciplines—and indeed all other disciplines—gives particular cause for alarm and raises yet further questions about the 2001 results. What do they mean to the community and, of equal long-term significance, how will they be interpreted by those responsible for funding research?

  Fifty-five per cent of all staff submitted to RAE across the whole exercise were in 5 or 5* rated units. In social policy, however, only 35 per cent of all staff submitted (including submissions "new" to social policy as described earlier) were in 5 or 5* departments (up from 21 per cent in 1996), compared with 38 per cent for social work (21 per cent in 1996), and, ironically, given the apparent flight of a substantial number of submissions from sociology to social policy, 52 per cent also in sociology (24 per cent). In law, an apparently astonishing 84 per cent of staff submitted are to be found in 5 or 5* departments but this appears to be because the law panel took a narrow definition of quality—particular kinds of cited research output. Only six other units of assessment of the total of 69 registered a smaller proportion of staff than social policy in 5 and 5* departments.

  Does this really reflect a poor performance by the social policy community?; and what will be the impact on funding for social policy research through the RAE? Any interpretation of these results can only be speculative at this moment. This contribution is made by someone who, although he read several RAE submissions in draft from HEIs other than his own, has no privileged information and is, along with colleagues elsewhere, searching for a positive interpretation of the results which appear to fly in the face of his own experience of the research profile of social policy. SPA hopes that other members might like to add to the debate in succeeding issues to SPAN to give shape to a policy campaign for the Executive.

  Explanations offered to the present author to date include the following and, in light of detailed engagement with the panel members, with members of other panels and with HEFCE's own views, more may present themselves as plausible explanations as time goes by.

    (i)  the quality of social policy research really is poor in the sense that it has fallen well behind virtually every other discipline measured by improvements in RAE ratings. It is difficult to imagine many within the social policy community adhering to this view but some self-reflection is clearly appropriate;

    (ii)  other panels have adopted a much more strongly protective approach to their discipline by adopting means of working, and judgments of quality, which led to substantial hikes in RAE ratings across the piece. Without knowing how all other panels worked, it is difficult to address how this might have happened but clearly the social policy panel will want to be as open as possible to their own constituency about their ways of working, how they might have differed, if at all, from that of the 1996 social policy panel, for example in adopting the approach of being a reading panel (as opposed to the "reading-across" approach of some panels which took publication in particular outputs as a de facto indicator of quality), and any insights they may have on ways in which other cognate panels may have taken differing approaches;

    (iii)  the problem of the common definition of 5 and 5* quality, which involves a high international provenance. As my report on the 1996 exercise (Craig 1996[10]) argued, one of social policy's great strengths is its local policy relevance and there can be little doubt that this has grown in a period when a huge range of government initiatives have needed research and evaluation and where government has pressed the urgency of its social policy agenda on the community. Quite apart from this contemporary agenda, much social policy research is necessarily national because of its empirical, policy and theoretical base. This was used as an argument in my report against the further concentration of resources into a relatively few HEIs but in a context where there have apparently been remarkable improvements in other disciplines judged by this international standard, members of the social policy panel may have felt structurally constrained to judge research quality by inappropriate criteria. What might would lead to de facto would be the penalising of HEIs with a relatively "weak" international/comparative research record which the use of high profile international referees would simply accentuate.

  On the face of it, and without further information, the 2001 RAE results may appear to have damaged the profile of social policy and a sustained campaign by the Social Policy Association acting in concert, one hopes, with members of the RAE panel, will be needed to place an appropriate interpretation on the results and defend our corner in the forthcoming discussions on research funding. The alternative may be that the overall pot of money to social policy will be inadequate—even by the generally parsimonious constraints within which HEFCE has to work—and its distribution within the social policy community even more inequitable than hitherto. Few in the social policy community would claim that a piece of high profile comparative research is necessarily better quality than a report for the Department of Health or a local study funded by a charitable body or a consortium of local authorities. Yet that continues to be the implication of the general measures used by HEFCE and social policy may well suffer as a result. The trend, re-emphasised by the 2001 results, towards two tiers of social policy departments, in which only the top tier is seen as undertaking research worthy of HEFCE funding, may suit some of those in the top tier but would do nothing for the long-term sustainability of the discipline as a whole. Social policy has always argued that it needs both high quality and locally-relevant research; but it also needs a critical mass, in terms of its presence in a wide range of HEIs, if it is to be sustainable in terms of attracting resources. The alternative—as appears to be happening already to some degree—is that social policy will become submerged within other departments or disciplines.

  There has been a growing feeling for some time that many aspects of the RAE are indefensible, not least that it allows substantial game-playing (for example as to which panel to submit to, how many staff to enter or not and so on); that it allows for no appeal save on procedural grounds, leaving substantially aggrieved departments to live with the consequences of unchallengable judgments for five years; that it does not encourage risk-taking and developmental work or indeed the development of new disciplinary areas; that it reinforces the position of the haves against the have-nots; and that it is inappropriate to use a single set of criteria by which to judge the quality of research in more than sixty different disciplinary areas with different histories, traditions and culture. The 2001 outcomes appear to have done little to address these fears and the Social Policy Association now has a substantial job of work to do to defend the interests of all of its members.

  The 2002 RAE raises questions about the measures used within the exercise, and most of all the usefulness of a single set of performance indicators used uniformly across all disciplines, which it is hoped the Select Committee may address in its discussions.

24 January 2002




10   Craig, G., (1996), Quality First?: the assessment of quality in social policy research, London: Social Policy Association. Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2002
Prepared 24 April 2002