ADMINISTRATION
Cost
2.45 The administrative
costs of the Framework Programmes are high, with an average of
8 per cent[19]
of programme budgets set aside for administration. The OST said
that "though the Commission themselves claim that their costs
compare well with those of similar organisations in Member States,
comparable costs for UK Research Councils are well under 5 per
cent and are subject even then to continual downward pressure"
(pp 8-9). Far from guaranteeing effective management, these
high costs have failed to prevent a constant stream of criticisms
about the administration of the programmes.
2.46 The European Court
of Auditors reported in November 1996[20]
that eight research projects revealed, on random audit, irregularities
amounting to 28 per cent of their total cost to the EU (1.7 MECU
out of 6.3 MECU). The Commission has responded by trebling
the internal financial audit staff of DG XII from 4 to 12.
The Court of Auditors has now begun a major "systems audit"
of the whole of EU research activity.
Failure rate
2.47 Most witnesses
drew attention to the waste of time and effort expended by unsuccessful
applicants for Framework funding (QQ 22, 174, 178, 204-206,
212, 215, 245, 309, 313). Inventing Tomorrow (p 10)
said that there was a "continual increase in the number of
proposals not accepted: on average only 1 in 6 has received
funding"; in written evidence to us, the Commission gave
the figure of 1 in 5 (p 126); according to the
1995 Monitoring Panel, "most of the programmes had success
rates in the range 20-30 per cent" (i.e. 1 in 5-1 in 3).
Inventing Tomorrow concluded that "these preliminary
figures clearly indicate a need for better targeting of calls
for proposals and for more concentrated efforts as a way of reducing
the dispersal of resources and the administrative burden. A detailed
evaluation of projects will accompany the formal proposal for
FP5". For comparison, the success rate of new project grant
applications to the United Kingdom Medical Research Council for
1996-97 was just under 1 in 5; the BBSRC success rate
is similar (Q 311).
2.48 The CBI was one
of a number of witnesses to comment that the success rate was
"very much worse" than 1 in 6 in certain areas
(Q 174). According to the Royal Academy of Engineering (RAEng),
in the Developing Countries Programme, the success rate was as
low as 1 in 12 (p 88). The ESRC said that 548
proposals had been submitted for the Targeted Socio-Economic Research
Programme, out of which 38 projects had been awarded, a success
rate of only 1 in 14 which they attributed in part to
the insufficient targeting of the call for funding applications
(Q 245). Another particularly over-subscribed area is life
sciences, where the success rate is 1 in 15 (Q 344).
In 1995 the TMR programme saw considerable oversubscription;
the acceptance rates were below 1 in 10 for PhD grants,
and under 1 in 15 for funding of research networks.
Professor McCleverty (p 217) blamed this on the high level
of awards, about 1.3 MECU per network.
2.49 The BBSRC spoke
of the discouragement encountered by the large numbers of unsuccessful
applicants (Q 313). Professor Routti of the Commission thought
that response times for processing applications were important
in relation to the impact of failure-"in this field sometimes
we say that there is nothing worse than "slow - no" "
(Q 343).
2.50 Professor Tom
Husband of Salford University put a more positive light on the
Framework Programme failure rate (Q 366). "Actually,
1 in 6 is not too bad compared to some of the research
council programmes in Great Britain. Also in most cases the costs
which are associated with an individual university making a bid
to Europe tend to be marginal costs for a very good reason in
that very commonly the bid that is being made by an academic plus
his industrial collaborators is lifted in part from other bids
for other funding sources, from industry perhaps in the past or
a recycled bid from a previous project or from a rejected research
council bid or whatever. It is very unusual actually to start
ab initio and say, "I shall write a bid for a particular
project"." Professor Amman of the ESRC made the same
point (Q 255).
2.51 The 1995 Monitoring
Panel found that, as a consequence of oversubscription, the Commission
often puts successful bidders under pressure either to reduce
their budget or to merge with other groups. They deprecated both
approaches; they recommended instead budgetary reallocations from
under-subscribed programmes to oversubscribed programmes.
Selection process
2.52 DG XII submits
all Framework project proposals to peer review, i.e. assessment
by independent experts in the relevant field, but not in the same
way as (for example) the United Kingdom Research Councils. In
the United Kingdom, a Research Council committee receives proposals,
sends them out to appropriate experts for comment on quality,
receives their observations and decides against its own priorities
which alpha-rated proposals will receive funding. The Commission
described their procedure for us (p 129): proposals are submitted
to an evaluation panel of independent experts, who award each
a numerical mark. The Commission then selects proposals for funding,
taking into account not only the mark, but also the objectives
of the programme and the size of the budget. The Commission's
selection is submitted for approval to a programme management
committee of representatives of the Member States.
2.53 The University
of Edinburgh approved of the Commission's reliance on external
evaluators to assess proposals, but thought that "the timescales
they are given to do their evaluations are very short and can
lead to insufficient review of certain proposals. This is ultimately
a waste of time and resources" (p 221). Professor McCleverty,
who has served as an evaluator for FP3 and FP4 in the TMR field,
is content with the process and the timescale, but considers that
the large number of applications and the absence of focus reduce
selection to the level of a "lottery draw" (p 218).
Sir Dai Rees considers the system a highly inefficient use of
money (Q 434); and Sir William Stewart told us, "I
see researchers winning grants in the Framework Programme who
would not stand a chance in the peer review system in the UK"
(Q 405). Sir Dai suggested that the Commission might do
here what it does in other areas, and delegate the peer review
function to national bodies with the necessary expertise; but
he acknowledged that this would run counter to the prevailing
culture of central management of research (Q 446).
2.54 The OST believed
that all Member States shared the United Kingdom's concern about
the "excessive length" of the initial selection process
and the lack of transparency in a number of aspects (Q 25,
p 9). The Commission had proposed setting itself targets
"that will be excellent (if they meet them) for increasing
the speed of flow-through of project applications which is a source
of great frustration to applicants at the moment". A number
of improvements were in train to make the peer review process
more transparent. The Government were seeking better feedback
both to the applicants and to the management committees, and to
improve the contract negotiation process following the initial
selection "both to make it quicker and to try to avoid some
of the ... unnecessary chopping and changing of projects at that
stage, which again is a source of aggravation and inefficiency"
(Q 25). One recent innovation which was now available to
applicants for Framework funding was "a kind of self-help
manual in which you can check your own proposal against the evaluation
criteria" (Q 30). We learn from the report of the 1995
Monitoring Panel that some programmes are experimenting with electronic
"on-line" application systems and pre-screening of applications;
the Panel were not convinced that either would improve the situation.
The OST would welcome, as a step towards greater transparency,
identification of the members of peer review panels (Q 31).
The Royal Society went further and called for independent observers
to be present at panel meetings, saying that "Secrecy does
not engender confidence within the scientific community and is
not required to protect panel members from lobbyists" (p 213).
2.55 The 1995 Monitoring
Panel's report echoes the general dissatisfaction with the selection
process. With nearly 5,000 experts involved, the process is very
complicated to organise. Evaluators are recruited at short notice,
sometimes as little as four weeks. "Industry is generally
poorly represented in the evaluation panels, even in the industry-oriented
programmes." The evaluators' reports are of variable quality
and consistency. The final selection by the Commission is "the
least transparent part of the application process".
2.56 Professor Routti
agreed that the Commission needed to process proposals more quickly.
He thought that the minimum length of time needed to satisfy
rules of impartiality and transparency was three to four months
(Q 335).
Issuing of contracts
2.57 The long time
between a funding decision being taken and the issuing of a contract
was a cause of particular complaint (QQ 309, 449). The Royal
Academy of Engineering (RAEng) identified some specific problems.
Firstly, although the process of receiving approval for Framework
funding was a lengthy one-"anything from nine to 12 months
or even longer" (Q 214)-the Commission often gave successful
applicants only very short notice of the start dates for projects.
"The rule of the game is that the contract will start on
the first day of the month after signing the contract. Contracts
are only signed by very senior people in Brussels and if they
happen to be out of the country and turn up on the 30th of the
month they sign it then and you are expected to start a day after
or two days after." This caused problems for universities
in particular, as it was usually necessary to recruit research
assistants to work on the projects "and you cannot do that
at a moment's notice". Second, it was "extremely difficult
to get an extension to a contract to compensate maybe for a late
start consequent upon you wishing to recruit staff" (Q 213).
The RAEng called for a notice period of about three months between
definite approval of a project and its start date, with "some
degree of flexibility" (Q 214). The Commission told
us (p 127) that this was already the case: "Many contracts
are subject to specific agreed start dates, up to a year beyond
the time of contract negotiation".
18