DIUS's Departmental Report 2008 - Innovation, Universities, Science and Skills Committee Contents


2  Style and content of the Departmental Report

Introduction

Guidance to departments on the preparation of departmental annual reports in 2008 was issued in December 2007 by the Treasury. It stated:

Departmental reports are the main vehicle for departments to explain to Parliament and the public how they are organised, what they are spending their money on, what they are trying to achieve, and how they are performing.[8]

The reports should continue to be both forward and backward looking, setting out plans and information about performance, and drafted to present a clear picture of the department's aims, activities, functions, and performance, linking performance delivered over the past year with resources consumed.[9]

Style of the Departmental Report

During the evidence session with officials in DIUS we selected at random and read the following extract from the Departmental Report to Mr Watmore:

An overarching national improvement strategy will drive up quality and performance underpinned by specific plans for strategically significant areas of activity, such as workforce and technology. The capital investment strategy will continue to renew and modernise further education establishments to create state of the art facilities.[10]

Mr Watmore was unable to explain the meaning of the passage. He conceded that "documents written by people in senior positions can often be very inaccessible to the public" and he undertook that for next year DIUS would "get the plain English people in earlier".[11]

The inaccessibility of the prose was not the only problem we encountered. It was compounded by the use of jargon-riddled phrases, assumptions backed-up with no clear evidence but which appeared to be designed to provide a positive tone to the Report, and euphemisms deflecting likely failure. Here are some examples we noted:

a)  "creating a vision of success";[12] This is the first heading in the first substantive chapter of the Departmental Report.

"For our customers, there are three particularly important ways in which we can demonstrate the success of the [Departmental] blueprint. These are first to develop true insight into their needs and requirements…";[13] The phrase "true insight" begs the question: what sort of insight did the Government and the former DfES have of these "customers" previously? (We consider further the use of the term "customer" at paragraphs 9 and 10 below).

"challenging growth trajectory to 2010";[14] This phase is used in a passage reporting on the prospect of achieving the target of "Reducing the number of adults in the workforce who lack Level 2 qualifications". We construe it to mean that the likelihood of achieving the target is slim.

"Our reputation for innovative policy-making approaches, fresh policy insights, bold points of view".[15] The effect of this phrase is to add weight to assumptions about DIUS's innovative approach, which, as far as we can see, has yet to be established.

TERMINOLOGY

Many terms used in the Departmental Report come from business and the private sector. We raised two examples with Mr Watmore and DIUS officials: "customers"; and the DIUS "brand". The term "our customers" is used 19 times in the Departmental Report and DIUS appears to want those who use its services to see themselves as its customers. The Report explained that to:

ensure that we have a shared understanding of our customers' needs, we are producing Customer Intelligence Packs for each of our major customer groups. This will provide the common foundation for developing the specific insights we will need for each area of policy, service or communications. Our policy and communications teams will build these specific insights in the planning stages of their work, using methods like customer experience or journey mapping.[16]

We think this passage means: in drawing up policy DIUS takes account of the views of those who will be affected. This interpretation was borne out by the reply DIUS gave when we asked about the packs. Ms Etheridge, Director, Strategy and Communications at DIUS, explained that "to take quite a simple example […] if you are designing a policy where you are trying to get support for training to employers then you can design a much better policy if you know what employers are looking for and what they want".[17] Such an approach is commonsense. We had doubts, however, about the use of the term "customers". A customer in common usage is someone who having weighed up choices on offer decides to spend money on goods or services. We questioned whether the concept fitted well with DIUS's main activities, many of whose "customers" receive money from DIUS and cannot take their business elsewhere. Mr Watmore explained:

One of the criticisms […] of Whitehall departments is they are not quite sure who their customers are, what are they doing it for. In our case, is it for the minister, is it for a university, is it for the college sector? We took the view that the customer base was individual learners, adult learners and businesses, businesses as employers and businesses as innovators in the economy.[18]

When we pressed Mr Watmore on the term "customer" he accepted that it might not be correct[19] and considered that "maybe 'client' for us is not a bad word".[20]

In his introduction to the Departmental Report Mr Watmore cites as evidence of progress "developing an increasingly strong DIUS 'brand'".[21] During the evidence session Mr Watmore explained:

One example of being new that is difficult is on day one all of your staff are coming from different parts of Whitehall and they have all got different backgrounds and different understandings. On the other hand, the advantage of that is you can move them all to something new and different if that is what you want. In our particular case we laid out our vision for the Department on one piece of paper, our so-called blueprint, and have tried to get all of those pieces of the blueprint now in action. Just to pick two or three examples of what we mean by the brand. […W]e have tried to make our Department very much focused around our tagline […] of investing in our future. We thought that phrase resonated with everything that we did in the Department: investment, a cost in for reward out, not necessarily in the same timeframe because quite often what we are doing is putting cost into a system to get the benefits some years later, and often it can be decades later, so it is a true investment cycle. Secondly, it is in "Our Future". "Our Future" is a phrase that we are using in our advertising campaign for things like Train to Gain and the Skills Agenda.[22]

The Oxford English dictionary defines brand-image as "the impression of a product in the minds of potential users or consumers" and brand awareness as the extent of "consumer familiarity with the name, image, or distinctive qualities of a particular brand of goods or services".[23] A brand is therefore a symbolic embodiment of the information connected to an organisation creating associations and expectations about its products and services. We have no doubt that it is sensible for the senior management of DIUS to ensure that those working for the new department work together in pursuit of a common set of aims and objectives. In addition, we accept that the attributes of a brand may assist the Learning and Skills Council or Train to Gain which have a more narrowly defined role or function than DIUS. But we were not convinced that what is an internal process to DIUS needs to be projected as the DIUS brand outside the department.

CONCLUSIONS

We acknowledge that some parts of the Departmental Report are informative and we quote them later in this Report.[24] Other parts are nearly impossible to read and understand. The Departmental Report is especially frustrating when addressing the broader objectives of DIUS and setting out how the Department is going to meet them and measure progress. It may be that this is a product of uncertainty as DIUS feels its way towards the very ambitious objectives that the Prime Minister has set. It may also be that the imprecise language fills a vacuum because, as Mr Watmore noted, the effects of decisions and investment made now may not be realised for years, even decades.

We conclude that the DIUS Departmental Report is by most standards a poor read. It is written in an impenetrable style and is peppered with jargon, unsupported assumptions and claims designed to promote DIUS. It does not meet the terms of the Treasury's guidance to present a clear picture of the department. We recommend that DIUS's 2009 departmental report be written in plain English, be shorter than the 2008 Report and use terminology appropriate to its functions.

Use of statistics

We asked a number of questions—both oral and written—about the statistics in the Departmental Report. We are grateful for, and on the whole satisfied with, the written replies provided by DIUS. The same could not be said of the presentation of the statistics themselves in the main body of the Departmental Report. The Treasury Guidance on departmental annual reports advises that when reviewing progress against targets "clear, transparent, and comprehensive reporting is essential".[25] Our concerns can be illustrated with an example. The Departmental Report stated that

Progress is being made towards achieving fairer access to higher education: between 2002/03 and 2005/06, the gap in participation among young people from higher and lower socio-economic classes closed by 3.5 percentage points. Table 1 [reproduced below] shows the proportion of young UK-domiciled entrants from state schools and disadvantaged groups to full-time, first degree courses at universities in England.[26]

Table 1

Young UK-domiciled entrants from state schools and disadvantaged groups to full-time, first degree courses at universities in England (1997-98 to 2005-06)
State schools Lower social classes (IIIM, IV, V)* Lower socio-economic classes (4-7)* Low participation neighbourhoods
1997-98
81.0
24.7
N/A
11.4
1998-99
84.4
24.9
N/A
11.6
1999-00
84.1
25.1
N/A
11.7
2000-01
85.0
25.3
N/A
11.8
2001-02
85.2
25.5
N/A
12.4
2002-03
86.4
N/A
27.9
12.5
2003-04
86.1
N/A
28.2
13.3
2004-05
85.9
N/A
27.9
13.1
2005-06
86.9
N/A
29.1
13.5
Source: Performance indicators in higher education (published by the Higher Education Statistics Agency)

*The national statistics socio-economic classification was introduced in 2002-03 to replace the social class groupings. The two classifications are not directly comparable.


We set out the evidence on the 3.5% claim at length because achieving fairer access to higher education is a key objective of the Government and to highlight the problems we encountered in DIUS's use of statistics. Table 1, which appears on page 69 of the Departmental Report, neither mentions higher socio-economic classes nor does it identify the closure of a 3.5% gap. (Nor does the more detailed data at Annex 1 of the Departmental Report, which reports performance against PSA targets.)[27] Subsequently in a written memorandum,[28] DIUS explained that the claim was based on the Full-time Young Participation by Socio-Economic Class (FYPSEC) measure which provided three figures for each year:

(a)  The proportion of English-domiciled 18, 19 and 20 year olds from the top three socio-economic classes, who participate for the first time in full-time higher education courses at UK higher education institutions and English, Scottish and Welsh further education colleges.

(b)  The proportion of English-domiciled 18, 19 and 20 year olds from the bottom four socio-economic classes, who participate for the first time in full-time higher education courses at UK higher education institutions and English, Scottish and Welsh further education colleges.

(c)  The gap between these two participation rates.

At the time of publication of the Departmental Report, FYPSEC figures were available for 2002/03 to 2005/06 as follows:

Table 2
2002 20032004 2005
Participation rate for NS-SECs 1, 2, 3 44.6% 41.5%41.5% 43.3%
Participation rate for NS-SECs 4, 5, 6, 7 17.6% 17.9%17.7% 19.9%
Difference 27.0% 23.6%23.8% 23.4%
(Total drop in gap: 3.5 percentage points)[29]

In its memorandum in November 2008 DIUS was able to update the figures:

Since the Departmental Report was published, FYPSEC has been revised according to changes in underlying datasets (including revisions to the population estimates and the Labour Force Survey by the ONS)[30] and updated to 2006/07 as follows:

Table 3
2002 20032004 20052006
Participation rate for NS-SECs 1, 2, 3 44.1% 40.9%41.2% 42.8% 39.5%
Participation rate for NS-SECs 4, 5, 6, 7 17.5% 17.8%17.4% 19.8% 19.0%
Difference 26.5% 23.1%23.7% 22.9% 20.5%
(Total drop in gap: 6.1 percentage points)[31]

Turning back to Table 1 in the Departmental Report, DIUS explained that the performance indicators in the Table were "not intended as evidence for the 3.5 percentage points claim, but as supplementary evidence of progress in widening participation".[32] It explained that the Table "showed the proportion of young, UK domiciled entrants from state schools and disadvantaged groups to full-time first degree courses at universities in England" and that the key difference between FYPSEC and the Performance Indicators was as follows:

(d)  FYPSEC provides the proportion of the English young upper/lower socio-economic class populations who participate in higher education: i.e. a population basis. This allows the measure to account for changes to the socio-economic breakdown of the underlying population of England.

(e)  The Performance Indicators show the proportion of UK-domiciled young full-time first degree entrants who are from lower socio-economic classes, i.e. a student basis. This takes no account of the socio-economic breakdown of the underlying population, nor any year-on-year changes to this.[33]

When we put the question to DIUS at the oral evidence session, we were mindful that Professor Sir Martin Harris, Director of Fair Access to Higher Education, had said in evidence to us in June 2008 that "broadly speaking, my understanding is that, after a dip in the year when fees were being talked about, […] the numbers of applicants and students admitted has edged up slightly each year and that within that the proportion in the lowest social groups has stayed stable."[34] Figures from UCAS[35] appear to bear this out. We therefore put these figures—set out in Table 4 below—to DIUS.

Table 4

Socio-economic classification of candidates accepted to higher education degree courses

% of accepted applicants from socio-economic group

1 23 45 67
Higher managerial

and

professional

Lower managerial and professional Intermediate Small employers and own account workers Lower supervisory and technical Semi-routine Routine
200223.3% 31.2% 15.6%7.3% 4.6%12.5% 5.6%
200322.7% 31.3% 15.2%7.4% 5.0%12.9% 5.5%
200422.5% 31.6% 15.2%7.3% 4.8%13.0% 5.5%
200521.7% 31.4% 15.2%7.4% 4.8%13.9% 5.6%
200622.4% 31.2% 14.5%7.7% 4.8%13.5% 5.9%
200722.9% 31.1% 14.3%7.6% 4.7%13.6% 5.8%
Note: Proportion based on home accepted applicants with a known classification

Source: UCAS annual datasets

DIUS responded that the

UCAS figures provide a different perspective […] looking at the socio-economic breakdown of the higher education applicant group. But […] this does not take account of changing size of socio-economic groups.

No single background type (e.g. socio-economic class, home area, school type, income) can comprehensively convey that somebody is from a deprived background. Differences in coverage of higher education data sources have led to the production of a number of different measures and indicators of progress with widening participation, each focusing on a particular background type (e.g. socio-economic class) and a particular group of students (e.g. young full-time). Therefore using a basket of measures/indicators gives us more confidence in the overall story being told.[36]

CONCLUSIONS ON THE PRESENTATION AND USE OF STATISTICS

If we start with the text on page 69 of the Departmental Report the informed reader would expect the claimed 3.5% improvement in participation in higher education to be justified in Table 1, which immediately follows the claim. Instead, the reader is left baffled. In our view it is neither tenable nor helpful to claim as DIUS has done that Table 1 was "not intended as evidence" for the 3.5% claim, but as "supplementary evidence of progress in widening participation". We consider such an approach to be unacceptable. We recommended that where a statistic is cited in the Departmental Report, the evidence to support the statistic be set out in full, if necessary in a footnote. We also recommend that in the material supporting the statistic DIUS provide information on the quality of data used, the source and the baseline and also provide a commentary on past performance.

The exchange raises a broader question about the use of statistics: how statistics are used to measure the effectiveness of government policy—in this instance, how the participation of socio-economic groups in higher education is measured. It may well be that the approach adopted by DIUS is the most helpful. But other approaches are possible, such as the use of UCAS figures, and a key authority, the Director of Fair Access, told us that there had been no change. The danger is that departments will simply choose the dataset that fits their conclusions unless they set out in advance of seeing the data the metrics they are going to use to ensure consistency. In our view the way to resolve such issues is to introduce independent review of the figures cited in the Departmental Report. We therefore recommend that future departmental reports are reviewed before publication by either the UK Statistics Authority or by an independent person such as an academic statistician, whose opinion on the statistics is included in the report and that the appropriate metrics are specified in advance.

DIUS SPENDING ON SERVICES BY COUNTRY AND REGION

Finally, we noticed that Table 7 of the Departmental Report showed that "total identifiable Departmental spending on services by country and region"[37] fell in every English region, except London, between 2006-07 and 2007-08 and we asked DIUS for the reasons. When it replied DIUS explained that in

looking to respond to the Committee's question, we identified inconsistencies in the allocation of spend in compiling the three Country and Regional Analysis (CRA) tables in the Departmental Annual Report. These affect all three CRA tables. The key issue related to the allocation of spend by the Learning and Skills Council from 2007-08. Addressing this means that DIUS spend in all regions increased between 2006-07 and 2007-08, and the movement of London spend is now in line that of the country as a whole.[38]

DIUS supplied revised tables which we have reproduced:[39] the total figures for England for 2008-09 have been corrected from £14 billion to £16 billion. We commend DIUS for owning up to the error in three tables in the Departmental Report setting out country and regional data and for supplying corrected tables. But we must put on record our concern that significant errors in the three tables setting out the country and regional analyses were not noticed before publication.

CAPABILITY REVIEW OF DIUS

As a postscript to this issue, we note that the Capability Review of DIUS, published on 11 December 2008, found that, whilst "analytical capability is strong in some areas [in DIUS], it is relatively undeveloped in others, and staff and stakeholders within Whitehall query whether there is a consistent method for ensuring that policy is evidence-based."[40] We recommend that DIUS, as a matter of urgency, put in place a consistent method for ensuring that the policy it develops is soundly based on evidence.


8   HM Treasury, Public Expenditure System: Guidance for the Spring 2008 Departmental Reports, PES (2007) 21, 21 December 2007, para 5 [paper deposited in the House of Commons Library] Back

9   HM Treasury, Public Expenditure System: Guidance for the Spring 2008 Departmental Reports, PES (2007) 21, 21 December 2007, para 1 [paper deposited in the House of Commons Library] Back

10   Q 27; the extract comes from the Departmental Report, p 45. Back

11   Q 27 Back

12   Departmental Report, p 13 Back

13   Departmental Report, p 13 Back

14   Departmental Report, p 91 Back

15   Departmental Report, p 14 Back

16   Departmental Report, p 18 Back

17   Q 20 [Ms Etheridge] Back

18   Q 20 [Mr Watmore] Back

19   Q 22 Back

20   Q 23 Back

21   Departmental Report, p 5 Back

22   Q 8 Back

23   Oxford English Dictionary, September 2008 Back

24   For example, see below, paragraphs 93 and 98. Back

25   HM Treasury, Public Expenditure System: Guidance for the Spring 2008 Departmental Reports, PES (2007) 21, 21 December 2007, para 8 [paper deposited in the House of Commons Library] Back

26   Departmental Report, p 69 Back

27   Departmental Report, pp 91-93; see also below, chapter 7, which considers wider participation. Back

28   Ev 79 Back

29   The difference recorded between 2002 and 2005 is 3.6%; although DIUS did not supply a footnote as it did for Table 3, it is assumed that due to rounding the correct figure is 3.5%. Back

30   Office for National Statistics, the executive office of the UK Statistics Authority Back

31   DIUS pointed out that "the figures suggest a narrowing of the gap of 6.0 percentage points rather than 6.1 percentage points. This is due to rounding and the correct figure is 6.1 percentage points." (Ev 80) Back

32   Ev 80 Back

33   Ev 80 Back

34   Oral evidence taken on 2 June 2008, HC (2007-08) 598-i, Q 34 Back

35   Universities and Colleges Admissions Service Back

36   Ev 80 Back

37   Departmental Report, p 106 Back

38   Ev 81 Back

39   Ev 82-86 Back

40   Cabinet Office, Civil Service Capability Reviews Department for Innovation, Universities and Skills: Baseline Assessment, December 2008, p 10; see also below, para 45 and following. Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2009
Prepared 20 January 2009