Education CommitteeFurther written evidence submitted by OCR Examinations Board (Annex A)

APPENDIX A

BREAKDOWN OF EXAMINER CHARACTERISTICS

Date appointed 

% Currently Teaching

% Never Taught

% Retired/Left Teaching

% Supply
Teacher/Tutor

% Lecturers

October 2010

57

2.5

26

5.5

9

May 2011

71

5

8

7

9

September 2011

55

2.5

29

4

9

APPENDIX B

EXCERPT FROM OFQUAL CODE OF PRACTICE

Qualitative Evidence

(i)copies of question papers/tasks and final mark schemes;

(ii)principals’ reports on how the assessment functioned;

(iii)samples of current candidates’ work distributed evenly across key boundary ranges for each component, with enough representing each mark to provide a sound basis for judgement so far as the size of entry and nature of work permit. The material should be selected from a range of centres and/or consortia where work has been marked/moderated by examiners/moderators whose work is known to be reliable;9

(iv)archive scripts and examples of internally assessed work (including, in appropriate subject areas, photographic or videotaped evidence) exemplifying grade boundaries for previous awards, together with the relevant question papers and mark schemes; and

(v)in the case of a new specification, pertinent material deemed to be of equivalent standard from other examinations in the subject or other relevant subjects may be considered.

Where Available

(vi) any published performance descriptions, grade descriptions and exemplar material.

(vii)any other supporting material (such as marking guides for components where the evidence is of an ephemeral nature).

Quantitative Evidence

(viii)subject-level expectations, when available;

(ix)information on candidates’ performance in at least two previous equivalent series, where available;

(x)details of changes in entry patterns, choices of options and prior attainment, where available;

(xi)information about the relationship between component/unit level data and whole-subject performance, where available;

(xii)technical information, including mark distributions relating to the question papers/tasks and individual questions for the current and previous series, where available;

(xiii)item-level statistics; and10

(xiv)information on centres estimated grades for all candidates, including:

qualification-level estimates for linear (including linear unitised) specifications; and

unit-level estimates for externally assessed units in all other unitised specifications.11

Instructions from the Regulators

(xv)any written instruction from the regulators specifying particular evidence that must inform the awarding process for a particular series; and

(xvi)relevant evidence from the regulators’ monitoring and comparability reports.

APPENDIX C

OCR METRICS

Figures below are “rounded” and taken annually, unless stated otherwise

Location

Three sites: Birmingham, Cambridge, Coventry

Staff

600+ employees

Assessors

16,000 examiners, moderators

Customers

7,600 schools, colleges and other institutions

Qualifications

470 general

1,050 vocational

Turnover

£116 million

Centre Approval and Visits

Centre approval requests

244 general

190 vocational

142 asset

Interchange centres

8,000 (90% of active centres)

CPC inspections

270

CAST visits

250

Entries

Entries

7.7 million (general)

1.9 million (vocational)

Candidates

1.3 million (general)

1.5 million (vocational)

Centres making entries

5,350 (general)

6,500 (vocational)

Pirate candidate investigations

126,000

Prior achievement requests

164 candidates

Transfer candidates

1,500

Assessors

14–19 and post-19 assessor agreements

12,500

Active (allocated) assessors (VQ)

1,200

Allocations made (VQ)

40,000

Centre/assessor amendments processed (VQ)

7,000

Awarding and marking review-related meetings (GQ)

910 (attended by 3,690 assessors)

Assessment Production

Assessment materials produced and published

6,500 items (including Post-19, 14–19 and Asset)

Assessment materials printed

Over 3,000 print orders placed for ~9.5 million copies

Mark schemes produced and published

2,000 items

Examiner reports published

1,000 items

scoris® zoned papers

770

Exam Stationery Production

Personalised attendance registers

400,000

Examiner mark sheets

160,000

Personalised internal assessment mark sheets

145,000

Forecast Grade forms

600,000

Examiner address labels

100,000 sheets

Candidate answer sheets

30,000

Logistics

Question papers despatched

8.5 million

Non-confidential items despatched

17 million

Orders for publications received

6,500

Publication items despatched

45,000

Special Requirements

Special consideration requests

86,000

Access arrangement requests

9,000 modified paper requests,
4,000 referred access arrangement requests

Processing

Scripts marked traditionally

2 million

Multiple choice answers keyed

27,000

Supplementary marks keyed

260,000

Centre authentication forms processed

3,400

Coursework mark adjustments and moderator reports processed

60,000

SEM forms (checks on examiners’ marking) processed

9,850

Visit reports processed (NVQ and Nationals)

14,000

Pay claims checked and authorised (VQ)

24,000

ESM

Scripts for e-marking

70% (% of total examined 14–19 qualifications)

Scripts scanned and marked on screen

3.75 million

Scanned (A4) images

>60 million

scoris® concurrent user peak

1400

Peak marking/marks return (per day)

>100,000

Results and Certificates

Statements of results issued (GQ)

2.23 million

Certificates issued (GQ)

1.3 million

Results issued (VQ)

3.4 million

Post-results Services

Enquiries about results

49,000

Access to Scripts requests

95,500

Missing or incomplete results requests

4,100

Late certification requests

12,200

Archives

Certifying statement requests (GQ)

7,351

Archives requests (VQ)

22,000

APPENDIX D

RPI AND HISTORICAL OCR FEE CHARGES—GCE & GCSE

OCR Fee % Variance

Actual Fee

Year

RPI

GCE 6 Unit

GCE 4 Unit

GCSE

GCE 6 Unit

GCE 4 Unit

GCSE

2007–08

4.10%

3.75%

n/a

4.35%

74.70

n/a

24.00

2008–09

4.80%

3.61%

−10.04%

3.33%

77.40

67.20

24.80

2009–10

−1.30%

3.10%

2.98%

4.84%

79.80

69.20

26.00

2010–11

4.70%

3.01%

2.89%

3.08%

82.20

71.20

26.80

2011–12

5.20%

3.65%

3.65%

3.73%

85.20

73.80

27.80

Annual RPI quoted is for August, calculated based on the difference from August of the previous year.

APPENDIX E

APPENDIX F

FEE COMPARISON BETWEEN AWARDING BODIES

Selected A2 Full Certification Fees

2009-10

2010-11

2011-12

Subject

Unit Req

AQA

OCR

Edexcel

AQA

OCR

Edexcel

AQA

OCR

Edexcel

Chemistry

70.20

79.80

101.40

74.20

82.20

103.80

75.30

85.20

105.60

English

70.20

69.20

79.60

74.20

71.20

81.60

75.10

73.80

82.80

History

70.20

69.20

82.40

74.20

71.20

84.40

75.10

73.80

85.60

Design & Technology

70.20

69.20

155.20

74.20

71.20

158.80

95.30

73.80

161.20

Selected GCSE Full Certification Fees

2009-10

2010-11

2011-12

Subject

AQA

OCR

Edexcel

AQA

OCR

Edexcel

AQA

OCR

Edexcel

Chemistry

26.10

27.80

28.15

28.20

31.50

English

 

26.00

 

 

26.80

 

28.10

27.80

27.30

History

 

26.00

 

 

26.80

 

28.10

27.80

29.60

Design & Technology

 

26.00

 

 

26.80

 

28.10

27.80

27.20

Notes:

AQA, OCR and Edexcel all quote unit fees for GCE. Unit fees shown are “std”, although there are exceptions to these for specific subjects where the assessment method warrants a higher fee ie visiting examiner etc.

Sep 2008 saw the introduction of the “4 unit” GCE with the exception of some subjects that remained as 6 unit quals (Sciences, Music). Fees for the newly introduced “4 unit” specifications were reduced by 10% from previous spec fees charged in 2007–08. Those subjects remaining as 6 unit quals were increased by 3.61%.

Analysis of 2011 Advanced GCE unit revenue shows that 20% (circa £6.2 million) was generated from resits.

Following the unitisation of many GCSE Short Courses in 2009 OCR opted to reduce the overall cost of gaining a Short Course where as competitors typically charge an additional £5 to £6 to certificate

APPENDIX G

EXTRACT FROM TERMS AND CONDITIONS FOR ASSESSMENT TASKS

Confidentiality and Disclosure

5.4 The assessor shall not, without the prior written permission of the Head of Assessor Management of OCR, use his/her name in association with that of OCR whilst carrying out the assessment services. The assessor will not use the OCR name for the assessor’s own commercial or non commercial purposes or whilst carrying out services under any other agreement with OCR, or allow it to be so used, whether expressly or by implication. For the avoidance of doubt this restriction shall apply during the agreement and at any time after the termination of the agreement howsoever the termination comes about.

5.5 To ensure the integrity of OCR’s assessments the assessor is required to make written declarations if the assessor has any interest in or with any person taking or involved in any way with an OCR assessment to the Head of Assessor Management at any time during the period of this agreement and for two years following expiry of this agreement. The assessor has an interest in a person if that person is a close relative, or is a person where the assessor’s interest (whether professional or not) could compromise the integrity of OCR’s assessments, or the assessor’s integrity, if the relationship were not disclosed.

5.6 The assessor is required to notify OCR of any potential conflicts of interest or any previous or existing relationship with any OCR centre in which the assessor as an individual has been required to provide any services in any capacity.

5.7 The assessor is required to disclose in writing whether the assessor is preparing candidates for the specification for which the assessor is providing assessment tasks to the Head of Assessor Management at any time during the period of this agreement that this becomes relevant.

5.8 OCR retains the right to determine whether a conflict of interest exists and any such judgement shall be final.

November 2011

Prepared 2nd July 2012