Supplementary memorandum submitted by
the Crown Prosecution Service
Question 83 (Mr Richard Bacon): Cracked and ineffective
trials
The Committee sought additional information
on cracked and separately on ineffective trials on the same basis
as that used in Appendix 3 of the NAO Report.
A cracked trial is one which ends on the day
of the trial listing, without the trial going ahead, that is where
the prosecution offers no evidence or where the defendant pleads
guilty. An ineffective trial is one which has to be adjourned
to another day. The three tables overleaf which provide information
on cracked trials and separately information on ineffective trials
on the same basis as that used in Appendix 3 of the NAO Report.
COMMENTS ON
THE DATA
In 2004-05, 10.2% of all trials were cracked
under the cracked trial reasons identified by NAO. Individual
CJS Area figures ranged from 3.8% in Durham to 14.5% in North
Yorkshire.
In 2004-05, 4.2% of all trials were ineffective
under the ineffective trial reasons identified by NAO. Individual
CJS Area figures ranged from 1.1% in Warwickshire to 7.1% in Greater
London.
All Areas have received a common set of guidance
on the cracked and ineffective trial monitoring process and that
guidance was developed jointly between HM Courts Service and CPS.
The guidance is amended and improved from time to time; the latest
version of the guidance was issued in September 2005. The cracked
and ineffective trial reasons were also amended in September 2005.
Despite the provision of common guidance, local
interpretation or practices could have skewed results in some
Areas leading to local variances.
Data integrity could be an issue. The data is
captured by HM Courts Service staff and care has to be taken to
ensure data is recorded accurately. Local pressure on staff could
impact on the accuracy of data recording.
Ineffective trials are the responsibility for
all members of local criminal justice agencies and often the action
of one part of the system can help or hinder other parts eg listing
cases very quickly or at short notice can put undue pressure on
the prosecution to be ready.
When the prosecution is "not ready"
for trial, the underlying reason is not captured as part of the
monitoring in a structured way (although it may be by way of free-text
comment). These underlying reasons can be variable, such as forensic
or doctors evidence not available, investigating late service
of alibi, CCTV tape not produced at court. Repeated failings need
to be negotiated at Area level.
Prosecution reasons for variances in cracked
and ineffective trial rates include:
witness warning issues, particularly
where English is not a first language and interpreters are required;
defence practitioners intent on delaying
proceedings;
local listing issues, where cases
are double listed or transferred to other courts;
high agent use, where agents do not
understand the monitoring arrangements;
relatively high incidence of police
officer witnesses failing to attend court;
approaches to having defendants "bound
over" at court;
difficulties in prosecution being
ready for trial causing ineffective trial performance prompting
further local investigation.
Data is reviewed at regular meetings locally
between interested parties and performance officers and strategies
should be developed to ensure the data is reliable and that performance
improves.
In CPS, ineffective trial data is used as part
of an established Area performance review process. This is a process
where senior Area managers have to account for their performance
at a series of quarterly meetings with the senior headquarters
personnel.
IMPROVING PERFORMANCE
The CPS is working with other agencies to improve
performance and these steps include:
applying more scrutiny to "prosecution"
failings causing cracked and ineffective trials at the Area performance
review process (using the NAO selected categories). This involves
written Reporting followed by periodic face to face interviews;
encouraging local monitoring of high
level data results to prompt more detailed analysis of individual
cases;
increasing the proportion of CPS
in-house advocates undertaking magistrates' court sessions and
decreasing the reliance on agents;
working with HM Courts Service colleagues
to review and improve the quality of guidance;
continuing to develop Witness Care
Units to improve the communication with witnesses and improve
court attendance levels;
embedding charging arrangements to
ensure that more charges are right at the outset and there is
less opportunity for late plea or bind over discussions;
greater use of case progression officers,
as and when resources allow, to ensure timely trial readiness
is achieved;
closer working between CPS and courts
to improve listing arrangements, and
continuing the work of local implementation
teams to support the Effective Trial Management Programme process.
EXAMPLES OF
THE CRIMINAL
JUSTICE AGENCIES
WORKING TOGETHER
Some examples of good practice in promoting
more effective trials are outlined below:
in Area A, Trial Readiness Assessment
Meetings are held twice weekly. These are used to review each
case that has been listed for trial in the coming three weeks
to ensure that there are no difficulties with the trial proceeding
on time. Monthly inter-agency meetings are also held where each
ineffective trial is reviewed and analysed;
in Area B, meetings are held between
project teams, area co-ordinators, and colleagues from across
the CJS. This is an opportunity to identify areas of good practice
and to share them with others. The area is also making use of
Citizens panels, asking the public their opinions on appropriate
standards of care for victims and witnesses and how they would
like Witness Care Units to operate;
in Area C, case progression officers
are working in partnership with cross agency colleagues to prevent
cases becoming ineffective prior to the event. In conjunction
with this, the local delivery board has allowed the area to review
their targets and tackle poor performance more effectively. The
Board identifies and examines the reasons behind ineffective and
cracked trials and takes remedial action;
analysis of the reasons for ineffective
trials is part of the interagency performance management arrangements
within each criminal justice Area and is part of their regular
self assessment.
Graphs 1-3 provide information on cracked and
separately on ineffective trials on the same basis as that used
in Appendix 3 of the NAO Report. The information covers trials
not going ahead because either the prosecution (CPS, police or
both) was not ready, or the prosecution was dropped on the trial
date but excluding trials which did not go ahead because of problems
with civilian witnesses.
The categories of cracked and ineffective trials
used by NAO to capture the information are:
Cracked trial categories:
Defendant bound overfirst time offered
by defence
Defendant bound overpreviously rejected
by prosecution
Prosecution end caseinsufficient evidence
Prosecution end caseother
Ineffective trial categories:
Prosecution not ready (disclosure problems)
Prosecution witness absentpolice
The data used in the Appendix 3 is collected
by the courts. The main reason for the trial not proceeding as
planned is recorded before the parties leave the court on the
direction of the presiding chair of the Bench or the judge. The
parties agree a main reason for the trial not proceeding as planned
to aid the decision by the presiding chair of the Bench or judge.
Where more than one reason is identified, or there are different
reasons for different defendants, the presiding chair of the Bench
or judge determines the dominant cause, as for statistical purposes
only one reason can be recorded.
Data returns are monitored locally on a regular
basis by the court, CPS, police, LCJB Performance Officers and
the Legal Services Commission (LSC), so that CJS staff can work
together to resolve any problems with a view to increasing the
proportion of "effective" trials including resulted
cracked trials. If there is a sudden drop in performance in any
month then this data should be carefully checked before it is
submitted to the HMCS Performance Directorate or entered on the
appropriate computer system. The HMCS Performance Directorate
monitors all monthly data and query those that appear to warrant
further investigation.

Question 127 (Mr Sadiq Khan): Information on the
number of unrepresented defendants
The DCA, the LSC, the CPS and the other prosecuting
agencies do not collect information on how many defendants pay
for their own defence, which makes it impossible to correctly
calculate the number of unrepresented defendants in the Crown
Court and the Magistrates' courts. There is therefore no hard
data on unrepresented defendants in either the Crown Court or
the Magistrates' court.
In the Magistrates' court in 2004-05, 573,473
defendants were granted representation under a representation
order.
In 2004, 121,345 defendants or appellants were
represented under Legal Aid in the Crown Court.
There is no hard data on the consequences of
defendants being unrepresented and this is a subject which would
require original research to produce results.
|