Committee Members present: Bob Neill, Bambos Charalambous, David Hanson, John Howell, Victoria Prentis, Ellie Reeves, Marie Rimmer, Andy Slaughter
Seminar participants: Dr Joe Tomlinson (Kings College London); Dr Natalie Byrom (Legal Education Foundation); Dr Meredith Rossner (London School of Economics); Professor Abi Adams-Prassl and Professor Jeremias Adams-Prassl (University of Oxford); Dr. Jack Simson Caird (Bingham Centre for the Rule of Law)
1)While evidence to the Justice Committee’s inquiry into court and tribunal reforms strongly supported the need for evaluation, few submissions provided specifics on how evaluation ought to be carried out. The Committee therefore convened an expert seminar to provide assistance on this issue. Introducing the session, Dr Joe Tomlinson explained that it would focus on one central question: what steps should the Government take to ensure proper and effective evaluation of the courts and tribunals reform programme?
2)Dr. Natalie Byrom suggested that the government should ensure that the reform programme is evaluated against the minimum standard of access to justice found in the law of England and Wales. This was defined as: (i) access to the formal legal system, e.g. ability to initiate a claim; (ii) access to a fair and effective hearing; (iii) access to a decision; and (iv) access to an outcome.
3)Dr. Byrom thought that this definition of access to justice should be institutionalised as part of the reform programme. Her primary suggestion was that assessments of the programme’s impact on access to justice ought to be undertaken and should be based on an evaluation that explores the progression of a full range of cases and individuals through the system from claim initiation to outcome (e.g. settlement, withdrawal etc). If HMCTS/Ministry of Justice wished to depart from the legal definition of access to justice in evaluating reform, they should be required to explain publicly how and why they wish to and do so in primary legislation.
4)Dr. Byrom further proposed that, for each individual service (e.g. Civil Money Claims Online), HMCTS should publish the underpinning logic models and intended outcomes for individuals at each stage of the process, and should:
a)appoint a Senior Responsible Owner for the ongoing project level evaluation;
b)confirm that project evaluation is monitoring the impact of services on vulnerable people and access to justice;
c)detail the specialist resource dedicated to project level evaluation; and
d)commit to making the findings of these evaluations public.
At a wider level, it was suggested that HMCTS should publish estimates of the impact of cross cutting-projects (such as court closures and fully video hearings) on each of the service-level projects.
5)Dr. Byrom also proposed that HMCTS be required to model the impact of court closures on court users, not the general population (as they have different characteristics). In addition, HMCTS should be required to publish the modelling they have done to calculate what level of travel costs will be deemed “too expensive” for people and what level of “supplementary provision” they have costed to deliver the business case. The need to monitor court closures, as a key part of the whole programme of reform, was supported by the whole panel.
6)Dr. Byrom further proposed that the transfer to digital systems should be seen as an opportunity for HMCTS to collect detailed data on the operation of the courts and tribunals systems, including the experience of users. This proposal was widely supported by all seminar participants. Various precise suggestions were made about how this requirement is operationalised.
7)Professor Abi Adams-Prassl and Professor Jeremias Adams-Prassl suggested that there was a need for the evaluation agenda within the reforms to be conceived of as a long-term programme, into which resources should be invested in addition to those assigned to the reform process itself. This was supported by others on the panel.
8)In line with Dr Byrom’s proposal, they also argued that evaluation must consider the outcomes that individuals achieve within the justice system. The impact of moving to an online system is expected to have an ambiguous impact on improving access to justice. While the digitally capable should find it easier to launch claims, the same conclusion does not necessarily hold for the vulnerable and digitally excluded. This means that overall case load and measures of average user satisfaction are insufficient to determine the impact of the reforms on the full run of litigants. Success rates, withdrawal rates, and the nature of remedies awarded must be analysed. It was suggested that evaluation should consider how reform changes the distribution of outcomes achieved and which types of users are affected. This is likely to require a combination of commissioned surveys to form an adequate baseline and on-going data on users to form a comparable yardstick across time, as well as the use of randomised evaluation in the piloting of new programmes.
9)Professor Adams-Prassl and Professor Adams-Prassl suggested that it is helpful to conceive of evaluation at three levels:
10)As regards the appropriate standard by which to assess access to justice, Professor Adams-Prassl and Professor Adams-Prassl suggested that case law provides a framework by which to understand access to justice, further reflecting Dr. Byrom’s proposals. Specifically, there are two key risks to take into account in any assessment: the risk of futility and costs of futility. In addition, vulnerability is a key feature. Questions were raised in discussions about whether a universal standard may become a “blunt instrument” but there was agreement that clarity, for the purposes of evaluation, was preferable.
11)It was further suggested by Professor Adams-Prassl and Professor Adams-Prassl that the iterative agile design and testing process being adopted by HMCTS must be properly documented and transparently communicated to external stakeholders. Specifically, which design iterations are trialled, the population of claimants that are drawn upon in this iterative trial process, and the precise evaluation metrics used to establish success in user testing must be properly documented and consulted on with external stakeholders.
12)Dr. Meredith Rossner spoke about her experience of undertaking evaluation research with HMCTS. Reflecting points raised by other participants, she suggested that evaluation needs to cover both the process (or implementation) and the outcome (or impact) of a given reform. However, she cautioned that if an outcome evaluations is conducted too early in the reform implementation process then there is a risk that it will evaluate the implementation of the reform rather than the reform itself. Dr. Rossner suggested that indicators of successful outcomes could include well-established procedural justice measures, access to justice (including an analysis of barriers to access), participant satisfaction, cost efficiency and decision-making outcomes (where applicable).
13)Dr. Rossner reiterated that robust evaluation methods are essential. She thought it was vital to have a meaningful comparison group in order to have a robust outcome evaluation; the ideal way to achieve this is through a randomised controlled trial. Echoing other participants, she proposed that a clear description of the methods and measures adopted for evaluation should be made available on the HMCTS website.
14)Reflecting specifically on her own process evaluation of the HMCTS video hearings pilot in the First Tier Tribunal (Tax Chamber), Dr. Rossner outlined that there were positive findings in terms of the user experience. Hearings similar to the ones under study—where users are professionals (as in most HMRC appeals), are able to participate from their homes or workplaces, and where there is little documentation and evidence to examine—may also be suitable for video hearings. However, Dr. Rossner cautioned that findings from evaluations such as this cannot be generalised to video-enabled hearings or video hearings in other jurisdictions such as Criminal or Immigration and Asylum. She pointed out that earlier research on video-enabled hearings consistently reports that vulnerable users, such as defendants appearing from a custody suite or migrants in detention, may be at a disadvantage; this demonstrates the need for high-quality research and evaluation to be embedded within the reform programme.
15)Dr. Jack Simson Caird highlighted the wider constitutional context of the reform programme. He suggested that the reform programme is fundamentally re-designing the justice system of England and Wales and that this is of constitutional significance; the institutional silo in which the programme has so far operated, and the agile development process with which it has been designed, have served to disaggregate the programme from the broader constitutional context in which it exists. According to Dr. Simson Caird, this situation generates two major concerns. First, that there is a democratic deficit as the reforms are not being properly debated and scrutinised. Second, that the long-term impact of these reforms on access to justice (and more broadly, the Rule of Law) has been insufficiently acknowledged.
16)To alleviate these concerns, Dr. Simson Caird suggested that the Committee should examine the institutional framework in which the programme operates and identify mechanisms by which to improve support, communication and collaboration across all governmental departments, including between HMCTS and the Ministry of Justice. He further suggested that the Government ought to bring forward legislation to enable the overall effect of the programme to be democratically scrutinised and debated.
17)Dr. Simson Caird also pointed to difficulties in scrutinising digital processes. He suggested that existing forms of scrutiny are unlikely to give MPs or the public a sense of how the justice system will work in practice. To ensure the transparency and accountability of the justice system, he suggested that parliamentarians and the public should be able to scrutinise the digital interface as well as the law which underpins it. To this end, the programme should increase openness and transparency by making more early-stage ‘draft’ versions of digital system designs available, thus increasing evaluation opportunities and mitigating concerns about clarity, intelligibility, and predictability.
18)In discussion, the issue was raised as to how any evaluation processes are made effective, especially in view of the passage of time, change of HMCTS personnel etc. It was suggested that there may be a need for a “fixed” mechanism of evaluation, such as independent reports on the reforms to be statutorily required at certain points in time. While GDPR had been cited as a potential barrier to data collection, it was felt that the effect of this had been overstated. Committee Members asked about the lack of HMCTS baseline data; while this problem was recognised, participants thought it was more realistic to focus on assessing the access to justice impact of the reforms as they are rolled out.
19)In conclusion, the Chair thanked all the participants for giving up their time to attend the event, and for sharing their expert views with Members.
Published: 31 October 2019