23.Since the Committee’s 2018 Report the Biometrics Strategy has finally been published (in June 2018), five years after it was first promised. Both our Chair and the Biometrics Commissioner have questioned whether the document is a Strategy. The Biometrics Commissioner, Professor Paul Wiles told us:
I thought that it was a slightly confusing and disappointing document […] It starts off with what I might describe as a very good prologue for a strategy. It then simply becomes a list of some of, but by no means all, the things that the Home Office is doing on the use of biometrics—and then stops. I thought that that was disappointing. It was a missed opportunity to lay out a strategy.
In addition, the Commissioner made a public statement in 2018 shortly after the Strategy was published saying:
It is disappointing that the Home Office document is not forward looking as one would expect from a strategy. In particular it does not propose legislation to provide rules for the use and oversight of new biometrics, including facial images. This is in contrast to Scotland where such legislation has been proposed. Given that new biometrics are being rapidly deployed or trialled this failure to set out more definitively what the future landscape will look like in terms of the use and governance of biometrics appears short sighted at best.
24.By comparison, in Scotland the Scottish Government held a consultation on its proposals for second generation biometrics legislation. As a result the Scottish Government is pursuing a principles-based approach. The Scottish Biometrics Commissioner Bill: business and regulatory impact assessment (BRIA) was published on 31 May 2019. The Biometrics Commissioner said, “It is an interesting example of attempting to legislate in a way that will cope with future technical change in this area”. No similar public consultation has taken place in England and Wales, despite the Biometrics Strategy stating that there would be a consultation on the governance of biometrics. Whether the Home Office has privately consulted with relevant organisations remains unclear. The Government’s position on discussions with Scottish counterparts appeared confused with the Minister asking her officials whether they had been talking to their Scottish counterparts about legislation.
25.The Surveillance Camera Commissioner (SCC), Microsoft and the Information Commissioner’s Office highlighted the need for legislation regarding second generation biometrics, such as automatic facial recognition. SCC said:
A clearer legal framework outlining how Automatic Facial Recognition (AFR) can be deployed would support law enforcement practitioners […] AFR is arguably more invasive than some covert surveillance techniques”.
The Biometrics Commissioner also made this point:
I think, most of the police service would like to see a proper legal framework for the new biometrics; otherwise, as each of these different biometrics is trialled, they will be challenged and will find it very difficult, but it is also more difficult to develop and evaluate new technology of this kind if you do not know what the rules are that will apply once you have done it. If they have to be bolted on afterwards, that makes life more complicated, so I think the police service would like to see a framework within which to operate.
26.A Home Office review of the governance of biometrics is currently underway. The Biometrics Commissioner said that he had been minimally involved with the review and had merely questioned the scope of the review. Dr Prince, a Home Office official, told the Committee that the review would be
advising on a variety of options. Clearly, one of those might be legislation, but there are other mechanisms by which we could improve the clarity and simplicity of the way in which biometrics are governed going forward.
The Minister informed us that there was already a legal framework in place including a data protection component, and that it was largely principles-based. However, although there is a legal framework for two biometrics—DNA and fingerprints—under the Protection of Freedoms Act 2012, it does not cover ‘new’ biometrics, such as facial or voice recognition.
27.The Government’s 27-page biometrics strategy was not worth the five-year wait. Arguably it is not a ‘strategy’ at all: it lacks a coherent, forward looking vision and fails to address the legislative vacuum that the Home Office has allowed to emerge around new biometrics. Ultimately, it represents a missed opportunity for the Government to set out a principles-based approach for the use and oversight of second generation biometrics. Simply establishing an oversight board, with no legal powers, is not good enough given the highly intrusive nature of the technologies. Further, the development and use of biometric technologies must be transparent and involve as much public awareness and engagement as possible, to ensure that there is public trust in the technologies. Unfortunately, public engagement has been sorely missing from the Home Office’s approach to date. Its ongoing ‘consultation’ on the governance of biometrics has no published terms of reference and there is no obvious way for interested parties to participate. This is not good enough.
28.The UK Government should learn from the Scottish Government’s approach to biometrics and commission an independent review of options for the use and retention of biometric data that is not currently covered by the Protection of Freedoms Act 2012. The results of the review should be published along with a Government Response, and a public consultation on the Government’s proposed way forward should follow. This process should culminate in legislation being brought forward that seeks to govern current and future biometric technologies.
29.The development and application of automatic (or ‘live’) facial recognition (AFR), notably by some police forces, encapsulates a number of the problems that have arisen due to the lack of a clear legislative framework for the technology.
30.AFR, is the automated one-to-many ‘matching’ of near real-time video images of individuals with a curated ‘watchlist’ of facial images. Since 2016 it has been trialled by the Metropolitan Police and South Wales Police. The Metropolitan Police told us that it had:
now concluded its trial of Live Facial Recognition Technology (LFR). The last formal trial was on 14th February 2019. The MPS is pleased that the technology–even though in a trial phase helped to show how it can keep people safe. In all, with very limited deployments, eight arrests were made directly from the technology with many relating to serious violence.
The London Policing Ethics Panel conducted a review of the trial by the Metropolitan Police and explained that:
Live facial recognition enables the police to conduct identity checks assisted by an automated recognition system, in real time and in public places. Facial features are scanned as people pass by cameras utilising specialised software. These are automatically checked against facial images on a ‘watch list’. These are images drawn from custody photographs and other police sources.
The Metropolitan Police told us that:
LFR is only an intelligence tool to potentially identify those who are wanted/of interest to police. Once anyone is subject to ‘a match/alert’ then police still need to verify identification by traditional means e.g. personal documents, Police National Computer (PNC) or other mobile technology such as fingerprints. LFR sits separately to any investigation–however, it may well be mentioned in individual witness statements but only to describe the events up until the person of interest was stopped.
The London Policing Ethic’s Panel final report on Live Facial Recognition proposed that LFR should only be deployed where specific conditions were met and under certain provisions. It also recommended:
i)when conducting future trials its ethical framework be incorporated;
ii)if LFR is adopted, 12 months later there should be a public attitudes survey; and
iii)the need for the Home Office to simplify and strengthen regulation.
31.The evaluation of the trials in both London and Wales by the Biometrics and Forensics Ethics Group in its interim report identified a number of questions which arose from the AFR trials, including:
The Group concluded that there had been a “lack of independent oversight and governance of the use of [Live Facial Recognition]” in these trials and recommended that, pending the development of a legislative framework, the police trials should comply with the “usual standards of experimental trials, including rigorous and ethical scientific design”. It proposed in its report “a number of ethical principles that can be used to inform these deployments and frame policy-making”.
32.The Biometrics Commissioner has since called for the trials to be conducted in a more consistent, standardised and robust way therefore allowing proper academic evaluation but avoiding the risk of function ‘creep’ from trial to extended deployment. He suggested this had not always been apparent throughout the whole of the trials to date.
33.The Information Commissioner is so concerned about the use of AFR by police forces in public spaces that she had opened a priority investigation “to understand and investigate the use of AFR by law enforcement bodies in public spaces. This will include considering the legal basis, the necessity, proportionality and justification for this intrusive processing.” The Commissioner is also concerned about:
i)the ambiguity of the trials;
ii)the more general roll out of AFR; and
iii)whether the trials demonstrate full compliance with the Data Protection Act 2018.
The investigation will therefore be looking at the trials as well as other deployments. It will be completed later in 2019.
34.The Minister did not share concerns about the lack of regulation of AFR, and stated that the:
very positive purpose of trials was to see where you have got it right and where improvements could be made. It has to be said that the police are operating within their powers when they do these trials […] I am not personally concerned about whether it is lawful, given the parameters within which the police have operated. I am quite comforted by the fact that the Information Commissioner is using her remit to scrutinise the basis on which they are operating.
35.Since we took evidence, a three day trial was heard at Cardiff High Court in May 2019 regarding South Wales Police’s Home Office funded trials of live facial recognition. Liberty, on behalf of their client, a Cardiff resident, challenged the South Wales Police’s use of live facial recognition on the grounds that the use breached the right to privacy, equality laws and data protection laws. An expected date for the judgment has not been set. As the Biometrics Commissioner said in his 2018 Annual Report:
This judgment will be significant not just for South Wales Police and their ongoing trials and/or deployment of this technology but for all police forces, the wider use of the technology and its future governance.
In addition Big Brother Watch is currently bringing a legal challenge against the Metropolitan Police and the Home Secretary regarding the use of live facial recognition in public spaces.
36.There is growing evidence from respected, independent bodies that the ‘regulatory lacuna’ surrounding the use of automatic facial recognition has called the legal basis of the trials into question. The Government, however, seems to not realise or to concede that there is a problem.
37.We reiterate our recommendation from our 2018 Report that automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved. We call on the Government to issue a moratorium on the current use of facial recognition technology and no further trials should take place until a legislative framework has been introduced and guidance on trial protocols, and an oversight and evaluation system, has been established.
38.We recommend that the Home Office should issue guidance on Automatic Facial Recognition Trials when it introduces a legislative framework for these trials and that trials must be of a scientific standard.
39.The retention of custody images is of concern as these images can form the basis of ‘watchlists’ for automatic facial recognition technology when used by police forces in public spaces. This issue was raised in our predecessor Committee’s 2015 Biometrics Report. First and foremost, the then Committee criticised the lack of a “robust governance regime” and a statutory basis for holding images of innocent people in order to detect crime. This follows a case in the High Court in 2012—R (RMC and FJ) v MPS (Metropolitan Police Service)—in which the High Court ruled that the indefinite retention of innocent people’s custody images was “unlawful”. Big Brother Watch argued that:
the number of custody images currently held on the Police National Database [is] 23 million. This is an increase of 4 million since the previous figures were updated just one year ago. According to the Biometrics Commissioner, a staggering 10 million of these images have now been made biometrically searchable by facial recognition technology, following an upgrade to the system in 2014 which occurred without parliamentary or public scrutiny. With sub-sets of this database being used at police deployments of live facial recognition, innocent people are increasingly at risk of being wrongfully stopped or even arrested. This also completely blurs the line between the innocent and the guilty, and makes a mockery of the presumption of innocence.
40.The Government responded to the 2017 Custody Image Review, which was its response to the 2012 High Court ruling in the case of RMC and FJ v Commissioner of Police for the Metropolis and Secretary of State for the Home Department  EWHC 1681(Admin) (‘RMC’), by saying that individuals could request to have their images deleted but there was no ‘automatic weeding’ of unconvicted individuals. The Biometrics Commissioner noted that:
a system was put in place where any custody images were kept for six years and then should be reviewed. During those six years anybody, but especially those who had been unconvicted, could apply to have their images removed, and the guidance was that there should be a presumption that the police would do that unless there were good reasons why not.
He went on to explain that “the fact is that the six-year review has somehow not been taken into routine police practice” and that there was a:
very poor understanding of that six-year review period and little evidence that it was being carried out. Very few applications were being made to chief officers […] [there is a need] to bolster and remind police forces that they should be carrying out reviews at the end of six years and deciding whether they need to keep those images and, if not, to delete them. Secondly, there needs to be much greater public visibility of people’s right, if they want to, to apply to a chief officer to have their images removed.
The ICO had similar concerns:
It is unclear how those individuals would know that they could make a request and we are aware that there have not been a significant number of requests, indicating a lack of awareness. So, the position remains, that there are potentially thousands of custody images being held with no clear basis in law or justification for the ongoing retention.
As we highlighted in our previous Report a Press Association investigation in 2018 revealed that only 67 applications for deletion had been made between February 2017 (when the Home Office review concluded that unconvicted individuals should have the right to apply for the deletion of their custody image from all police systems) and October 2017, and only 34 applications had been successful.
41.Minutes from the September 2018 Biometrics Advisory Board said that “most forces” were struggling to comply with the management of police information policy (MOPI) when weeding the custody images that they held. It was also clear that no earmarked additional resources had been provided by the Home Office to assist police forces with compliance.
42.In 2018, the Home Office told the Committee that it expected the “new platform being delivered by the National Law Enforcement Data Programme to […] enable a considerably more flexible approach to automatic deletion than is possible at present”. The Committee heard, however, in 2019, that there had been delays to implementing this new technology. The Minister said that it had “not been procured” and the Home Office “cannot commit to a date” for an automatic deletion system to be in place In a further letter to us Baroness Williams said that:
There are a number of challenges to overcome but we are working up a detailed technology and business change plan, which will entail substantial investment.
However she confirmed that she had asked her officials “to work with the police to determine how manual deletion might be enhanced without placing an unmanageable burden on policing”.
43.Concerns were also raised by the Biometrics Commissioner and the ICO about the lawfulness of the current situation: the Biometrics Commissioner told us “at the time the custody image review was published I was not at all sure this would meet further court challenges. I still think that”. The ICO, meanwhile, stated that delays to implementing the new National Law Enforcement Data Service would mean “that many images are potentially being held longer than necessary and this will therefore not comply with the DPA18”. Another custody image review was promised by the Minister in 2020.
44.Since the Committee published its Report in 2018, progress has stalled on ensuring that the custody images of unconvicted individuals are weeded and deleted. It is unclear whether police forces are unaware of the requirement to review custody images every six years, or if they are simply ‘struggling to comply’. What is clear, however, is that they have not been afforded any earmarked resources to assist with the manual review and weeding process. The Minister previously promised improvements to IT systems that would have facilitated automatic deletion. Such improvements now appear to have been delayed indefinitely. As such, the burden remains on individuals to know that they have the right to request deletion of their image. As we stated in 2018, this approach is unacceptable and we agree with the Biometrics Commissioner that its lawfulness requires further assessment.
45.Police forces should give higher priority in the allocation of their resources to ensure a comprehensive manual deletion process of custody images in compliance with national guidance. In turn, the Government should strengthen the requirement for such a manual process to delete custody images and introduce clearer and stronger guidance on the process. In the long-term the Government should invest in automatic deletion software as previously promised.
37 Press release: , 28 June 2018
40 : business and regulatory impact assessment (BRIA, May 2019
41 , 27 June 2019
43 Surveillance Camera Commissioner (), Microsoft (), Information Commissioner’s Office ()
44 Surveillance Camera Commissioner ()
48 Home Office ()
49 , February 2019
50 Metropolitan Police Service ()
51 An independent panel established by the Mayor of London.
52 LONDON POLICING ETHICS PANEL May 2019
53 Metropolitan Police Service ()
54 LONDON POLICING ETHICS PANEL May 2019
55 Biometrics and Forensics Ethics Group is an advisory non-departmental public body, sponsored by the Home Office.
56 , February 2019
57 , February 2019
60 Information Commissioner’s Office ()
61 Information Commissioner’s Office ()
63 Independent 21 May 2019
64 , Paul Wiles, 27 June 2019
65 Big Brother Watch ()
66 RMC and FJ v Commissioner of Police for the Metropolis and Secretary of State for the Home Department  EWHC 1681 (Admin)
67 Big Brother Watch ()
68 This ruling found that “that the retention of images from unconvicted individuals under the Metropolitan Police Service’s policy for the retention of custody images, which followed the Code of Practice on the Management of Police Information and accompanying guidance (‘MoPI’), was unlawful”.
69 Home Office, ‘, February 2017
73 Information Commissioner’s Office ()
74 Big Brother Watch ()
75 24 September 2018
77 Science and Technology Committee, , Fifth Report of Session 2017–19, HC 800
79 to Chair of the Science and Technology Committee, 30 April 2019
82 Information Commissioner’s Office ()
Published: 18 July 2019