Data Protection and Digital Information (No. 2) Bill

Written evidence submitted by the Equality and Human Rights Commission (DPDIB38)

Data Protection and Digital Information Bill

House of Commons – Committee Stage

 

1.0 Introduction

The Equality and Human Rights Commission is a statutory body established under the Equality Act 2006. The Commission has been given powers by Parliament to advise the Government on the equality and human rights implications of laws and proposed laws and to publish information and provide advice, including to Parliament, on any matter related to equality, diversity and human rights. Find out more about the Commission’s work on our website .

1.1 The Data Protection and Digital Information (No. 2) Bill (the Bill) is an opportunity to review the way data collected in the UK is governed and ensure that the regulation of data works for those gathering and using data, while preserving protections for people whose data are collected. The Government wishes to ‘simplify the UK’s data protection framework’ . The Commission welcomes this aim but is concerned at the implications for equality and human rights of a few of the Bill’s provisions, particularly as they relate to Articles 8 (the right to a private and family life) and 14 (freedom from discrimination) of the European Convention on Human Rights (ECHR).

1.2 The role of technology in society continues to expand at speed. Those deploying tools using Artificial Intelligence (AI) are collecting more data about people than ever before. The unchecked collection and use of people’s data risks replicating or exacerbating bias and discrimination that can already occur in decision-making. We consider that the proposals in the Bill do not offer strong enough safeguards against this.

1.3 In addition to the proposals in the Bill, the Government published a White Paper on AI on 29 March 2023 with the intention of creating a ‘pro-innovation’ approach to regulating AI. Given the speed at which AI technology is developing, we are keen for regulatory frameworks to evolve similarly quickly to ensure effective oversight and protection against risk or misuse. We caution against measures in the Bill that may weaken protections, safeguards or governance around the use of data until these risks are better understood.

2.0 Summary of our key concerns

2.1 Of the equality and human rights implications of the Bill’s proposals, we focus our recommendations on the following provisions:

· Automated decision-making (Clause 11) proposals do not offer sufficient safeguards to protect individuals from unfair or discriminatory outcomes of decisions with no meaningful human oversight.

· Replacing the requirement to produce Data Protection Impact Assessments with new Assessments for High-Risk Data Processing (Clause 17) risks reducing the quality and robustness of assessments of the impact of data processing. This increases the risk of violation of individuals’ data rights.

· Personal data rights (Clauses 2, 5, 7 and 9) changes collectively risk weakening individuals’ right to choose how their personal data are processed and for what purposes, thereby engaging Articles 8 and 14 of the ECHR, with potential implications for those with certain protected characteristics.

· The Government has not provided enough information on the necessity and proportionality of a National Security Exemption (Clause 24) for law enforcement data processing.

· The changes to the Oversight of biometric data (Clauses 104 and 105) risk a reduction in the consideration of broader human rights and equality principles in the oversight of surveillance cameras and biometric data.

3.0 Automated decision-making (Clause 11)

3.1 Clause 11 of the Bill would change Article 22 of the General Data Protection Regulation (GDPR), broadening the range of contexts in which solely automated decision-making can be used. Currently, Article 22 is a general prohibition on fully automated decision-making that has a legal or ‘similarly significant’ effect on individuals, with exhaustive exceptions. The Bill seeks to reframe Article 22 to generally allow automated decision-making, subject to some safeguards.

3.2 We are concerned that the proposed changes do not offer sufficient safeguards to protect individuals from unfair or discriminatory outcomes of automated decision-making. Because the data used to help the AI-based tools to make decisions may contain existing biases, individuals with particular protected characteristics may be unfairly impacted by automated decision-making. For example, an AI system used to monitor employee productivity may make automated decisions that do not take account of the legal requirement to make reasonable adjustments in respect of an individual’s disability, resulting in a person being penalised for not meeting productivity requirements.  Under the current framing of Article 22, there are already contexts where unfair and discriminatory outcomes from automated systems occur. The Bill should seeks to improve protection for individuals by ensuring that automated systems and algorithms are monitored for bias in relation to the protected characteristics in the Equality Act 2010.

3.3 More and more public bodies, including local authorities and NHS bodies, are adopting AI and automated decision-making tools. The proposals to amend Article 22 of the GDPR will not help public authorities to use automated decision-making in ways that respects human rights, in particular the Article 8 right to privacy and the Article 14 right to non-discrimination (read with Article 8). Indeed, the Government has accepted that its proposals may increase interference with Articles 8 and 14 given the greater range of decisions that will be permitted under the proposed reforms. [1]

3.4 We support the recommendation by the Information Commissioner’s Office (ICO) [2] to extend the definition of automated decision-making to cover partly, as well as solely, automated decision-making. Many decisions are significantly informed by AI, but have an element of human involvement. Yet human actors may have little understanding of how the AI-based tool has made a decision and little ability to question its output. Expanding the definitions in the Bill to include partly automated decisions will extend opportunities to challenge decisions that may be discriminatory or jeopardise human rights. This approach would align with the principles of transparency, explainability, contestability and redress in the Government’s AI White Paper.

3.5 The proposed definition in the Bill of a ‘significant decision’ removes references to profiling, which is currently explicitly referenced in Article 22 of the GDPR and defined by the ICO as a decision which significantly affects or produces an adverse legal effect on an individual and is authorised by law . [3] The use of profiling can lead to discriminatory decision-making if it draws on limited data sets that contain existing stereotypes and biases, or it can restrict individuals’ options based on assumed preferences, or even deny them access to services or opportunities. Data analysed by Citizen’s Advice found that some ethnic minorities pay £250 more on average than White people for car insurance, with different pricing across ethnicities. [4] For these reasons, we recommend maintaining explicit inclusion of profiling in the Bill.

3.6 Proposed Article 22D would allow the Secretary of State to make regulations setting out what is or is not meaningful human involvement, and what is or is not a ‘similarly significant effect’ in relation to significant decisions. These regulations will be subject to the affirmative procedure, which we welcome. We recommend detailed parliamentary scrutiny of the regulations, given that the decisions made by AI tools could have significant implications, including in relation to social security, immigration and asylum decisions.

· We recommend that the Government removes its proposals in relation to solely automated decision-making (Clause 11) and retains Article 22 of the GDPR in its current form.

· We recommend including an explicit reference to profiling in the definition of a ‘significant decision’ (Clause 11) as this will determine the contexts in which fully automated decision-making is prohibited.

· We support the ICO’s recommendation to extend the definition of automated decision-making to cover partly, as well as solely, automated decision-making (Clause 11).

4.0 Data Protection Impact Assessments (Clause 17)

4.1 Clause 17 replaces existing provisions for Data Protection Impact Assessments with new Assessments for High-Risk Data Processing. The new assessment has fewer requirements to record information on processing operations, purposes and proportionality checks, and is less clear about when these assessments need to take place. There is a further proposal to remove the requirement to consult the Information Commissioner’s Office (ICO) prior to the processing of high-risk data.

4.2 We are concerned that this proposal will reduce the quality or robustness of the assessment of the impact of data processing, thereby increasing the risk that individuals’ data protection rights may be violated, and that organisations may breach data regulations. Data Protection Impact Assessments are an important tool for demonstrating GDPR compliance in the processing of individuals’ personal data and identifying and minimising any risks. We support the ICO’s recommendation [5] to retain a requirement to consult them about the impacts of high-risk data processing. This will safeguard individuals’ data protection rights and help organisations to comply with the Human Rights Act 1998 and the Equality Act 2010, including the Public Sector Equality Duty.

· We recommend that the Government retains the requirement for Data Protection Impact Assessments (Clause 17) or otherwise ensures that high-risk data processing requirements are more robust and clearer than currently proposed.

· We recommend retaining the requirement for organisations to consult the ICO about the impacts of high-risk data processing.

5.0 Personal data rights (Clauses 2, 5, 7 and 9)

5.1 There are several proposals in the Bill that relate to personal data rights. These include: changes to the legal bases on which personal data can be processed for research and statistical purposes to include commercial purposes (Clause 2); an extension of the recognised list of legitimate interests for processing which removes the need to carry out a balancing test to consider the impact on data subjects (Clause 5) ; and an exemption to the requirement to provide the data subject with fair processing information for reusing their data beyond its original scientific or research purposes (Clause 9) . These clauses collectively risk weakening individuals’ right to choose how their personal data are processed and for what purposes, thereby engaging Articles 8 and 14 of the ECHR, with potential implications for those with certain protected characteristics. This is particularly concerning given the increased use of AI and automated-decision-making that relies on personal data.

5.2 We are pleased that the Government recognises that the current system under Article 12 of UK GDPR, which allows individuals to access their personal data through subject access requests, functions as a ‘critical transparency mechanism’ under the right of access (Article 15 UK GDPR). [6] We are therefore concerned that proposals in the Bill to lower the threshold for refusing subject access requests from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’ (Clause 7) will have an adverse impact on individuals’ ability to access their personal data. Subject access requests are an important tool for individuals to understand when their personal data has been inappropriately obtained, processed, or shared. We are concerned that the proposals in the Bill might limit data subjects’ ability to understand whether they can challenge the use of their data under Articles 8 (privacy) and 14 (non-discrimination) of the ECHR.

· We urge caution in weakening personal data rights as set out in Clauses 2, 5, 7 and 9, in the context of the increased use of AI and automated decision-making.

· We recommend that the current level of provision for subject access requests set out in Article 12 of the GDPR is retained to support people’s ability to challenge misuse of their data, under Articles 8 and 14 of the ECHR (Clause 7).

6.0 National Security Exemption (Clause 24)

6.1 Clause 24 would make changes to section 26 of the Data Protection Act (DPA) 2018 in relation to law enforcement data processing in order to minimise differences between the DPA 2018 and the GDPR. The proposal provides a national security exemption from safeguards for law enforcement agencies to process personal data. We are concerned that this may undermine data protection safeguards in broad areas of decision-making, as it is unclear how law enforcement data controllers may use these exemptions. This raises concerns about whether these provisions are necessary and proportionate.

· We recommend that the Government provide further information on its proposals to introduce a national security exemption for law enforcement data processing, and the safeguards that will be put in place to guard against data misuse.

7.0 Oversight of biometric data (Clauses 104 and 105)

7.1 Clause 104 would abolish the office of the Commissioner for the Retention and Use of Biometric Material. Clause 105 would abolish the Office of the Surveillance Camera Commissioner. These roles are currently combined in the Biometrics and Surveillance Camera Commissioner (BSCC). Clause 105 would also repeal the statutory Surveillance Camera Code of Practice (issued under the Protection of Freedoms Act 2012). This Code of Practice is the only statutory code on the appropriate use of surveillance camera systems by local authorities and the police in England and Wales.

7.2 Biometric casework and reporting functions will transfer to the Investigatory Powers Commissioner. General oversight of biometrics and surveillance cameras will fall solely to the ICO under its existing powers. The Government considers that this will reduce duplication in oversight of biometrics and surveillance cameras.

7.3 We agree that duplication in oversight is unsatisfactory. However, given the expansion of the use of biometrics, including facial recognition technology, by the police and others, we are concerned that the proposals in the Bill risk reducing the consideration of the full range of equality and human rights risks in the use of this technology. The use of this technology could lead to breaches of Articles 8 (privacy), 10 (freedom of expression), 11 (freedom of assembly) and 14 (non-discrimination) of the ECHR, as well as the Equality Act 2010.

7.4 If the Surveillance Camera Code of Practice is repealed and the dedicated role played by the BSCC abolished, we ask the Government to provide assurance that consideration of the equality and human rights implications of biometric and surveillance technologies, currently in the Code, is maintained. Specifically, if the Code of Practice is repealed, it must be replaced by a statutory document of at least equal strength and weight of the Code.

· We recommend that the Government maintains a statutory consideration of equality and human rights obligations , as currently set out in the Su rvei llance Camera Code of Pract i ce , in the oversight of the use of biometrics, including facial recognition technology .

 

Appendix

1.1 Article 8 Right to respect for private and family life, the European Convention

on Human Rights

1. Everyone has the right to respect for his private and family life, his home and his correspondence.

2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.

1.2 Article 10 Freedom of Expression, the European Convention on Human Rights

1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.

2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.

1.3 Article 11 Freedom of assembly and association, the European Convention on Human Rights

1. Everyone has the right to freedom of peaceful assembly and to freedom of association with others, including the right to form and to join trade unions for the protection of his interests.

2. No restrictions shall be placed on the exercise of these rights other than such as are prescribed by law and are necessary in a democratic society in the interests of national security or public safety, for the prevention of disorder or crime, for the protection of health or morals or for the protection of the rights and freedoms of others. This Article shall not prevent the imposition of lawful restrictions on the exercise of these rights by members of the armed forces, of the police or of the administration of the State.

1.4 Article 14 Prohibition of discrimination, the European Convention on Human Rights

The enjoyment of the rights and freedoms set forth in this Convention shall be secured without discrimination on any ground such as sex, race, colour, language, religion, political or other opinion, national or social origin, association with a national minority, property, birth or other status.

22 May 2023


[1] Department for Science, Innovation and Technology, et al, Data Protection and Digital Information (No. 2) Bill: European Convention on Human Rights Memorandum. 2023. P 10

[2] Information Commissioner’s Office, Response to DCMS consultation "Data: a new direction" . 2021.

[3] Information Commissioner’s Office, Right not to be subject to automated decision-making.

[4] Citizen’s Advice, Discriminatory Pricing: One Year On. 2023.

[5] Information Commissioner’s Office, Response to DCMS consultation "Data: a new direction". 2021.

[6] Department for Digital, Culture, Media & Sport, Data: A new direction. 2021

 

Prepared 23rd May 2023