Data Protection and Digital Information (No. 2) Bill

WRITTEN EVIDENCE SUBMITTED BY THE OPEN RIGHTS GROUP TO THE DATA PROTECTION AND DIGITAL INFORMATION (NO.2) BILL: CALL FOR WRITTEN EVIDENCE (DPDIB06)

The DPDI Bill will introduce new barriers to exercise data protection rights, lower protections around AI and automated decision-making, lessen rights against online manipulation and discrimination, and create new obstacles to lodging a complaint

The DPDI Bill will lessen accountability, encourage off-shore data laundering, and reduce rights of UK residents against overseas organisations.

The DPDI Bill will allow commercial exploitation under the guise of scientific research

The DPDI Bill will politicise the ICO, and expand government control over data uses and the power to authorise international data transfers arbitrarily

We need a new Bill for Data Rights, that exceeds existing data protection standards and provides UK residents with world-leading personal data protection.

About Open Rights Group (ORG): Founded in 2005, ORG is a UK-based digital campaigning organisation working to protect individuals’ rights to privacy and free speech online. ORG has been following the UK government’s proposed reforms to data protection since their inception. In June 2022, we organised an open letter signed by a coalition of over 30 organisations that highlighted the failure of the DCMS to properly engage with civil society groups about the proposed reforms, and in March 2023, we delivered a letter signed by 25 CSOs to Michelle Donelan, highlighting our serious concerns with the Government’s draft legislation.

1 The Data Protection and Digital Information (DPDI) Bill will weaken your constituents’ data protection rights, water down corporate accountability mechanisms, undermine trust in data uses in the public interest, empower the Secretary of State with undemocratic controls over data protection, and negatively impact the economy.

2 The drafting of the Bill was informed by a lopsided consultation process that purposefully excluded critical voices, [1] and failed to address the incoherence and weakness of the evidence that was presented to support the UK data protection reform. [2] As a result, the UK DPDI Bill would:

2.1 Harm UK businesses: the Bill introduces new legal provisions whose interpretation is part unclear, and sometimes apparently incompatible, with the EU GDPR. Further, the powers granted to the Secretary of State would make the UK data protection regime prone to sudden, politically-motivated and potentially incoherent changes. This will increase legal uncertainty and require businesses to navigate multiple data protection regimes, thus increasing costs and creating bureaucratic headaches. Numerous businesses have spoken out about the negative impacts of the Bill’s proposals. [3] Some startups are already fleeing the UK in anticipation of this reform. [4]

2.2 Endanger data adequacy and trade relationship with the EU: The Bill will greatly weaken people’s data protection rights and open new avenues for the UK to transfer data to countries with poor data protection, creating a scenario where the data of EU citizens could be laundered through the UK to countries that the EU does not have an agreement with. These changes are raising red flags in Europe and jeopardize the UK’s current adequacy agreement. Conservative estimates found that the loss of the adequacy agreement would cost 1 to 1.6 billion pounds in legal fees alone. [5] This figure does not include the cost resulting from disruption of digital trade and investments.

2.3 Diminish the UK international reputation: The Bill frames the UK data protection regime in terms of regulatory competition, at a time when countries around the world are seeking convergence and cooperation on data protection and digital matters. By lowering safety standards, the Bill will make the UK a safe heaven where unethical tech companies and data marketers can avoid legal liability and circumvent regulatory standards. Companies will move to the UK to move fast and break things, leaving UK residents with the burden of picking up the broken pieces.

3 In an increasingly digitalised and data-driven world, existing data protection laws provide much needed legal protection for the public against predatory commercial practices and the increased use of algorithmic decision-making across public services, law enforcement and employment.

4 The government has an opportunity to strengthen the UK's data protection regime post Brexit. However, it is instead setting the country on a dangerous path that undermines trust, furthers economic instability, and erodes fundamental rights.

WEAKENED DATA PROTECTION RIGHTS

 

New barriers to exercising data protection rights

5 Clause 7 would lower the threshold for charging a reasonable fee or refusing a request from an individual to exercise their data protection rights from "manifestly unfounded or excessive" to "vexatious or excessive". This will lead to more requests being refused, and it will create a barrier for many people, particularly those on lower incomes.

6 The new test reflects language used in the Freedom of Information Act 2000, where individuals making the request are required to demonstrate a "reasonable foundation" of "value to the requester". This would allow organisations to intimidate individuals by asking for their reasons for exercising their data rights. Further, organisations who receive a discomforting request may threat it as "vexatious", but a request that is inconvenient to the recipient should be no less valid.

7 The right of access is a fundamental redress against unfair and exploitative labour practices, such as in the case of gig economy drivers being victims of false accusations of fraud or unfair wage deductions. [1] However, by making the repetitive nature of a request a criteria to dismiss them as "vexatious", Clause 7 would make it easier to adopt obstructive behaviours, such as in the case of Uber providing circular and futile answers to access requests. [2]

8 Likewise, it is notoriously difficult to exercise the right to be forgotten or correction against AI models: [3] by making the resources required to erase data a criteria to dismiss such request, Clause 7 would disempower individuals against false accusations of sexual assault [4] or bribery [5] made by AI applications.

Lower protections around AI and automated decision-making

9 The Bill changes current rules that prevent companies and the government from making solely automated decisions about individuals that could have legal or other significant effects on their lives. Clause 11 would omit Article 22 of the UK GDPR and introduce new Articles 22A, 22B, 22C and 22D. these would

9.1 Remove the right not to be subject to solely automated decisions that have legal or otherwise significant effects on them, unless the decision is based on special category data.

9.2 Empower the Secretary of State to designate automated decision-making which is exempted from any such safeguard..

10 These changes would empower the Secretary of State to remove safeguards around AI and automated decisions for arbitrary or politically-motivated reasons. Further, it would expose individuals to solely automated decisions against their will, unless sensitive data have been used to take such decision.

Less rights against online manipulation or discrimination

11 Clause 79 would remove cookies’ consent requirements for online tracking and personalised advertising. Further, Clause 79(3) would give the Secretary of State the power to amend Regulation 6 of the Privacy and Electronic Communications Regulations (PECR), such as by allowing data marketers and other third parties to store information, or gaining access to information stored, in the terminal equipment of an individual without their informed consent.

12 Individuals already feel powerless against dark patterns, used by online platforms and data marketers to deliberately force or mislead users into providing consent. [6] By making online tracking, profiling and advertising non-consensual, Clause 79 will expose individuals to more exclusion based on their gender or ethnicity [7] as well as to predatory practices, such as in the case of online gambling app using personal data to target problem gamblers. [8]

New obstacles to lodging a complaint

13 Clauses 39 and 40 would give the Information Commissioner’s Office (ICO) more discretion to dismiss complaints, unless individuals have already complained to an organisation and company first. Further, Clause 8 would introduce a new loophole will allow companies and organisations to reset the one month time limit for responding to individuals’ requests (such as access to data or erasure) by asking further information.

14 UK residents seeking justice against an infringement of their rights will have to wait longer for a rights’ request to be processed, and undergo a privatised complaint procedure with the customers’ service of the offending organisation, before being able to lodge a complaint with the ICO. The combination of these changes means that complaints could routinely take 20 months or longer to resolve.

15 The Commissioner rarely takes enforcement action against organisations even when widespread or systemic issues are highlighted. By widening regulatory discretion and forcing individuals to complain with offenders first, individuals will feel intimidated and even more powerless.

LESS ACCOUNTABILITY AND MORE OFF-SHORE DATA LAUNDERING

 

Less accountability

16 Clauses 14, 15, 17 and 18 will remove requirements to keep Records of Processing Operations, Data Protection Impact Assessments, and Data Protection Officers, and replace them with less robust requirements that only need be fulfilled in limited and highly discretionary circumstances. These changes would also remove the requirement to consult with people affected by high risk data processing, thus making these assessments less reliable and objective.

17 As digital public programmes during the Covid-19 pandemic have shown, carrying out accountability requirements is a fundamental due diligence process to support public trust and prevent wide-scale data harms. By removing or weakening these requirements, the Bill will expose individuals to more data breaches and abuses, such as in the case of confidential contact tracing data being leaked on social media channels by Test and Trace personnel, [9] being abused to harass women, [10] or being lost due to its storage on an excel sheet [11] .

18 Further, Data Protection Impact Assessments have already proven to be key audit tools to hold big tech companies to account, and to negotiate material improvements to the safety and privacy of the products they offer. [12]

More off-shore data laundering

19 Schedule 5 amends article 46 of the UK GDPR, allowing organisations to transfer personal data to oversea countries by acting "reasonably and proportionately": these criteria do not consider the actual existence of enforceable rights and legal remedies, but only the (alleged) due diligence of the organisation operating the transfer. Further, Clause 1 will weaken the definition of personal data, allowing organisations to misclassifying personal data as anonymous data.

20 These changes will

20.1 make it easier to launder personal data in off-shore countries and circumvent UK law and individuals’ rights;

20.2 threaten the UK adequacy decision, which would cost 1.9 billion in legal fees alone and disrupt UK digital trade with the largest UK trading partner. [13]

Less rights against overseas organisations

21 Clause 13 removes the requirement to nominate a UK representative for overseas organisations. By removing this requirement, the Bill will make it more difficult for UK residents to exercise their rights once their data is transferred abroad.

UNDERMINING TRUST IN THE USE OF DATA FOR THE PUBLIC GOOD AND SCIENTIFIC RESEARCH

 

Commercial exploitation under the guise of scientific research

22 Clause 2 would amend the definition of scientific research to include data uses "carried out as a commercial or non-commercial activity", thus allowing personal data to be used for commercial purposes under the guise of "research purposes".

23 Further, Clause 9 would reduce transparency over personal data uses for research purposes, such as by exempting researchers from providing information to large numbers of individuals or when personal data have been collected a long time ago.

24 The public has a clear expectation that data uses for innovation and scientific research should be "be ethical, responsible and focused on public benefit". [14] By overextending the definition of scientific research, Clause 2 would breach such expectations and undermine trust in legitimate research activities.

25 The National Data Guardian already stressed that "a lack of transparency has caused important projects that use health data for public benefits to fail or stall due to lack of public trust". [15] By devaluing the meaning of research, the Bill hinders data sharing and uses that saves lives.

U NDEMOCRATIC EXPANSION OF GOVERNMENT POWERS AND LEGAL UNCERTAINTY

 

Politicising the ICO

26 Clause 28 would insert new sections 120E and 120F into Part 5 of the 2018 Act, empowering the Secretary of State to introduce a Statement of Strategic Priorities to which the Commissioner must have regard, and require the regulator to respond in writing as to how it will address them.

27 Further, clause 31 would introduce new Section 120H, which would require the ICO to submit Codes of Practice to the prior approval of the Secretary of State before they can be laid before Parliament.

28 The ICO plays a key role in the oversight of the government’s handling of data so it is vital that it is completely independent from government. However, the Bill will give the Secretary of State new powers to issue instructions to the ICO and to interfere with how it functions. Additionally, the ICO will have to seek the approval of the government before  exercising important regulatory functions.

Arbitrary transfers of personal data to insecure countries

29 Schedule 5 would introduce new Article 45B to the UK GDPR, and empower the Secretary of State to approve international transfers to countries which lack enforceable rights and effective remedies for UK residents. In particular, the new "data protection test" for international transfers:

29.1 Does not require to consider the impact that foreign legal frameworks concerning defence, national security, criminal law and the access of public authorities to personal data, will have on the protection of UK personal data;

29.2 Does not require an independent and effective supervisory authority in the country where data is being transferred, or the availability of a judicial redress;

29.3 Gives arbitrary discretion to the UK government to consider, as a justification for authorising international data transfers, "any matter which the Secretary of State considers relevant".

30 The requirement to consider "public security, defence, national security and criminal law and the access of public authorities to personal data", as well as the existence of an independent supervisory authority and of effective judicial redress" are the cornerstone of the Schrems I and Schrems II judgments, which are at the basis of the UK adequacy decision. [16] By removing such requirements, Schedule 5 puts the UK adequacy decision in jeopardy.

Expanding government control over data

31 The Secretary of State will be given additional powers to introduce (without meaningful democratic scrutiny) new grounds for processing personal data under Clause 5, and new exemptions that would legitimise data uses beyond the purpose they were collected for or the legitimate expectations of the individuals under Clause 6. The list of exemptions is overly broad and vague. For instance, it includes "crime detection", "national security" or "disclosures to public authorities". The government is given broad powers to amend this list at any time and without meaningful limits to their discretion.

32 The Secretary of State will have the power to rule by decree and override fundamental rights at the whim of the government of the day

33 Existing GDPR rules are a coherent set of principle-based rules, whose application in specific context has developed gradually and thanks to independent and impartial bodies as the ICO or Judicial Courts. The Secretary of State will have the power to introduce sudden, incoherent changes driven by partisan considerations and the politics of the day, leading to legal uncertainty.

A  NEW BILL FOR DATA RIGHTS IS NEEDED

 

34 The Data Protection and Digital Information Bill is a misguided attempt to deregulate data protection and a missed opportunity to build on the UK proud tradition of pragmatism and human rights.

35 In an ever-digitalised and data-driven world, existing data protection laws provide much needed legal protection for the public against predatory commercial practices and the increased use of algorithmic decision-making across public services, law enforcement and employment.

36 Parliament needs to take back control, and reject government attempts to transferring regulatory powers to the government and its departments while undermining the independence of the Information Commissioner. Likewise, Parliament should resist proposals to water down or remove data protection rights and accountability measures.

37 Instead, the UK have an opportunity to fine tune and exceed existing data protection standards, and establish themselves as a word-leading regulatory power. Parliament could build on the findings of the Join Committee on Human Rights Report on "The Right to Privacy (Article 8) and the Digital Revolution", and the many ignored inputs to the Data a new direction consultation. A new deal for data rights is needed, one that

37.1 Scrap proposals that would lower the threshold to stop individuals from exercising data protection rights, water down accountability requirements, remove safeguards around international data transfers, threaten the independence of the ICO, and empower the Secretary of State to override primary legislation with Henry VII clauses.

37.2 Introduce rigorous mechanisms to test whether companies and organisations are using legitimate interest appropriately;

37.3 Extend the right not to be subject to solely-automate decision making to decisions which are "significantly automated".

37.4 Introduce legal requirements to publish accountability documents, such as Data Protection Impact Assessments, Records of Processing Activities, Legitimate Interest Assessments and International Data Transfers Assessments, to strengthen public scrutiny, accountability, and prevent data misuses.

37.5 Strengthen legal duties to consult with individuals, workers and communities who are affected by personal data uses, to ensure that impact assessments are conducted objectively and expose issues before they materialise into harms.

37.6 Strengthen the ICO mandate to prioritise data protection enforcement over external or political considerations, and transfer the prerogative to appoint the Commissioner to Parliament to safeguard its independence.

May 2023


[1] https://www.openrightsgroup.org/publications/open-letter-to-the-dcms-10-june-2022/

[2] https://www.openrightsgroup.org/app/uploads/2023/03/DPDI-Bill-UK-civil-society-letter.pdf

[3] See, for instance, 15 CEOs of SaaS Companies open letter to Michelle Donelan, at: https://www.linkedin.com/posts/adhale_data-protection-letter-to-secretary-of-state-activity-6992876772790784000-ztEB/

[4] See Back to the EU at: https://adambird.com/posts/back-to-eu/

[5] See The cost of data inadequacy at: https://neweconomics.org/2020/11/the-cost-of-data-inadequacy

[1] https://www.workerinfoexchange.org/post/historic-digital-rights-win-for-wie-and-the-adcu-over-uber-and-ola-at-amsterdam-court-of-appeal

[2] https://www.workerinfoexchange.org/wie-report-managed-by-bots

[3] https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3018186

[4] https://nypost.com/2023/04/07/chatgpt-falsely-accuses-law-professor-of-sexual-assault/

[5] https://www.reuters.com/technology/australian-mayor-readies-worlds-first-defamation-lawsuit-over-chatgpt-content-2023-04-05/

[6] https://edri.org/our-work/ncc-report-on-dark-patterns/

[7] https://www.theatlantic.com/business/archive/2017/03/facebook-ad-discrimination/518718/

[8] https://www.nytimes.com/2021/03/24/technology/gambling-apps-tracking-sky-bet.html

[9] The Times "Coronavirus contact tracers sharing patients’ data on WhatsApp and Facebook." Source: https://www.thetimes.co.uk/edition/news/coronavirus-contact-tracers-sharing-patients-data-on-whatsapp-and-facebook-rg3zqn5l6

[10] The Telegraph "Test and trace is being used to harass women – already." Source: https://www.telegraph.co.uk/women/life/test-trace-used-harass-women-already/

[11] https://www.bbc.co.uk/news/technology-54423988

[12] https://www.nytimes.com/2023/01/18/technology/dutch-school-privacy-google-microsoft-zoom.html

[13] https://neweconomics.org/2020/11/the-cost-of-data-inadequacy

[14] https://www.adalovelaceinstitute.org/evidence-review/public-attitudes-data-regulation/

[15] https://www.gov.uk/government/publications/national-data-guardian-feedback-on-data-a-new-direction-proposed-government-reforms-to-the-uk-data-protection-regime

[16] https://ec.europa.eu/commission/presscorner/detail/en/ip_21_3183

 

Prepared 10th May 2023