Data Protection Bill [HL]

Written evidence submitted by Nikita Aggarwal (DPB58)

Submission to House of Commons Public Bill Committee on the Data Protection Bill

1. I, Nikita Aggarwal, am a doctoral researcher at the University of Oxford, working on the regulation of autonomous and intelligent systems with a particular focus on machine learning. I am a qualified solicitor of England and Wales, with expertise in EU law, financial regulation and international law, and with prior experience advising on legal and regulatory reform in various countries. This submission is made in my personal capacity.

2. I would like to highlight the legal ambiguity in section 98 of the draft Data Protection Bill, which if left unaddressed would create a disproportionate regulatory burden for data controllers – without commensurate gain for the protection of data subjects - and risk fragmentation between Member States in the implementation of the GDPR.

3. Specifically, I am concerned that section 98 would create a right to information for all decision-making, not solely where it is automated. It should be noted that the inclusion of the words "at least in those cases", in Article 15(1)(h) of the GDPR, qualifies the sub-phrase "including profiling, referred to in Article 22(1) and (4)". This gives Member States the discretion to extend the right to information beyond decision-making based solely on automated processing (such as profiling) to include all automated decision-making. However, it does not envisage gold-plating the right to information beyond automated decision-making, to cover all forms of decision-making (automated or otherwise) involving personal data.

4. Such a broad duty would implicate most firms, as controllers and processors of personal data, potentially creating a disproportionate and excessively costly regulatory burden. Differences such as these in the implementation of the GDPR between Member States will also undermine the goal of harmonizing data protection laws in the EU. The objective of increasing accountability for automated decision-making, and preventing abusive and discriminatory behavior, can be achieved by a more carefully circumscribed right to information, as proposed below, that redresses the informational asymmetry between consumers and producers where potentially opaque algorithmic methods are used for data processing, whilst avoiding an unfeasible compliance burden for businesses that utilize personal data.

5. There is further ambiguity in section 98 as to the scope of the right to information, stemming from the duality between the terms "decision-making" (in the title of the section), and "processing" (in its body). "Decision-making" is ostensibly broader than "processing", implying an understanding of how the various operations interact to generate an output (i.e. the decision). [1] This difference has a potentially material impact on the scope of the right and duty of information under this section. A right to information about the reasoning underlying processing could be implemented ex ante, and at the level of the firm, through disclosure of a firm’s policies on the use of a customer’s personal data in automated systems, including the reasons why such data is used. In the context of credit scoring, for example, such disclosure might state the categories of data that the lender uses (e.g. social media, health data et c.) and the reasons why they use that data to assess creditworthiness.

6. This interpretation would be consistent with the GDPR, Article 15(1)(h), under which the right to receive "meaningful information about the logic involved" (in automated decision-making) is a forward-looking one. This can be inferred from the subsequent phrase "envisaged consequences of such processing", which is necessarily triggered before a decision (the consequences of processing) is reached. On the other hand, a right to information pertaining to decision-making – or the results of processing – implies a broader right to an explanation ex post about the reasons for a decision, at the level of the individual data subject. In the credit scoring context, this would oblige lenders to explain the reasons for every denial of credit using automated, credit scoring models, if requested (as well as approvals of credit, if a customer happens to be especially curious).

7. Whilst there is a societal interest in ensuring that complex, algorithmic systems are not so opaque that decisions based on them cannot be scrutinized, challenged and held accountable, this does not justify the creation of a general right to an explanation, and its corollary, the duty to give reasons, for the outcome of any processing of personal data, whether automated or not – as suggested by section 98. As such, it is recommended that in order to minimize legal ambiguity, ensure a proportionate regulatory response that balances the needs of consumers (data subjects) and businesses (data controllers and processors), enhance accountability and transparency for automated and algorithmic decision-making processes, and to facilitate harmonization of data protection laws across the EU, section 98 should be amended to narrow the scope of the right and duty to give information, as follows:


1. Specify, in the section title, that Section 98 applies to automated decision-making.

2. Introduce a threshold of significance for the right to receive information where a decision is not based solely on automated processing of personal data, in a new sub-clause (1)(c):

(1) Where –

(b)…, and

(c) the decision is either based solely on automated processing of personal data relating to the data subject, or the decision involves the use of automated processing and significantly affects the data subject.

3. Clarify in sub-paragraph (2) whether the right to receive information applies to processing only, or decision-making more generally.

March 2018

[1] Based on the definition of "decision making" in sections 96 and 97 of the Bill. The use of the additional words "results produced by [the processing]", in subsection (1)(b) of section 98, further supports the distinction between processing (without results) and decision-making (with results).


Prepared 23rd March 2018