Digital Economy Bill

Written evidence submitted by Dr Edgar A. Whitley, Associate Professor (Reader) in Information Systems, London School of Economics and Political Science. Co–chair Cabinet Office Privacy and Consumer Advisory Group (PCAG) (DEB 69)


Follow up written evidence on Part 5 of the Digital Economy Bill


Introduction: Delivering on an offer to comment on the Draft Codes of Practice when they are published


In my oral evidence to the Public Bill Committee on 11 October 2016 I made repeated reference to the Codes of Practice that were intended to guide the Data Sharing proposals but which had not been published at that time.

Mr Hancock, Minister of State for Digital and Culture, wrote to the Committee on 19 October 2016 proposing further amendments and noting that he was publishing the draft codes of practice which "set out more detail on how effective and safe data-sharing will be conducted" [1] .

In response to a question from Louise Haigh about how I would advise the Government to achieve that code of practice, I responded that if the Government "gave us the detail" I would give them detailed comments to improve the process. Having had the opportunity to read the draft codes I am able to offer some initial thoughts on them.

As noted in my oral evidence, I am particularly concerned with how the codes will, "as a matter of principle", address privacy concerns by ensuring that the data sharing process will use the "minimum amount of information required".

Four draft Codes of Practice have been presented:

· A code of practice on Public Service Delivery, Fraud and Debt (Chapters 1, 3 and 4) [CoPPSD]

· A code of practice for civil registration officials (Chapter 2) [CoPCR]

· A code of practice and accreditation criteria for access to data for research purposes (Chapter 5) [CoPR]

· A statement of principles and procedures and code of practice for changes to data systems (Chapter 7) [CoPS]

Some of the detail provided in the Codes of Practice is good. For example, the checklist in CoPPSD §73 addresses question of detail although I would still prefer more guidance for organisations, for example, on "What methods or technology can be used to minimise the amount of information shared and risk of data loss e.g. using aggregate data, derived data or the use of a look-up process, in preference to bulk data sharing" or how to implement "issues, disputes and resolution procedures".

I have written further about the need to be specific about technological issues in my submission to the House of Lords Constitution Committee Inquiry into "Legislative Process: Stage 1: Preparing legislation for introduction in Parliament"



A simplification of data sharing?

A clear outcome from the Open Policy Making process about data sharing was the need to simplify the regulatory environment around data sharing and this is the intention behind Part 5 of the Digital Economy Bill. However, §9 notes that organisations will still need to satisfy themselves that they are complying with the Data Protect Act, §11 notes that the contents of the Code of Practice are not legally binding, that the Code does not itself impose additional legal obligations nor is it an authoritative statement of law. It talks about data sharing partners being "expected" to "agree to adhere" to the code and that failure to have regard to the code "may" have significant consequences. This is hardly "a simple and agile route for agreeing and establishing data shares" (§35)

Personal Information or Personal Data?

CoPPSD follow the language of the Bill by referring to Personal Information (PI). As noted in §3 this is in contrast to the DPA definition which uses the term Personal Data (PD). In particular, PI is taken as identifying because of how someone’s privacy can be affected if they can be identified from the information taken together with "any other available information" (emphasis added). This is much broader than the DPA definition of PD, which talks of "those data and other information which is in the possession of, or is likely to come into the possession of, the data controller". Whilst there may be legitimate drafting reasons for not using the DPA definition (i.e. additional categories of data to be used), there appears to be a higher test (risk of reidentification when PI is combined with *any* other available information [2] ) than exists under the DPA and therefore two, inconsistent, requirements.

Policy compliance by definitions

§4 appears to introduce a novel form of governance, namely policy compliance ("careful data handling") by means of linguistic definition. Presumably the "specific safeguards introduced in the Bill to ensure personal information is handled appropriately" are additional to the safeguards found in the Data Protection Act and the powers of the ICO.

Data sharing and personal information

§3 provides a definition of data sharing that is taken from the ICO’s Data Sharing Code of Practice. The ICO notes in that code, however, that "the code isn’t really about ‘sharing’ in the plain English sense. It’s more about different types of disclosure, often involving many organisations and very complex information chains; chains that grow ever longer, crossing organisational and even national boundaries" [3] .

§6 talks about the use of Yes/No type attribute checks. These are generally considered to be privacy friendly. Indeed, CoPPSD notes "the method is generally safer (large amounts of data are not being transferred either physically or electronically and, in instances when eligibility is being checked, the need to share underlying data is removed completely) and less intrusive (binary checks can be run against very specific relevant data fields)". However, despite acknowledging that "the need to share underlying data is removed completely" CoPPSD §6 states that "information is still being shared and as such those sharing data in these ways require the legal powers, to do so".

Although bits of data are being transferred, it is unclear why a simple Yes/No eligibility flag is considered Personal Information (and within the scope of the Bill) even using the "*any* other information" condition described above.

The Central Review Group for Data Sharing Pilots

The Central Review Group is a new initiative that I have not seen previously. Unfortunately the level of detail provided is rather limited. For example, does it include "public representative bodies" (§93) and / or "privacy interest groups" (§88)? How will it ensure that it has the expertise (and authority) to investigate any breaches"? (§115).

Given that the group will meet monthly, it is surprising that the cost of the group’s work is not provided in the Bill’s Impact Assessment.



The heading "Is there a legal gateway?" is puzzling as providing such a gateway is what I understood as the purpose of the provisions of this part of the Bill.

The conceptual confusion about data sharing noted above is further complicated in §32 which talks of data sharing being used to enable the recipient to exercise one or more of their functions, yet the examples provided in the consultation are not about enabling but rather simplifying already enabled functions. In contrast, the data sharing provisions around fuel poverty enable interventions that would not have been possible without the data sharing.

I am very alarmed that §40 suggests that "Formal data sharing agreements will not always be necessary".



§52 contains a large number of (potentially) contentious issues hidden away in the midst of the proposals.

§54 expresses an ideal. Experience from the Administrative Data Research Network (ADRN) suggests that this is rarely achieved in practice. A likely explanation is the (current) often poor quality meta–data associated with the kinds of data sources likely to be requested.



This CoP refers to the work of the UK Statistics Authority and is prepared by them. I am therefore unable to comment on it.

Presentational issues



I am puzzled by the reference to the Troubled Families programme in §16. This was discussed in detail in the data sharing consultation but does not appear to be part of the current Bill.

The Code of Practice (§19) provides greater clarity about the intention behind the fuel poverty assistance, implying it will only apply "if they are a member of a household living on a lower income [AND] in a home which cannot be kept warm at reasonable cost" (logical connector added).

§25-32 seem in the wrong place. Most users of the CoP will want to know what to do and how to do it properly. Details of how to go about requesting additional powers would make more sense coming later.

§64 talks about "key considerations" and "key indicators". It is unclear if they are intended to refer to the same thing.

The reference in §67 to sending notices to individuals has not been referenced previously. I suspect that this might be a response to the question of consent but this formulation is unlikely to comply with the EU General Data Protection Regulation which enters into application 25 May 2018.

§70. The box refers to "A customer who is in a position to pay their debt – some of whom may need additional support". Presumably this is about "Customers who are in a position to pay their debt – some of whom may need additional support" (emphasis added).

I don’t think it is helpful to combine the guidance about requesting pilots of data sharing for fraud purposes in the same CoP as the general public service delivery as it may cause undue confusion.

Annex C, Step 3 seems inconsistent between "the" and "your" central review group.


§50 To whom is the formal application made?


§1 notes that researchers will only receive de–identified data, so are we to assume from §53 that discussion of the requirement that "potentially disclosive information will be stored confidentially" means that the Government recognises the limits of de–identification / anonymization techniques?

It is unclear in §53 what issues of consent arise, given that the researcher is receiving anonymised data and consent is not needed to process it.

Again in §53, if the research plan has ethics committee approval, what does the need for additional human oversight involve?

Again in §53, how should the "views of the public" be considered in light of the data used?

October 2016


[2] See, for example, O’Hara, K., Whitley, E. A., and Whittall, P. (2011). Avoiding the Jigsaw Effect: Experiences With Ministry of Justice Reoffending Data, (available at and Elliot, M., Mackey, E., O’Hara, K., and Tudor, C. (2016). The anonymization decision-making framework, UKAN Manchester (available at



Prepared 26th October 2016