57.Big data, as we have discussed, has enormous potential to improve the services we receive and the way businesses operate. When big data deals with personal or commercially sensitive data, however, those benefits have to be weighed against a potential loss of privacy and the risks of our data being lost or misused. The Information Commissioner highlighted that “every week there seems to be a horror story at the top of my pile, and my investigation team and my enforcement team, which is being expanded, has more and more work to do.” Chris Combemale from the Direct Marketing Association saw “a constant battle between data security and hackers”.
58.Big Brother Watch found in a 2015 poll that 79% of adults in the UK were “concerned” about their privacy online, and 46% believed that they were “being harmed by the collection of their data by large companies”. The Direct Marketing Association found that 60% of people were “happy with the amount of personal information that they shared with companies”, and 47% considered that “the exchange of personal data is essential for the smooth running of modern society”. Such surveys might appear inconsistent, or else point to internal conflicts in people’s attitudes towards big data. As Baroness Shields, minister for internet safety and security, pointed out:
There is a chasm in terms of what people feel about trusting data. If you talk to teenagers, they do not care; they have given up privacy and decided that they are happy to share absolutely everything in their lives and have it catalogued. … We have some responsibility to look out for their interests, especially in terms of their rights online.
59.A further 2015 survey from the Digital Catapult on people’s attitudes towards sharing data found that the Government was the most trusted user of personal data: 44% of respondents named it as the most trusted sector, with financial services in second place with 29%. However, this trust comes with strings attached: 32% considered it the Government’s responsibility to educate people about protecting or controlling personal data, while 30% thought it the responsibility of the individual.
60.It is important to note that personal data is only a small proportion of big data—there is huge value still to be realised from novel uses of non-personal datasets like transport data, weather data, etc. Nevertheless, given the scale and pace of data gathering and sharing, distrust arising from concerns about privacy and security is often well founded and must be resolved by industry and Government if the full value of big data is to be realised. We recommend below the establishment of a Council of data Ethics to help address these issues (paragraph 102).
61.Controls on the storage and processing of personal data are covered by the Data Protection Act 1998, which transposed the 1995 EU Data Protection Directive. The 1998 Act set out a number of ‘data protection principles’:
(1) Personal data shall be processed fairly and lawfully, with the subject’s consent, by necessity or for the data controller’s legitimate interests.
(2) Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes.
(3) Personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed.
(4) Personal data shall be accurate and, where necessary, kept up to date.
(5) Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.
(6) Personal data shall be processed in accordance with the rights of data subjects under this Act.
(7) Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.
(8) Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.
The Data Protection Act also makes special provisions for ‘sensitive personal data’, such as patients’ health data.
62.The Information Commissioner’s Office has emphasised that big data is subject to existing data protection legislation:
We do not accept the argument that data protection principles are not fit for purpose in the context of big data. Big data is not a game that is played by different rules.
63.Consent is one of the conditions which allow an organisation to process personal data—the first ‘data protection principle’ under the 1998 Act. Of course, as many witnesses pointed out, there are circumstances which make it more challenging to secure consent in big data. The Information Commissioner acknowledged this:
If an organisation is collecting personal data to use in big data analytics, and it is relying on consent to legitimise this, then it has to make people aware of all the intended uses of the data, including, for example, whether it is going to share the data with other organisations. Similarly, if an organisation is acquiring data from elsewhere, it has to satisfy itself that the original consent covers that further use of the data. Given the complex and sometimes unforeseen uses of data in big data analytics, this can of course be problematic.
However, he was clear that these challenges could be overcome in most circumstances and emphasised that “the consent must be freely given, specific and informed … Furthermore, ‘freely given’ means that people can also withdraw their consent”.
64.Along with the Digital Marketing Association, the Digital Catapult, Big Brother Watch and others, the Information Commissioner was clear that informed consent started with a comprehensive Privacy Impact Assessment to identify and mitigate privacy risks followed by innovative solutions for delivering meaningful, transparent ‘privacy notices’.
65.Terms & conditions and privacy notices are the primary mechanism for obtaining consent but many are so dense and opaque that they actively prevent rather than enable informed consent. In offering advice to a company seeking to obtain genuine consent, the Information Commissioner had some straightforward suggestions. Privacy notices, he said, should be:
66.Businesses and governments that communicate most effectively with the public, giving the citizen greater control in their data transactions by using simple and layered privacy notices to empower the consumer to decide exactly how far they are willing to trust each data-holder they engage with, will gain a huge commercial and societal advantage. Although the length of a privacy notice will be dictated by the service or data application involved, it should be best practice to draft them as simply as possible. Furthermore, if informed, freely given consent must be the bedrock of a trusting relationship between a consumer and a data-holder, then it must always be part of that deal that consent freely given can also be freely withdrawn.
67.Nevertheless, the Information Commissioner, Christopher Graham, raised concerns with us about big data techniques which ‘re-identify’ individuals when previously anonymised data are combined with other datasets. The Wellcome Trust worried that “as datasets become more sophisticated … the technical possibility of undertaking ‘jigsaw’ re-identification of individuals increases, even from data that has been through a process of anonymisation”. Professor Jonathan Montgomery, chair of the Nuffield Council on Bioethics, noted that “in the context of health data, the difficulty is magnified by the fact that in order to be useful the data has to be quite rich about your health: The richer it is, the more possible it becomes to use those techniques to correlate.” He also told us that work on data breaches by the Nuffield Council on Bioethics showed that “most of the breaches are human as opposed to technological. That suggests that … some of the techniques we have used previously to safeguard privacy and confidentiality remain important, [such as] personal integrity and the quality of staff.”
68.The Information Commissioner was clear that more work needs to be done on researching anonymisation and raising industry standards. The UK Anonymisation Network (UKAN) was set up by the Information Commissioner’s Office in 2012 as a means of establishing best practice in this area. It aims to maximise the value of data, minimise the risks to privacy and preserve public confidence by collating best practice in anonymisation from a wide range of experienced practitioners. Our witnesses were clear that there are technical options available to improve the effectiveness of anonymisation protocols; one method seen as more tried and tested than the others being ‘differential privacy’.
69.Differential privacy aims to ensure that the results derived from a dataset would look the same whether a person’s data was included in the dataset or not. This is achieved by adding noise to the dataset in a way which does not interfere with the accuracy or outcome of results. Microsoft noted in 2011 that:
Differential privacy thrives because it is natural, it is not domain-specific, and it enjoys fruitful interplay with other fields. … This flexibility gives hope for a principled approach to privacy in cases, like private data analysis, where traditional notions of cryptographic security are inappropriate or impracticable.
Despite the academic enthusiasm for differential privacy, it is a system that is rarely deployed. Whilst differential privacy may not be a silver bullet, it requires greater research and further exploration to establish it as a method aimed at addressing privacy concerns.
70.Furthermore, the Information Commissioner was concerned that ‘re-identification’ of individuals may be outside the scope of the current legislation:
Section 55 [of the Data Protection Act] just deals with the unauthorised obtaining or disclosure of personal information without the knowledge of the data controller. I would not like to test a de-anonymisation or re-identification case against that.
Ed Vaizey told us:
The original intention of section 55 was to address the problem of third parties obtaining personal data by deception and most prosecutions under this provision have dealt with these types of offences. It is unlikely that it was intended for the purposes of dealing with the de-anonymisation, which was not thought to be a major issue at the time.
71.The Information Commissioner described the range of methods he uses to ensure compliance with the data protection requirements covered by the regulations. These included data protection audits which, though generally voluntary, may be conducted compulsorily on government departments (including on NHS authorities since February 2015). The Information Commissioner believed that compulsory audits “ought logically to apply to local government as well”, and possibly even to some commercial sectors.
72.The Information Commissioner can also impose civil monetary penalties of up to £500,000. He was primarily “interested in a way of doing enforcement that gets a result that is of benefit to consumers, rather than just getting off on civil monetary penalties”. But he was nevertheless frustrated that, in cases of serious or malicious breaches of data protection, the current penalties were not sufficient and there was a lack of the clout from criminal sanctions:
I take cases before magistrates’ courts and I weep when the fine is £250 and a £100 community sentence … It does not provide the deterrent that I want. We will continue, but it would make our investigations much easier if I could require people to attend for interview rather than asking them nicely. It would make the investigations quicker. If the offences were recordable and on the police national computer, that itself is a disincentive.
He wanted serious breaches of the Data Protection Act to become criminal offences. This, he told us, could happen “in very short order” if sections 77 and 78 of the Criminal Justice and Immigration Act 2008 were commenced.
73.Such a recommendation has been made previously by the 2012 Joint Committee on the Communications Data Bill, the Home Affairs Committee in 2012, the Justice Committee in 2013, and a number of witnesses to our current inquiry. The Wellcome Trust, for example, advocated criminal sanctions for misusing personal data, especially “unauthorised and unwarranted deliberate re-identification of individuals through big data technologies”.
74.The Association of Medical Research Charities was more cautious:
We need to bear in mind that we do not want to make a system that is already very risk-averse even more risk-averse … We must not leave out the education piece if we go down the criminal sanction route.
Minister Ed Vaizey also advised caution. When we asked him if there were any particular reasons why he might not introduce criminal penalties, he replied:
The fact that it is a criminal offence maliciously to access data with the intent, effectively, to misuse that data covers what perhaps the ordinary person in the street would regard as a criminal act. I would not want criminal legislation inadvertently to catch people who have been negligent, however much they might be condemned for their negligent behaviour in allowing your data to become available. We would have to think very hard, if we were to introduce criminal penalties, about what kind of behaviour they would catch.
Subsequently, the Minister wrote to us, saying:
The forthcoming [EU] General Data Protection Regulation will give us an opportunity to stress test the existing sanctions available in relation to the misuse of personal data to make sure they are fit for purpose for the digital age. In particular, we will review current penalties for data protection breaches and aim for sanctions that act as effective deterrents against the misuse of personal data in all contexts.
75.As citizens’ personal data is being used in ever increasing volumes and for ever changing purposes, it is vital that the Information Commissioner’s Office has the powers it needs to help ensure data protection. With a new EU data protection regulation now agreed (paragraph 83), we welcome the Government’s commitment to review current penalties for data protection breaches. The Government should nevertheless introduce as soon as possible a criminal penalty for serious data protection breaches by commencing sections 77 and 78 of the Criminal Justice and Immigration Act 2008. The Government should not regard the two-year implementation period of the recently agreed EU data protection regulation, which will provide for bigger fines
(paragraph 83), as a reason for delaying this.
77.The Government should set out its anonymisation strategy for big data in its upcoming Digital Strategy, including a clear funding commitment, a plan to engage industry with the work of the UK Anonymisation Network and core anonymisation priorities.
78.The Information Commissioner has concluded that, regardless of whether citizens feel threatened by big data, “they feel they have lost control over their personal information”. It is vitally important, therefore, that individuals give their informed consent to the use and sharing of their personal data—one of the ‘data protection principles’ in the Data Protection Act 1998 (paragraph 61). Our predecessor Committee concluded in 2014 that people’s ability to provide informed consent was undermined by the “opaque, literary style” of terms and conditions documents, which “renders them unsuitable for conveying an organisation’s intent for processing personal data to users”.
79.In our current inquiry, Chris Combemale from the Direct Marketing Association believed that a business’s desire to maintain its brand reputation provided a key incentive for good practice. James Meekings from Funding Circle, in a similar vein, emphasised the importance of following best practice and transparency in establishing a trustworthy brand. The Nuffield Council on Bioethics argued that data users had a responsibility to protect personal data that addressed the ‘re-use’ of data for purposes not envisaged when consent was originally obtained—another ‘data protection principle’ (paragraph 61):
Where a person providing data about themselves cannot foresee or comprehend the possible consequences of how their data will be available for linkage or re-use, consent at the time of data collection cannot, on its own, protect all of their interests. … The changing context and potential for data re-use means that compliance with the law is not enough to ensure a data initiative is ethically appropriate. Those who manage data initiatives therefore have a continuing duty to promote and protect the legitimate rights and interests of those who have provided data about themselves irrespective of the terms of any consent given.
80.While the specific duties of the Information Commissioner are set out in legislation, including those concerned with consent, Hetan Shah of the Royal Statistical Society saw a need for an oversight body to help ensure good practice with big data more generally:
Regulation is always lagging behind new technology and developments … What we are hoping for is that the Alan Turing Institute will take a lead in thinking through the ethics around big data. In the US there is a council of ethics on big data, and I wonder whether the UK needs something similar to take forward this agenda.
81.Good practice can be encouraged by acknowledging it through the use of kitemarks, as our predecessor Committee recommended. In their inquiry, the Government highlighted the work on the collection of personal information by the British Standards Institution and the Information Commissioner’s ‘privacy seal’ programme. The Information Commissioner told us in our current inquiry that:
The idea of a privacy seal is that it is beyond the ISO standard; it is something people can recognise as a good housekeeping seal of approval on sites that sign up to doing things properly, and are prepared to be audited for doing that. … I think that will give consumers a way of recognising that this is a serious player that understands privacy and is committed to looking after their data. That will give those companies a competitive advantage.
82.The Information Commissioner has developed a data protection kitemark, ready for use now. The use of such kitemarks, acknowledging good behaviours, would complement the greater sanctions of criminal penalties for bad behaviours that we have recommended. The Government and Information Commissioner should work with industry to ensure that the UK’s already developed kitemark is adopted as soon as possible, and initiate a campaign to raise public awareness of it.
83.The European Parliament, Commission and Council agreed a General Data Protection Regulation in December 2015. It will now require changes within the next two years to the UK’s Data Protection Act 1998, which transposed the existing 1995 EU Data Protection Directive. The Commission proposed the new Regulation, as part of a larger ‘Data Protection package’, in January 2012. The provisions in the EU Regulation include changes to, or restatements of, requirements in the existing Directive:
84.The Regulation has taken four years to be finalised between the three EU institutions under the trilogue ‘ordinary procedure’ negotiation process. In 2012 the House of Commons Justice Committee concluded that the proposed Regulation in its original prescriptive form would not produce a “proportionate, practicable, affordable or effective system of data protection”. The Council’s position, in putting forward its own revisions, was “broadly more pro-business than the European Parliament”.
85.The UK Government’s approach appears to have supported the Council’s position during the negotiations. The European Scrutiny Committee stated in December, before the Regulation was agreed, that it was “unconvinced” that the Government’s “unusual approach of supporting a [draft] text with ‘serious reservations’ would lead to greater negotiating influence over the text in trilogues [negotiations]”. These reservations were about the provisions dealing with ‘the right to be forgotten’, the ‘one-stop-shop mechanism, and the liability and sanctions faced by data controllers.
86.Ed Vaizey MP told the European Scrutiny Committee in 2015 that a strand of the UK’s negotiating approach on the Commission’s Digital Single Market package, would focus on
Encouraging innovation: We should support an approach that is light-touch and flexible enough to respond to rapid technological changes; in particular, we do not want regulation to close down innovation and the potential of fast-moving technologies such as big data and cloud computing.
87.The European Parliament had generally sought to strengthen the meaning of consent in the draft Regulation; the Council to weaken consent and to widen some grounds legitimising processing. For example, the provision on data minimisation was weakened by the Council’s draft which deleted the requirement that personal data “shall only be processed if, and as long as, the purposes could not be fulfilled by processing information that does not involve personal data”. The Council draft introduced data collection and processing exceptions for ‘statistical’, ‘scientific’ and ‘historical’ purposes—regimes “which might describe some big data operations”. The 1995 Data Protection Directive only addressed statistical and historical processing, and “only incidentally”. The European Parliament’s proposals started with a prohibition on processing unless it would satisfy a number of conditions; the Council’s version (that was subsequently included in the final Regulation) started from a position allowing personal data processing for scientific, statistical or historical purposes but subject to safeguards.
88.Ed Vaizey had told us that the sticking points in the negotiations were about “the level of burden on business. We do not want to place too many onerous reporting requirements on business. We want to make sure we get that balance absolutely right.” It appears from the recently agreed Regulation text that the Government’s and the European Council’s concerns have been met. A number of provisions allow data to be collected, retained or processed for “scientific research” purposes or more generally “in the public interest”. Article 5 stipulates, for example, that:
Personal data must be … collected for specified, explicit legitimate purposes and not further processed in a way incompatible with those purposes; [but] further processing of personal data for archiving purposes in the public interest, or scientific and historical research purposes or statistical purposes, shall … not be considered incompatible with the initial purposes …
Personal data may be stored for longer periods insofar as the data will be processed solely for archiving purposes in the public interest, or scientific and historical research purposes or statistical purposes.
89.The agreed Regulation allows processing of data if “necessary for the purposes of the legitimate interests pursued by the [data] controller”. More fundamentally, the Regulation includes a clause inserted by the Council that allows states to “maintain or introduce more specific provisions to adapt the application of the rules of this Regulation with regard to the processing of personal data” to meet a national legal obligations (such as those for investigatory powers) or to perform tasks carried out in the public interest.
90.There may be a debate still to be had about whether data collection and processing that satisfies ‘the public interest’ would include private sector activities. The Regulation leaves the term undefined.
91.During the negotiations of the Regulation there had been arguments over the nature of the consent that people would have to give. The European Council’s assessment of the final agreed Regulation concluded that “the way in which consent is to be given by data subjects remains ‘unambiguous’ for all processing of personal data, with the clarification that this requires a ‘clear affirmative action’, and that consent has to be ‘explicit’ for sensitive data.” Article 7 requires that the data controller will have to be able to demonstrate that consent was given by individuals to the processing of their personal data, and that seeking consent must be presented “in an intelligible and easily accessible form, using clear and plain language”.
92.There had been particular concerns about possible restrictions on the use of medical records for big data processing. Before the EU regulation was agreed, some witnesses were concerned that consent requirements might become too onerous. TechUK, for example, warned that:
It is important that the [Regulation] discussions do not result in the introduction of a narrow consent requirements that are not adaptable to citizens’ expectations nor to their online behaviour …. Such a move would risk ‘consent fatigue’ or worse ‘meaningless consent’ whereby overly burdensome requirements on consumers … could undermine the willingness of consumers to navigate preferences and understand how their data is being used.
93.The medical research community, in particular, was concerned that more stringent consent requirements would be extremely restrictive in a sector where data is often re-used and re-purposed as techniques develop:
When the regulations started out, it was clear that a separate case was made for the research piece, and then it got amended. We are very worried that, if it goes ahead, medical research will be damaged and become unworkable, which will not benefit us at patient/public level; nor does it help all the investment that has been made in this particular area. It will all be for nothing. It is a matter of great concern at the moment.
The Information Commissioner seemed unconvinced that exceptions should be made for medical research. He commented that:
The Information Commissioner is sufficiently imaginative to see the power of big data in medical research and in the delivery of health services … but we want to see things done in the right way so that people’s fundamental rights and privacy are not trashed in the name of some higher obligation to efficiency and the onward march of science.
94.The Minister told the European Scrutiny Committee, before the Regulation was agreed in December, that the Council draft text:
does not prevent the processing of NHS medical records data for research purposes … The European Parliament’s text however, would appear to significantly restrict processing for research purposes. The UK has been very clear that the position under [Council’s] General Approach must be preserved …
In the event, the final agreed text retained the ‘public interest’ and ‘scientific and historical research’ exemptions; meeting the minister’s (and, it would appear, the medical research community’s) concerns.
95.The EU Data Protection Regulation provides some safeguards where data processing profiles people according to particular characteristics. It essentially prohibits ‘profiling’ of people according to characteristics which would normally be discriminatory, except for ‘public interest’ or legal requirements. It allows someone to object to the processing of their personal data even when processed according to those conditions, and the data controller would only be able to continue to process the personal data if able “to demonstrate compelling legitimate grounds”. The agreed EU Regulation gives a “right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”, unless concerned with contracts or satisfying national laws.
96.The agreed EU Regulation includes provisions on ‘data portability’—allowing individuals to re-use their personal data. There is no equivalent in the 1995 Data Protection Directive. The Regulation stipulates that the data would have to be provided to the individual “in a structured and commonly used and machine-readable format, and [they would] have the right to transmit those data to another controller without hindrance”.
97.On data portability, the Government has already introduced the ‘midata’ initiative, under which a Current Account Switch Service has been established. Midata introduced a portable data format which allows consumers to use their own consumption or transaction history to compare products and services in the energy, finance and telecoms sectors. The Current Account Switch Service facilitates the automatic transfer of all credit and debit instructions associated with an account. Imran Gulamhuseinwala from EY consultants, a member of the steering committee of the Open Bank Working Group, told us that:
There is a broad feeling that midata has been a very interesting, robust first step in enabling consumers to understand that they have transaction level data; it belongs to them and they can also use it for their own benefit. Midata has a very narrow use case, which is about trying to shop around for the best current account. None the less, it feels that it is beginning to move in that direction.
98.Under the new Regulation these UK initiatives will acquire a statutory footing, with enforceable rights for individuals.
99.The new Regulation must be implemented within two years of it being formally published by the Commission—expected soon. We asked the minister, Ed Vaizey MP, whether in the meantime national data protection safeguards were sufficient. He felt that:
We can live with them as they are. … I do not think it would be sensible to have any kind of interim measures between our current regulations and the future regulation because that would be confusing for business. … You have to have a conversation and dialogue with business about the best way of implementing the regulation.
100.The Data Protection Act will have to be revised to accommodate the recently agreed EU Data Protection Regulation, which will come into force with the next two years or so. We do not share the Government’s view that current UK data protections can simply be left until then. Some areas in particular need to be addressed straightaway—introducing the Information Commissioner’s kitemark (paragraph 78), and introducing criminal penalties (paragraph 72) rather than relying only on the prospective greater fines envisaged by the new EU Regulation.
101.The new EU Regulation appears to leave it open for data to be re-used, and potentially de-anonymised, if “legitimate interests” or “public interest” considerations are invoked. This is an issue that urgently needs to be addressed as big data becomes increasingly a part of our lives. There are arguments on both sides of this issue: Seeking to balance the potential benefits of processing data (some collected many years before and no longer with a clear consent trail) and people’s justified privacy concerns will not be straightforward. It is unsatisfactory, however, for the matter to be left unaddressed by Government and without a clear public-policy position set out. The Government should therefore clarify its interpretation of the EU Regulation on the re-use and de-anonymisation of personal data, and after consultation introduce changes to the Data Protection Act 1998 as soon as possible to strike a transparent and appropriate balance between the benefits of processing data and respecting people’s privacy concerns.
102.Given the UK’s leading position in big data and the Government’s stated commitment to capitalise on the potential innovation and research opportunities it promises, the Government should establish a Council of Data Ethics within the Alan Turing Institute as a means of addressing the growing legal and ethical challenges associated with balancing privacy, anonymisation, security and public benefit. Ensuring that such a Council is established, with appropriate terms of reference, offers the clarity, stability and direction which has so far been lacking from the European debate on data issues.
110 Big Brother Watch, Com Res, (2015)
111 Direct Marketing Association ()
113 Digital Catapult, (2015)
114 Information Commissioners Office,
116 Information Commissioners Office (), para 27
118 Direct Marketing Association ()
119 Digital Catapult ()
120 Big Brother Watch ()
121 Q155; Information Commissioners Office ()
124 As reported by Big Brother Watch ()
125 Wellcome Trust ()
128 Information Commissioner oral evidence
129 , accessed February 2016
130 Big Brother Watch ()
133 Letter from Minister Ed Vaizey MP ()
135 Information Commissioners Office, , News item, 2 February 2015
138 Qq149, 152, 156, 169
141 Qq161, 176-178
142 Joint Committee on the Draft Communications Data Bill, Draft Communications Data Bill, HC (2012–13) 479, para 226; Home Affairs Select Committee, Private Investigators, HC (2010–12) 100, para 47; Justice Committee, The functions, powers and resources of the Information Commissioner, HC (2012–13) 962, para 39.
143 Wellcome Trust ()
146 Letter from Minister Ed Vaizey MP ()
148 Science and Technology Committee, , Fourth Report of Session 2014–15, HC 245, para 49
151 Nuffield Council on Bioethics ()
153 Science and Technology Committee, , Fourth Report of Session 2014–15, HC 245, para 69
154 Science and Technology Committee, , Fourth Report of Session 2014–15, HC 245, paras 46-47
156 European Commission, t, Press release, 15 December 2015
157 Implementation will be required within two years after the Regulation is formally published by the Commission.
159 Justice Committee, The Committee’s opinion on the European Union Data Protection framework proposals, HC (2012–13) 572, para 105, as reported also in European Scrutiny Committee, Fifth Report, 2015–16, HC 342v, para 5.2
160 Prof Lorna Woods ()
163 Prof Lorna Woods ()
164 Prof Lorna Woods ()
165 Prof Lorna Woods ()
166 Article 83; Prof Lorna Woods ()
168 Article 6(1)
170 Article 6(2a)
171 European Council, , 15 December 2015
173 Tech UK ()
177 Academy of Medical Sciences, , News item, 23 December 2015
178 Article 9
179 Article 19
180 Article 20
181 Prof Lorna Woods ()
182 Article 18
183 HM Treasury, , News item, 10 September 2013
184 BIS, , News item, 3 November 2011
185 BIS, , News item, 22 October 2015
Prepared 11 February 2016