203.The collection and use of personal data are key to the commercial success of many online platforms. BEUC, a Europe-wide consumer protection organisation, told us that “many of the mainstream consumer services provided by companies like Facebook and Google are based on ‘for free’ models, in which consumers are tracked and may be profiled as they surf the web in exchange for using the service.” Professor Ezrachi noted that “Data is the currency—the commodity—which provides us with ‘free’ access to many online services and products and an advanced Internet environment.”
204.David Alexander, Chief Executive of MyDex, said that it was possible to put a value on users’ data: “If you are trying to value a tech start-up, the value of data is calculated to a very fine precision—$720 per person, per year is Google’s estimate when they are talking to investors over time.”
205.The Information Commissioner’s Office (ICO) noted that the “collection and use of personal data is becoming more central to the business model of the online platforms, particularly to drive personalisation and tailored services, also linked to more sophisticated behavioural advertising.” Mr Cohen, from Google, provided confirmation: “our annual turnover last year, 2014, was $66 billion. We derived 89 per cent of that income from advertising.”
206.The way in which personal data use is regulated will soon undergo a major overhaul: the Data Protection Directive, which regulates the processing of personal data, will be replaced by the General Data Protection Regulation in 2018. This chapter asks to what extent these changes will address consumers’ concerns about the collection and use of their personal data by online platforms.
207.The way in which online platforms use consumers’ personal data to generate revenues from advertising is complex and opaque, contributing to low consumer trust in online platforms. The Commission said: “only 22% of individuals have full trust in service providers such as search engines, social networking sites and e-mail services”. Citizens Advice told the Committee of “general unease” among consumers about how their personal data are collected and used online: “A recent survey of consumers … found that 69% describe the way companies use their data as ‘creepy’”.
208.BEUC told us that “The misuse of personal data is perhaps the main source of concerns for consumers using platforms, particularly social networks. This is confirmed by recent data showing that 70% of EU consumers are worried about how their data is being collected and processed.”
209.Skyscanner described the way in which online platforms could collect personal data without users being aware of it. An example of passive collection “would be where a user passively provides data via their web browser (e.g., their IP address) or the cookies that are placed on their device by the online platform, or through their incidental use of the online platform (i.e. what sections of the website did they access, at which point did they exit the website).” The German Monopolies Commission noted that passively collected data were needed in order to facilitate the interaction between the user and the online platform. For instance: “when a website is visited, the IP address of the Internet connection is communicated, which makes it possible to approximately localise the user, but is also needed for communication between the web server and the web browser”.
210.At the same time, the Monopolies Commission recognised that passively collected data “may also reveal information on websites visited and on users’ interests on the Internet”. Such data may be “combined to form (anonymised) user profiles.”
211.The use of location tracking through online platforms, without consumers’ consent, raised particular concerns. Professor Ezrachi told the Committee that an application for smartphones called Brightest Flashlight provided users with a flashlight app for free, but said that what many “did not know was that that application was tracking your location all the time, even when you were not using it, and that information was being sold to third parties as part of harvesting.” Professor Tom Rodden, Director of the Horizon Digital Economy Research Institute, said that Facebook collected personal data through location tracking, which “was recently blamed as [the] possible cause for large power drain in iPhones.”
213.Skyscanner told us that the personal data collected by online platforms are used in large part to deliver their services:
“For many online platforms, the data collected will be used primarily to deliver the specific services or transactions being requested by the user (for example, the purchase of a product) or to better tailor those services for a particular user (for example, by recognising a users’ geographic location and tailoring the language which the service is provided in as a result).”
214.Online platforms also use personal data to make advertising more targeted. Michael Ross, Chief Executive of Dynamic Action, said when a consumer visited a retail website, the retailer would be “building profiles”, based on “what you click on, how you behave and what marketing you are looking at.” These profiles helped them “work out how they can target you with better offers and build a range [of services] that is more attractive to you.” He described this as “a fantastic thing”, because it helped retail businesses remain competitive. Steve Chester, from the Internet Advertising Bureau, said: “advertisers will not be able to see who that person is or any personal details, but they are actually looking at passion points and interests and being able to sell advertising based on interest levels.”
215.However, some online platforms also sell the personal data they collect to third parties. Demos and Ipsos MORI said that in a qualitative survey of about 1,250 people, “while the majority of respondents were aware that advertising is targeted using their social media data (57% said this currently happens) … six in ten (60%) respondents felt that social media data should not be shared with third parties as happens currently under existing terms and conditions of social media sites.”
216.Mr Chester noted that the use of personal data by online platforms is hugely complex:
“There is a whole supply chain here, from the advertiser who buys the advertising to the publisher who then sells it, which could be Facebook or Google, but there could be many businesses in between that transact various forms of data to make the advertising more targeted and relevant to the audience they are selling to. There may be just one broker, if you will, or there could be many in between, with each of them offering a different level of service.”
Dr Lynskey concluded that “Individuals are not data brokers”, and could not be expected to understand “the multitude of daily transactions which take place online.”
217.Consumers’ lack of awareness of how their personal data are collected and used means that there is limited competition between online platforms on the basis of privacy standards. The CMA said: “While, in theory, consumers should be able to discipline providers over the level of privacy or the extent to which data may be used … in practice, consumers may find it difficult because of a lack of awareness that data may be used for this purpose and/or the value of the data to the platforms.” The European Data Protection Supervisor, Mr Buttarelli, agreed that consumers “do not see privacy use as a barometer of product quality.”
218.For some, this lack of competition on the basis of privacy standards reflected the market power of some online platforms. The ICO was concerned about “how free people are to offer consent to use a market-dominant search engine, for example. Nobody has to use search engines or social media services, but in reality they are [the] first port of call for many who want to access or share Internet content.” Professor Zimmer went further and suggested that, where a platform had market power, “consent may be forced consent”. Dr Lynskey observed that “In reality, most content and services offered by online platforms are offered on a ‘take it or leave it’ basis.”
219. For Mr Chisholm, this lack of competition reflected the fact that consumers valued convenience over privacy: “if you look at the behaviour of consumers online, very often when given a choice between a bit more privacy and a bit more convenience, it is convenience that is chosen.” The fact that “relatively few consumers” had explored browser settings, privacy dashboards or opt-out tools and ad-blocking suggested “that it is not a very large concern for people.” Mr Ross said that “the vast majority of consumers are very happy with their online experience. They do not really care”. He added that if “you both understand and care, there is plenty you can do about it. You can use anonymous browsers and you can hide in a cave somewhere.”
220.Nonetheless, Mr Alexander said it was time for online platforms “to stop brushing the offer under the carpet … If somebody chooses to use a service like Hotmail, Gmail or any number of other services in which they receive value back—free services, free email, et cetera—the transparency of that offer is the issue.” The ICO agreed: “Platforms must find more effective means of explaining their complex information systems to ‘ordinary’ service users. This is important as transparency opens the way to the exercise of individuals’ rights, and choice and control over their personal data.” The CMA noted that “pressure on consumers is only set to increase. Developments such as the Internet of Things—like online devices we wear or carry and devices in the home or in our cars—will mean that data is collected and shared on a regular basis without the consumer having to make a conscious decision.”
221.Consumers agree to share their personal data with online platforms in exchange for access to their services. However, the complex ways in which online platforms collect and use personal data mean that the full extent of this agreement is not sufficiently understood by consumers. As a result, trust in how online platforms collect and use consumers’ data is worryingly low and there is little incentive for online platforms to compete on privacy standards. We believe this presents a barrier to future growth of the digital economy. Online platforms must be more effective in explaining the terms of such agreements to consumers.
222.The General Data Protection Regulation (GDPR), which was agreed on 15 December 2015, will substantially change how the collection and processing of personal data is regulated in the EU. The Commission noted that: “The existing Data Protection Directive (95/46/EC) was adopted in 1995 and, even if it remains sound as far as its objectives and principles are concerned, it has not kept pace with rapid technological and social developments in the digital world which have brought new challenges for the protection of personal data.” Steve Wood, from the ICO, confirmed that “there was a clear policy intention from the Commission at the outset, when it published the regulation … that it was designed to address some of the issues that have arisen because of the way online platforms collected personal data”. In particular, it addressed controversy over the Safe Harbour agreement, which meant that many US based online platforms could transfer personal data to the US and not be subject to EU data protection rules.
223.A description of the existing Data Protection Directive (implemented in domestic law by means of the Data Protection Act 1998) and the changes to be brought forward by the GDPR is given in Box 8.
Under the Data Protection Act 1998, personal data are defined as “data which [relate] to a living individual who can be identified (a) from the data, or (b) from those data and other information which is in the possession of, or is likely to be in possession of, the data controller.”The GDPR will extend this definition to include data which are collected through online identifiers provided by their devices, applications, tools and protocols, such as Internet Protocol addresses and cookie identifiers.
Both the Data Protection Act 1998 and the GDPR require personal data to be processed “fairly and lawfully”, which normally requires the data controller to have obtained the data subject’s “informed and freely given” consent. The GDPR will change the requirement to clear consent, which is considered to be given by a clear affirmative action which establishes a freely given, specific, informed and unambiguous indication of the data subject’s agreement to personal data relating to him or her being processed.
The GDPR will strengthen data subject rights through provisions mandating the portability of personal data and the right to erasure (‘the right to be forgotten’).
Whereas the Data Protection Directive only applied to data controllers established in the EU, the GDPR will apply to data controllers and processors established outside the EU, if their data processing activities relate to EU data subjects.
224.As outlined in Box 8, the GDPR will widen the definition of personal data to include online identifiers, device identifiers, cookie IDs and IP addresses. In this way, the Regulation encompasses aspects of the e-Privacy Directive (2002/58/EC), which was amended in 2009 to take into account data collection through cookies, traffic and location data..
226.The ICO agreed that requesting consumers to consent to the collection of such data did “raise practical issues, as data collection increases, for example through connected devices in the Internet of things, as to how often users can be expected to interact with transparency and consent mechanisms.” The ICO was “supportive of a risk based approach that would expect higher standards of transparency and more powerful choice mechanisms where the information is particularly sensitive, or where its use may be unexpected or particularly personally … objectionable”.
227.Other witnesses asked how this part of the Regulation related to the e-Privacy Directive, and how it would be applied to online platforms. Mr Buttarelli said the existing e-Privacy Directive focused “more on standard telecom providers and electronic communication services”, and was hard to apply to platforms. The ICO said the extent to which the e-Privacy and Data Protection Directive applied to online platforms had “been a contentious issue for many years and some online platform providers have argued that they are only processing ‘pseudonymous personal data’—and should be subject to light touch regulation.” However, the ICO considered “that search engines are data controllers and are processing personal data when, for example, they deliver name-based search results.” The ICO agreed that “passively-collected information can identify data subjects”.
229.Nonetheless, given the limitations of the consent-based model, and industry’s reluctance to make the mechanisms of consent more meaningful, we are concerned that the provisions that widen the definition of ‘personal data’ will be difficult to apply in practice. We recommend that the Commission investigate how the requirement for all businesses to seek consent for the collection of personal data through online identifiers, device identifiers, cookie IDs and IP addresses can be applied to online platforms in a practical and risk-based way.
230.At present, most online platforms communicate information about how they collect and use personal data through privacy notices. The evidence suggested that few consumers read or fully understand privacy notices, which are normally embedded within a company’s ‘Terms and Conditions’. Citizens Advice said “approximately only a third of consumers’ report that they read terms and conditions”, but that “actually people are likely to be over-claiming”—according to the evidence of “actual time spent reading terms and conditions … the figure appears closer to 1%.” The German Monopolies Commission confirmed that “the collection of personal data without users’ explicit consent is likely to be not the exception, but in fact the rule.”
231.One problem with privacy notices is their length. Steve Wood, from the ICO, described many privacy notices as being “longer than Hamlet”, while Professor Rodden said they were “as long as Othello” and Mr Alexander said they were “longer than the Declaration of Independence”.
232.They are, though, much less readable. Professor Rodden highlighted research undertaken by Research Councils UK showing that the language of privacy notices was “overly complex and difficult to read”, and that they were “written to be understood and used in [a] US court rather than by ordinary consumers.” A Eurobarometer survey found that, of those who did not fully read privacy statements, 67% found them too long, while 38% found them unclear or difficult to understand.
233.The ICO, the European Data Protection Supervisor and the CMA all said that online platforms had to improve the transparency of their privacy notices. The Minister, the Rt. Hon. Ed Vaizey MP, agreed: “You get these very complex terms and conditions. I signed up to some this morning, to an unnamed provider, on my tablet in order to update my software—I do not have a clue what I signed up to. People have to be told, partly by government and partly by consumer rights organisations”.
234.Mr Buttarelli told the Committee that the GDPR would ensure that “the quality of notices in the new framework will be verifiable by regulators”, who would be able to object to unclear notices. The ICO also said the GDPR would “open the possibility of stronger sanctions for the breach of the transparency provisions.” The GDPR provides for a maximum fine of €20 million or 4% of annual turnover in cases where an online platform fails to obtain explicit consent.
235.In order to address concerns about the length and accessibility of privacy notices, Professor Rodden recommended that privacy notices should be “supported by kite-marks”, to identify online platforms meeting EU standards on the handling and processing of personal data. Kite-marks would provide a visual symbol for consumers to quickly understand the implication of any agreement they may make regarding data protection when engaging with an online platform. Kite-marks have also been recommended by the House of Commons Science and Technology Committee in its report on Responsible Use of Data. In order to create an incentive to foster competition, rather than just compliance, on the basis of privacy standards, such kite-marks should include a graded scale indicating levels of data protection, similar to the traffic light system used in labelling for food products.
236.The ICO said that the GDPR incorporated provisions for Data Protection Authorities to support privacy seal schemes or stamps of approval to demonstrate good privacy practices, “as a way of demonstrating data protection compliance”, and that the Information Commissioner was “developing a privacy seal programme that will enable data controllers to apply for a seal.” They said that this would work by allowing third party scheme operators to apply to the Information Commissioner for an endorsement that would enable them to use the seal. The Information Commissioner launched a call for applications in 2015 and expects the first scheme to be formally launched sometime in 2016.
237.The privacy notices used by online platforms are inaccessible to the average consumer. They are too long and expressed in complex language. While the General Data Protection Regulation will require more transparency in privacy notices, and introduce heftier fines for non-compliance, this alone may not be sufficient to make consumers understand the value of their data when transacting with online platforms.
238.We support provisions within the General Data Protection Regulation to allow organisations to use privacy seals, or kite-marks, to give consumers confidence that they comply with data protection rules.
239.In order to encourage competition on privacy standards, not just compliance with the law, we recommend that the Government and the Information Commissioner’s Office work with the European Commission to develop a kite-mark or privacy seal that incorporates a graded scale or traffic light system, similar to that used in food labelling, which can be used on all websites and applications that collect and process the personal data of EU citizens.
240.While there is a growing acceptance that kite-marks or the development of other standards will be necessary to incentivise competition on privacy, we suggest that equivalent action should also be taken to communicate abuse of data protection rules to users more clearly. Professor Ezrachi told us that, although Google argued that it must “maintain quality because it is such a competitive market that it will lose its position”, that was not obviously the case when it came to data protection. He continued: “there have been cases where it has had to pay fines in the US for misleading users over privacy and use of data. That did not affect its dominance. Unfortunately, we are not very sophisticated users when we use those websites; we just click something and assume that everything will be okay.”
241.Moreover, reputation is critically important to online platforms, so requiring platforms to communicate this information directly to their users through the platform itself would potentially be an effective way to deter abuse. A range of witnesses elaborated on the importance of reputation in these markets. Mr Berthet, from the French Digital Council, said: “As the information society grows, trust and reputation become a bigger part of the equation. When competition is supposedly just a click away, reputation is very important for online platforms.” Mr Freeman, from the CMA, also suggested that “commercial reputation … exercises constraint” on platforms. The Information Technology and Innovation Foundation (ITIF) suggested that platforms were effectively “regulated by market competition and public reputation”.
242.To discourage misuse of users’ personal data, we recommend that the European Commission reserve powers to require online platforms that are found to have breached EU data protection standards, or to have breached competition law by degrading privacy standards, to communicate this information clearly and directly to all of their users within the EU through notifications on their web-sites and mobile applications. We suggest that this power be used sparingly, for repeat offenders or particularly egregious breaches of the law.
243.The Commission said the GDPR would “equip individuals with a new set of rights fit for the digital age, such as the ‘right to be forgotten’, the right to data portability and the right to be notified when the security of personal data is breached.” These provisions respond to demands from consumers to have more control over their personal data. Citizens Advice said previous research from Demos in 2012 found that 70% of consumers “would be more willing to share data if they had the ability to withdraw it and see what data was held on them.” In its report, The commercial use of consumer data, the CMA recommended more “control over how the data is used subsequently—so that consumers can manage the data they are sharing and choose how much, if any, data to share.”
244.Mr Alexander told us that data portability was particularly important in order to drive competition and enable consumers to switch to different providers: “if individuals themselves do not have the ability to move those around to different platforms, so that they can apply and share their browsing experience or their purchasing history with other platforms—it makes it incredibly hard for them to find new service providers.” Professor Zimmer described a situation in which an acquaintance’s phone was damaged when he rescued a child from a swimming pool, and he subsequently “wanted to switch to another phone brand using different software”. He therefore wanted “to get to his backup address book and to take the addresses with him. That was denied to him, so he bought another phone from the same producer.”
245.Google said they supported “portability of data to promote user choice and switching” through their Google Takeout service which “allows users to download their data … and move them across to another service.” Jon Steinberg, of Google, said: “we know that millions of users have used this service in the past year.”
246.However, data portability raises complex questions about data ownership. Mr Chisholm said that when “getting your phone record from your service provider you would not think twice about your right to be able to ask for and get it. You would think, ‘It’s my data and I should be able to get it’”. It was more complicated dealing with an online marketplace, because “your online reputation and profile are not just your data; that data reflects a lot of other user feedback on you that has been generated solely through that marketplace and by others … the sense that that is your data to be able to take somewhere else becomes more arguable.” Mr Alexander agreed that data portability was “a minefield, particularly with things like Facebook’s download, where you are downloading posts and comments made by other people”.
247.For data portability to work in practice the data need to be downloaded in a standardised and reusable format. In relation to Google’s TakeOut service, Mr Alexander told us: “At best, the terms and conditions allow you to download data on to a hard disk. I cannot imagine that 95% of the population is even vaguely interested in downloading into a CSV file format on to a hard disk.” However, Mr Steinberg said that creating a standard for sharing data risked creating barriers for new platforms entering the market: “We … want new service providers to be able to come up with new ideas, new forms of technology, and new forms of innovation and business models that should not be hampered unnecessarily by a requirement from the beginning to be able to integrate very easily with pre-existing technology.”
248.Data portability could be one of the most significant changes brought in under the General Data Protection Regulation. It could promote quality-based competition and innovation by making it easier for consumers to switch platforms. This would facilitate the emergence of new market entrants.
249.However, we are concerned that the principle of data portability may unravel in practice. If applied too rigidly, it could place onerous obligations on emerging businesses; however, unless it is more clearly defined, it is unlikely that it will be implemented by many online platforms.
250.We recommend that the Commission publish guidelines explaining how data portability requirements apply to different types of online platform. These guidelines should match data portability requirements to different types of online platform, adopting a proportionate approach depending on the essentiality of the service in question.
251.Mr French, from the Digital Catapult, noted that online platforms “carry out research using personal data into the effects of their services on individuals’ behaviours and habits”. In so doing, and regardless of the impact upon consumers, “the online platform has total autonomy over the purposes and means and no obligation of transparency.”
252.Joe McNamee, from European Digital Rights (EDRi), referred to an experiment Facebook conducted for one week in 2012, which altered users’ news feeds to see how the change affected their mood. Researchers studied whether positive or negative words in messages read by users determined whether they then posted positive or negative content in their status updates. Mr McNamee told us that “Facebook did this on the basis of a phrase in its 9,000-plus-word terms of service that states the company can use the data for research purposes.” Dr Koene mentioned another Facebook experiment, during the 2012 US presidential election, “which showed that people who had been notified when their friends mentioned that they’d just voted were significantly more likely to have also voted during the election.”
253.Demos and Ipsos MORI recommended the urgent introduction of new guidelines for the use of social media data for research:
“Government could play a larger role in helping to incentivise companies and institutions to develop and adopt appropriate industry standard regulation … [by] encouraging wider membership of the Market Research Society, encouraging more binding expectations through the Information Commissioner, or establishing a task force for developing government approved best practice for social media research.”
254.The use of personal data as the basis of research, particularly on social media, goes beyond what most users would ordinarily expect or consider acceptable. We recommend that the Government and Information Commissioner’s Office publish guidelines in the next 12 months setting out best practice for research using personal data gathered through social media platforms.
255.Yahoo expressed concern about the impact of the GDPR on the competitiveness of European digital firms, telling us that “expanding the definition of personal data”, and “narrowing the permitted legal bases for lawful processing”, as well as introducing “an enhanced role for national data protection authorities … and vastly increased sanction powers”, could lead to “a significant (and rather more negative) impact on digital investment in the EU.” The Information Technology and Innovation Foundation (ITIF) said: “It is likely that the proposed General Data Protection Regulation will, if enacted and implemented, dramatically further reduce the competitiveness of the European digital economy”.
256.Regulators, on the other hand, were clear that success of the GDPR depended on industry taking more responsibility. Mr Buttarelli said the GDPR would lead to a shift from “a basic system articulated on a to-do list, where I check what I should do in terms of privacy”, to asking the data controller “to translate into practice existing principles, to allocate responsibilities, to better define roles and to document and demonstrate that I am proactive on the data protection policy”. Mr Chisholm noted that data controllers would have to engage in “an active conversation with consumers … an ongoing dialogue.”
257.Mr Buttarelli suggested that these changes had yet to be fully understood by industry: “As far as I know, a message has not been passed to designers and developers … that ‘privacy by design’ and ‘privacy by default’ are not simply recommendations but legal requirements … They still believe, as was the case in 1995, that there is a space for last-minute changes to water down the existing safeguards.”
258.In the past, online platforms established outside the EU were not subject to European data protection rules. This resulted in a weak data protection regime in which European citizens’ fundamental rights were breached, and reduced consumer trust in how online platforms collect and process personal data. We are therefore concerned that industry remains sceptical about the forthcoming General Data Protection Regulation. Online platforms must accept that the Regulation will apply to them and will be enforced, and prepare to make the necessary adaptations.
296 Written evidence from BEUC ()
297 Written evidence from Professor Ariel Ezrachi and Professor Maurice Stucke ()
298 (David Alexander)
299 Supplementary written evidence from the Information Commissioner’s Office ()
300 (Adam Cohen)
301 Directive 95/46/EC on the protection of individuals with regard to the processing of personal data and on the free movement of such data, (, 23 November 1995)
302 Commission Staff Working Document, A Digital Single Market for Europe : Analysis and Evidence, p 45
303 Written evidence from Citizens Advice ()
304 Written evidence from BEUC ()
305 Written evidence from Skyscanner Limited ()
306 Written evidence from Monopolkommission ()
307 Written evidence from Monopolkommission ()
308 (Professor Ariel Ezrachi)
309 Written evidence from Professor Tom Rodden ()
310 Written evidence from Professor Tom Rodden ()
311 Written evidence from Professor Eric Clemons ()
312 Written evidence from Skyscanner Limited ()
313 (Michael Ross)
314 (Steve Chester)
315 Written evidence from the Centre for the Analysis of Social Media (CASM) at Demos and Ipsos MORI ()
316 (Steve Chester)
317 Written evidence from Dr Orla Lynskey ()
318 Written evidence from the Competition and Markets Authority ()
319 (Giovanni Buttarelli)
320 Supplementary written evidence from the Information Commissioner’s Office ()
321 (Professor Daniel Zimmer)
322 Written evidence from Dr Orla Lynskey ()
323 (Alex Chisholm)
324 (Michael Ross)
325 (David Alexander)
326 Written evidence from the Information Commissioner’s Office ()
327 Written evidence from the Competition and Markets Authority ()
328 Commission Staff Working Document, A Digital Single Market for Europe: Analysis and Evidence, p 46
329 (Steve Wood)
330 Information Commissioner’s Office, Guide to Data Protection (February 2016) p 4: [accessed 3 March 2016]
331 Bird and Bird, ‘Agreement on general data protection regulation’ (18 December 2015): [accessed 24 March 2016]
332 Bird and Bird, ‘Agreement on general data protection regulation’ (18 December 2015): [accessed 24 March 2016]
333 Bird and Bird, ‘What is to be done with the e-Privacy Directive?: Part 2’ (23 November 2015): [accessed 3 March 2016]
334 (Michael Ross)
335 Supplementary written evidence from the Information Commissioner’s Office ()
336 (Giovanni Buttarelli)
337 Written evidence from the Information Commissioner’s Office ()
338 Supplementary written evidence from the Information Commissioner’s Office ()
339 Written evidence from Citizens Advice ()
340 Written evidence from Monopolkommission ()
341 (Steve Wood)
342 Written evidence from Professor Tom Rodden ()
343 (David Alexander)
344 Written evidence from Professor Tom Rodden ()
345 European Commission, Special Barometer 43, Data Protection Report (June 2015): [accessed 14 March 2016]
346 (Steve Wood), (Giovanni Buttarelli), (Jason Freeman)
347 (Ed Vaizey MP)
348 (Giovanni Buttarelli)
349 Supplementary written evidence from the Information Commissioner’s Office ()
350 Written evidence from Professor Tom Rodden ()
351 Science and Technology Committee, (Fourth Report Session 2014–15 HC 245)
352 Written evidence from the Information Commissioner’s Office ()
353 (Professor Ariel Ezrachi)
354 (Charly Berthet)
355 (Jason Freeman)
356 Written evidence from the Information Technology and Innovation Foundation ()
357 Communication from the Commission, A Digital Single Market Strategy for Europe, p 11
358 Written evidence from Citizens Advice ()
359 Written evidence from the Competition and Markets Authority ()
360 (David Alexander)
361 (Professor Daniel Zimmer)
362 Written evidence from Google Inc. ()
363 (Jon Steinberg)
364 (Alex Chisholm)
365 (David Alexander)
366 (David Alexander)
367 (Jon Steinberg)
368 Written evidence from Richard French ()
369 (Joe McNamee)
370 Written evidence from Dr Ansgar Koene ()
371 Written evidence from Centre for Analysis of Social Media (CASM) at Demos and Ipsos MORI ()
372 Written evidence from Yahoo ()
373 Written evidence from the Information Technology and Innovation Foundation ()
374 (Giovanni Buttarelli)
375 (Alex Chisholm)
376 (Giovanni Buttarelli)