53.The ICO published its report, “Investigation into the use of data analytics in political campaigns” on 6 November 2018, the same day that the Information Commissioner and the Deputy Information Commissioner appeared before the Committee. The report was an update on its investigation into the use of data analytics for political purposes, which started in May 2017. It states that it “had little idea of what was to come. Eighteen months later, multiple jurisdictions are struggling to retain fundamental democratic principles in the fact of opaque digital technologies” and the report went on to reveal the extent of the illegal practices that took place during this time:
We have uncovered a disturbing disregard for voters’ personal privacy. Social media platforms, political parties, data brokers and credit reference agencies have started to question their own processes—sending ripples through the big data eco-system. We have used the full range of our investigative powers and where there have been breaches of the law, we have acted. We have issued monetary penalties and enforcement notices ordering companies to comply with the law. We have instigated criminal proceedings and referred issues to other regulators and law enforcement agencies as appropriate. And, where we have found no evidence of illegality, we have shared those findings openly. Our investigation uncovered significant issues, negligence and contraventions of the law.
54.This Chapter will build on data issues explored in our Interim Report, updating on progress where there has been resolution, and making recommendations to the Government, to ensure that such malpractice is tackled effectively in the future. As Elizabeth Denham told us when she gave evidence in November 2018, “This is a time for a pause to look at codes, to look at the practices of social media companies, to take action where they have broken the law”.
55.We shall also focus on the Facebook documents dated between 2011 and 2015, which were provided to a Californian court by Facebook, under seal, as part of a US app developer’s lawsuit. The Committee ordered the provision of these documents from an individual in the UK on 19 November 2018 and we published them, in part, on 5 December 2018. We took this unusual step because we believed this information to be in the public interest, including to regulators, which it proved to be.
56.The ICO wrote in its report of November 2018 that it is in the process of referring issues about Facebook’s targeting functions and techniques used “to monitor individuals’ browsing habits, interactions and behaviour across the internet and different devices to the Irish Data Protection Commission, as the lead supervisory authority for Facebook under the General Data Protection Regulation (GDPR).
57.On 25 October 2018, the ICO imposed the maximum penalty possible at the time—£500,000—on Facebook under the UK’s previous data protection law (prior to the introduction, in May 2018, of the GDPR), for lack of transparency and security issues relating to the harvesting of data, in contravention of the first and seventh data protection principles of the Data Protection Act 1998. Facebook has since appealed against the fine on the grounds that the ICO had not found evidence that UK users’ personal data had actually been shared. However, the Information Commissioner told us that the ICO’s fine was not about whether UK users’ data was shared. Instead:
We fined Facebook because it allowed applications and application developers to harvest the personal information of its customers who had not given their informed consent—think of friends, and friends of friends—and then Facebook failed to keep the information safe. […] It is not a case of no harm, no foul. Companies are responsible for proactively protecting personal information and that’s been the case in the UK for thirty years. […] Facebook broke data protection law, and it is disingenuous for Facebook to compare that to email forwarding, because that is not what it is about; it is about the release of users’ profile information without their knowledge and consent.
58.Elizabeth Denham told the Committee that the ICO “found their business practices and the way applications interact with data on the platform to have contravened data protection law. That is a big statement and a big finding”. In oral evidence, Elizabeth Denham said that Facebook does not view the rulings from the federal privacy commissioner in Canada or the Irish ICO as anything more than advice. She said that, from the evidence that Richard Allan, Vice President of Policy Solutions at Facebook, had given, she thought “that unless there is a legal order compelling a change in their business model and their practice, they are not going to do it”.
59.GDPR fines, introduced on 25 May 2018, are much higher than the £500,000 maximum specified in the Data Protection Act 1998. The new regulation includes provision for administrative fines of up to 4% of annual global turnover or €20 million, whichever is the greater. In the fourth quarter of 2018, Facebook’s revenue rose 30% from a year earlier to $16.9 billion and its profits increased by 61% to $6.9 billion, showing the scope for much greater fines in the future.
60.Our Interim Report described the “complex web of relationships” within what started out as the SCL (Strategic Communications Laboratories) group of companies, of which Cambridge Analytica was a part. The SCL Group went into administration in April 2018. The ICO’s latest Report, published in November 2018, commented on its investigation into Cambridge Analytica. At that stage, the ICO had:
61.On 9 January 2019, SCL Elections Ltd was fined £15,000 for failing to comply with the enforcement notice issued by the ICO in May 2018, relating to David Carroll’s Subject Access Request. The company pleaded guilty, through its administrators, to breaching Section 47(1) of the Data Protection Act 1998 (again, the fine was under the old legislation, not under the GDPR). Hendon Magistrates’ Court also ordered the company to pay £6,000 costs and a victim surcharge of £170. In reaction, the Information Commissioner, Elizabeth Denham, made the following public statement:
This prosecution, the first against Cambridge Analytica, is a warning that there are consequences for ignoring the law. Wherever you live in the world, if your data is being processed by a UK company, UK data protection laws apply. Organisations that handle personal data must respect people’s legal privacy rights. Where that does not happen and companies ignore ICO enforcement notices, we will take action.
62.We were keen to know when and which people working at Facebook first knew about the GSR/Cambridge Analytica breach. The ICO confirmed, in correspondence with the Committee, that three “senior managers” were involved in email exchanges earlier in 2015 concerning the GSR breach before December 2015, when it was first reported by The Guardian. At the request of the ICO, we have agreed to keep the names confidential, but it would seem that this important information was not shared with the most senior executives at Facebook, leading us to ask why this was the case.
63.The scale and importance of the GSR/Cambridge Analytica breach was such that its occurrence should have been referred to Mark Zuckerberg as its CEO immediately. The fact that it was not is evidence that Facebook did not treat the breach with the seriousness it merited. It was a profound failure of governance within Facebook that its CEO did not know what was going on, the company now maintains, until the issue became public to us all in 2018. The incident displays the fundamental weakness of Facebook in managing its responsibilities to the people whose data is used for its own commercial interests.
64.When Richard Allan, Vice President of Policy Solutions at Facebook, gave evidence in November 2018, he told us that “our intention is that you should not be surprised by the way your data is used […] It is not a good outcome for us if you are”. Yet, time and again, this Committee and the general public have been surprised by the porous nature of Facebook data security protocols and the extent to which users’ personal data has been shared in the past and continues to be shared today. The scale of this data sharing risks being massively increased, given the news that, by early 2020, Facebook is planning to integrate the technical infrastructure of Messenger, Instagram and WhatsApp, which, between them, have more than 2.6 billion users.
65.The Federal Trade Commission Consent Decree of 2011 is an example of the way in which Facebook’s security protocols and practices do not always align. In November 2009, Facebook users had a ‘central privacy page’, with the Facebook text stating: “Control who can see your profile and personal information”. A user’s profile and personal information might include: name; gender; email address; birthday; profile picture; hometown; relationship information; political and religious views; likes and interests; education and work; a Friends list; photos and videos; and messages.
66.In November 2011, the US Federal Trade Commission (FTC) made a complaint against Facebook on the basis that Facebook had, from May 2007 to July 2010, allowed external app developers unrestricted access information about Facebook users’ personal profile and related information, despite the fact that Facebook had informed users that platform apps “will access only the profile information these applications need to operate”. The FTC complaint lists several examples of Facebook making promises to its users that were not kept:
67.Under the settlement, Facebook agreed to obtain consent from users before sharing their data with third parties. The settlement also required Facebook to establish a “comprehensive privacy program” to protect users’ data and to have independent, third-party audits every two years for the following 20 years to certify that it has a privacy programme that meets or exceeds the requirement of the FTC order.
68.When Richard Allan was asked at what point Facebook had made such changes to its own systems, to prevent developers from receiving information (which resulted in circumventing Facebook users’ own privacy settings), he replied that the change had happened in 2014:
The FTC objected to the idea that data may have been accessed from Facebook without consent and without permission. We were confident that the controls we implemented constituted consent and permission—others would contest that, but we believed we had controls in place that did that and that covered us for that period up to 2014”.
Richard Allan was referring here to the change from Version 1 of Facebook’s Application Programming Interface (API) to its more restrictive Version 2.
69.In reply to a question as to whether CEO Mark Zuckerberg knew that Facebook continued to allow developers access to that information, after the agreement, Richard Allan replied that Mr Zuckerberg and “all of us” knew that the platform continued to allow access to information. As to whether that was in violation of the FTC Consent Decree (and over two years after Facebook had agreed to it), he told us that “as long as we had the correct controls in place, that was not seen as being anything that was inconsistent with the FTC consent order”.
70.Richard Allan was referring to Count 1 of the Federal Trade Commission’s complaint of 2011, which states that Facebook’s claim that the correct controls were in place was misleading:
Facebook has represented, expressly or by implication, that, through their Profile Privacy Settings, users can restrict access to their profile information to specific groups, such as “Only Friends” or “Friends of Friends.” In truth and in fact, in many instances, users could not restrict access to their profile information to specific groups, such as “Only Friends” or “Friends of Friends” through their Profile Privacy Settings. Instead, such information could be accessed by Platform Applications that their Friends used.
71.Richard Allan’s argument was that, while Facebook continued to allow the same data access—highlighted in the first count of the FTC’s complaint and of which the CEO, Mark Zuckerberg, was also aware—that was acceptable due to the fact that Facebook had supposedly put “controls” in place that constituted consent and permission.
72.Ashkan Soltani, an independent researcher and consultant, was then a primary technologist at the Federal Trade Commission, worked on the Facebook investigation in 2010 to 2011 and became the Chief Technologist at the FTC in 2014. Before our Committee, he questioned Richard Allan’s evidence:
Mr Allan corrected one of the comments from you all, specifically that apps in Version 1 of the API did not have unfiltered access to personal information. In fact, that is false. In the 2011 FTC settlement, the FTC alleged that if a user had an app installed, it had access to nearly all of the user’s profile information, even if that information was set to private. I think there is some sleight of hand with regards to V1, but this was early V1 and I believe it was only addressed after the settlement.
73.Mr Soltani clarified the timeline of events:
The timelines vary, but this—in my opinion—was V1, if they are considering the changes in 2014 as V2. In short, I found that time and time again Facebook allows developers to access personal information of users and their friends, in contrast to their privacy settings and their policy statements.
74.Richard Allan did not specify what controls had been put in place by Facebook, but they did not prevent app developers, who were not authorised by a user, from accessing data that the user had specified should not to be shared (beyond a small group of friends on the privacy settings page). The FTC complaint took issue with both the fact that apps had unfettered access to users’ information, and that the privacy controls that Facebook represented as allowing users to control who saw their personal information were, in fact, inconsequential with regards to information to which the apps had access.
75.There was public outcry in March 2018, when the Cambridge Analytica data scandal was revealed and the vast majority of Facebook users had no idea that their data was able to be accessed by developers unknown to them, despite the fact that they had set privacy settings, specifically disallowing the practice. Richard Allan also admitted to us that people might indeed take issue with Facebook’s position: “we were confident that the controls we implemented constituted consent and permission—others would contest that”. He seemed to justify Facebook’s continued allowance of data access by app developers, by stating that the users had given their consent to this data access. The fact that Facebook continued to allow this access after the Consent Decree is not new information; the new information is the admission by Richard Allan that the CEO and senior management— “all of us”—knew that Facebook was continuing to allow the practice to occur, despite the public statements about its change of policy. That, people might well contest, constituted deceit and we would agree with them. Ashkan Soltani told us that he believed that Facebook is in violation of the Consent Decree and the FTC has publicly confirmed that it is investigating the company.
76.The Cambridge Analytica scandal was faciliated by Facebook’s policies. If it had fully complied with the FTC settlement, it would not have happened. The US Federal Trade Commission (FTC) Complaint of 2011 ruled against Facebook—for not protecting users’ data and for letting app developers gain as much access to user data as they liked, without restraint—and stated that Facebook built their company in a way that made data abuses easy. When asked about Facebook’s failure to act on the FTC’s complaint, Elizabeth Denham, the Information Commissioner, told us: “I am very disappointed that Facebook, being such an innovative company, could not have put more focus, attention and resources into protecting people’s data”. We are equally disappointed.
78.The published ‘corrected memorandum of points and authorities to defendants’ special motions to strike’, by the complainant in the case, the US-based app developer Six4Three, describes the allegations against Facebook; that Facebook used its users’ data to persuade app developers to create platforms on its system, by promising access to users’ data, including access to data of users’ friends. The case also alleges that those developers that became successful were targeted and ordered to pay money to Facebook. If apps became too successful, Facebook is alleged to have removed the access of data to those apps, thereby starving them of the information they needed to succeed. Six4Three lodged its original case in 2015, after Facebook removed developers’ access to friends’ data, including its own.
79.The DCMS Committee took the unusual, but lawful, step of obtaining these documents, which spanned between 2012 and 2014, even though they were sealed under a court order at the San Mateo Court, as we believed strongly that the documents related specifically to the serious issues of data privacy that we have been exploring for the past 18 months. The Committee received the documents after issuing an order for their delivery to Ted Kramer, the founder of Six4Three, whilst he was visiting London on a business trip in November 2018. Mr Kramer complied with the Committee’s order, rather than risk being found to be in contempt of Parliament. Since we published these sealed documents, on 14 January 2019 another court agreed to unseal 135 pages of internal Facebook memos, strategies and employee emails from between 2012 and 2014, connected with Facebook’s inappropriate profiting from business transactions with children. A New York Times investigation published in December 2018 based on internal Facebook documents also revealed that the company had offered preferential access to users data to other major technology companies, including Microsoft, Amazon and Spotify.
80.We believed that our publishing the documents was in the public interest and would also be of interest to regulatory bodies, in particular the ICO and the FTC. In evidence, indeed, both the UK Information Commissioner and Ashkan Soltani, formerly of the FTC, said it would be. We published 250 pages of evidence selected from the documents on 5 December 2018 and at the same time as this Report’s publication, we shall be publishing more evidence. The documents highlight Facebook’s aggressive action against certain apps, including denying them access to data that they were originally promised. They highlight the link between friends’ data and the financial value of the developers’ relationship with Facebook. The main issues concern: ‘white lists’; the value of friends’ data; reciprocity; the sharing of data of users owning Android phones; and Facebook’s targeting of competition.
81.Facebook entered into ‘whitelisting agreements’ with certain companies, which meant that, after the platform changes in 2014/15, those companies maintained full access to friends’ data. It is not fully clear that there was any user consent for this, nor precisely how Facebook decided which companies should be whitelisted or not.
82.When asked about user privacy settings and data access, Richard Allan consistently said that there were controls in place to limit data access, and that people were aware of how the data was being used. He said that Facebook was confident that the controls implemented constituted consent and permission. He did admit that “there are very valid questions about how well people understand the controls and whether they are too complex,” but said that privacy settings could not be overridden. Finally, he stated that: “Our intention is that you should not be surprised by the way your data is used. Our intention is that it is clear and that you are not surprised. It is not a good outcome for us if you are”.
83.Ashkan Soltani rejected this claim, saying that up until 2012, platform controls did not exist, and privacy controls did not apply to apps. So even if a user set their profile to private, installed apps would still be able to access information. After 2012, Facebook added platform controls and made privacy controls applicable to apps. However, there were ‘whitelisted’ apps that could still access user data without permission and which, according to Ashkan Soltani, could access friends’ data for nearly a decade before that time. Apps were able to circumvent users’ privacy of platform settings and access friends’ information, even when the user disabled the Platform. This was an example of Facebook’s business model driving privacy violations.
84.Expanding the whitelisting scheme resulted in a large number of companies striking special deals with Facebook. A November 2013 email discussion reveals that Facebook was managing 5,200 whitelisted apps. From the documents we received, the following well-known apps were among those whitelisted:
85.All whitelisted companies used a standard form agreement called a “Private Extended API Addendum,” which reads in part:
Access to the Private Extended APIs. Subject to the terms of the Agreement, FB may, in its sole discretion, make specific Private Extended APIs available to Developer for use in connection with Developer Applications. FB may terminate such access for convenience at any time. The Private Extended APIs and the Private Extended API Guidelines will be deemed to be a part of the Platform and the Platform Policies, respectively, for purposes of the Agreement…. ‘Private Extended APIs’ means a set of APIs and services provided by FB to Developer that enables Developer to retrieve data or functionality relating to Facebook that is not generally available under Platform, which may include persistent authentication, photo upload, video upload, messaging and phonebook connectivity.
86.From the documents, it is also clear that whitelisting had been under consideration for quite some time in the run-up to all these special permissions being granted. There was an internal Facebook discussion, for instance, about the whitelisting process in an email sent on 5 September 2013: “We need to build collective experience on how to review the access that’s been granted, and how to make decisions about keep/kill/contract”.
87.It is clear that increasing revenues from major app developers was one of the key drivers behind the policy changes made by Facebook. The idea of linking access to friends’ data to the financial value of the developers’ relationship with Facebook was a recurring feature of the documents.
88.The FTC had found that Facebook misrepresented its claims regarding their app oversight programme, specifically the ‘verified apps programme’, which was a review allegedly designed to give users additional assurances and help them identify trustworthy applications. The review was non-existent and there was no oversight of those apps. Some preinstalled apps were able to circumvent users’ privacy settings or platform settings, and to access friends’ information as well as users’ information, such as birthdays and political affiliation, even when the user disabled the platform. For example, Yelp and Rotten Tomatoes would automatically get access to users’ personal information.
89.Mr. Soltani told the Committee:
In short, I found that time and time again Facebook allows developers to access personal information of users and their friends, in contrast to their privacy settings and their policy statements. This architecture means that if a bad actor gets a hold of these tokens […] there is very little the user can do to prevent their information from being accessed. Facebook prioritises these developers over their users.
90.As an example of the value Facebook’s customers placed on access to friends data, there is a long internal Facebook discussion in the documents we have published—again, dating back to 2013—around the Royal Bank of Canada’s ‘Neko’ spend, alongside whether they should also be whitelisted. ‘Neko’ was Facebook’s internal name for its new mobile advertising product, Mobile App Install Ads.
91.In an email from Sachin Monga at Facebook to Jackie Chang at Facebook, on 20 August 2013, at 10.38am, the negative impact of the platform changes on Royal Bank of Canada was discussed: “Without the ability to access non-app friends, the Messages API becomes drastically less useful”.
92.In reply, minutes later, Sachin Monga wrote back:
What would be really helpful for us is if you can provide the below details first:
2/ did they sign an extended api agreement when you whitelisted them for this api?
3/ who internally gave you approval to extend them whitelist access? Can you send me email or permalink from the Platform Whitelist Approval Group.
4/ Is there budget tied specifically to this integration? How much? We need the above info foremost and we understand the context below.’
93.The next email was from Sachin Monga to Jackie Chang, 10.58am, 20 August 2013:
Thanks for the quick response. Answers below:
2/ They did not sign an extended API agreement. Should they have? I didn’t know about this…
3/ Doug gave the approval…
4/ There is budget tied specifically to this app update (all mobile app install ads to existing RBC customers, via custom audiences). I believe it will be one of the biggest neko campaigns ever run in Canada.
94.The internal discussions about Royal Bank of Canada continued into the autumn, citing precedents Facebook had already used in its whitelisting extended access process. Simon Cross wrote to Jackie Chang, Sachin Monga, Bryan Hurren (Facebook), 25 October 2013: “+ bryan who recently whitelisted Netflix for the messages API—he will have a better idea of what agreements we need to give them to access to this API”. On the same day, Bryan Hurren then responded to Sachin Monga, Jackie Chang and Simon Cross: “From a PR perspective, the story is about the app, not the API, so the fact that it uses Titan isn’t a big deal. From a legal perspective, they need an “Extended API agreement” (we used with Netflix) which governs use going forward and should provide us with the freedom to make the changes that Simon mentions below (without being too explicit)”. Jackie Chang then wrote to the Facebook group, on 28 October 2013, stating “Bryan—can you take the lead on getting this agreement written up?’
95.These exchanges about just one major country customer, the Royal Bank of Canada, demonstrate the interlinkages between the value of access to friends’ data to advertising spending, and Facebook’s preferential whitelisting process, which we now consider further.
96.From the Six4Three case documents, it is clear that spending substantial sums with Facebook, as a condition of maintaining preferential access to personal data, was part and parcel of the company’s strategy of platform development as it embraced the mobile advertising world. And that this approach was driven from the highest level.
97.Included in the documents is an email between Mike Vernal, then Vice-President of Search, Local, and Developer Products at Facebook, and Mark Zuckerberg, Chris Daniels, Dan Rose, and Douglas Pardy, dated 7 October 2012. It discusses the link of data with revenue:
I’ve been thinking about platform business model a lot this weekend […] if we make it so devs can generate revenue for us in different ways, then it makes it more acceptable for us to charge them quite a bit more for using platform. The basic idea is that any other revenue you generate for us earns you a credit towards whatever fees you own us for using platform. For most developers this would probably cover cost completely. So instead of everyone paying us directly, they’d just use our payments or ads products.
A basic model could be:
For the money that you owe, you can cover it in any of the following ways:
Or if the revenue we get from those doesn’t add up to more that the fees you owe us, then you just pay us the fee directly.
98.On 27 October 2012, Mark Zuckerberg sent an internal email to Sam Lessin, discussing linking data to revenue, highlighting the fact that users’ data was valuable and that he was sceptical about the risk of such data leaking from developer to developer, which is, of course, exactly what happened during the Cambridge Analytica scandal. The following quotation illustrates this:
There’s a big question on where we get the revenue from. Do we make it easy for devs to use our payments/ad network but not require them? Do we require them? Do we just charge a rev share directly and let devs who use them get a credit against what they owe us? It’s not at all clear to me here that we have a model that will actually make us the revenue we want at scale.
I’m getting more on board with locking down some parts of platform, including friends data and potentially email addresses for mobile apps.
‘I’m generally sceptical that there is as much data leak strategic risk as you think. I agree there is clear risk on the advertiser side, but I haven’t figured out how that connects to the rest of the platform. I think we leak info to developers, but I just can’t think if any instances where that data has leaked from developer to developer and caused a real issue for us. Do you have examples of this?
Without limiting distribution or access to friends who use this app, I don’t think we have any way to get developers to pay us at all besides offering payments and ad networks.
99.By the following year, Facebook’s new approach, accompanying the launch of Neko in the mobile advertising world was clearly paying off handsomely. An email exchange on 20 June 2013 from Sam Lessin to Deborah Lin, copying in Mike Vernal and Douglas Purdy, shows the rapid growth of revenues from Neko advertising: “The nekko [sic.] growth is just freaking awesome. Completely exceeding my expectations re what is possible re ramping up paid products”.
100.By the autumn of 2013, at least, the substantial revenue link from Facebook customers to gain preferential access to personal data was set in stone. The following internal Facebook email from Konstantinos Papamiltiadis to Ime Archibong, on 18 September 2013, discussed slides prepared for a talk to the ‘DevOps’ the following day, highlighting the need for app developers to spend $250,000 per year to maintain access to their current Facebook data: “Key points: 1/ Find out what other apps like Refresh are out that we don’t want to share data with and figure out if they spend on NEKO. Communicate in one-go to all apps that don’t spend that those permission will be revoked. Communicate to the rest that they need to spend on NEKO $250k a year to maintain access to the data”.
101.The Six4Three documents also show that Facebook not only considered hard cash as a condition of preferential access, but also app developers’ property, such as tradenames. For example, the term ‘Moments’ was already protected by Tinder. This email from 11 March 2015 highlights a discussion about giving Tinder whitelisted access to restricted APIs in return for Facebook using the term ‘Moments’:
I was not sure there was not a question about compensation, apologies; in my mind we have been working collaboratively with Sean and the team in good faith for the past 16 or so months. He’s a member of a trusted group of advisers for our platform (Developer Advisory Board) and based on our commitment to provide a great and safe experience for the Tinder users, we have developed two new APIs that effectively allow Tinder to maintain parity of the product in the new API world.
Another email from Konstantinos Papamiltiadis to Tinder sent the next day states: “We have been working with Sean and his team in true partnership spirit all this time, delivering value that we think is far greater than this trademark.” Facebook then launched a photo-sharing app under the name of ‘Moments’ in June 2015.
102.We discuss, under ‘Facebook’s targeting of competition’ at the end of this Chapter, more examples of Facebook’s use of its position in the social media world to enhance its dominance and the issues this raises for the public, the industry and regulators alike.
103.‘Data reciprocity’ is the exchange of data between Facebook and apps, and then allowing the apps’ users to share their data with Facebook. As Ashkan Soltani told us, Facebook’s business model is “to monetise data”, which evolved into Facebook paying app developers to build apps, using the personal information of Facebook’s users. To Mr Soltani, Facebook was and is still making the following invitation: “Developers, please come and spend your engineering hours and time in exchange for access to user data”.
104.Data reciprocity between Facebook and app developers was a central feature in the discussions about the re-launch of its platform. The following email exchange on 30 October 2012 highlights this issue:
Mike Vernal: On Data Reciprocity—in practice I think this will be one of those rights that we reserve. […] We’ll pay closest attention to strategic partners where we want to make sure the value exchange is reciprocal.
Greg Schechter: Seems like Data Reciprocity is going to require a new level of subjective evaluation of apps that our platform ops folks will need to step up to—evaluating whether the reciprocity UI/action importers are sufficiently reciprocal.’
Mike Vernal: As many of you know, we’ve been having a series of conversations w/Mark for months about the Platform Business Model. […] We are going to require that all platform partners agree to data reciprocity. If you access a certain type of data (e.g. music listens), you must allow the user to publish back that same kind of data. Users must be able to easily turn this on both within your own app as well as from Facebook (via action importers).
105.Mark Zuckerberg wrote a long email entitled “Platform Model Thoughts,” sent on 19 November 2012 to senior executives Sheryl Sandberg, Mark Vernal, Douglas Purdy, Javier Olivan, Alex Schultz, Ed Baker, Chris Cox, Mike Schroepfer (who gave evidence to the DCMS Committee in April 2018), Dan Rose, Chris Daniels, David Ebersman, Vladimir Fedrov, Cory Ondrejka and Greg Badros. He discusses the concept of reciprocity and data value, and also refers to “pulling non-app friends out of friends.get”, thereby prioritising developer access to data from users who had not granted data permission to the developer:
After thinking about platform business for a long time, I wanted to send out a note explaining where I’m leaning on this. This isn’t final and we’ll have a chance to discuss this in person before we decide this for sure, but since this is complex, I wanted to write out my thoughts. This is long, but hopefully helpful.
The quick summary is that I think we should go with full reciprocity and access to app friends for no charge. Full reciprocity means that apps are required to give any user who connects to FB a prominent option to share all of their social content within that service back […]to Facebook.
We’re trying to enable people to share everything they want, and to do it on Facebook. Sometimes the best way to enable people to share something is to have a developer build a special purpose app or network for that type of content and to make that app social by having Facebook plug into it. However, that may be good for the world but it’s not good for us unless people also share back to Facebook and that content increases the value of our network. So ultimately, I think the purpose of platform—even the read side—is to increase sharing back into Facebook.’
It seems like we need some way to fast app switch to the FB app to show a dialog on our side that lets you select which of your friends you want to invite to an app. We need to make sure this experience actually is possible to build and make as good as we want, especially on iOS where we’re more constrained. We also need to figure out how we’re going to charge for it. I want to make sure this is explicitly tied to pulling non-app friends out of friends.get.’ (friends information)
What I’m assuming we’ll do here is have a few basic thresholds of API usage and once you pass a threshold you either need to pay us some fixed amount to get to the next threshold or you get rate limited at the lower threshold.
Overall, I feel good about this direction. The purpose of platform is to tie the universe of all the social apps together so we can enable a lot more sharing and still remain the central social hub. I think this finds the right balance between ubiquity, reciprocity and profit. On 19 November 2012, Sheryl Sandberg replied to this email from Mark Zuckerberg, stating, “I like full reciprocity and this is the heart of why”.
106.The use of ‘reciprocity’ highlights the outlook and the business model of Facebook. ‘Reciprocity’ agreements with certain apps enabled Facebook to gain as much information as possible, by requiring apps that used data from Facebook to allow their users to share of their data back to Facebook (with scant regard to users’ privacy). Facebook’s business interests were and are based on balancing the needs of developers to work with Facebook by giving them access to users’ data, while supposedly protecting users’ privacy. By logging into an app such as Tinder, for instance, the user would not have realised they were giving away all their information on Facebook. Facebook’s business interest is to gather as much information from users as possible, both directly and from app developers on the Platform.
107.Paul-Olivier Dehaye and Christopher Wylie described the way in which the Facebook app collects users’ data from other apps on Android phones. In fact, Facebook’s was one of millions of Android apps having potential access to users’ calls and messages in the Android operating system, dating back to 2008. The Six4Three documents reveal discussions about how Facebook could obtain such information. Facebook knew that the changes to its policies on the Android mobile phone system, which enabled the Facebook app to collect a record of calls and texts sent by the user, would be controversial. To mitigate any bad PR, Facebook planned to make it as hard of possible for users to know that this was one of the underlying features of the upgrade of their app.
108.The following email exchange, sent on 4 February 2015, from Michael LeBeau to colleagues, highlight the changing of ‘read call log’ permissions on Android and a disregard for users’ privacy:
Michael LeBeau – ‘Hi guys, as you know all the growth team is planning on shipping a permissions update on Android at the end of this month. They are going to include the ‘read call log’ permission, which will trigger the Android permissions dialog on update, requiring users to accept the update. They will then provide an in-app opt in NUX for a feature that lets you continuously upload your SMS and call log history to Facebook to be used for improving things like PYMK, coefficient calculation, feed ranking etc. This is a pretty high-risk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it.’
109.On 25 March 2018, Facebook issued a statement about the logging of people’s call and text history, without their permission:
Call and text history logging is part of an opt-in feature for people using Messenger or Facebook Lite on Android. This helps you find and stay connected with the people you care about, and provides you with a better experience across Facebook. […] Contact importers are fairly common among social apps and services as a way to more easily find the people you want to connect with.
This positive spin on the logging of people’s data may have been accurate, but it failed to highlight the huge financial advantage to Facebook of collecting extensive data from its users’ daily interactions.
110.Onavo was an Israeli company that built a VPN app, which could hide users’ IP addresses so that third parties could not track the websites or apps used. Facebook bought Onavo in 2013, promoting it to customers “to keep you and your data safe when you go online by blocking potentially harmful websites and securing your personal information”. However, Facebook used Onavo to collect app usage data from its customers to assess not only how many people had downloaded apps, but how often they used them. This fact was included in the ‘Read More’ button in the App Store description of Onavo: “Onavo collects your mobile data traffic […] Because we’re part of Facebook, we also use this info to improve Facebook products and services, gain insights into the products and services people value, and build better experiences”.
111.This knowledge helped them to decide which companies were performing well and therefore gave them invaluable data on possible competitors. They could then acquire those companies, or shut down those they judged to be a threat. Facebook acquired and used this app, giving the impression that users had greater privacy, when in fact it was being used by Facebook to spy on those users.
112.The following slides are from a presentation, titled “Industry Update”, given on 26 April 2013, showing market analysis driven by Onavo data, comparing data about apps on users’ phones and mining that data to analyse Facebook’s competitors.
113.The following slide illustrates statistics collected from different popular apps, such as Vine, Twitter, Path and Tumblr:
“Industry Update” Facebook presentation, given on 26 April 2013
114.In August 2018, Apple discovered that Facebook had breached its terms and conditions and removed Onavo from its App Store, stating:
We work hard to protect user privacy and data security throughout the Apple ecosystem. With the latest update to our guidelines, we made it explicitly clear that apps should not collect information about which other apps are installed on a user’s device for the purposes of analytics or advertising/marketing and must make it clear what user data will be collected and how it will be used.
115.Since 2016, Facebook has undertaken similar practices in relation to its ‘Facebook Research’ app, which violated Apple’s rules surrounding the internal distribution of apps within an organisation. Facebook secretly paid users, aged between 13 and 25, up to $20 in gift cards per month to sell their phone and website activity, by installing the Android ‘Facebook Research’ app. Apple blocked Facebook’s Research app in January 2019, when it realised that Facebook had violated Apple’s terms and conditions. The app will continue, however, to run on Android. An Apple spokesman stated:
Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.
116.Since inception, Facebook has made multiple acquisitions, including Instagram in 2012 and WhatsApp in 2014. The Six4Three files show evidence of Facebook taking aggressive positions against certain apps, especially against direct competitors, which resulted in their being denied access to data. This inevitably led to the failure of those businesses, including Six4Three. An email sent on 24 January 2013 from Justin Osofsky to Mike Vernal, Mark Zuckerberg, Kevin Systrom, Douglas Purdy and Dan Rose describes the targeting of Twitter’s Vine app, a direct competitor to Instagram, by shutting down its use of Facebook’s Friends API:
Justin Osofsky – Twitter launched Vine today which lets you shoot multiple short video segments to make one single, 6-second video. As part of their NUX, you can find friends via FB. Unless anyone raises objections, we will shut down their friends API access today. We’ve prepared reactive PR, and I will let Jana know our decision.
MZ – Yup, go for it.
117.Instagram Video, also created in 2013, enabled users to upload 15-second videos to their profile. From the email exchange above, it is clear that Mark Zuckerberg personally approved the decision to deny access to data for Vine. In October 2016, Vine announced that Twitter would be discontinuing the Vine mobile app, in part due to the fact that they could not grow their user base. On the same day that we published the Six4Three documents in December 2018, the co-founder of Vine, Rus Yusupov, tweeted “I remember that day like it was yesterday”.
118.We published a small proportion of the evidence obtained from the Six4Three court case. For over a year, on multiple occasions, Mark Zuckerberg has refused to give evidence to the DCMS Committee. Yet within four hours of the Six4Three evidence being published on the DCMS Committee website, he responded, with the following post on his Facebook page:
This week a British Parliament committee published some internal Facebook emails, which mostly include internal discussions leading up to changes we made to our developer platform to shut down abusive apps in 2014–15. Since these emails were only part of our discussions, I want to share some more context around the decisions that were made.
We launched the Facebook Platform in 2007 with the idea that more apps should be social. For example, your calendar should show your friends’ birthdays and your address book should have your friends’ photos. Many new companies and great experiences were built on this platform, but at the same time, some developers built shady apps that abused people’s data. In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access.
This change meant that a lot of sketchy apps—like the quiz app that sold data to Cambridge Analytica—could no longer operate on our platform. Some of the developers whose sketchy apps were kicked off our platform sued us to reverse the change and give them more access to people’s data. We’re confident this change was the right thing to do and that we’ll win these lawsuits.
119.The “sketchy apps” that Mr Zuckerberg referred to—‘the quiz app that sold data to Cambridge Analytica’—was the ‘thisisyourdigitallife’ app, owned by GSR, which brings us back full circle to the starting point of our inquiries into the corporate methods and practices of Facebook.
120.One of the co-founders of GSR was Joseph Chancellor, who was employed, until recently, at Facebook, as a quantitative researcher on the User Experience Research team, only two months after leaving GSR. Facebook has provided us with no explanation for its recruitment of Mr Chancellor, after what Facebook now presents as a very serious breach of its terms and conditions. We believe the truth of the matter is contained in the evidence of Mr Chancellor’s co-founder Aleksandr Kogan.
121.When Richard Allan was asked why Joseph Chancellor was employed by Facebook, he replied that “Mr. Chancellor, as I understand it, is somebody who had a track record as an academic working on relevant areas”. He acknowledged that Mr Chancellor was involved with the source of the breach to Cambridge Analytica and that Facebook had not had any action taken against him.
122.When Aleksandr Kogan gave evidence to the DCMS Committee in April 2018, he was asked why Joseph Chancellor had been employed by Facebook, given the circumstances of his involvement with GSR. As to whether it seemed strange, Dr Kogan replied: “The reason I don’t think it’s odd is because, in my view, Facebook’s comments are PR crisis mode. I don’t believe they actually think these things, because I think they realise that the platform has been mined left and right by thousands of others”.
123.Dr Kogan’s interpretation of what happened seems to be supported by the Six4Three evidence. Facebook was violating user privacy because, from the beginning, its Platform had been designed in that way. Facebook fostered a tension between developer access to data and user privacy; it designed its Platform to apply privacy settings for Facebook apps only, but applied different and varying settings for data passed through the Platform’s APIs. For example, an email exchange between Mr Papamiltiadis and colleagues reveal that over 40,000 apps that had requested access to APIs were categorised based on: those that may cause negative press; those providing strategic value, by driving value to Facebook; those that are competitive, driving little value to Facebook; and those that will cause a business disruption. Mr. Lessin responds that all lifestyle apps should have their access removed “because we are ultimately competitive with all of them”.
124.Another document supplied to the committee by Six4Three shows concerns being raised by Facebook staff in 2011 about apps being removed from the platfom that were not necessarily ‘spammy’ or ‘sketchy’ to use Mark Zuckerberg’s terminology. In an internal email Mike Vernal from Facebook wrote that “It’s very, very bad when we disable a legitimate application. It erodes trust in the platform, because it makes developers think that their entire business could disappear at any second.” This is indeed the grievance that developers have tried to take up against Facebook and is at the heart of Six4Three’s complaint against the company.
125.Facebook has continually hidden behind obfuscation. The sealed documents contained internal emails, revealing the fact that Facebook’s profit comes before anything else. When they are exposed, Facebook “is always sorry, they are always on a journey”, as Charlie Angus, MP (Vice-Chair of the Canadian Standing Committee on Access to Information, Privacy and Ethics, and member of the ‘International Grand Committee’) described them.Facebook continues to choose profit over data security, taking risks in order to prioritise their aim of making money from user data.
126.Facebook has recently turned 15 years old, which makes it a relatively young company. What started as a seemingly innocuous way of sharing information with friends and family has turned into a global phenomenon, influencing political events. Theresa Hong, a member of the Trump digital election campaign described ‘Project Alamo’, which involved staff working for the then presidential candidate Donald Trump, Cambridge Analytica staff and Facebook staff all working together with the Cambridge Analytica data sets, targeting specific states and specific voters. The project spent $85 million on Facebook adverts and Ms Hong said that “without Facebook we wouldn’t have won”.
127.Facebook has grown exponentially, buying up competitors such as WhatsApp and Instagram. As Charlie Angus said to Richard Allan: “Facebook has broken so much trust that to allow you to simply gobble up every form of competition is probably not in the public interest. […] The problem is the unprecedented economic control of every form of social discourse and communication by Facebook”.
128.In portraying itself as a free service, Facebook gives only half the story. As Ashkan Soltani, former Chief Technologist to the Federal Trade Commission of the United States of America, told us:
It is either free—there is an exchange of information that is non-monetary—or it is an exchange of personal information that is given to the platform, mined, and then resold to or reused by third-party developers to develop apps, or resold to advertisers to advertise with.
129.The documents that we received highlighted the fact that Facebook wanted to maximise revenues at all cost, and in doing so favoured those app developers who were willing to pay a lot of money for adverts and targeted those apps that were in direct or potential future competition—and in certain notable instances acquired them.
130.Facebook’s behaviour clearly poses challenges for competition regulators. A joint HM Treasury and Department for Business, Energy and Industrial Strategy (BEIS) initiative has commissioned an expert panel, chaired by Professor Jason Furman, to consider the potential opportunities and challenges that the digital economy may pose for competition and pro-competition policy, and to make recommendations on any changes needed. The consultation period ended in early December 2018, and the panel is due to report in early 2019. We hope it will consider the evidence we have taken.
131.Since our publication of a selection of the Six4Three case documents, they have clearly been available to regulators, including the UK’s Information Commissioner and the US Federal Trade Commission to assist in their ongoing work. In March 2018, following the revelations over Cambridge Analytica, the FTC said it was launching a further investigation into Facebook’s data practices, the outcome of which—including the possibility of substantial fines—is still awaited.
132.When asked whether it was fair to think of Facebook as possibly falling foul of the US Racketeer Influenced and Corrupt Organisations Act, in its alleged conspiracy to damage others’ businesses, Richard Allan disagreed, describing the company as “a group of people who I have worked with closely over many years who want to build a successful business”. We received evidence that showed that Facebook not only targeted developers to increase revenue, but also sought to switch off apps where it considered them to be in competition or operating in a lucrative areas of its platform and vulnerable to takeover. Since 1970, the US has possessed high-profile federal legislation, the Racketeer Influenced and Corrupt Organizations Act (RICO); and many individual states have since adopted similar laws. Originally aimed at tackling organised crime syndicates, it has also been used in business cases and has provisions for civil action for damages in RICO-covered offences.
133.We believe that Mark Zuckerberg’s response to the publication of the Six4Three evidence was, similarly, to use Dr. Kogan’s description, “PR crisis mode”. Far from Facebook acting against “sketchy” or “abusive” apps, of which action it has produced no evidence at all, it, in fact, worked with such apps as an intrinsic part of its business model. This explains why it recruited the people who created them, such as Joseph Chancellor. Nothing in Facebook’s actions supports the statements of Mark Zuckerberg who, we believe, lapsed into “PR crisis mode”, when its real business model was exposed. This is just one example of the bad faith which we believe justifies governments holding a business such as Facebook at arms’ length. It seems clear to us that Facebook acts only when serious breaches become public. This is what happened in 2015 and 2018.
134.Despite specific requests, Facebook has not provided us with one example of a business excluded from its platform because of serious data breaches. We believe that is because it only ever takes action when breaches become public. We consider that data transfer for value is Facebook’s business model and that Mark Zuckerberg’s statement that “we’ve never sold anyone’s data” is simply untrue.”
135.The evidence that we obtained from the Six4Three court documents indicates that Facebook was willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers—such as Six4Three—of that data, thereby causing them to lose their business. It seems clear that Facebook was, at the very least, in violation of its Federal Trade Commission settlement.
136.The Information Commissioner told the Committee that Facebook needs to significantly change its business model and its practices to maintain trust. From the documents we received from Six4Three, it is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws. The ICO should carry out a detailed investigation into the practices of the Facebook Platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data.
137.Ireland is the lead authority for Facebook, under GDPR, and we hope that these documents will provide useful evidence for Helen Dixon, the Irish Data Protection Commissioner, in her current investigations into the way in which Facebook targeted, monitored, and monetised its users.
138.In our Interim Report, we stated that the dominance of a handful of powerful tech companies has resulted in their behaving as if they were monopolies in their specific area, and that there are considerations around the data on which those services are based. Facebook, in particular, is unwilling to be accountable to regulators around the world. The Government should consider the impact of such monopolies on the political world and on democracy.
139.The Competitions and Market Authority (CMA) should conduct a comprehensive audit of the operation of the advertising market on social media. The Committee made this recommendation its interim report, and we are pleased that it has also been supported in the independent Cairncross Report commissioned by the government and published in February 2019. Given the contents of the Six4Three documents that we have published, it should also investigate whether Facebook specifically has been involved in any anti-competitive practices and conduct a review of Facebook’s business practices towards other developers, to decide whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail. We hope that the Government will include these considerations when it reviews the UK’s competition powers in April 2019, as stated in the Government response to our Interim Report. Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.
140.Our Interim Report highlighted the methods by which Arron Banks campaigned during the Referendum, which, in his own words, involved creating ‘bush fires’ and then “putting a big fan on and making the fan blow”. He described the issue of immigration as one that set “the wild fires burning”. Evidence we received indicated that data had been shared between the Leave.EU campaign—with its strapline of ‘leaving the EU out of the UK’—and Arron Banks’ insurance company, Eldon Insurance Ltd. When we asked Mr Banks whether staff from Eldon Insurance had also worked on campaigning for LeaveEU, Mr Banks responded that such an allegation was “a flat lie”.
141.The allegation of the sharing of people’s data during a referendum campaign is a matter both for the Information Commissioner’s Office (as it relates to the alleged unauthorised sharing of data, in contravention of the Privacy and Electronic Communication Regulations 2003) and for the Electoral Commission (as it relates to alleged breaches of rules relating to spending limits during a referendum).
142.Since we published our Interim Report, the ICO published the findings of its investigations into these issues. Its report states that Leave.EU and Eldon Insurance are closely linked, with both organisations sharing at least three directors, with further sharing of employees and projects. The ICO found evidence to show that Eldon Insurance customers’ personal data, in the form of email addresses, was accessed by staff working for Leave.EU and was used unlawfully to send political marketing messages:
143.The ICO’s report highlighted its notice of intent to fine the following companies:
144.The Information Commissioner gave evidence to us on the day of publication of her report, and described to us the “failure to keep separate the data of insurance clients of Eldon and marketing and messaging to potential supporters and voters and Leave.EU data. We have issued notices of intent under the electronic marketing regulation, but also our work on the data protection side, to look deeply into the policies or the disregard for separation of the data. That is going to be looked at through an audit”. The ICO issued a preliminary enforcement notice on Eldon Insurance, requiring immediate action to ensure that the company is compliant with data protection law.
145.On 1 February 2019, after considering the companies’ representations, the ICO issued the fines, confirming a change to one amount, with the other two remaining unchanged (the fine for Leave.EU’s marketing campaign was £15,000 less than the ICO’s original notice of intention). The Information Commissioner has also issued two assessment notices to Leave.EU and Eldon Insurance, to inform both organisations that they will be audited.
146.From the evidence we received, which has been supported by the findings of both the ICO and the Electoral Commission, it is clear that a porous relationship existed between Eldon Insurance and Leave.EU, with staff and data from one organisation augmenting the work of the other. There was no attempt to create a strict division between the two organisations, in breach of current laws. We look forward to hearing the findings of the ICO’s audits into the two organisations.
147.As set out in our Interim Report, Arron Banks and Andy Wigmore showed complete disregard and disdain for the parliamentary process when they appeared before us in June 2018. It is now evident that they gave misleading evidence to us, too, about the working relationship between Eldon Insurance and Leave.EU. They are individuals, clearly, who have less than a passing regard for the truth.
60 , ICO, 6 November 2018.
62 , ICO, November 2018, p9.
63 Same as above.
68 Article 83, Chapter VIII.
69 , The New York Times, 30 January 2019.
70 , DCMS Committee, Fifth Report of Session 2017–19, HC 363, 29 July 2018, para 124.
71 For more information about Professor David Carroll’s Subject Access Request, please see para 100 of , DCMS Committee, Fifth Report of Session 2017–19.
72 , A report to Parliament, Information Commissioner’s Office, 6 November 2018, page 8.
73 , ICO website, 9 January 2019.
74 Harry Davies had previously published the following article , in The Guardian, on 11 December 2015, which first revealed the harvesting of data from Facebook.
76 , Mike Isaac, The New York Times, 25 January 2019.
77 , DOCKET NO. C-4365, July 2012, p2.
78 As above, Para 30.
79 , FTC, 29 November 2011.
82 , DOCKET NO. C-4365, July 2012.
84 Facebook’s APIs were released as follows: V1.0 was introduced in April 2010, V2.0–2.12 was introduced in April 2014, and V3.0–3.2 was introduced in April 2018. V2 limited Facebook developers’ industrial-level access to users’ information, but the same day that Facebook launched V2, it announced its largest tracking and ad targeting initiative to date: the Facebook Audience Network, extending the company’s data profiling and ad-targeting from its own apps and services to the rest of the Internet.
86 Para 102 to 110 of , DCMS Committee, Fifth Report of Session 2017–19.
91 Facebook’s old model was taking a percentage of online payments made to Facebook apps (such as free-to-play games) that ran on desktop, but would not run on smartphones.
93 , Nathan Halverson, Reveal, 17 January 2019; , Nathan Halverson, 24 January 2019.
94 , Gabrial J.X.Dance, Michael LaForgia and Nicholas Confessore, The New York Times, 18 December 2018.
95 The specific terms will be explained below.
96 In , exhibits 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 94 and 95 include discussions on whitelisting businesses.
98 Same as above.
101 , Ashkan Soltani.
106 - not published yet
107 ‘Gks’ and ‘Sitevars’ refer to internal Facebook terms. (own emphasis added).
111 Our emphasis added.
116 , Facebook newsroom, 15 June 2015.
120 Our emphasis added.
121 Our emphasis added.
122 (Our emphasis added).
125 As of October 2017, there were 3.3 million apps,
126 Footnote needed (our emphasis added).
128 Ovano promotional information.
129 , Rachel Sandler, Business Insider, 14 February 2018 (our emphasis added)
130 , Taylor Hatmaker Techcrunch.com, 22 August 2018.
131 , appleinsider, 22 August 2018.
132 , Techncrunch.com, 29 January 2019.
133 , Charlotte Henry, the Mac Observer, 30 January, 2019.
134 A NUX is a toolkit, used to create user interfaces.
135 (our emphasis added).
136 , Rachel Kraus, MashableUK, 5 December 2018.
137 Same as above.
138 Mark Zuckerberg post on Facebook, around 6.30pm, accessed 7.40pm, 5 December 2018.
140 Dr Kogan’s evidence, requoted by Ian Lucas MP, in the ‘International Grand Committee’ oral evidence session,
144, 21 August 2017.
150 , accessed 29 November 2018.
151 , DCMS Committee, Fifth Report of Session 2017–19, HC 363, paras 151 to 159.
153 As above, para 155.
154 , ICO, 6 November 2018.
155 As above, p45.
156 , ICO, 6 November 2018, p47.
157 As above, p9.
159 ICO Report, p47.
160 , ICO, 1 February 2019.
Published: 18 February 2019