92.Arguably more invasive than obviously false information is the relentless targeting of hyper-partisan views, which play to the fears and the prejudices of people, in order to alter their voting plans. This targeting formed the basis of the revelations of March 2018, which brought to the general public’s attention the facts about how much of their own personal data is in the public domain, unprotected, and available for use by different players. Some of the events surrounding Cambridge Analytica and its use of Facebook data had been revealed at the end of 2015, when Harry Davies published an article in The Guardian, following investigations lasting around a year, and wrote: “a little known data company [Cambridge Analytica] […] paid researchers at Cambridge University to gather detailed psychological profiles about the US electorate using a massive pool of mainly unwitting US Facebook users built with an online survey”.
93.Based on our knowledge of this article, we had explored the general issues surrounding the manipulation of data when we questioned Facebook in February 2018. However, it was in March 2018 when facts about this case became better known across the world, including how people’s data was used to influence election campaigning, in the US and the UK, through the work of Carole Cadwalladr, at The Observer, and the whistle-blower Christopher Wylie, a former employee of SCL Group, and Cambridge Analytica. Shortly after going public with his allegations, Christopher Wylie gave evidence to the Committee. This chapter will focus on the events that highlighted the extent of the misuse of data, involving various organisations including Facebook, Global Science Research (GSR), Cambridge Analytica, and Aggregate IQ (AIQ), and the alleged sharing of data in the EU Referendum. We received written and oral evidence from many of those intimately involved in these revelations. Issues relating to these companies and political campaigning are further examined in Chapter 4, as well as evidence regarding SCL’s involvement in overseas elections in Chapter 6.
94.Cambridge Analytica was founded in 2012, with backing from the US hedge fund billionaire and Donald Trump donor, Robert Mercer, who became the majority
shareholder. He was the largest donor to the super political action committee (PAC) that supported the presidential campaigns of Ted Cruz and Donald Trump in the 2016 presidential election. Christopher Wylie argued that the funding of Cambridge Analytica enabled Mr Mercer to benefit from political campaigns that he supported, without directly spending money on them, thereby bypassing electoral finance laws: “[Robert Mercer] can put in $15 million to create something and then only charge $50,000 for it. It would have been physically impossible to get the same value and level of service and data for that amount of money in any other way”.
95.Cambridge Analytica was born out of the already established SCL consultancy, which had engaged in political campaigns around the world, using specialist communications techniques previously developed by the military to combat terror organisations, and to disrupt enemy intelligence and to give on the ground support in war zones. Cambridge Analytica’s primary purpose would instead be to focus on data targeting and communications campaigns for carefully selected Republican Party candidates in the United States of America.
96.Steve Bannon served as White House chief strategist at the start of President Donald Trump’s term, having previously been chief executive of President Trump’s general election campaign. He was the executive chairman of Breitbart News, a website he described as ‘the platform of the alt-right,’ and was the former Vice President of Cambridge Analytica. A Cambridge Analytica invoice to UKIP was billed to the same address as Steve Bannon’s company, Glittering Steel. The Committee was also told that Steve Bannon introduced Cambridge Analytica to Arron Banks and to Leave.EU.
97.We heard evidence from Alexander Nix, in February 2018, before the Observer and Guardian revelations in March 2018, and before the company and its associated company had gone into administration. Alexander Nix described the micro-targeting business of Cambridge Analytica:
We are trying to make sure that voters receive messages on the issues and policies that they care most about, and we are trying to make sure that they are not bombarded with irrelevant materials. That can only be good. That can only be good for politics, it can only be good for democracy and it can be good in the wider realms of communication and advertising.
98.The use of data analytics, based on the psychological profile of the audience, was at the heart of the work of Cambridge Analytica, “presenting a fact that is underpinned by an emotion”, as described by Mr. Nix. In order to match the right type of message to voters, Cambridge Analytica needed information about voters, such as what merchandise they bought, what media they read, what cars they drove. Mr. Nix told us that “we are able to match these data with first-party research, being large, quantitative research instruments, not dissimilar to a poll. We can go out and ask audiences about their preferences […] indeed we can also start to probe questions about personality and other drivers that might be relevant to understanding their behaviour and purchasing decisions”. Cambridge Analytica used ‘OCEAN psychological analysis’ to identify issues people might support and how to position the arguments to them. OCEAN categorises people based on their ‘Openness’, ‘Conscientiousness’, ‘Extraversion’, ‘Agreeableness’ and ‘Neuroticism’. As Alexander Nix explained in his talk at the 2016 Concordia Annual Summit, entitled ‘The Power of Big Data and Psychographics’, this approach helps you, for example, to decide how to persuade American voters on the importance of protection of the second amendment, which guarantees the right to keep and bear arms. In the example Mr Nix showed, you might play on the fears of someone who could be frightened into believing that they needed the right to have a gun to protect their home from intruders.
99.When asked where the data used by Cambridge Analytica came from, Alexander Nix told us: “We do not work with Facebook data, and we do not have Facebook data. We do use Facebook as a platform to advertise, as do all brands and most agencies, or all agencies, I should say. We use Facebook as a means to gather data. We roll out surveys on Facebook that the public can engage with if they elect to”. When asked whether Cambridge Analytica was able to use information on users’ Facebook profile when they complete surveys, Mr. Nix replied, “No, absolutely not. Absolutely not”.
100.Professor David Carroll, a US citizen, made a Subject Access Request (SAR) to Cambridge Analytica in January 2017, under the Data Protection Act 1998, because he believed that his data was being processed in the UK. He told us “there was no indication of where they obtained the data. […] We should be able to know where they got the data, how they processed it, what they used it for, who they shared it with and also whether we have a right to opt out of it and have them delete the data and stop processing it in the future”. The ICO’s investigation update of 11 July 2018 described Professor Carroll’s case as “a specific example of Cambridge Analytica/SCL’s poor practice with regard to data protection law”.
101.The ICO served an Enforcement Notice on SCL Elections Ltd on 4 May 2018, ordering it to comply with the terms of the SAR, by providing copies of all personal information that SCL held on Professor Carroll. However, the terms of the Enforcement Notice were not complied with by the deadline of 3 June 2018, and the ICO is now considering criminal action against Cambridge Analytica and SCL Elections Ltd.
102.The Facebook data breach in 2014, and the role of Cambridge Analytica in acquiring this data, has been the subject of intense scrutiny. Ultimately the data breach originated at the source of the data, at Facebook. ‘Friends permissions’ were a set of permissions on Facebook between 2010 and 2014, and allowed developers to access data related to users’ friends, without the knowledge or express consent of those friends. One such developer, Aleksandr Kogan, an American citizen who had been born in the former Soviet Union, was a Research Associate and University Lecturer at the University of Cambridge in the Department of Psychology. Kogan began collaborating “directly” with Facebook in 2013, and he told us that they “provided me with several macro-level datasets on friendship connections and emoticon usage.”
103.Professor Kogan set up his own business, Global Science Research (GSR), in the spring of 2014, and developed an App, called the GSR App, which collected data from users, at an individual level. It was at around this time as well that Dr Kogan was in discussions about working on some projects with SCL Elections and Cambridge Analytica, to see whether his data collection and analysis methods could help the audience targeting of digital campaigns. Professor Kogan explained that he did not sell the GSR App itself as “it is not technically challenging in any way. Facebook explains how to do it, so there is great documentation on this”. What was valuable was the data. The aim was to recruit around 200,000 people who could earn money by completing an online survey. Recruits had to download the App before they could collect payment. The App would download some information about the user and their friends. Each person was paid $3 or $4, which totalled $600,000 to $800,000 across all participants. In this case SCL paid that amount, and then returned to get predictions about people’s personalities, for which they paid GSR £230,000. In the latter part of 2014, after the GSR App data collection was complete, Professor Kogan revised the application to become an interactive personality “quiz” and renamed the App “thisisyourdigitallife.”
104.The exact nature of Dr Kogan’s work on this project is set out in the contract he signed with SCL, on 4 June 2014, along with his business partner, Joseph Chancellor, who was later hired to work for Facebook. Alexander Nix also signed this contract on behalf of SCL Elections. In the ‘Project and Specification’ schedule of the contract it states that ‘After data is collected, models are built using psychometric techniques (e.g. factor analysis, dimensional scaling etc) which uses Facebook likes to predict people’s personality scores. These models are validity tested on users who were not part of the training sample. Trait predictions based on Facebook likes are at near test-retest levels and have been compared to the predictions that romantic partners, family members and friends make about their traits. In all previous cases the computer-generated scores performed the best. Thus, the computer-generated scores can be more accurate than even the knowledge of very close friends and family members.’
105.Furthermore, Dr Kogan and SCL knew that ‘scraping’ Facebook user data in this way was in breach of the company’s then recently revised terms of service. Instead the work was carried out under the terms of an agreement GSR has with Facebook which predated this change. It is stated in the contract that, “GSR’s method relies on a pre-existing application functioning under Facebook’s old terms of service. New applications are not able to access friend networks and no other psychometric profiling applications exist under the old Facebook terms.”
106.The purpose of the project, however, was not to carry out this testing as part of an experiment into the predictive nature of understanding the insights about an individual that are provided by Facebook likes. Rather, data would be scraped to order to support political campaigns. Cambridge Analytica was involved with in eleven states in the USA in 2014. These were Arkansas, Colorado, Florida, Iowa, Louisiana, Nevada, New Hampshire, North Carolina, Oregon, South Carolina and West Virginia. Dr Kogan and his team were required under the contract to provide SCL with data sets that matched predictive personality scores, including someone’s likely political interests, to named individuals on the electoral register in those states.
107.When Dr Kogan gave evidence to us, he stated that he believed using Facebook likes to predict someone’s personality and interests was not particularly accurate. However, from the contract he signed with SCL in June 2014, he certainly thought it was at the time. Furthermore, Dr Kogan’s colleague at the Cambridge Psychometrics Centre, Michal Kosinski, co-authored an academic paper called ‘Tracking the Digital Footprints of Personality’, published in December 2014, where he re-states the case or the effectiveness of assessing personality from Facebook likes. This article claims that “Facebook likes are highly predictive of personality and number[s] of other psychodemographic traits, such as age, gender, intelligence, political and religious views, and sexual orientation”. The article goes on, rightly, to raise the ethical concerns that should exist in relation to this approach, stating that:
The results presented here may have considerable negative implications because it can easily be applied to large numbers of people without obtaining their individual consent and without them noticing. Commercial companies, governmental institutions, or even one’s Facebook friends could use software to infer personality (and other attributes, such as intelligence or sexual orientation) that an individual may not have intended to share. There is a risk that the growing awareness of such digital exposure may decrease their trust in digital technologies, or even completely deter them from them.
108.When Alexander Nix first gave evidence to us in February 2018, he denied that Dr Kogan and GSR had supplied Cambridge Analytica with any data or information, and that his datasets were not based on information received from GSR. We received evidence from both Dr Kogan and Mr Wylie that conflicted with Mr Nix’s evidence; indeed, Mr Wylie described the data obtained from Dr Kogan’s GSR App as the foundation dataset of the company, which collected data on up to 87 million users, over 1 million of whom were based in the UK. We believe that Dr Kogan also knew perfectly well what he was doing, and that he was in breach of Facebook’s own codes of conduct (which he told us he did not consider to be operative in practice, as they were never enforced).
109.During his second appearance, Mr Nix admitted that “I accept that some of my answers could have been clearer, but I assure you that I did not intend to mislead you”. He went on to explain that Cambridge Analytica had not at that time been in possession of data from GSR, due to the fact that they had “deleted all such data licensed in good faith from GSR under that research contract”. This suggests that Mr. Nix, who by his own admission to the Committee tells lies, was not telling the whole truth when he gave us his previous version of events, in February 2018.
110.In August 2014 Dr Kogan worked with SCL to provide data on individual voters to support US candidates being promoted by the John Bolton Super Pac in the mid-term elections in November of that year. Psychographic profiling was used to micro-target adverts at voters across five distinct personality groups. After the campaign, according to an SCL presentation document seen by the Committee, the company claimed that there was a 39% increase in awareness of the issues featured in the campaign’s advertising amongst those who had received targeted messages. In September 2014, SCL also signed a contract to work with the American Conservative advocacy organisation, ‘For America’. Again, they used behavioural micro-targeting to support their campaign messages ahead of the mid-term elections that year. SCL would later claim that the 1.5million advertising impressions they generated through their campaign led to a 30% uplift in voter turnout, against the predicted turnout, for the targeted groups.
111.Sandy Parakilas worked for Facebook for 16 months in 2011 and 2012, and told us that “once the data passed from Facebook servers to the developer, Facebook lost insight into what was being done with the data and lost control over the data”. There was no proper audit trail of where the data went and during Mr Parakilas’ 16 months of working there, he did not remember one audit of a developer’s storage. This is a fundamental flaw in Facebook’s model of holding data; Facebook cannot assure its users that its data is not being used by third parties and of the reasons for which that data may be being used.
112.Once the data had left Facebook, that data, or its derivatives, could be copied multiple times. Chris Vickery, Director of Cyber Risk Research at UpGuard, described to us the ‘sticky’ nature of data: “In this type of industry, data does not just go away. It does not just disappear. It is sticky. It gathers up. The good stuff stays. Even the bad stuff stays, but it is not used. It is held in an archive somewhere. Nothing disappears”.
113.Furthermore, that data was specific and personal to each person with a Facebook account, including their names, their email addresses, and could even include private messages. Tristan Harris, from the Center for Humane Technology, told us that the entire premise of Facebook’s App platform was exactly this—to enable third-party developers to have access to people’s friends’ data: “The premise of the app platform was to enable as many developers as possible to use that data in creative ways, to build creative new social applications on behalf of Facebook”.
114.Facebook claimed that Dr Kogan had violated his agreement to use the data solely for academic purposes. On Friday 16 March 2018, Facebook suspended Kogan from the platform, issued a statement saying that he “lied” to the company, and characterised his activities as “a scam—and a fraud”. Facebook also suspended Christopher Wylie at the same time. On Wednesday 21 March 2018, Mark Zuckerberg called Dr Kogan’s actions a “breach of trust”. However, when Facebook gave evidence to us in February 2018, they failed to disclose the existence of this “breach of trust” and its implications.
115.In its commitment to update our Committee on its ongoing investigation, the ICO decided to publish a Notice of Intent to issue a monetary penalty to Facebook of £500,000, “for lack of transparency and security issues relating to the harvesting of data constituting breaches of the first and seventh data protection principles”, under the Data Protection Act 1998. It should be noted that, if the new Data Protection Act 2018 had been in place when the ICO started its investigation into Facebook, the ICO’s Notice of Intent to impose 4% of its annual turnover of $7.87 billion, which would have totalled £315 million.
116.As recently as 20 July 2018, Facebook suspended another company that it believes harvested data from its site. Crimson Hexagon is based in Boston, US. Facebook is investigating whether this analytics firm’s contracts with the US government and a Russian not-for-profit organisation with ties to the Kremlin violated Facebook’s policies. For Crimson Hexagon to share such data with government agencies would be incredibly useful to those agencies, as it would show how large groups of people were feeling at a particular time, and could be used during political campaigns.6 Again, the same opportunities given by Facebook to, unwittingly, share their users’ data with Cambridge Analytica, via GSR, were being given, up until a few days ago, to Crimson Hexagon, despite Facebook’s reassurances that they were tightening their policies.
117.Jeff Silvester is one of the owners of the Canadian digital advertising web and software development company, Aggregate IQ (AIQ), which was incorporated in 2013. Mr Silvester gave evidence to us in May 2018 and explained that their first work for SCL was to “create a political customer relationship management software tool” for the Trinidad and Tobago election campaigning, in 2014. From that work, AIQ then started developing the Ripon tool, software that was commissioned and would be owned by SCL:
The Ripon tool has been described in a lot of different ways. The part that we have done was a political customer relationship management tool focused on the US market. This was software that would help with people going door to door. There was a tool in there that you could do phone banking so you could call people and get their opinions on things and keep track of all that sort of information.
118.Christopher Wylie gave us a different version of Ripon: “A lot of the papers that eventually became the foundation of the methods that were used on the Ripon project came out of research that was being done at the University of Cambridge, some of which was funded in part by DARPA, which is the US military’s research agency”. Mr Wylie went on to explain that Ripon was the software that utilised the algorithms from the Facebook data.
119.In its interim report published in July 2018, the ICO confirmed that AIQ had access to the personal data of UK voters, given by the Vote Leave campaign. The ICO is in the process of establishing from where AIQ accessed the personal data, and whether AIQ still holds that data. Furthermore, “we have however established, following a separate report, that [AIQ] hold UK data which they should not continue to hold”. In this regard, the ICO is working with the federal Office of the Privacy Commissioner and the Office of the Information and Privacy Commissioner, British Columbia.
120.In the files presented to the committee by Chris Vickery, we have also found evidence that AIQ used tools that could scrape user profile data from LinkedIn. The App acts similarly to online human behaviour, searching LinkedIn user profiles, scraping their contacts, and all accompanying information such as users’ place of work, location and job title.
121.There have been data privacy concerns raised about another campaign tool used, but not developed, by AIQ. A company called uCampaign has a mobile App that employs gamification strategy to political campaigns. Users can win points for campaign activity, like sending text messages and emails to their contacts and friends. The App was used in Donald Trump’s presidential campaign, and by Vote Leave during the Brexit Referendum.
122.The developer of the uCampaign app, Vladyslav Seryakov, is an Eastern Ukrainian military veteran who trained in computer programming at two elite Soviet universities in the late 1980s. The main investor in uCampaign is the American hedge fund magnate Sean Fieler, who is a close associate of the billionaire backer of SCL and Cambridge Analytica, Robert Mercer. An article published by Business Insider on 7 November 2016 states:
If users download the App and agree to share their address books, including phone numbers and emails, the App then shoots the data [to] a third-party vendor, which looks for matches to existing voter file information that could give clues as to what may motivate that specific voter. Thomas Peters, whose company uCampaign created Trump’s app, said the App is “going absolutely granular”, and will—with permission—send different A/B tested messages to users’ contacts based on existing information.
123.AIQ’s first substantial work was for SCL, before it went on later to work for Vote Leave in the UK’s EU Referendum in 2016. According to evidence we have received, Alexander Nix and SCL also pitched for work in the Referendum to Leave.EU, but were not successful. Throughout our inquiry, we have been concerned about the links within this seemingly small community involved in political micro-targeting and about the potential for data misuse. These concerns have been heightened by Mr Nix and SCL’s own links with organisations involved in the military, defence, intelligence and security realms.
124.Much effort has been expended in trying to untangle the complex web of relationships within what started out as the SCL (Strategic Communications Laboratories) group of companies, in which the founder Nigel Oakes and Alexander Nix have been involved, along with a myriad of changing shareholders. Confusion can perhaps be sourced to the use of the SCL name within both sets of businesses: the defence consultancy (SCL Group Limited), run by Mr Oakes, and the political consultancy (SCL Elections Limited), incorporated by Mr Nix in 2012. In evidence, however, Mr Nix certainly did not help, as he was evasive about the changes in beneficial ownership during the period when Cambridge Analytica operated.
125.In February 2018, in response to a question about the distinction between SCL and Cambridge Analytica, Alexander Nix told us that “SCL is a very different company to Cambridge Analytica. It is a different company that has different employees who sit in a different office. It has a different board and a different board of advisers. It has different datasets, and it has different clients.”
126.Christopher Wylie told us, in March 2018, that everyone who worked for Cambridge Analytica was “effectively” employed by SCL: “When I started in June 2013, Cambridge Analytica did not exist yet. It is important for people to understand that Cambridge Analytica is more of a concept or a brand than anything else because it does not have employees. It is all SCL, it is just the front-facing company for the United States”. No distinction was made by Mr Nix between SCL Elections Ltd and SCL Group Limited (to which he was apparently referring). In June, 2018, Mr Nix gave us graphics showing the changes to the group’s employment structure between 2005–2018, but these were not a map of the ownership changes.
127.Corporate filings, however, show that after a period of independence under Mr Nix from 2012, in November 2015, SCL Elections formally rejoined the orbit of the wider SCL group. Nothwithstanding this, the Committee has seen internal documents from 27 May, 2015 which show political and election projects being discussed under the banner of ‘SCL Group’. (We refer to one of these projects, relating to Argentina, in Chapter 6). By the time the whole group went into administration in April 2018, the US employing body, SCL USA Inc, was providing staff both to Cambridge Analytica LLC and the defence consultancy SCL Group.
128.Throughout, Cambridge Analytica was also only 19% owned within the group, with the other shareholders unclear, despite our questioning over two evidence sessions with Mr Nix. Wendy Siegelman and Ann Marlow created a chart in May 2017, including 30 companies, with shareholders, interlinked within the SCL Group Ltd. That structure was again soon to change, however.
129.Brittany Kaiser told us, in April 2018, that when she joined the SCL Group, “it was a parent company of a few different divisions. One would be SCL Commercial, SCL Election, SCL Defence, SCL Social, and then Cambridge Analytica, which I understood was the US-acting subsidiary.” She further explained that, once Cambridge Analytica had become a popular brand, it “subsumed most of those companies and divisions and the SCL Group became just our defence company, SLC Group or SCL.gov, based in Arlington”.
130.In August 2017, a new ultimate holding company, Emerdata Limited, was incorporated at the same address, in Canary Wharf in London, as SCL Group. Alexander Nix was appointed a director of Emerdata Ltd in January 2018. Its other directors included the former SCL Group Chairman, Julian Wheatland (who also became the new acting CEO of Cambridge Analytica on 11 April, 2018) and the former Chief Data Officer of Cambridge Analytica, Alexander Tayler (who took over as acting CEO of Cambridge Analytica on 20 March, 2018, when Mr Nix was suspended, before resigning on 11 April 2018).
131.On 18 March 2018 (the day after The Guardian first published articles relating to Cambridge Analytica), Rebekah and Jennifer Mercer, daughters of Robert Mercer, were also appointed directors. Another director of Emerdata is Johnson Chun Shun Ko, Deputy Chairman and Executive Director of Frontier Services Group, which is a private security firm that operates mostly in Africa. Emerdata is chaired by the US businessman Erik Prince, who founded the private military group Blackwater USA. All of Emerdata’s subsidiaries went into administration in April 2018, following the Cambridge Analytica scandal, and it is uncertain what activities have continued since.
132.Companies House published the ‘Notice of administrators’ proposals’, in respect of SCL Elections Ltd., in July 2018. It sets out the circumstances surrounding SCL Election Ltd.’s administration. Emerdata is the ultimate holding company and first called in the insolvency practitioner, Vincent Green, who then became an administrator. The notice highlights the fact that laptops from the SCL offices were not returned, and that some laptops returned by the ICO were subsequently stolen. There is also a list of SCL Election’s creditors. Emerdata is listed as a creditor/claimant, with the amount of debt totalling £6,381,778.05. The administrators propose that the company go into compulsory liquidation. We are concerned about what data was remaining on the stolen laptops and why Emerdata, the parent company, is the major creditor and is owed over £6.3 million, and why SCL USA Inc, a US affiliate, is owed over £1 million.
133.Over the past month, Facebook has been investing in adverts globally, proclaiming the fact that “Fake accounts are not our friends.” Yet the serious failings in the company’s operations that resulted in data manipulation, resulting in misinformation and disinformation, have occurred again. Over four months after Facebook suspended Cambridge Analytica for its alleged data harvesting, Facebook suspended another company, Crimson Hexagon—which has direct contracts with the US government and Kremlin-connected Russian organisations—for allegedly carrying out the same offence.
134.We are concerned about the administrators’ proposals in connection with SCL Elections Ltd, as listed in Companies House, and the fact that Emerdata Ltd is listed as the ultimate parent company of SCL Elections Ltd, and is the major creditor and owed over £6.3 million. The proposals also describe laptops from the SCL Elections Ltd offices being stolen, and laptops returned by the ICO, following its investigations, also being stolen. We recommend that the National Crime Agency, if it is not already, should investigate the connections between the company SCL Elections Ltd and Emerdata Ltd.
135.The allegations of data harvesting revealed the extent of data misuse, made possible by Cambridge University’s Dr Kogan and facilitated by Facebook, GSR, and manipulated into micro-targeting Cambridge Analytica and its associated companies, through AIQ. The SCL Group and associated companies have gone into administration, but other companies are carrying out very similar work. Many of the individuals involved in SCL and Cambridge Analytica appear to have moved on to new corporate vehicles. Cambridge Analytica is currently being investigated by the Information Commissioner’s Office (ICO) (and, as a leading academic institution, Cambridge University also has questions to answer from this affair about the activities of Dr Kogan).
136.We invited Alexander Nix twice to give evidence; both times he was evasive in his answers and the standard of his answers fell well below those expected from a CEO of an organisation. His initial evidence concerning GSR was not the whole truth. There is a public interest in getting to the heart of what happened, and Alexander Nix must take responsibility for failing to provide the full picture of events, for whatever reason. With respect to GSR, he misled us. We will give a final verdict on Mr Nix’s evidence when we complete the inquiry.
100 Paul-Olivier Dehaye, , referring to , Harry Davies, The Guardian, 11 December 2015.
101 The DCMS Committee’s , 27 March 2018, was described by Mark D’Arcy, parliamentary correspondent at the BBC, as “by a distance, the most astounding thing I’ve seen in Parliament”. , Mark D’Arcy.
102 Christopher Wylie,
103 , OpenSecrets.org
104 Christopher Wylie, Qq and
105 , Business Insider UK, November 2016
106 Brittany Kaiser,
108 Cambridge Analytica and the SCL Group started insolvency proceedings in the US and the UK on 2 May 2018.
111 Alexander Nix,
113 Alexander Nix, Q
118 , ICO, 11 July 2018
119 , ICO, 11 July 2018
120 Sandy Parakilas,
122 Aleksandr Kogan,
125 Aleksandr Kogan ()
126 , p67
127 , p84
128 , p.84
129 , pp.84–85
130 , Michal Kosinski, proceedings of the IEEE,Vol 102, no 12, December 2014
132 Qq to
139 Sandy Parakilas,
141 , Facebook statement, 16 March 2018; , The Guardian, 22 March 2018
142 , CNN interview with Mark Zuckerberg, 22 March 2018
143 , ICO, July 2018
149 , ICO, July 2018, p4
150 , ICO, July 2018
151 Evidence to be published in the autumn of 2018, when the next disinformation Report is published.
153 Business Insider, 7 November 2016
158 , Alexander Nix evidence, 7 June 2018
160 , accessed 24 July 2018
161 , May 2018
163 , Wendy Siegelman, 26 March 2018; ;
164 , accessed 22 July 2018
Published: 29 July 2018