1.In this inquiry, we have studied the spread of false, misleading, and persuasive content, and the ways in which malign players, whether automated or human, or both together, distort what is true in order to create influence, to intimidate, to make money, or to influence political elections.
2.People are increasingly finding out about what is happening in this country, in their local communities, and across the wider world, through social media, rather than through more traditional forms of communication, such as television, print media, or the radio.1 Social media has become hugely influential in our lives.2 Research by the Reuters Institute for the Study of Journalism has shown that not only are huge numbers of people accessing news and information worldwide through Facebook, in particular, but also through social messaging software such as WhatsApp. When such media are used to spread rumours and ‘fake news’, the consequences can be devastating.3
3.Tristan Harris, Co-founder and Executive Director, at the Center for Humane Technology—an organisation seeking to realign technology with the best interests of its users—told us about the many billions of people who interact with social media: “There are more than 2 billion people who use Facebook, which is about the number of conventional followers of Christianity. There are about 1.8 billion users of YouTube, which is about the number of conventional followers of Islam. People check their phones about 150 times a day in the developed world.”4 This equates to once every 6.4 minutes in a 16-hour day. This is a profound change in the way in which we access information and news, one which has occurred without conscious appreciation by most of us.
4.This kind of evidence led us to explore the use of data analytics and psychological profiling to target people on social media with political content, as its political impact has been profound, but largely unappreciated. The inquiry was launched in January 2017 in the previous Parliament, and then relaunched in the autumn, following the June 2017 election. The inquiry’s Terms of Reference were as follows:
5.We will address the wider areas of our Terms of Reference, including the role of advertising, in our further Report this autumn. In recent months, however, our inquiry delved increasingly into the political use of social media, raising concerns that we wish to address immediately. We had asked representatives from Facebook, in February 2018, about Facebook developers and data harvesting.6 Then, in March 2018, Carole Cadwalladr of The Observer,7 together with Channel 4 News, and the New York Times, published allegations about Cambridge Analytica (and associated companies) and its work with Global Science Research (GSR), and the misuse of Facebook data.8 Those allegations put into question the use of data during the EU Referendum in 2016, and the extent of foreign interference in UK politics. Our oral evidence sessions subsequently focussed on those specific revelations, and we invited several people involved to give evidence. The allegations highlighted both the amount of data that private companies and organisations hold on individuals, and the ability of technology to manipulate people.
6.This transatlantic media coverage brought our Committee into close contact with other parliaments around the world. The US Senate Select Committee on Intelligence, the US House of Representatives Permanent Select Committee on Intelligence, the European Parliament, and the Canadian Standing Committee on Access to Information, Privacy, and Ethics all carried out independent investigations. We shared information, sometimes live, during the hearings. Representatives from other countries, including Spain, France, Estonia, Latvia, Lithuania, Australia, Singapore, Canada, and Uzbekistan, have visited London, and we have shared our evidence and thoughts. We were also told about the work of SCL Elections—and other SCL associates, including Cambridge Analytica—set up by the businessman Alexander Nix; their role in manipulating foreign elections; and the financial benefits they gained through those activities. What became clear is that, without the knowledge of most politicians and election regulators across the world, not to mention the wider public, a small group of individuals and businesses had been influencing elections across different jurisdictions in recent years.
7.We invited many witnesses to give evidence. Some came to the Committee willingly, others less so. We were forced to summon two witnesses: Alexander Nix, former CEO of Cambridge Analytica; and Dominic Cummings, Campaign Director of Vote Leave, the designated Leave campaign group in the EU Referendum. While Mr. Nix subsequently agreed to appear before the Committee, Dominic Cummings still refused. We were then compelled to ask the House to support a motion ordering Mr Cummings to appear before the Committee.9 At the time of writing he has still not complied with this Order, and the matter has been referred by the House to the Committee of Privileges. Mr Cummings’ contemptuous behaviour is unprecedented in the history of this Committee’s inquiries and underlines concerns about the difficulties of enforcing co-operation with Parliamentary scrutiny in the modern age. We will return to this issue in our Report in the autumn, and believe it to be an urgent matter for consideration by the Privileges Committee and by Parliament as a whole.
8.In total, we held twenty oral evidence sessions, including two informal background sessions, and heard from 61 witnesses, asking over 3,500 questions at these hearings. We received over 150 written submissions, numerous pieces of background evidence, and undertook substantial exchanges of correspondence with organisations and individuals. We held one oral evidence session in Washington D.C. (the first time a Select Committee has held a public, live broadcast oral evidence session abroad) and also heard from experts in the tech field, journalists and politicians, in private meetings, in Washington and New York. Most of our witnesses took the Select Committee process seriously, and gave considered, thoughtful evidence, specific to the context of our inquiry. We thank witnesses, experts, politicians, and individuals (including whistle-blowers) whom we met in public and in private, in this country and abroad, and who have been generous with their expertise, knowledge, help and ideas.10 We also thank Dr Lara Brown and her team at the Graduate School of Political Management at George Washington University, for hosting the Select Committee’s oral evidence session in the US.
9.As noted above, this is our first Report on misinformation and disinformation. Another Report will be published in the autumn of 2018, which will include more substantive recommendations, and also detailed analysis of data obtained from the insecure AggregateIQ website, harvested and submitted to us by Chris Vickery, Director of Cyber Risk Research at UpGuard.11 Aggregate IQ is one of the businesses involved most closely in influencing elections.
10.Since we commenced this inquiry, the Electoral Commission has reported on serious breaches by Vote Leave and other campaign groups during the 2016 EU Referendum; the Information Commissioner’s Office has found serious data breaches by Facebook and Cambridge Analytica, amongst others; the Department for Digital, Culture, Media and Sport (DDCMS) has launched the Cairncross Review into press sustainability in the digital age; and, following a Green Paper in May, 2018, the Government has announced its intention to publish a White Paper later this year into making the internet and social media safer. This interim Report, therefore, focuses at this stage on seven of the areas covered in our inquiry:
11.There is no agreed definition of the term ‘fake news’, which became widely used in 2016 (although it first appeared in the US in the latter part of the 19th century).12 Claire Wardle, from First Draft, told us in our oral evidence session in Washington D.C. that “when we are talking about this huge spectrum, we cannot start thinking about regulation, and we cannot start talking about interventions, if we are not clear about what we mean”.13 It has been used by some, notably the current US President Donald Trump, to describe content published by established news providers that they dislike or disagree with, but is more widely applied to various types of false information, including:
12.In addition to the above is the relentless prevalence of ‘micro-targeted messaging’, which may distort people’s views and opinions.15 The distortion of images is a related problem; evidence from MoneySavingExpert.com cited celebrities who have had their images used to endorse scam money-making businesses, including Martin Lewis, whose face has been used in adverts across Facebook and the internet for scams endorsing products including binary trading and energy products.16 There are also ‘deepfakes’, audio and videos that look and sound like a real person, saying something that that person has never said.17 These examples will only become more complex and harder to spot, the more sophisticated the software becomes.
13.There is no regulatory body that oversees social media platforms and written content including printed news content, online, as a whole. However, in the UK, under the Communications Act 2003, Ofcom sets and enforces content standards for television and radio broadcasters, including rules relating to accuracy and impartiality.18 On 13 July 2018, Ofcom’s Chief Executive, Sharon White, called for greater regulation of social media, and announced plans to release an outline of how such regulation could work in the autumn of this year.19 We shall assess these plans in our further Report.
14.The term ‘fake news’ is bandied around with no clear idea of what it means, or agreed definition. The term has taken on a variety of meanings, including a description of any statement that is not liked or agreed with by the reader. We recommend that the Government rejects the term ‘fake news’, and instead puts forward an agreed definition of the words ‘misinformation’ and ‘disinformation’. With such a shared definition, and clear guidelines for companies, organisations, and the Government to follow, there will be a shared consistency of meaning across the platforms, which can be used as the basis of regulation and enforcement.
15.We recommend that the Government uses the rules given to Ofcom under the Communications Act 2003 to set and enforce content standards for television and radio broadcasters, including rules relating to accuracy and impartiality, as a basis for setting standards for online content. We look forward to hearing Ofcom’s plans for greater regulation of social media this autumn. We plan to comment on these in our further Report.
16.Standards surrounding fact-checking exist, through the International Fact-Checking Network’s Code of Principles, signed by the majority of major fact-checkers.20 A recent report of the independent High-Level Expert Group on Fake News and Online Disinformation highlighted that, while a Code of Principles exists, fact-checkers themselves must continually improve on their own transparency.21
17.Algorithms are being used to help address the challenges of misinformation. We heard evidence from Professor Kalina Bontcheva, who conceived and led the Pheme research project, which aims to create a system to automatically verify online rumours and thereby allow journalists, governments and others to check the veracity of stories on social media.22 Algorithms are also being developed to help to identify fake news. The fact-checking organisation, Full Fact, received funding from Google to develop an automated fact-checking tool for journalists.23 Facebook and Google have also altered their algorithms so that content identified as misinformation ranks lower.24 Many organisations are exploring ways in which content on the internet can be verified, kite-marked, and graded according to agreed definitions.25
18.The Government should support research into the methods by which misinformation and disinformation are created and spread across the internet: a core part of this is fact-checking. We recommend that the Government initiate a working group of experts to create a credible annotation of standards, so that people can see, at a glance, the level of verification of a site. This would help people to decide on the level of importance that they put on those sites.
19.During the course of this inquiry we have wrestled with complex, global issues, which cannot easily be tackled by blunt, reactive and outmoded legislative instruments. In this Report, we suggest principle-based recommendations which are sufficiently adaptive to deal with fast-moving technological developments. We look forward to hearing the Government’s response to these recommendations.
20.We also welcome submissions to the Committee from readers of this interim Report, based on these recommendations, and on specific areas where the recommendations can incorporate work already undertaken by others. This inquiry has grown through collaboration with other countries, organisations, parliamentarians, and individuals, in this country and abroad, and we want this co-operation to continue.
1 News consumption in the UK: 2016, Ofcom, 29 June 2017
3 The seventh annual Digital News Report, by the Reuters Institute for the Study of Journalism, University of Oxford was based on a YouGov online survey of 74,000 people in 37 countries.
5 Terms of reference, Fake News inquiry, DCMS Committee, 15 September 2017.
6 Monika Bickert, Q389
7 In June 2018, Carole Cadwalladr won the Orwell journalism prize, for her investigative work into Cambridge Analytica, which culminated in a series of articles from March 2018.
8 Harry Davies had previously published the following article Ted Cruz using firm that harvested data on millions of unwitting Facebook users, in The Guardian, on 11 December 2015, which first revealed the harvesting of data from Facebook.
9 Following the motion being passed, Dominic Cummings did not appear before the Committee. The matter was then referred to the Privileges Committee on 28 June 2018.
10 Our expert adviser for the inquiry was Dr Charles Kriel, Associate Fellow at the King’s Centre for Strategic Communications (KCSC), King’s College London. His Declaration of Interests are: Director, Kriel.Agency, a digital media and social data consulting agency; Countering Violent Extremism Programme Director, Corsham Institute, a civil society charity; and Cofounder and shareholder, Lightful, a social media tool for charities.
11 In the early autumn, we hope to invite Ofcom and the Advertising Standards Authority to give evidence, and to re-invite witnesses from the Information Commissioner’s Office and the Electoral Commission, and this oral evidence will also inform our substantive Report.
12 Fake News: A Roadmap, NATO Strategic Centre for Strategic Communications, Riga and King’s Centre for Strategic Communications (KCSE), January 2018.
14 Online information and fake news, Parliamentary Office of Science and Technology, July 2017, box 4.
Also see First Draft News, Fake news. It’s complicated, February 2017; Ben Nimmo (FNW0125); Full Fact, (FNW0097)
15 Micro-targeting of messages will be explored in greater detail in Chapter 4.
19 ‘It’s time to regulate social media sites that publish news’ The Times 13 July 2018
20 The International Fact-Checking Network website, accessed 21 June 2018.
21 A multi-dimensional approach to disinformation, Report of the independent High Level Expert Group on Fake News and Online Disinformation, March 2018.
22 Pheme website, www.pheme.eu, accessed 21 June 2018
23 Full Fact website, fullfact.org, accessed 21 June 2018
24 Mosseri, Facebook, “Working to stop misinformation and false news”. 6/4/2017
Published: 29 July 2018