Disinformation and ‘fake news’: Interim Report Contents

2The definition, role and legal responsibilities of tech companies

21.At the centre of the argument about misinformation and disinformation is the role of tech companies, on whose platforms content is disseminated.26 Throughout the chapter, we shall use the term ‘tech companies’ to indicate the different types of social media and internet service providers, such as Facebook, Twitter, and Google. It is important to note that a series of mergers and acquisitions mean that a handful of tech companies own the major platforms. For example, Facebook owns Instagram and WhatsApp; Alphabet owns both Google and YouTube.

22.The word ‘platform’ suggests that these companies act in a passive way, posting information they receive, and not themselves influencing what we see, or what we do not see. However, this is a misleading term; tech companies do control what we see, by their very business model. They want to engage us from the moment we log onto their sites and into their apps, in order to generate revenue from the adverts that we see. In this chapter, we will explore: the definitions surrounding tech companies; the companies’ power in choosing and disseminating content to users; and the role of the Government and the tech companies themselves in ensuring that those companies carry out their business in a transparent, accountable way.

An unregulated sphere

23.Tristan Harris of the Center for Humane Technology27 provided a persuasive narrative of the development and role of social media platforms, telling us that engagement of the user is an integral part both of tech companies’ business model and of their growth strategy:

They set the dial; they don’t want to admit that they set the dial, and instead they keep claiming, “We’re a neutral platform,” or, “We’re a neutral tool,” but in fact every choice they make is a choice architecture. They are designing how compelling the thing that shows up next on the news feed is, and their admission that they can already change the news feeds so that people spend less time [on it] shows that they do have control of that.28

24.Mr Harris told us that, while we think that we are in control of what we look at when we check our phones (on average, around 150 times a day), our mind is being hijacked, as if we were playing a slot machine:

Every time you scroll, you might as well be playing a slot machine, because you are not sure what is going to come up next on the page. A slot machine is a very simple, powerful technique that causes people to want to check in all the time. Facebook and Twitter, by being social products—by using your social network—have an infinite supply of new information that they could show you. There are literally thousands of things that they could populate that news feed with, which turns it into that random-reward slot machine.29

25.Coupled with this is the relentless feed of information that we receive on our phones, which is driven by tech engineers “who know a lot more about how your mind works than you do. They play all these different tricks every single day and update those tricks to keep people hooked”.30

Regulatory architecture

The Information Commissioner’s Office

26.The Information Commissioner is a non-departmental public body, with statutory responsibility “for regulating the processing of personal data” in the United Kingdom,31 including the enforcement of the new Data Protection Act 2018 and the General Data Protection Regulation (GDPR).32 The ICO’s written evidence describes the Commission’s role as “one of the sheriffs of the internet”.33

27.The Commissioner, Elizabeth Denham, highlighted the “behind the scenes algorithms, analysis, data matching and profiling” which mean that people’s data is being used in new ways to target them with information.34 She sees her role as showing the public how personal data is collected, used and shared through advertising and through the micro-targeting of messages delivered through social media.”35 She has a range of powers to ensure that personal data is processed within the legislative framework, including the serving of an information notice, requiring specified information to be provided within a defined timeframe.

28.The 2018 Act extends the Commissioner’s powers to conduct a full audit where she suspects that data protection legislation has, or is being, contravened and to order a company to stop processing data. Elizabeth Denham told us that these would be “powerful” measures.36 The recent legislative changes also increased the maximum fine that the Commissioner can levy, from £500,000 to £17 million or 4% of global turnover, whichever is greater, and set out her responsibilities for international co-operation on the enforcement of data protection legislation.37

29.The Data Protection Act 2018 created a new definition, called “Applied GDPR”, to describe an amended version of the GDPR, when European Union law does not apply (when the UK leaves the EU). Data controllers would still need to assess whether they are subject to EU law, in order to decide whether to follow the GDPR or the Applied GDPR. Apart from the exceptions laid down in the GDPR, all personal data processed in the United Kingdom comes within the scope of European Union law, until EU law no longer applies to the United Kingdom. However, when the United Kingdom leaves the EU, social media companies could “process personal data of people in the UK from bases in the US without any coverage of data protection law. Organisations that emulate Cambridge Analytica could set up in offshore locations and profile individuals in the UK without being subject to any rules on processing personal data”, according to Robert Madge, CEO of the Swiss data management company Xifrat Daten.38

30.The Data Protection Act 2018 gives greater protection to people’s data than did its predecessor, the 1998 Data Protection Act, and follows the law set out in the GDPR. However, when the UK leaves the EU, social media companies will be able to process personal data of people in the UK from bases in the US, without any coverage of data protection law. We urge the Government to clarify this loophole in a White Paper this autumn.

Investigation into the use of data analytics for political purposes

31.In May 2017, the ICO announced a formal investigation into the use of data analytics for political purposes. The investigation has two strands: explaining how personal data is used in the context of political messaging; and taking enforcement action against any found breaches of data protection legislation.39 The investigation has involved 30 organisations, including Facebook and Cambridge Analytica. Elizabeth Denham said of the investigation:

For the public, we need to be able to understand why an individual sees a certain ad. Why does an individual see a message in their newsfeed that somebody else does not see? We are really the data cops here. We are doing a data audit to be able to understand and to pull back the curtain on the advertising model around political campaigning and elections.40

32.In response to our request for the ICO to provide an update on the investigation into data analytics in political campaigning, the Commissioner duly published this update on 11 July 2018.41 We are grateful to the Commissioner for providing such a useful, detailed update on her investigations, and we look forward to receiving her final report in due course.

33.The ICO has been given extra responsibilities, but with those responsibilities should come extra resources. Christopher Wylie, a whistle-blower and ex-SCL employee, has had regular contact with the ICO, and he explained that the organisation has limited resources to deal with its responsibilities: “A lot of the investigators do not have a robust technical background. […] They are in charge of regulating data, which means that they should have a lot of people who understand how databases work”.42

34.Paul-Olivier Dehaye, founder of PersonalDataIO, told us that he had sent a letter to the ICO in August 2016, asking them if they were investigating Cambridge Analytica, because of the information about the company that was publicly available at that time. He told us that “if the right of access was made much more efficient, because of increased staffing at the ICO, this right would be adopted by [...] educators, journalists, activists, academics, as a tool to connect civil society with the commercial world and to help document what is happening”.43 Data scientists at the ICO need to be able to cope with new technologies that are not even in existence at the moment and, therefore, the ICO needs to be as technically expert, if not more so, than the experts in private tech companies.

35.The Commissioner told us that the Government had given the ICO pay flexibility to retain and recruit more expert staff: “We need forensic investigators, we need senior counsel and lawyers, we need access to the best, and maybe outside counsel, to be able to help us with some of these really big files”.44 We are unconvinced that pay flexibility will be enough to retain and recruit technical experts.

36.We welcome the increased powers that the Information Commissioner has been given as a result of the Data Protection Act 2018, and the ability to be able to look behind the curtain of tech companies, and to examine the data for themselves. However, to be a sheriff in the wild west of the internet, which is how the Information Commissioner has described her office, the ICO needs to have the same if not more technical expert knowledge as those organisations under scrutiny. The ICO needs to attract and employ more technically-skilled engineers who not only can analyse current technologies, but have the capacity to predict future technologies. We acknowledge the fact that the Government has given the ICO pay flexibility to retain and recruit more expert staff, but it is uncertain whether pay flexibility will be enough to retain and attract the expertise that the ICO needs. We recommend that the White Paper explores the possibility of major investment in the ICO and the way in which that money should be raised. One possible route could be a levy on tech companies operating in the UK, to help pay for the expanded work of the ICO, in a similar vein to the way in which the banking sector pays for the upkeep of the Financial Conduct Authority.

The Electoral Commission

37.The Electoral Commission is responsible for regulating and enforcing the rules that govern political campaign finance in the UK. Their priority is to ensure appropriate transparency and voter confidence in the system.45 However, concerns have been expressed about the relevance of that legislation in an age of social media and online campaigning. Claire Bassett, the Electoral Commission’s Chief Executive, told us that, “It is no great secret that our electoral law is old and fragmented. It has developed over the years, and we struggle with the complexity created by that, right across the work that we do.”46

38.The use of social media in political campaigning has had major consequences for the Electoral Commission’s work.47 As a financial regulator, the Electoral Commission regulates “by looking at how campaigners and parties receive income, and how they spend that.”48 While that is primarily achieved through the spending returns submitted by registered campaigners, the Commission also conducts real-time monitoring of campaign activities, including on social media, that it can then compare the facts with what it is being told.49 Where the Electoral Commission suspects or identifies that rules have been breached it has the power to conduct investigations, refer matters to the police, and issue sanctions, including fines.

39.At present, campaign spending is declared under broad categories such as ‘advertising’ and ‘unsolicited material to electors’, with no specific category for digital campaigning, not to mention the many subcategories covered by paid and organic campaigning, and combinations thereof. Bob Posner, the Electoral Commission’s Director of Political Finance and Regulation and Legal Counsel, told us that “A more detailed category of spending would be helpful to understand what is spent on services, advertising, leaflets, posters or whatever it might be, so anyone can interrogate and question it.”50 The Electoral Commission has since recommended that legislation be amended so that spending returns clearly detail digital campaigning.51

40.Spending on election or referendum campaigns by foreign organisations or individuals is not allowed. We shall be exploring issues surrounding the donation to Leave.EU by Arron Banks in Chapter 4, but another example involving Cambridge Analytica was brought to our attention by Arron Banks himself. A document from Cambridge Analytica’s presentation pitch to Leave.EU stated that “We will co-ordinate a programme of targeted solicitation, using digital advertising and other media as appropriate to raise funds for Leave.EU in the UK, the USA, and in other countries.”52 In response to a question asking whether he had taken legal advice on this proposal, Alexander Nix, the then CEO of Cambridge Analytica, replied, “We took a considerable amount of legal advice and, at the time, it was suggested by our counsel that we could target British nationals living abroad for donations. I believe […] that there is still some lack of clarity about whether this is true or not.53

41.When giving evidence, the Electoral Commission repeated a recommendation first made in 2003 that online campaign material should legally be required to carry a digital imprint, identifying the source. While the Electoral Commission’s remit does not cover the content of campaign material, and it is “not in a position to monitor the truthfulness of campaign claims, online or otherwise”, it holds that digital imprints “will help voters to assess the credibility of campaign messages.”54 A recent paper from Upturn, Leveling the platform: real transparency for paid messages on Facebook, highlighted the fact that “ads can be shared widely, and live beyond indication that their distribution was once subsidized. And they can be targeted with remarkable precision”.55 For this reason, we believe digital imprints should be clear and make it easy for users to identify what is in adverts and who the advertiser is.

42.The Electoral Commission published a report on 26 June 2018, calling for the law to be strengthened around digital advertising and campaigning, including:

43.Claire Bassett told us that the current maximum fine that the Electoral Commission can impose on wrongdoings in political campaigning is £20,000, which she said is described as “the cost of doing business” for some individuals and organisations. Ms Bassett said that this amount was too low and should be increased, in line with other regulators that can impose more significant fines.57 She also commented on how she would like a change to the regulated periods, particularly in reference to referenda:

One of the challenges we have as regulator is that we are a financial regulator, and we regulate the parties and campaigners, usually during just that regulated period or the extended period that is set out. That does create challenges in a referendum setting. We think there is value in looking at those campaign regulated periods and thinking about how they work.58

We are aware that the Report of the Independent Commission on Referendums made similar recommendations in its report of July 2018.59

44.The globalised nature of social media creates challenges for regulators. In evidence Facebook did not accept their responsibilities to identify or prevent illegal election campaign activity from overseas jurisdictions. In the context of outside interference in elections, this position is unsustainable and Facebook, and other platforms, must begin to take responsibility for the way in which their platforms are used.

45.Electoral law in this country is not fit for purpose for the digital age, and needs to be amended to reflect new technologies. We support the Electoral Commission’s suggestion that all electronic campaigning should have easily accessible digital imprint requirements, including information on the publishing organisation and who is legally responsible for the spending, so that it is obvious at a glance who has sponsored that campaigning material, thereby bringing all online advertisements and messages into line with physically published leaflets, circulars and advertisements. We note that a similar recommendation was made by the Committee on Standards in Public Life, and urge the Government to study the practicalities of giving the Electoral Commission this power in its White Paper.

46.As well as having digital imprints, the Government should consider the feasibility of clear, persistent banners on all paid-for political adverts and videos, indicating the source and making it easy for users to identify what is in the adverts, and who the advertiser is.

47.The Electoral Commission’s current maximum fine limit of £20,000 should be changed to a larger fine based on a fixed percentage of turnover, such as has been granted recently to the Information Commissioner’s Office in the Data Protection Act 2018. Furthermore, the Electoral Commission should have the ability to refer matters to the Crown Prosecution Service, before their investigations have been completed.

48.Electoral law needs to be updated to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, micro-targeted political campaigning, as well as the many digital subcategories covered by paid and organic campaigning. The Government must carry out a comprehensive review of the current rules and regulations surrounding political work during elections and referenda, including: increasing the length of the regulated period; definitions of what constitutes political campaigning; absolute transparency of online political campaigning; a category introduced for digital spending on campaigns; reducing the time for spending returns to be sent to the Electoral Commission (the current time for large political organisations is six months); and increasing the fine for not complying with the electoral law.

49.The Government should consider giving the Electoral Commission the power to compel organisations that it does not specifically regulate, including tech companies and individuals, to provide information relevant to their inquiries, subject to due process.

50.The Electoral Commission should also establish a code for advertising through social media during election periods, giving consideration to whether such activity should be restricted during the regulated period, to political organisations or campaigns that have registered with the Commission. Both the Electoral Commission and the ICO should consider the ethics of Facebook or other relevant social media companies selling lookalike political audiences to advertisers during the regulated period, where they are using the data they hold on their customers to guess whether their political interests are similar to those profiles held in target audiences already collected by a political campaign. In particular, we would ask them to consider whether users of Facebook or other relevant social media companies should have the right to opt out from being included in such lookalike audiences.

Platform or publisher?

51.How should tech companies be defined—as a platform, a publisher, or something in between? The definition of ‘publisher’ gives the impression that tech companies have the potential to limit freedom of speech, by choosing what to publish and what not to publish. Monika Bickert, Head of Global Policy Management, Facebook, told us that “our community would not want us, a private company, to be the arbiter of truth”.60 The definition of ‘platform’ gives the impression that these companies do not create or control the content themselves, but are merely the channel through which content is made available. Yet Facebook is continually altering what we see, as is shown by its decision to prioritise content from friends and family, which then feeds into users’ newsfeed algorithm.61

52.Frank Sesno, Director of the School of Media and Public Affairs, George Washington University, told us in Washington D.C. that “they have this very strange, powerful, hybrid identity as media companies that do not create any of the content but should and must—to their own inadequate levels—accept some responsibility for promulgating it. What they fear most is regulation—a requirement to turn over their data”.62

53.At both our evidence session and at a separate speech in March 2018, the then Secretary of State for DCMS, Rt Hon Matt Hancock MP, noted the complexity of making any legislative changes to tech companies’ liabilities, putting his weight behind “a new definition” that was “more subtle” than the binary choice between platform and publisher.63 He told us that the Government has launched the Cairncross Review to look (within the broader context of the future of the press in the UK) at the role of the digital advertising supply chain, at how fair and transparent it is, and whether it “incentivises the proliferation of inaccurate and/or misleading news.” The review is also examining the role and impact of digital search engines and social media companies including an assessment of regulation “or further collaboration between the platforms and publishers.” The consultation closes in September 2018.64

54.In Germany, tech companies were asked to remove hate speech within 24 hours. This self-regulation did not work, so the German Government passed the Network Enforcement Act, commonly known as NetzDG, which became law in January 2018. This legislation forces tech companies to remove hate speech from their sites within 24 hours, and fines them 20 million Euros if it is not removed.65

55.Some say that the NetzDG regulation is a blunt instrument, which could be seen to tamper with free speech, and is specific to one country, when the extent of the content spans many countries. Monika Bickert, from Facebook, told us that “sometimes regulations can take us to a place—you have probably seen some of the commentary about the NetzDG law in Germany—where there will be broader societal concerns about content that we are removing and whether that line is in the right place”.66 The then Secretary of State was also wary of the German legislation because “when a regulator gets to the position where they are policing the publication of politicians then you are into tricky territory”.67 However, as a result of this law, one in six of Facebook’s moderators now works in Germany, which is practical evidence that legislation can work.68

56.Within social media, there is little or no regulation. Hugely important and influential subjects that affect us—political opinions, mental health, advertising, data privacy—are being raised, directly or indirectly, in these tech spaces. People’s behaviour is being modified and changed as a result of social media companies. There is currently no sign of this stopping.

57.Social media companies cannot hide behind the claim of being merely a ‘platform’, claiming that they are tech companies and have no role themselves in regulating the content of their sites. That is not the case; they continually change what is and is not seen on their sites, based on algorithms and human intervention. However, they are also significantly different from the traditional model of a ‘publisher’, which commissions, pays for, edits and takes responsibility for the content it disseminates.

58.We recommend that a new category of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher’. We anticipate that the Government will put forward these proposals in its White Paper later this year and hope that sufficient time will be built in for our Committee to comment on new policies and possible legislation.

59.We support the launch of the Government’s Cairncross Review, which has been charged with studying the role of the digital advertising supply chain, and whether its model incentivises the proliferation of inaccurate or misleading news. We propose that this Report is taken into account as a submission to the Cairncross Review. We recommend that the possibility of the Advertising Standards Agency regulating digital advertising be considered as part of the Review. We ourselves plan to take evidence on this question this autumn, from the ASA themselves, and as part of wider discussions with DCMS and Ofcom.

60.It is our recommendation that this process should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms. This should include both content that has been referred to them for takedown by their users, and other content that should have been easy for the tech companies to identify for themselves. In these cases, failure to act on behalf of the tech companies could leave them open to legal proceedings launched either by a public regulator, and/or by individuals or organisations who have suffered as a result of this content being freely disseminated on a social media platform.

Transparency

61.What we found, time and again, during the course of our inquiry, was the failure on occasions of Facebook and other tech companies, to provide us with the information that we sought. We undertook fifteen exchanges of correspondence with Facebook, and two oral evidence sessions, in an attempt to elicit some of the information that they held, including information regarding users’ data, foreign interference and details of the so-called ‘dark ads’ that had reached Facebook users.69 Facebook consistently responded to questions by giving the minimal amount of information possible, and routinely failed to offer information relevant to the inquiry, unless it had been expressly asked for. It provided witnesses who have been unwilling or unable to give full answers to the Committee’s questions. This is the reason why the Committee has continued to press for Mark Zuckerberg to appear as a witness as, by his own admission, he is the person who decides what happens at Facebook.

62.We ask, once more, for Mr Zuckerberg to come to the Committee to answer the many outstanding questions to which Facebook has not responded adequately to date. Edward Lucas, a writer and security policy expert, rightly told us that Facebook should not be in a position of marking its own homework: “They have a duty as a platform to have transparent rules that can be discussed with the outside world and we should be able to check stuff. […] We cannot just trust Facebook to go after the event and say, ‘Nothing to see here, move along’. We should be able to see in real time who is advertising”.70

63.As the then Secretary of State, Rt Hon Matthew Hancock MP, pointed out when he gave evidence to us, the Defamation Act 2013 contains provisions stating that, if a user is defamed on social media, and the offending individual cannot be identified, the liability rests with the platform.71

64.Tech companies are not passive platforms on which users input content; they reward what is most engaging, because engagement is part of their business model and their growth strategy. They have profited greatly by using this model. This manipulation of the sites by tech companies must be made more transparent. Facebook has all of the information. Those outside of the company have none of it, unless Facebook chooses to release it. Facebook was reluctant to share information with the Committee, which does not bode well for future transparency. We ask, once more, for Mr Zuckerberg to come to the Committee to answer the many outstanding questions to which Facebook has not responded adequately, to date.

65.Facebook and other social media companies should not be in a position of ‘marking their own homework’. As part of its White Paper this Autumn, the Government need to carry out proactive work to find practical solutions to issues surrounding transparency that will work for both users, the Government, and the tech companies.

66.Facebook and other social media companies have a duty to publish and to follow transparent rules. The Defamation Act 2013 contains provisions stating that, if a user is defamed on social media, and the offending individual cannot be identified, the liability rests with the platform. We urge the Government to examine the effectiveness of these provisions, and to monitor tech companies to ensure they are complying with court orders in the UK and to provide details of the source of disputed content—including advertisements—to ensure that they are operating in accordance with the law, or any future industry Codes of Ethics or Conduct. Tech companies also have a responsibility to ensure full disclosure of the source of any political advertising they carry.

Bots

67.Bots are algorithmically-driven computer programmes designed to carry out specific tasks online, such as analysing and scraping data. Some are created for political purposes, such as automatically posting content, increasing follower numbers, supporting political campaigns, or for spreading misinformation and disinformation. Samantha Bradshaw, from the Oxford Internet Institute, University of Oxford, described the different types of bots, with some being completely automated and some with real people who engage with the automated bots, described as ‘cyborgs’: “Those accounts are a lot harder to detect for researchers, because they feel a lot more genuine. Instead of just automating a bunch of tweets, so that something retweets different accounts 100 times a day, bots might actually post comments and talk with other users—real people—on the accounts”.72

68.When she gave evidence in February 2018, Monika Bickert, Head of Global Policy Management at Facebook, would not confirm whether there were around 50,000 bot accounts during the US presidential election of 2016.73 However, when Mike Schroepfer, CTO of Facebook, gave evidence in April 2018, after the Cambridge Analytica events had unfolded, he was more forthcoming about the problem of bots:

The key thing here is people trying to create inauthentic identities on Facebook, claiming they are someone other than who they are. To give you a sense of the scale of that problem and the means, while we are in this testimony today it is likely we will be blocking hundreds of thousands of attempts by people around the world to create fake accounts through automated systems. This is literally a day-to-day fight to make sure that people who are trying to abuse the platform are kept off it and to make sure that people use Facebook for what we want it for, which is to share it with our friends.74

69.Mike Schroepfer also said that removal of such bots can be difficult, and was evasive about how many fake accounts had been removed, telling us: “We are purging fake accounts all the time and dealing with fraudulent ads and we do not tend to report each specific instance. I know we report aggregate statistics on a regular basis, but it is not something we are reporting here or there, so I don’t know.75 The problem with removing such bots without a systematic appraisal of their composition means that valuable information is then lost. Such information would prove invaluable to researchers involved in making connections, in order to prevent future attacks by malign players.

Algorithms

70.Both social media companies and search engines use algorithms, or sequences of instructions, to personalise news and other content for users. The algorithms select content based on factors such as a user’s past online activity, social connections, and their location. Samantha Bradshaw, from the Oxford Internet Institute, told us about the power of Facebook to manipulate people’s emotions by showing different types of stories to them: “If you showed them more negative stories, they would feel more negatively. If you showed them positive stories, they would feel more positive”.76 The tech companies’ business models rely on revenue coming from the sale of adverts and, because the bottom line is profit, negative emotions (which appear more quickly than positive emotions) will always be prioritised. This makes it possible for negative stories to spread.

71.Information about algorithms that dictate what users see on their News Feed is not publicly available. But just as information about the tech companies themselves needs to be more transparent, so does information about the algorithms themselves. These can carry inherent biases, such as those involving gender and race, as a result of algorithms development by engineers; these biases are then replicated, spread, and reinforced. Monika Bickert, from Facebook, admitted that Facebook was concerned about “any type of bias, whether gender bias, racial bias or other forms of bias that could affect the way that work is done at our company. That includes working on algorithms”. She went on to describe ways in which they were attempting to address this problem, including “initiatives ongoing, right now, to try to develop talent in under-represented communities”.77 In our opinion, Facebook should be taking a more active and urgent role in tackling such inherent biases in algorithm development by engineers, to prevent these biases being replicated and reinforced. Claire Wardle, Executive Director of First Draft News, told us of the importance to get behind the ‘black box’ of the working of algorithms, in order to understand the rules and motivations of the tech companies:

What are the questions to ask a platform about why it was created? What are the metrics for that particular algorithm? How can we have more insight into that algorithm? How can we think about frameworks of algorithms? Irrespective of the platform, how can we set up that framework so that platforms have to be not just transparent but transparent across particular aspects and elements?78

72.Just as the finances of companies are audited and scrutinised, the same type of auditing and scrutinising should be carried out on the non-financial aspects of technology companies, including their security mechanisms and algorithms, to ensure they are operating responsibly. The Government should provide the appropriate body with the power to audit these companies, including algorithmic auditing, and we reiterate the point that the ICO’s powers should be substantially strengthened in these respects.

73.If companies like Facebook and Twitter fail to act against fake accounts, and properly account for the estimated total of fake accounts on their sites at any one time, this could not only damage the user experience, but potentially defraud advertisers who could be buying target audiences on the basis that the user profiles are connected to real people. We ask the Competition and Markets Authority to consider conducting an audit of the operation of the advertising market on social media.

Privacy settings and ‘terms and conditions’

74.Facebook and other tech companies make it hard for users to protect their own data. A report by the Norwegian Consumer Council, ‘Deceived by Design’, published in June 2018, highlighted the fact that Facebook, Google and Microsoft direct users away from privacy-friendly options on their services in an “unethical” way, while giving users “an illusion of control”.79 Users’ privacy rights are usually hidden in tech companies’ ‘terms and conditions’ policies, which themselves are complicated, and do not follow their own terms of conditions in ensuring that they are age appropriate and that age ratification takes place.80

75.Social media companies have a legal duty to inform users of their privacy rights. Companies give users the illusion of users having freedom over how they control their data, but they make it extremely difficult, in practice, for users to protect their data. Complicated and lengthy terms and conditions, small buttons to protect our data and large buttons to share our data mean that, although in principle we have the ability to practise our rights over our data—through for example the GDPR and the Data Protection Act—in practice it is made hard for us.

76.The UK Government should consider establishing a digital Atlantic Charter as a new mechanism to reassure users that their digital rights are guaranteed. This innovation would demonstrate the UK’s commitment to protecting and supporting users, and establish a formal basis for collaboration with the US on this issue. The Charter would be voluntary, but would be underpinned by a framework setting out clearly the respective legal obligations in signatory countries. This would help ensure alignment, if not in law, then in what users can expect in terms of liability and protections.

‘Free Basics’ and Burma

77.One of Facebook’s unofficial company mottoes was to “move quickly and break things”; to take risks, to not consider the consequences. Sandy Parakilas, an ex-Facebook employee, told us that most of the goals of the company were centred around growth, in terms of the number of people using the service and the subsequent revenue.81 But with growth comes unintended consequences, if that growth happens unchecked. This unchecked growth of Facebook is continuing. ‘Free Basics’ is a Facebook service that provides people in developing countries with mobile phone access to various services without data charges. This content includes news, employment, health, information and local information. Free Basics is available in 63 countries around the world.82

78.Out of a 50 million population in Burma, there are 30 million monthly active users on Facebook.83 While Free Basics gives internet access for the majority of people in Burma, at the same time it severely limits the information available to users, making Facebook virtually the only source of information online for the majority of people in Burma.84 The United Nations accused Facebook of playing a determining role in stirring up hatred against the Rohingya Muslim minority in Rakhine State. In March 2018, the UN Myanmar investigator Yanghee Lee said that the platform had morphed into a ‘beast’ that helped to spread vitriol against Rohingya Muslins.85

79.When Mike Schroepfer, the CTO at Facebook, gave evidence in April 2018, he described the situation in Burma as “awful”, and that “we need to and are trying to do a lot more to get hate speech and all this kind of vile content off the platform”,86 but he could not tell us when Facebook had started work on limiting hate speech, he could not tell us how many fake accounts had been identified and removed from Burma, and he could not tell us how much revenue Facebook was making from Facebook users in Burma.87

80.Mr. Schroepfer promised to submit supplementary evidence to give us that information. However, Facebook’s supplementary evidence stated: “We do not break down the removal of fake accounts by country. […] Myanmar [Burma] is a small market for Facebook. We do not publish country advertising revenue figures”.88 We sent yet another letter, asking why Facebook does not break down the removal of fake accounts by country, which seems a serious lapse in demonstrating how it takes responsibility when problems with fake accounts arise.89 To date, we have not received an answer.

81.UK aid to Burma is planned at £100 million for 2018.90 The Department for International Development told the International Development Committee that “for our programme to be successful, Burma must work towards the implementation of inclusive peace agreements, a new political settlement; and the military serving, rather than ruling, Burma”.91 To date, the UK’s total support for the crisis since August 2017 is £129 million.

82.The United Nations has named Facebook as being responsible for inciting hatred against the Rohingya Muslim minority in Burma, through its ‘Free Basics’ service. It provides people free mobile phone access without data charges, but is also responsible for the spread disinformation and propaganda. The CTO of Facebook, Mike Schroepfer described the situation in Burma as “awful”, yet Facebook cannot show us that it has done anything to stop the spread of disinformation against the Rohingya minority.

83.The hate speech against the Rohingya—built up on Facebook, much of which is disseminated through fake accounts—and subsequent ethnic cleansing, has potentially resulted in the success of DFID’s aid programmes being greatly reduced, based on the qualifications they set for success. The activity of Facebook undermines international aid to Burma, including the UK Government’s work. Facebook is releasing a product that is dangerous to consumers and deeply unethical. We urge the Government to demonstrate how seriously it takes Facebook’s apparent collusion in spreading disinformation in Burma, at the earliest opportunity. This is a further example of Facebook failing to take responsibility for the misuse of its platform.

Code of Ethics and developments

84.Facebook has hampered our efforts to get information about their company throughout this inquiry. It is as if it thinks that the problem will go away if it does not share information about the problem, and reacts only when it is pressed. Time and again we heard from Facebook about mistakes being made and then (sometimes) rectified, rather than designing the product ethically from the beginning of the process. Facebook has a ‘Code of Conduct’, which highlights the principles by which Facebook staff carry out their work, and states that employees are expected to “act lawfully, honestly, ethically, and in the best interests of the company while performing duties on behalf of Facebook”.92 Facebook has fallen well below this standard in Burma.

85.The then Secretary of State, Rt Hon Matt Hancock MP, talked about the need for tech companies to move from a libertarian attitude—“the foundation of the internet”—to one of “liberal values on the internet, which is supporting and cherishing the freedom but not the freedom to harm others”.93 He warned that tech company leaders have a responsibility, otherwise responsibility will be imposed on them: “I do not, for a moment, buy this idea that just because the internet is global therefore nation states do not have a say in it. We are responsible. We collectively, Parliament is responsible, for the statutory rules where our society lives”.94

Monopolies and the business models of tech companies

86.The dominance of a handful of powerful tech companies, such as Facebook, Twitter and Google, has resulted in their behaving as if they were monopolies in their specific area. Traditionally, the basis of competition policy with regard to monopolies has been the issue of consumer detriment, such as the risk of overcharging. However, in the tech world, consumer detriment is harder to quantify. In the digital sphere, many of these services have marginal running costs, are free to the consumer at the point of use, and have the potential of benefiting the consumer from being monopolistic—the sharing of information is the point of these companies and improves the accuracy of services such as Google Maps. As the Secretary of State told us, “The whole question of the concept of how we run competition policy in an era where many goods and many other new innovations have zero marginal costs and are free is intellectually difficult.”95

87.With the free access of services must come the means of funding the businesses; tech companies’ business models rely on the data of users for advertisers to utilise, in order to maximise their revenue. Facebook and Google have 60% of US digital ad spend and 20% of total global spend, as of February 2018.96 Therefore, consumer protection in the modern world is not only about goods, it is about the protection of data. Tech companies’ business models have extolled the fact that they are offering innovations that are free to use, but in doing so the users become the product of the companies, and this is where issues of mistrust and misuse arise. The new measures in GDPR allow users to see what data the companies hold about them, and users can request their data to be removed or transferred to other tech companies, but in order for this to be effective, users must have knowledge of and utilise these rights.97

88.Professor Bakir, from Bangor University, talked of how technology continually changes, with people adapting to that technology in unpredictable ways.98 She suggested the establishment of a working group, to monitor what is being developed in the area of misinformation and disinformation because “what is around the corner may be much more worrying than what we have experienced to date”.99 As technology develops so quickly, regulation needs to be based not on specifics, but on principles, and adaptive enough to withstand technological developments.

89.A professional global Code of Ethics should be developed by tech companies, in collaboration with this and other governments, academics, and interested parties, including the World Summit on Information Society, to set down in writing what is and what is not acceptable by users on social media, with possible liabilities for companies and for individuals working for those companies, including those technical engineers involved in creating the software for the companies. New products should be tested to ensure that products are fit-for-purpose and do not constitute dangers to the users, or to society.

90.The Code of Ethics should be the backbone of tech companies’ work, and should be continually referred to when developing new technologies and algorithms. If companies fail to adhere to their own Code of Ethics, the UK Government should introduce regulation to make such ethical rules compulsory.

91.The dominance of a handful of powerful tech companies, such as Facebook, Twitter and Google, has resulted in their behaving as if they were monopolies in their specific area. While this portrayal of tech companies does not appreciate the benefits of a shared service, where people can communicate freely, there are considerations around the data on which those services are based, and how these companies are using the vast amount of data they hold on users. In its White Paper, the Government must set out why the issue of monopolies is different in the tech world, and the measures needed to protect users’ data.


26 As of February 2018, 79% of the UK population had Facebook accounts, 79% used YouTube, and 47% used Twitter, https://weareflint.co.uk/press-release-social-media-demographics-2018

27 The Center for Humane Technology website, accessed 27 June 2018

28 Tristan Harris, Q3149

30 Tristan Harris, Q3147

31 Elizabeth Denham, Information Commissioner (FKN0051)

32 The General Data Protection Regulation (GDPR) came into force on 25 May 2018 and is a regulation under EU law on data protection and privacy for all individuals within the European Union (EU) and the European Economic Area (EEA). It forms part of the data protection regime in the UK, together with the new Data Protection Act 2018 (DPA 2018).

33 Elizabeth Denham, Information Commissioner (FKN0051)

34 Elizabeth Denham, Information Commissioner (FKN0051)

35 Elizabeth Denham, Information Commissioner (FKN0051)

37 Guide to the GDPR, ICO website, accessed 21 July 2018

38 Brexit risk to UK personal data, Robert Madge, Medium, 22 June 2018

45 Electoral Commission (FNW0048)

47 The Electoral Commission’s remit covers the UK only and it has no power to intervene or to stop someone acting if they are outside the UK (Claire Bassett, Q2655)

51 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018, p12

52 Arron Banks (FKN0056)

53 Q3331

54 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018, p9

56 Digital campaigning: increasing transparency for voters, Electoral Commission, 25 June 2018

58 Claire Bassett, Q2617

59 Report of the Independent Commission on Referendums UCL Constitution Unit, July 2018

61 Monika Bickert, Q434

62 Frank Sesno, Q583

68 Professor Lewandowsky, Q233

69 Oral evidence session, 8 February 2018; oral evidence session, 26 April 2018; exchanges of correspondence between the Chair of the DCMS Committee and Facebook representatives, between 24 October and 8 June 2018, can be found on the DCMS Committee’s inquiry page.

71 Section 5, ‘Operators’, Defamation Act 2013

79 Deceived by design, Norwegian Consumer Council, 26 June 2018.

80 The then Secretary of State, Rt Hon Matt Hancock MP Q968

82 internet.org by Facebook, accessed 21 July 2018.

87 Qq 2493 to 2496

90 Bangladesh, Burma and the Rohingya crisis, International Development Committee, fourth report of session 2017–19, HC 1054, para 11

91 Ibid.

92 Code of Conduct, Facebook, 31 May 2018

96 Ian Lucas MP, Q619

97 Guide to the GDPR, ICO website




Published: 29 July 2018