Disinformation and 'fake news': Final Report Contents

Conclusions and recommendations

Regulation and the role, definition and legal liability of tech companies

1.Social media companies cannot hide behind the claim of being merely a ‘platform’ and maintain that they have no responsibility themselves in regulating the content of their sites. We repeat the recommendation from our Interim Report that a new category of tech company is formulated, which tightens tech companies’ liabilities, and which is not necessarily either a ‘platform’ or a ‘publisher’. This approach would see the tech companies assume legal liability for content identified as harmful after it has been posted by users. We ask the Government to consider this new category of tech company in its forthcoming White Paper. (Paragraph 14)

2.By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both the UK Parliament and the ‘International Grand Committee’, involving members from nine legislatures from around the world. (Paragraph 29)

3.Our Interim Report recommended that clear legal liabilities should be established for tech companies to act against harmful or illegal content on their sites. There is now an urgent need to establish independent regulation. We believe that a compulsory Code of Ethics should be established, overseen by an independent regulator, setting out what constitutes harmful content. The independent regulator would have statutory powers to monitor relevant tech companies; this would create a regulatory system for online content that is as effective as that for offline content industries. (Paragraph 37)

4.As we said in our Interim Report, such a Code of Ethics should be similar to the Broadcasting Code issued by Ofcom—which is based on the guidelines established in section 319 of the 2003 Communications Act. The Code of Ethics should be developed by technical experts and overseen by the independent regulator, in order to set down in writing what is and is not acceptable on social media. This should include harmful and illegal content that has been referred to the companies for removal by their users, or that should have been easy for tech companies themselves to identify. (Paragraph 38)

5.The process should establish clear, legal liability for tech companies to act against agreed harmful and illegal content on their platform and such companies should have relevant systems in place to highlight and remove ‘types of harm’ and to ensure that cyber security structures are in place. If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code. (Paragraph 39)

6.This same public body should have statutory powers to obtain any information from social media companies that are relevant to its inquiries. This could include the capability to check what data is being held on an individual user, if a user requests such information. This body should also have access to tech companies’ security mechanisms and algorithms, to ensure they are operating responsibly. This public body should be accessible to the public and be able to take up complaints from members of the public about social media companies. We ask the Government to put forward these proposals in its forthcoming White Paper. (Paragraph 40)

7.We support the recommendation from the ICO that inferred data should be as protected under the law as personal information. Protections of privacy law should be extended beyond personal information to include models used to make inferences about an individual. We recommend that the Government studies the way in which the protections of privacy law can be expanded to include models that are used to make inferences about individuals, in particular during political campaigning. This will ensure that inferences about individuals are treated as importantly as individuals’ personal information. (Paragraph 48)

8.In our Interim Report, we recommended a levy should be placed on tech companies operating in the UK to support the enhanced work of the ICO. We reiterate this recommendation. The Chancellor’s decision, in his 2018 Budget, to impose a new 2% digital services tax on UK revenues of big technology companies from April 2020, shows that the Government is open to the idea of a levy on tech companies. The Government’s response to our Interim Report implied that it would not be financially supporting the ICO any further, contrary to our recommendation. We urge the Government to reassess this position. (Paragraph 51)

9.The new independent system and regulation that we recommend should be established must be adequately funded. We recommend that a levy is placed on tech companies operating in the UK to fund its work. (Paragraph 52)

Data Use and data targeting

10.The Cambridge Analytica scandal was faciliated by Facebook’s policies. If it had fully complied with the FTC settlement, it would not have happened. The US Federal Trade Commission (FTC) Complaint of 2011 ruled against Facebook—for not protecting users’ data and for letting app developers gain as much access to user data as they liked, without restraint—and stated that Facebook built their company in a way that made data abuses easy. When asked about Facebook’s failure to act on the FTC’s complaint, Elizabeth Denham, the Information Commissioner, told us: “I am very disappointed that Facebook, being such an innovative company, could not have put more focus, attention and resources into protecting people’s data”. We are equally disappointed. (Paragraph 76)

11.The evidence that we obtained from the Six4Three court documents indicates that Facebook was willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers—such as Six4Three—of that data, thereby causing them to lose their business. It seems clear that Facebook was, at the very least, in violation of its Federal Trade Commission settlement. (Paragraph 135)

12.The Information Commissioner told the Committee that Facebook needs to significantly change its business model and its practices to maintain trust. From the documents we received from Six4Three, it is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws. The ICO should carry out a detailed investigation into the practices of the Facebook Platform, its use of users’ and users’ friends’ data, and the use of ‘reciprocity’ of the sharing of data. (Paragraph 136)

13.Ireland is the lead authority for Facebook, under GDPR, and we hope that these documents will provide useful evidence for Helen Dixon, the Irish Data Protection Commissioner, in her current investigations into the way in which Facebook targeted, monitored, and monetised its users. (Paragraph 137)

14.In our Interim Report, we stated that the dominance of a handful of powerful tech companies has resulted in their behaving as if they were monopolies in their specific area, and that there are considerations around the data on which those services are based. Facebook, in particular, is unwilling to be accountable to regulators around the world. The Government should consider the impact of such monopolies on the political world and on democracy. (Paragraph 138)

15.The Competitions and Market Authority (CMA) should conduct a comprehensive audit of the operation of the advertising market on social media. The Committee made this recommendation its interim report, and we are pleased that it has also been supported in the independent Cairncross Report commissioned by the government and published in February 2019. Given the contents of the Six4Three documents that we have published, it should also investigate whether Facebook specifically has been involved in any anti-competitive practices and conduct a review of Facebook’s business practices towards other developers, to decide whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail. We hope that the Government will include these considerations when it reviews the UK’s competition powers in April 2019, as stated in the Government response to our Interim Report. Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law. (Paragraph 139)

16.From the evidence we received, which has been supported by the findings of both the ICO and the Electoral Commission, it is clear that a porous relationship existed between Eldon Insurance and Leave.EU, with staff and data from one organisation augmenting the work of the other. There was no attempt to create a strict division between the two organisations, in breach of current laws. We look forward to hearing the findings of the ICO’s audits into the two organisations. (Paragraph 146)

17.As set out in our Interim Report, Arron Banks and Andy Wigmore showed complete disregard and disdain for the parliamentary process when they appeared before us in June 2018. It is now evident that they gave misleading evidence to us, too, about the working relationship between Eldon Insurance and Leave.EU. They are individuals, clearly, who have less than a passing regard for the truth. (Paragraph 147)

Aggregate IQ

18.There is clear evidence that there was a close working relationship between Cambridge Analytica, SCL and AIQ. There was certainly a contractual relationship, but we believe that the information revealed from the repository would imply something closer, with data exchanged between both AIQ and SCL, as well as between AIQ and Cambridge Analytica. (Paragraph 166)

19.AIQ worked on both the US Presidential primaries and for Brexit-related organisations, including the designated Vote Leave group, during the EU Referendum. The work of AIQ highlights the fact that data has been and is still being used extensively by private companies to target people, often in a political context, in order to influence their decisions. It is far more common than people think. The next chapter highlights the widespread nature of this targeting. (Paragraph 192)

Advertising and political campaigning

20.We repeat the recommendation from our Interim Report, that the Government should look at the ways in which the UK law should define digital campaigning, including having agreed definitions of what constitutes online political advertising, such as agreed types of words that continually arise in adverts that are not sponsored by a specific political party. There also needs to be an acknowledgement of the role and power of unpaid campaigns and Facebook Groups that influence elections and referendums (both inside and outside the designated period). (Paragraph 210)

21.Electoral law is not fit for purpose and needs to be changed to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, microtargeted political campaigning. There needs to be: absolute transparency of online political campaigning, including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser; a category introduced for digital spending on campaigns; and explicit rules surrounding designated campaigners’ role and responsibilities. (Paragraph 211)

22.We would expect that the Cabinet Office’s consultation will result in the Government concluding that paid-for political advertising should be publicly accessible, clear and easily recognisable. Recipients should be able to identify the source, who uploaded it, who sponsored it, and its country of origin. (Paragraph 212)

23.The Government should carry out a comprehensive review of the current rules and regulations surrounding political work during elections and referenda including: increasing the length of the regulated period; defining what constitutes political campaigning; and reducing the time for spending returns to be sent to the Electoral Commission. (Paragraph 213)

24.The Government should explore ways in which the Electoral Commission can be given more powers to carry out its work comprehensively, including the following measures:

25.Political advertising items should be publicly accessible in a searchable repository—who is paying for the ads, which organisations are sponsoring the ad, who is being targeted by the ads—so that members of the public can understand the behaviour of individual advertisers. It should be run independently of the advertising industry and of political parties. This recommendation builds on paragraph 144 of our Interim Report. (Paragraph 215)

26.We agree with the ICO’s proposal that a Code of Practice, which highlights the use of personal information in political campaigning and applying to all data controllers who process personal data for the purpose of political campaigning, should be underpinned by primary legislation. We urge the Government to act on the ICO’s recommendation and bring forward primary legislation to place these Codes of Practice into statute. (Paragraph 216)

27.We support the ICO’s recommendation that all political parties should work with the ICO, the Cabinet Office and the Electoral Commission, to identify and implement a cross-party solution to improve transparency over the use of commonly-held data. This would be a practical solution to ensure that the use of data during elections and referenda is treated lawfully. We hope that the Government will work towards making this collaboration happen. We hope that the Government will address all of these issues when it responds to its consultation, “Protecting the Debate: Intimidating, Influence, and Information” and to the Electoral Commission’s report, “Digital Campaigning: increasing transparency for voters”. A crucial aspect of political advertising and influence is that of foreign interference in elections, which we hope it will also strongly address. (Paragraph 217)

28.Mainstream Network is yet another, more recent example of an online organisation seeking to influence political debate using methods similar to those which caused concern over the EU Referendum and there is no good case for Mainstream Network to hide behind anonymity. We look forward to receiving information from Facebook about the origins of Mainstream Network, which—to date—we have not received, despite promises from Richard Allan that he would provide the information. We consider Facebook’s response generally to be disingenuous and another example of Facebook’s bad faith. The Information Commissioner has confirmed that it is currently investigating this website’s activities and Facebook will, in any event, have to co-operate with the ICO. (Paragraph 222)

29.Tech companies must address the issue of shell companies and other professional attempts to hide identity in advert purchasing, especially around political advertising—both within and outside campaigning periods. There should be full disclosure of the targeting used as part of advertising transparency. The Government should explore ways of regulating the use of external targeting on social media platforms, such as Facebook’s Custom Audiences. (Paragraph 223)

30.Donations made to political parties in Northern Ireland before July 2017 are protected from disclosure, under Section 71E of the Political Parties, Elections and Referendums Act 2000. This prevents the Electoral Commission from disclosing any information relating to such donations before July 2017. We concur with the Electoral Commission that it is “deeply regrettable” that they are unable, by law, to tell Members of Parliament and the public about details surrounding the source of the £435,000 donation that was given by the Constitutional Research Council (CRC) to the DUP or the due diligence that was followed. Because of the law as it currently stands, this Committee and the wider public have no way of investigating the source of the £435,000 donation to the DUP made on behalf of the CRC and are prevented from even knowing whether it came from an organisation, whose membership had either sanctioned the donation or not, or from a wealthy individual. (Paragraph 232)

31.There is an absence of transparency surrounding the relationship between the Constitutional Research Council, the DUP and Vote Leave. We believe that, in order to avoid having to disclose the source of this £435,000 donation, the CRC, deliberately and knowingly, exploited a loophole in the electoral law to funnel money to the Democratic Unionist Party in Northern Ireland. That money was used to fund pro-Brexit newspaper advertising outside Northern Ireland and to pay the Canadian-based data analytics company, Aggregate IQ. (Paragraph 233)

32.We support the Electoral Commission in its request that the Government extend the transparency rules around donations made to political parties in Northern Ireland from 2014. This period of time would cover two UK general elections, two Northern Ireland Assembly elections, the Scottish independence referendum, the EU referendum, and EU and local government elections. We urge the Government to make this change in the law as soon as is practicable to ensure full transparency over these elections and referendums. (Paragraph 234)

33.We welcome Dame Frances Cairncross’s report on safeguarding the future of journalism, and the establishment of a code of conduct to rebalance the relationship between news providers and social media platforms. In particular, we welcome the recommendation that online digital newspapers and magazines should be zero rated for VAT, as is the case for printed versions. This would remove the false incentive for news companies against developing more paid-for digital services. We support the recommendation that chimes with our own on investigating online advertising, in particular focussing on the major search and social media companie, by the Competitions and Markets Authority. (Paragraph 236)

Foreign influence in political campaigns

34.In common with other countries, the UK is clearly vulnerable to covert digital influence campaigns and the Government should be conducting analysis to understand the extent of the targeting of voters, by foreign players, during past elections. We ask the Government whether current legislation to protect the electoral process from malign influence is sufficient. Legislation should be in line with the latest technological developments, and should be explicit on the illegal influencing of the democratic process by foreign players. We urge the Government to look into this issue and to respond in its White Paper. (Paragraph 249)

35.We are pleased that our recommendation set out in the Interim Report in July 2018, concerning Arron Banks and his donation, has been acted on by both the Electoral Commission—which has concerns that Banks is not the ‘true source’ of the donation—and by the National Crime Agency, which is currently investigating the source of the donation. (Paragraph 266)

36.There is a general principle that, subject to certain spending limits, funding from abroad is not allowed in UK elections. However, as the Electoral Commission has made clear, the current rules do not explicitly ban overseas spending. We recommend that, at the earliest opportunity, the Government reviews the current rules on overseas involvement in our UK elections to ensure that foreign interference in UK elections, in the form of donations, cannot happen. We also need to be clear that Facebook, and all platforms, have a responsibility to comply with the law and not to facilitate illegal activity. (Paragraph 267)

37.Information operations are part of a complex, interrelated group of actions that promote confusion and unrest through information systems, such as social media companies. These firms, in particular Facebook, need to take action against untransparent administrators of groups, which are being used for political campaigns. They also need to impose much more stringent punishment on users who abuse the system. Merely having a fake disinformation account shut down, but being able to open another one the next moment, is hardly a deterrent. (Paragraph 271)

38.The Government should put pressure on social media companies to publicise any instances of disinformation. The Government needs to ensure that social media companies share information they have about foreign interference on their sites—including who has paid for political adverts, who has seen the adverts, and who has clicked on the adverts—with the threat of financial liability if such information is not forthcoming. Security certificates, authenticating social media accounts, would ensure that a real person was behind the views expressed on the account. (Paragraph 272)

39.We repeat our call to the Government to make a statement about how many investigations are currently being carried out into Russian interference in UK politics. We further recommend that the Government launches an independent investigation into past elections—including the UK election of 2017, the UK Referendum of 2016, and the Scottish Referendum of 2014—to explore what actually happened with regard to foreign influence, disinformation, funding, voter manipulation, and the sharing of data, so that appropriate changes to the law can be made and lessons can be learnt for future elections and referenda. (Paragraph 273)

SCL influence in foreign elections

40.We stated in our Interim Report that “SCL Group and associated companies have gone into administration, but other companies are carrying out similar work. Senior individuals involved in SCL and Cambridge Analytica appear to have moved onto new corporate vehicles.” We recommended that “the National Crime Agency, if it is not already, should investigate the connections between the company SCL Elections Ltd and Emerdata Ltd. We repeat those recommendations in this Report. (Paragraph 296)

41.We recommend that the Government looks into ways that PR and strategic communications companies are audited, possibly by an independent body, to ensure that their campaigns do not conflict with the UK national interest and security concerns and do not obstruct the imposition of legitimate sanctions, as is the case currently with the legal selling of passports. Barriers need to be put in place to ensure that such companies cannot work on both sensitive UK Government projects and with clients whose intention might be to undermine those interests. (Paragraph 298)

42.The transformation of Cambridge Analytica into Emerdata illustrates how easy it is for discredited companies to reinvent themselves and potentially use the same data and the same tactics to undermine governments, including in the UK. The industry needs cleaning up. As the SCL/Cambridge Analytica scandal shows, the sort of bad practices indulged in abroad or for foreign clients, risk making their way into UK politics. Currently the strategic communications industry is largely self-regulated. The UK Government should consider new regulations that curb bad behaviour in this industry. (Paragraph 299)

43.There needs to be transparency in these strategic communications companies, with a public record of all campaigns that they work on, both at home and abroad. They need to be held accountable for breaking laws during campaigns anywhere in the world, or working for financially non-transparent campaigns. We recommend that the Government addresses this issue, when it responds to its consultation, ‘Protecting the Debate: Intimidating, Influence and Information’. (Paragraph 300)

44.We recommend that the Government revisits the UK Bribery Act, to gauge whether the legislation is enough of a regulatory brake on bad behaviour abroad. We also look to the Government to explore the feasibility of adopting a UK version of the US Foreign Agents and Registration Act (FARA), which requires “persons acting as agents of foreign principals in a political or quasi-political capacity to make periodic public disclosure of their relationships with the foreign principal, as well as activities, receipts and disbursements in support of those activities”. (Paragraph 301)

Digital literacy

45.On the one hand, Facebook gives the impression of working towards transparency, with regard to the auditing of its news content; but on the other, there is considerable obfuscation concerning the auditing of its adverts, which provide Facebook with its ever-increasing revenue. To make informed judgments about the adverts presented to them on Facebook, users need to see the source and purpose behind the content. (Paragraph 306)

46.As we wrote in our Interim Report, digital literacy should be a fourth pillar of education, alongside reading, writing and maths. In its response, the Government did not comment on our recommendation of a social media company levy, to be used, in part, to finance a comprehensive educational framework—developed by charities, NGOs, and the regulators themselves—and based online. Such a framework would inform people of the implications of sharing their data willingly, their rights over their data, and ways in which they can constructively engage and interact with social media sites. People need to be resilient about their relationship with such sites, particular around what they read and what they write. We reiterate this recommendation to the Government, and look forward to its response. (Paragraph 312)

47.The public need to know more about their ability to report digital campaigning that they think is misleading and or unlawful. Ofcom, the ASA, the ICO and the Electoral Commission need to raise their profiles so that people know about their services and roles. The Government should take a leading role in co-ordinating this crucial service for the public. The Government must provide clarity for members of the public about their rights with regards to social media companies. (Paragraph 313)

48.Social media users need online tools to help them distinguish between quality journalism, and stories coming from organisations that have been linked to disinformation or are regarded as being unreliable sources. The social media companies should be required to either develop tools like this for themselves, or work with existing providers, such as Newsguard, to make such services available for their users. The requirement for social media companies to introduce these measures could form part of a new system of content regulation, based on a statutory code, and overseen by an independent regulator, as we have discussed earlier in this report. (Paragraph 314)

49.Social media companies need to be more transparent about their own sites, and how they work. Rather than hiding behind complex agreements, they should be informing users of how their sites work, including curation functions and the way in which algorithms are used to prioritise certain stories, news and videos, depending on each user’s profile. The more people know how the sites work, and how the sites use individuals’ data, the more informed we shall all be, which in turn will make choices about the use and privacy of sites easier to make. (Paragraph 315)

50.Ofcom, the ICO, the Electoral Commission and the Advertising Standards Authority have all written separately about their role in promoting digital literacy. We recommend that the Government ensures that the four main regulators produce a more united strategy in relation to digital literacy. Included in this united approach should be a public discussion on how we, as individuals, are happy for our data to be used and shared. People need to know how their data is being used (building on recommendations we set out in Chapter Two of this Final Report). Users need to know how to set the boundaries that they want to, and how those boundaries should be set, with regard to their personal data. Included in this debate should be arguments around whether users want an agreed basic expectation of privacy, in a similar vein to a basic level of hygiene. Users could have the ability of opting out of such minimum thresholds, if they chose. (Paragraph 316)

51.We recommend that participating in social media should allow more pause for thought. More obstacles or ‘friction’ should be both incorporated into social media platforms and into users’ own activities—to give people time to consider what they are writing and sharing. Techniques for slowing down interaction online should be taught, so that people themselves question both what they write and what they read—and that they pause and think further, before they make a judgement online. (Paragraph 317)





Published: 18 February 2019