1.The relevant experts in the ASA, the Electoral Commission, Ofcom and the UK Statistics Authority should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code. (Paragraph 36)
2.The Online Harms Bill should make clear that misinformation and disinformation are within its scope. (Paragraph 40)
3.Ofcom should produce a code of practice on misinformation. This code should include a requirement that if a piece or pattern of content is identified as misinformation by an accredited fact checker then it should be flagged as misinformation on all platforms. The content should then no longer be recommended to new audiences. Ofcom should work with platforms to experiment and determine how this should be presented to users and whether audiences that had previously engaged with the content should be shown the fact check. (Paragraph 54)
4.Ofcom should work with online platforms to agree a common means of accreditation (initially based on the International Fact Checking Network), a system of funding that keeps fact checkers independent both from Government and from platforms, and the development of an open database of what content has been fact checked across platforms and providers. (Paragraph 55)
5.The House of Lords Communications and Digital Select Committee should consider conducting an inquiry to examine the communication of official statistics in an online world. (Paragraph 61)
6.Parliament should establish formal partnerships with broadcasters during election periods to make optimal use of its research expertise to help better inform election coverage. (Paragraph 65)
7.A new settlement is needed to protect the role of local and public interest news. The Government should work urgently to implement those recommendations of the Cairncross review which it accepts, as well as providing support for news organisations in dealing with the impact of COVID-19. (Paragraph 72)
8.The Competition and Markets Authority should conduct a full market investigation into online platforms’ control over digital advertising. (Paragraph 73)
9.The Government should introduce Online Harms legislation within a year of this Report’s publication. (Paragraph 84)
10.The Online Harms work should make clear that platforms’ duty of care extends to actions which undermine democracy. This means that the duty of care extends to preventing generic harm to our democracy as well as against specific harm to an individual. (Paragraph 89)
11.For harmful but legal content, Ofcom’s codes of practice should focus on the principle that platforms should be liable for the content they rank, recommend or target to users. (Paragraph 108)
12.The Government should include as a provision in the Online Harms Bill that Ofcom will hold platforms accountable for content that they recommend to large audiences. Platforms should be held responsible for content that they recommend once it has reached a specific level of virality or is produced by users with large audiences. (Paragraph 109)
13.The Government should empower Ofcom to sanction platforms that fail to comply with their duty of care in the Online Harms Bill. These sanctions should include fines of up to four per cent of global turnover and powers to enforce ISP blocking of serially non-compliant platforms. (Paragraph 110)
14.The Government should establish an independent ombudsman for content moderation decisions to whom the public can appeal should they feel they have been let down by a platform’s decisions. This ombudsman’s decisions should be binding on the platform and in turn create clear standards to be expected for future decisions for UK users. These standards should be adjudicated by Ofcom, with platforms able to make representations on how they are applied within their moderation processes. The ombudsman should not prevent platforms removing content which they have due cause to remove. (Paragraph 127)
15.Parliament should set up a joint committee of both Houses to oversee Ofcom’s Online Harms work and that of the proposed ombudsman. This committee should be constituted so that there can be no Government majority amongst its Members. The committee should ensure an adequate budget for this portion of Ofcom’s work. Ofcom should be obliged to submit all codes of practice to the Committee for scrutiny. (Paragraph 134)
16.The joint committee should set the budget for the content moderation ombudsman. The committee should hold an appointment hearing with the ombudsman’s proposed chief executive and hold the power of veto over their appointment. (Paragraph 135)
17.The Government should introduce legislation to enact the ICO’s proposal for a committee of regulators that would allow for joint investigations between regulators in the model of the Regulatory Reform (Collaboration etc between Ombudsmen) Order 2007. This committee should also act as a forum to encourage the sharing of best practice between regulators and support horizon scanning activity. (Paragraph 147)
18.The CDEI should conduct a review of regulatory digital capacity across the CMA, ICO, Electoral Commission, ASA and Ofcom to determine their levels of digital expertise. This review should be completed with urgency, to inform the Online Harms Bill before it becomes law. The CDEI should work with Ofcom to help determine its role in online regulation. The review should consider: (Paragraph 148)
(a)What relative levels of digital expertise exist within regulators, and where skills gaps are becoming evident; (Paragraph 148)
(b)How these regulators currently draw on external expertise, and what shared system might be devised for seeking advice and support; (Paragraph 148)
(c)What changes in legislation governing regulators would be needed to allow for a shared pool of digital expertise and staffing resource that could work between and across regulators; (Paragraph 148)
(d)How this joint pool of staffing resource could be prioritised and funded between regulators. (Paragraph 148)
19.Ofcom should be given the power to compel companies to facilitate research on topics that are in the public interest. The ICO should, in consultation with Ofcom, prepare statutory guidance under Section 128 of the Data Protection Act 2018 on data sharing between researchers and the technology platforms. Once this guidance is completed, Ofcom should require platforms to: (Paragraph 187)
(a)Provide at least equivalent access for researchers to APIs as that provided to commercial partners; (Paragraph 187)
(b)Establish direct partnerships with researchers to undertake user surveys and experiments with user informed consent on matters of substantial public interest; (Paragraph 187)
(c)Develop, for sensitive personal information, physical or virtual ‘clean rooms’ where researchers can analyse data. (Paragraph 187)
20.Ofcom should issue a code of practice on algorithmic recommending. This should require platforms to conduct audits on all substantial changes to their algorithmic recommending facilities for their effects on users with characteristics protected under the Equality Act 2010. Ofcom should work with platforms to establish audits on other relevant and appropriate characteristics. Platforms should be required to share the results of these audits with Ofcom and the Equalities and Human Rights Commission if requested. (Paragraph 200)
21.Ofcom should be given the powers and be properly resourced in order to undertake periodic audits of the algorithmic recommending systems used by technology platforms, including accessing the training data used to train the systems and comprehensive information from the platforms on what content is being recommended. (Paragraph 201)
22.There is a common thread between the need for transparency of algorithmic processes and researchers’ access to platforms. Platforms must be entirely open to the regulators to ensure proper oversight. Ofcom can only ensure that platforms are meeting their duty of care if it has access to all data from these platforms and the ability to use additional research expertise to better understand what that data means. The exact details of what data Ofcom will need will change as technology develops therefore these powers must be suitably broad. (Paragraph 202)
23.Ofcom should have the power to request any data relevant to ensure that platforms are acting in accordance with their duty of care. (Paragraph 203)
24.Ofcom should issue a code of practice on content moderation. This should require companies to clearly state what they do not allow on their platforms and give useful examples of how this applies in practice. These policies should also make clear how individual decisions can be appealed. Platforms should be obligated to ensure that their content moderation decisions are consistent with their published terms and conditions, community standards and privacy rules. (Paragraph 216)
25.The code of practice on content moderation should also include the requirement that all technology platforms publish an anonymised database of archetypes of content moderation decisions on impersonation, misinformation, hate speech and abuse. Where decisions differ from existing published examples the platform should be obliged to explain the decision to the individuals affected and to create a new anonymised decision. Failure to ensure consistency between content moderation practices and published examples should be seen as a failure in the duty of care and result in sanctions against the platforms. (Paragraph 217)
26.Local authorities should be required to publish open, machine-readable information on elections, including what elections are taking place, who the candidates are and where polling stations are located. (Paragraph 232)
27.Any information about democratic processes published by government at any level should be available in accessible language. (Paragraph 233)
28.Technology can play an important role in engaging people with democratic processes. Parliament and government, at all levels, should not seek to use technology simply to reduce costs, and must ensure that appropriate technology is used to enhance and enrich democratic engagement. (Paragraph 242)
29.The Government should establish an independent democratic information hub. This would be both a public-facing hub that provides information about democracy, starting with basic information about democratic procedures, and a means of sharing best practice in digital democracy between policymakers and civil society organisations. (Paragraph 258)
30.Parliament, and national, devolved and local government must acquire and develop greater digital capacity and skills to facilitate digital democratic engagement. This should be a mix of inhouse development and the funding of specialist external organisations as appropriate. (Paragraph 267)
31.The Government should bring forward a Bill based on the proposals set out by the Law Commission that comprehensively modernises electoral law. This should be completed in all its stages before the next General Election. (Paragraph 284)
32.The Government should legislate immediately to introduce imprints on online political material. This could be done through secondary legislation. (Paragraph 294)
33.The reform of electoral law should grant the Electoral Commission the power to acquire information from external parties such as social networks about campaigners’ activities outside of a formal investigation. (Paragraph 299)
34.The reform of electoral law should support the Electoral Commission in creating statutory guidance on the level of detail campaigners set out in receipts concerning digital spending and in their spending returns to the Commission, to provide the public with a greater understanding of the breadth and nature of online campaigns. (Paragraph 305)
35.As part of the reform of electoral law, the maximum fine the Electoral Commission can levy should be raised to £500,000 or four per cent of a campaign’s total spend, whichever is greater. (Paragraph 311)
36.The Electoral Commission should be given oversight of local candidate spending as well as national spending and should review what types of spending are included in each category. (Paragraph 318)
37.The Electoral Commission should explore whether it would be feasible to create a secondary registration scheme for campaigners who would otherwise fall below current spending limits. These campaigners would only be required to register the identity of their trustees or legally responsible persons and the identity of their five largest funders. They would not be required to disclose spending. This information could then be used to improve the transparency of online imprints. (Paragraph 324)
38.The Government should then consider whether this secondary registration scheme should form part of the reform of electoral law. (Paragraph 325)
39.Ofcom should issue a code of practice for online advertising setting out that in order for platforms to meet their obligations under the ‘duty of care’ they must provide a comprehensive, real time and publicly accessible database of all adverts on their platform. This code of practice should make use of existing work on best practice. (Paragraph 337)
40.The Government should legislate to put the ICO’s draft code on political campaigners’ use of personal data onto a statutory footing. (Paragraph 353)
41.Ofsted, in partnership with the Department for Education, Ofcom, the ICO and subject associations, should commission a large-scale programme of evaluation of digital media literacy initiatives. This should: (Paragraph 386)
(a)Review the international evidence of what has worked best in digital media literacy initiatives; (Paragraph 386)
(b)Map existing digital media initiatives across the UK, inside and outside of schools, aimed at all age groups; (Paragraph 386)
(c)Commission research to evaluate those UK initiatives that appear to be most successful; (Paragraph 386)
(d)Report in time for the lessons learned to be implemented at scale in the 2021–22 academic year. (Paragraph 386)
42.The Department for Education should review the school curriculum to ensure that pupils are equipped with all the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review of initiatives recommended above. All teachers will need support through CPD to achieve this. (Paragraph 395)
43.We recommend that Ofcom should require large platforms to user test all major design changes to ensure that they increase rather than decrease informed user choices. Ofcom should help devise the criteria for this testing and review the results. There should be genuine and easily understandable options for people to choose how their data is used. (Paragraph 401)
44.The CDEI should conduct a review of the implications of platform design for users, focusing on determining best practice in explaining how individual pieces of content have been targeted at a user. Ofcom should use this to form the basis of a code of practice on design transparency. This should feed into the Department for Education’s review of the curriculum so that citizens are taught what to expect from user transparency on platforms. (Paragraph 405)
45.Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users. (Paragraph 419)