Below is a list of all of the Committee’s conclusions and recommendations (recommendations appear in italics).
1.Moderating social media platforms is a very difficult task. Huge volumes of content are posted every day. Algorithms cannot understand context, nuance or irony; these also pose challenges for human moderators. However, platforms’ moderation decisions are often unreasonably inconsistent and opaque, and sometimes seem to be influenced by commercial and political considerations. (Paragraph 78)
2.The largest platforms have monopolised the digital public square. The private companies which run them have unprecedented control over what citizens can say online and the power to censor views with which the companies disagree. We agree with the principle underlying clause 13 of the draft Online Safety Bill that category 1 platforms should have duties to be impartial in their moderation of political content. (Paragraph 79)
3.However, the definition of ‘content of democratic importance’ in the draft Bill is too narrow. It should be expanded to ensure that contributions to all political debates—not only those debates which are about, or initiated by, politicians and political parties, and about policy, rather than social change—would be covered. The protections should also be extended to cover the content of platforms’ terms and conditions, in addition to the “systems and processes” with which they apply them. (Paragraph 80)
4.We are concerned that platforms’ approaches to misinformation have stifled legitimate debate, including between experts. Platforms should not seek to be arbiters of truth. Posts should only be removed in exceptional circumstances. (Paragraph 81)
5.The White Paper preceding the draft Online Safety Bill proposed that a reduction in online harms could best be achieved through the introduction of a statutory ‘duty of care’. This would require social media companies to design and operate safer systems, developed in a proportionate and risk-based way, in partnership with the regulator. This focus on companies taking reasonable steps to prevent reasonably foreseeable harms that occur in the operation of their services examines systems would remove the need to regulate individual pierces of content. However, there are legitimate concerns about how a statutory duty of care would interact with freedom of expression on the internet. (Paragraph 95)
6.The duty of care approach should inform a flexible framework for digital regulation, guided by underlying principles, including freedom of expression, which is able to adapt to the rapidly developing digital world while setting clear expectations for platforms. We discuss how this can be achieved—including the necessary parliamentary scrutiny and co-operation between regulators—in our subsequent conclusions and recommendations. (Paragraph 96)
7.We support the Law Commission’s aim of reforming communications offences. Although we heard compelling concerns about the appropriateness for social media of criminalising sending a communication which the defendant knew or should have known was likely to harm a likely audience, the Law Commission has now revised its proposal to require intent to harm —providing a clearer standard. However, we are concerned about the ability of social media platforms—in complying with their duties under the draft Online Safety Bill—to identify and remove content covered by the offence without also removing legal content. (Paragraph 111)
8.Police face many challenges, including from the scale of online content, anonymity, and the dark web. We are concerned that they do not have sufficient resources they need to enforce the law online. It is essential that the police can bring criminals to justice. The draft Online Safety Bill cannot be the only answer to the problem of illegal content. The Government should ensure that existing laws are properly enforced and explore mechanisms for platforms to fund this. (Paragraph 123)
9.Retrieving deleted evidence places particular strain on police resources. The Government should require category 1 platforms to preserve deleted posts for a fixed period. Ofcom should also have the right to impose this requirement on category 2 platforms where appropriate. (Paragraph 124)
10.We support the principle in the draft Online Safety Bill that platforms should be required to remove illegal content.In implementing clause 9(3)(d) Ofcom should set strict timeframes within which platforms must remove content which is clearly illegal. (Paragraph 132)
11.We support the Government’s proposal that Ofcom should judge platforms’ compliance on a systemic basis, rather than adjudicating all content removal decisions. The Bill should make clear that a platform would not be compliant if its systems to remove illegal content either systematically fail to remove illegal content or systematically remove legal content. This would ensure that platforms do not have an incentive to remove legal content. (Paragraph 133)
12.In our report Growing up with the Internet, we supported the then Government’s inclusion in the Digital Economy Act 2017 of a requirement on websites to verify that their users are over 18 years old if more than one-third of their content is pornographic. More than four years after the Act received Royal Assent, this provision has not been commenced and the Government now plans to repeal it. The Government’s inaction has severely impacted children. The Government should ensure that all pornographic websites are in scope of the online safety regime and held to the highest standards. (Paragraph 149)
13.A supposed inability to enforce age verification is no excuse, though we recognise that there were legitimate concerns about privacy and the effectiveness of the measures themselves. Other protections for children exist, such as on licensing of alcohol or restrictions on gambling, despite enforcement issues. Since the Digital Economy Act, age recognition technology has advanced. Websites are now able to use biometric age estimation technology, databases, mobile phone records and algorithmic profiling, alongside the ID based verification tools envisaged at the time of the Digital Economy Act. Such technological advances suggest it has been a missed opportunity for the Government to make clear on the face of the draft Bill that websites hosting pornographic content will be blocked for children. Children deserve to enjoy the full benefits of being online and still be safe. (Paragraph 150)
14.We do not support the Government’s proposed duties on platforms in clause 11 of the draft Online Safety Bill relating to content which is legal but may be harmful to adults. We are not convinced that they are workable or could be implemented without unjustifiable and unprecedented interference in freedom of expression. If a type of content is seriously harmful, it should be defined and criminalised through primary legislation. It would be more effective—and more consistent with the value which has historically been attached to freedom of expression in the UK—to address content which is legal but some may find distressing through strong regulation of the design of platforms, digital citizenship education, and competition regulation. We discuss these in Chapters 3 and 4. (Paragraph 182)
15.If the Government does not accept our recommendation on an alternative approach to clause 11, it should improve the draft Bill in the following ways:
16.In Regulating in a Digital World, we recommended that a joint committee of both Houses of Parliament should be established to consider regulation of the digital environment. This committee would be responsible for scrutinising the adequacy of powers and resources in digital regulation. This would bring consistency and urgency to regulation. In addition, we found in this inquiry that there is a lack of scrutiny of delegated powers given to the Secretary of State and Ofcom. In relation to the latter, this raises serious concerns about democratic accountability. (Paragraph 184)
17.We reiterate our recommendation that a joint committee of Parliament should be established to scrutinise the work of digital regulators. This joint committee should also scrutinise the independence of Ofcom and statutory instruments relating to digital regulation. (Paragraph 185)
18.We support the Government’s approach to journalistic content in the draft Online Safety Bill. As news publishers’ websites are out of scope, it would not be appropriate for the sharing of their content on social media to be subject to the online safety regime. They are already legally liable for their content. We support the principle that there should be protections for citizen journalism. However, we are concerned by the vagueness of the draft Bill about what constitutes citizen journalism. The Government should clearly define citizen journalism in the draft Bill. (Paragraph 196)
19.Freedom of expression online is a threat to authoritarian regimes. Afraid of what their citizens might say if allowed to speak for themselves, these regimes are seeking to censor the internet. The online safety regime should require category 1 platforms to report annually on content they have been obliged to remove in other jurisdictions, whether by law or political authorities. This would allow UK users to know whether they are giving their custom to a platform which is complicit in censorship by authoritarian regimes. Users may wish to boycott platforms which value their profits more highly than human rights. The interventions we recommend in Chapter 4 to increase competition would make it easier for users to do so. (Paragraph 210)
20.The design of platforms shapes how users behave online. Too often, users are rewarded with ‘likes’ and other forms of engagement when they behave unkindly towards others. Although the worst forms of content repel users and advertisers, it can be in platforms’ interests to encourage heated and uncivilised exchanges to hold users’ attention. We welcome design changes which encourage users to behave in a more responsible way, such as Twitter prompting users to read articles before retweeting them. (Paragraph 240)
21.Platforms must be held responsible for the effects of their design choices. The Government should replace the duties on category 1 platforms in relation to ‘legal but harmful’ content in clause 11 of the draft Online Safety Bill with a new design duty. Platforms would be obliged to demonstrate that they have taken proportionate steps to ensure that their design choices, such as reward mechanisms, choice architecture, and content curation algorithms, mitigate the risk of encouraging and amplifying uncivil content. This should apply both to new and existing services. The duty should include a requirement for platforms to share information about their design with accredited researchers and to put in place systems to share best practice with competitors. (Paragraph 241)
22.Giving users more control over the content they are shown is crucial. This is more consistent with freedom of expression and the wide range of vulnerabilities and preferences users have than focusing on removing legal content. As part of the design duty we propose, the Online Safety Bill should require category 1 platforms to give users a comprehensive toolkit of settings, overseen by Ofcom, allowing users to decide what types of content they see and from whom. Platforms should be required to make these tools easy to find and use. The safest settings should always be the default. The toolkit should include fair and non-discriminatory access to third-party content curation tools. (Paragraph 242)
23.Ofcom should allow category 2 platforms to opt in to the design duty on category 1 platforms, with a kitemark scheme to show users which meet this higher standard. (Paragraph 243)
24.Anonymity can allow individuals, including those in vulnerable positions, to express themselves more freely and to challenge orthodoxies. This is crucial. However, it can also embolden people to abuse others.Under the draft Online Safety Bill, as part of the user toolkit we propose, category 1 platforms should be required to allow users to opt out of seeing content from users who have not verified their identity. (Paragraph 255)
25.The right to privacy can enable users to express themselves freely. Privacy can also affect competition, which itself affects freedom of expression. (Paragraph 264)
26.Strong privacy standards should form part of the design duty we propose be added to the draft Online Safety Bill. (Paragraph 265)
27.It is essential that regulators, including the Information Commissioner’s Office, Ofcom, and the Competition and Markets Authority, co-operate to protect users’ rights. (Paragraph 266)
28.Digital citizenship should be a central part of the Government’s media literacy strategy, with proper funding. Digital citizenship education in schools should cover both digital literacy and conduct online, aimed at promoting civility and inclusion and how it can be practised online. This should feature across subjects such as Computing, PSHE and Citizenship Education. The latter is particularly crucial as it emphasises why good online behaviour is important for our society, our democracy and for the freedom of expression. (Paragraph 293)
29.It is not enough to focus efforts to improve digital citizenship on young people. The Government should commission Ofcom to research the motivations and consequences of online trolling and use this to inform a public information campaign highlighting the distress online abuse causes and encouraging users to be good bystanders. (Paragraph 294)
30.Digital citizenship education suffers from a lack of co-ordination. We recommend that the strengthened duties in the draft Online Safety Bill on promotion of media literacy should include a duty on Ofcom to assist in co-ordinating digital citizenship education between civil society organisations and industry. (Paragraph 295)
31.Effective digital citizenship education requires contributions from both the Government and platforms. Social media companies should increase the provision and prominence of digital education campaigns on their platforms. (Paragraph 296)
32.Increasing competition is crucial to promoting freedom of expression online. In a more competitive market, platforms would have to be more responsive to users’ concerns about freedom of expression and other rights. (Paragraph 318)
33.The Government should introduce legislation to give statutory powers to the Digital Markets Unit during the current parliamentary session. This is if anything more important than the Online Safety Bill. Given the impact of competition on freedom of expression and privacy standards, the Digital Markets Unit should include human rights in its assessments of consumer welfare alongside economic harm. (Paragraph 319)
34.Social media services offer citizens unparalleled opportunities to share their opinions with others. However, this market is dominated by a small number of very powerful companies. Rather than allowing these platforms to monopolise the digital public square, there should be a range of interlinked services between which users can freely choose and move. The Digital Markets Unit should make structural interventions to increase competition, including mandating interoperability. Where necessary, it should work with international partners—including to block mergers and acquisitions which would undermine competition. (Paragraph 333)
35.Search engines play a key role in facilitating freedom of expression, both through disseminating individuals’ and publishers’ content and providing access to information from which opinions can be formed. The lack of competition in this market is unacceptable. (Paragraph 345)
36.The Digital Markets Unit should make structural interventions to increase competition in the market, where necessary working with international partners. This should include forcing Google to share click-and-query data with rivals and preventing the company from paying to be the default search engine on mobile phones. (Paragraph 346)
37.The online safety regime must not entrench the market power of the largest platforms by increasing barriers to entry for competitors. Ultimately, this would harm consumers. The Government should require Ofcom to give due consideration to—and report on the impact on—competition in its implementation of the regime. Ofcom should work closely with the Digital Markets Unit in this area. (Paragraph 358)
38.Including all platforms which are accessible from the UK in scope of the online safety regime risks reducing user choice. It is likely that small platforms based abroad with very few UK users will block access to their services from the UK rather than take on the burden of compliance. The draft Online Safety Bill only distinguishes between the largest, category 1, platforms and a second category containing all others. (Paragraph 359)
39.To avoid UK users losing access to these websites, the Government should introduce a third category for platforms which are based abroad and have very few UK users. This would include websites such as local newspapers and message boards for people with niche interests. We expect that Ofcom would set a threshold for the number of UK visitors per year to allow platforms to know whether they are in category 2 or 3. Although category 3 platforms would be held to the same safety standards as category 2 platforms, to reduce the regulatory burden on them category 3 platforms would have no duties proactively to prove compliance unless Ofcom notified the company—after completing its own risk assessment, or receiving complaints from users or a third-party—of further steps it should take in relation to illegal content or content which may be harmful to children. (Paragraph 360)
40.The open-display advertising market is opaque and unfair. Google’s dominance throughout the intermediation chain, on both the demand and supply side, would not be permitted in any other market. The broken market has made it more difficult for news publishers to survive, let alone thrive. Having a wide range of viable news publishers is essential for freedom of expression. (Paragraph 369)
41.The Digital Markets Unit should make structural interventions to increase competition, including through separation remedies. (Paragraph 370)
42.We reiterate our recommendation for a mandatory bargaining code to ensure fair negotiations between platforms and publishers. Google’s and Facebook’s voluntary initiatives to pay some publishers for some of the use of their content are welcome. However, such agreements reflect the fundamental imbalance of power between the two sides. As the Competition and Markets Authority has noted, publishers have little choice but to accept the terms they are offered. Only a mandatory bargaining code, with the possibility of independent arbitration, can ensure that publishers—particularly smaller and local publishers—get a fair deal. The code should also cover how platforms use and curate publishers’ content. (Paragraph 389)