Online Safety Bill

Written evidence submitted by Full Fact (OSB28)

Written evidence to the Online Safety Bill Public Bill Committee

Full Fact and our interest in the Online Safety Bill

1. Full Fact fights bad information. We’re a team of independent fact checkers, technologists, researchers, and policy specialists who find, expose and counter the harm it does.

2. Bad information ruins lives. It promotes hate, damages people’s health, and it hurts democracy. So we tackle it in four ways. We check claims made by politicians, public institutions, in the media and online and we ask people to correct the record where possible to reduce the spread of specific claims. We campaign for systems changes to help make bad information rarer and less harmful, and we advocate for higher standards.

3. Full Fact is a registered charity. We're funded by individual donations, charitable trusts, and by other funders. We receive funding from both Facebook and Google. Details of our funding can be found on our website.

4. Full Fact’s expertise covers online misinformation and public debate. Some areas of the Online Safety Bill, including specific harms, fall beyond our areas of expertise (e.g. illegal harms around protecting children and relating to terrorism) and our written evidence reflects this.

Summary of written evidence

5. As this written evidence outlines, we are very concerned about the Bill’s inadequate approach to harmful misinformation and disinformation. The reasons for this are, in summary, as follows:

The shift towards a narrow focus on harm to individuals will fail to address wider harms to our society and our democracy. The legislation should be revisited so that the Bill tackles the harms to our society and democracy – as well as harms to individuals

The approach to ‘legal but harmful content’ is flawed and relies too heavily on the promise of secondary legislation at a later date. The Government should set out the types of priority content that it considers to be harmful in the Bill now rather than waiting for secondary legislation. As a minimum this should include harmful health related misinformation and disinformation.

The Bill should set out the need for proportionate responses to those risks more clearly so as to both mitigate harm and protect freedom of expression. Restricting or removing content to tackle harmful misinformation should rarely be necessary.

In its current guise the Clause on the Advisory Committee on misinformation and disinformation serves limited practical purpose and is a potential distraction from the wider failures of the Bill’s approach in this area. It should be strengthened alongside wider improvements to the legislation’s approach, or removed from the Bill.

Dropping the focus on media literacy from the Bill sends completely the wrong signal. A new, stronger, media literacy duty should be reinstated to the legislation, and Ofcom required to produce a strategy for delivery.

The Bill is too focussed on the regulation of the day to day online environment. The Bill must give Ofcom, as online safety regulator, a clearer role in responding to information incidents and crises.

The provisions on content of democratic importance and journalistic content are too vague and risk being manipulated to spread harmful disinformation. Unless they can be sufficiently tightened the clauses should be removed, and all users given equal protections under the protections for freedom of expression and privacy.

Attempts to disrupt elections take place in an online landscape. Democratic harms should be brought within scope, and a Canadian style Critical Election Incident Public Protocol established to alert the public to threats to the integrity of elections.

Government attempts to influence internet companies about content on their platforms (including pushing for content removal) should be made transparent under new provisions in the Bill.

Harmful misinformation and disinformation

6. False and misleading information has circulated online for decades, causing real harms including to public health, public debate and public trust. We have described this in detail in various reports [1] , including the first year of the pandemic which made harmful misinformation apparent to all [2] . Our latest report ‘Tackling online misinformation in an open society’ focuses on how the Online Safety Bill should help address the issue [3] .

7. Online misinformation is affecting everyone, whether they use social media platforms or not. It has negatively impacted so many people’s lives and livelihoods and far too many lives have been lost in some part due to it.

8. Full Fact has long called for the government to take action in this area and the online misinformation that has come with the pandemic has made this need even more evident. We cannot go on relying on the internet companies to make decisions without independent scrutiny and transparency. We therefore welcomed the process of pre-legislative scrutiny of the draft Online Safety Bill and the subsequent introduction of the Bill to Parliament: good legislation and regulation could make a significant difference in tackling dangerous online misinformation.

9. The extensive material submitted during pre-legislative scrutiny and the growing debate and backdrop of revelations around online misinformation and disinformation have shown just how important this legislation is, but also how critical it is that the UK Government and our lawmakers make changes that improve the Bill during its passage through Parliament.

10. However, as this submission sets out, we are very concerned that the Government has not taken sufficient heed of the concerns of Full Fact or others, including the Joint Committee, and strengthened the Bill’s approach to harmful misinformation and disinformation. There is still an opportunity to change this. The Bill must be amended to have a clearer focus on proportionately but effectively addressing harm from misinformation and disinformation. That will require:

a robust and transparent regulatory regime, that expressly recognises both the harms caused by the dissemination of misinformation and disinformation and the importance of protecting freedom of expression;

better promotion of good quality, accurate information and other alternatives to content restriction or ‘take down’;

a more proactive role for an independent Ofcom, as both a strategic and a day-to-day regulator with responsibility for identifying and addressing harmful misinformation and disinformation issues.

11. Without this we risk continued harms to individuals and communities, the undermining of public health, and unintentional but long-term damage to public debate.

The scope of the Bill and application of the safety duties

12. The scope of the draft Bill was narrowed from earlier proposals so that the problems of misinformation and disinformation will not be directly addressed. This problem was maintained in the revised Bill introduced to Parliament in March.

13. This is despite the Government’s own RESIST counter-disinformation strategy stating that: "When the information environment is deliberately confused this can: threaten public safety; fracture community cohesion; reduce trust in institutions and the media; undermine public acceptance of science’s role in informing policy development and implementation; damage our economic prosperity and our global influence; and undermine the integrity of government, the constitution and our democratic processes." [4] The shift towards a narrow focus on harm to individuals will fail to address all of these harms to our society and our democracy.

14. The legislation should be revisited so that the Bill tackles the harms to our society and democracy – as well as the harms to individuals – as set out in the government’s counter-disinformation strategy.

15. More generally, there is currently only one provision in the Bill that directly addresses misinformation and disinformation, and that is in the shape of an Ofcom advisory committee with no identifiable powers or active role in tackling harmful misinformation.

16. In its materials accompanying the introduction of the Bill the Government claims that the Bill will address ‘illegal disinformation’ [5] , but Full Fact believes this is misleading. It is not clear what types of information the Government has in mind here, but such content would need to be addressed on the basis that it contains content amounting to a criminal offence rather than because it contains harmful false information. There are no ‘disinformation’ offences designated as priority illegal content. More pertinently, the majority of harmful misinformation that Full Fact sees is unlikely to be clearly identifiable as criminal in nature, and addressing it through the approaches for illegal content (which focus heavily on take down) would in many cases be inappropriate and risk disproportionately interfering with the freedom of expression of users (see further on this at paras 22 to 27 below).

17. There have been some suggestions that the solution may sit with the new false communication offence (Clause 151). That new offence is essentially a modernisation of the existing false communication offence in section 127(2) of the Communications Act 2003. Although it may be useful for prosecuting authorities taking action against individuals who send knowingly false communications intended to cause psychological or physical harm, we do not think it is likely to play a significant role in tackling harmful misinformation and disinformation online as part of the new regulatory regime overseen by Ofcom under the Bill. There are a number of reasons for this:

The false communications offence is not designated as priority illegal content, so the key ‘proactive’ duties to prevent users encountering such content, and to minimise its presence, will not apply (and we would be concerned if that were to be the case - please see (b) below).

The nature of the offence, and the need for providers to assess knowledge of falsity and intent to cause harm, means that it is likely to be difficult for providers to appropriately regulate it at internet scale. And indeed attempts to do so could risk the over moderation of lawful content, raising freedom of expression concerns.

A lot of harmful misinformation is not obviously criminal in nature, and would therefore not fall within the scope of the offence.

18. The Government also points to the child safety duties services as providing protections for underage users from harmful disinformation. This is of course welcome, but it will be limited in scope (to content likely to cause them significant physical or psychological harm) and do nothing to address harmful misinformation being disseminated on services predominantly accessed by adults. As with the adult safety duties, the extent of the protection is partially dependent on the designation of types of priority content, which is still not set out in the Bill.

19. In terms of adults, the only potential protection under the Bill as it stands is the hope that the Government may decide to list some types of misinformation or disinformation as ‘priority content’ in secondary legislation. In our view this is inadequate for a number of reasons:

There is no guarantee it will happen. After years of consideration and scrutiny we have still not seen any sort of list of the content that the Government considers to be harmful to adults. This leaves Parliament, and others, unable to scrutinise a vital part of the regime.

The scope for the Secretary of State to designate misinformation under these secondary legislation powers is limited. Although it seems likely that health misinformation could be designated, it is likely to leave a range of other harmful misinformation and disinformation out of scope.

Even for misinformation that is designated as priority content the protections for adults are very limited. There is no obligation on service providers to put in place systems and processes to mitigate and manage the risks of harm presented by that content even though, in that scenario, the Government would have determined that the misinformation presented a material risk of significant harm . It is left to providers to decide, and then set out, how they intend to deal with that harmful content.

The adult safety duties in Clauses 12 and 13 only apply to the potentially small number of services within Category 1. Consequently, the vast majority of services will not even have to do a risk assessment for misinformation or disinformation that is harmful to adults, even where it is designated as priority content presenting a material risk of significant harm to adults.

20. We would like to see the Government set out the types of priority content that it considers to be harmful to adults and children in the Bill now rather than waiting for secondary legislation. This would give Parliament, and stakeholders, the opportunity to properly scrutinise what sort of content should be brought into scope, and what the proportionate response for addressing it should be.

21. At the very minimum the legislation should be amended to ensure that harmful health related misinformation and disinformation is designated as priority content that is harmful to both children and adults.

Protecting freedom of expression while addressing harmful misinformation

22. An open society should aim to inform people's decisions, not control them. Proportionate action is needed from internet platforms to address clearly identified harms from bad information. But approaches to specific pieces of content should consider freedom of expression as a starting point, as should policies addressing harmful misinformation and disinformation. Freedom of expression includes the freedom to be wrong. And nothing should be removed from the internet just because it is wrong.

23. We want to see harmful misinformation and disinformation brought within scope of the safety duties in the Bill, but this needs to be accompanied by clearer provision on how that information should be dealt with.

24. Better protection for freedom of expression requires some oversight of the content moderation choices made by internet companies. At present the Bill risks letting in-scope companies ‘mark their own homework’, including when it comes to decisions around freedom of expression. This is particularly the case with the adult safety duties governing so-called ‘legal but harmful’ content.

25. As well as ensuring that risks of harm from misinformation and disinformation are mitigated and managed, the Bill should set out the need for proportionate responses to those risks more clearly. There are a growing number of resources and methods that can be used so that restricting or removing content to tackle misinformation should rarely be necessary. For example:

Ensuring that reliable information from authoritative sources is available on platforms.

Proactive provision of such information (such as the Covid-19 information centres Facebook and others established).

Friction inducing initiatives (for example including ‘read-before-you-share’ prompts).

Labelling and fact checking to more clearly surface false information.

Better user control over the curation of information, and better human moderation.

Increasing the resilience of a platform’s users by taking steps to improve their media literacy.

26. We would like to see Ofcom required to issue a mandatory code practice on proportionately reducing harm from misinformation and disinformation. Such a code should include use of fact checking in proportion to reach and risk, along with other forms of mitigation which can help to protect people’s freedom of expression, including user control over curation and better human moderation.

27. Alternatively this could be set out explicitly in the Bill, by requiring platforms to address harmful misinformation and disinformation proportionately, and setting out a non-exhaustive list of approaches for doing so.

The role of the the Advisory Committee on Disinformation and Misinformation

28. In its current guise the Clause on the Advisory Committee serves limited practical purpose and is a potential distraction from the wider failures of the Bill’s approach to addressing harmful misinformation and disinformation. Moreover, it is unclear how it is intended to fit with Ofcom’s functions under the wider regulatory regime being introduced through the Bill. Unless it is strengthened we consider that it should be removed from the Bill.

29. If the Government and Ofcom proceed with creating the Committee, we would like to see its role clarified and strengthened so that Ofcom are receiving the advice and input they need to properly address issues of harmful misinformation and disinformation. In particular, its remit should be widened to expressly include the following:

Advising on and overseeing Ofcom research on the harms caused by disinformation and misinformation.

Reporting on the emerging patterns of behaviour driving misinformation and disinformation, how people interact with content, the causes of harmful information, and the proportionate responses to those issues.

30. Ofcom should also be required to consult the Committee when drawing up relevant aspects of their draft codes of practice.

31. Although we recognise and emphasise the importance of collaborative responses to misinformation including working with the internet companies, we question whether it is appropriate for internet company representatives to sit on this Committee when part of its role is to advise Ofcom on what providers of regulated services should do.

Media literacy

32. Clause 103 of the Draft Bill that was presented for pre-legislative scrutiny contained a proposed new media literacy duty for Ofcom (replacing the existing one in section 11 of the Communications Act that dates back to 2003). As well as updating the duty for the modern online era, the proposals included requiring Ofcom carry out, commission or encourage educational initiatives designed to improve the media literacy of members of the public, and to prepare guidance on evaluating media literacy related initiatives.

33. The Government has now scrapped that new duty, dropping it from the version of the Bill introduced to Parliament. Media literacy initiatives are now only mentioned in the context of the risk assessments [6] , but there is no active requirement for internet companies to improve media literacy.

34. We were disappointed to see this development. The media literacy provision needed to be strengthened, not cut. The UK has a vast literacy skills and knowledge gap that leaves a population and society at risk of harm in the digital era. Ofcom’s own research shows that a third of internet users were unaware of the potential for inaccurate or biased information online, and that 30% of internet users don’t know – or don’t think about – whether the information they find is truthful or not [7] . That same research also showed that, although seven in ten (69%) adult internet users said they were confident in judging whether online content was true or false, most were actually unable to correctly evaluate the reasons that indicate whether a social media post is genuine.

35. Media and information literacy can strengthen the public’s defences against the harms of online misinformation and disinformation: it can make the difference between decisions based on sound evidence, and decisions based on poorly informed opinions that can harm personal health and wellbeing, social cohesion, and democracy.

36. Full Fact believes that a new, stronger, media literacy duty should be reinstated to the Bill and then Ofcom required under the legislation to produce a statutory strategy for delivering on it. We would also like to see the regulator report on the progress Ofcom is making towards increasing the media literacy of the public in accordance with that duty.

Information incidents

37. Events such as terror attacks or pandemics can corrupt the information environment by increasing the complexity of accurate information, creating confusion or revealing information gaps - all of which can result in an increase in the volume of harmful misinformation and the speed at which it spreads, and opportunities for malicious actors to spread disinformation. We describe these moments of heightened vulnerability as ‘information incidents’. Information incidents are often characterised by a proliferation of inaccurate or misleading claims or narratives, which relate to or affect perceptions of our behaviour towards a certain event or issue happening online or offline.

38. Since 2020, Full Fact has been working with internet companies, civil society and governments to create a new shared model to fight crises of misinformation (the Framework for Information Incidents [8] ) to help decision-makers understand, respond to and mitigate information crises in proportionate and effective ways.

39. This sort of thinking now needs embedding into the new regulatory regime. Unfortunately, we do not think that harmful misinformation and disinformation that arises during periods of uncertainty - either acutely, such as during a terror attack, or over a longer period, as with a pandemic - is effectively dealt with in the Online Safety Bill.

40. As it stands the Bill is focussed on the regulation of the day to day online environment. Although Clause 146 gives the Secretary State powers of direction during certain ‘special circumstances’, those provisions simply allow the Government to mandate Ofcom to prioritise its media literacy function, or make internet companies report on what they are doing in response to a crisis. The provisions do little to meaningfully empower Ofcom itself, and risk undermining the regulator’s independence.

41. As an independent regulator, Ofcom can play a credible convening role, and help to ensure that service providers are ready to mitigate the risks of future information incidents. We would like to see the Bill provide for Ofcom to introduce a system whereby emerging incidents can be publicly reported, and different actors such as fact checkers, news organisations, community representation groups and service providers can request that Ofcom convene a response group to discuss severity and response.

Risks arising from the concepts of content of ‘democratic importance’ and ‘journalistic content’.

42. Under the Bill, Category 1 services have duties to protect journalistic content and content of democratic importance. Although these provisions seem intended to provide additional protections against overly restrictive content moderation decisions by platforms, the definitions are vague and we are concerned that they open up the risk of the provisions being used as avenues to spread harmful misinformation and disinformation under the guise of journalism or democratically important speech.

43. Firstly, the definition of "content of democratic importance" is not sufficiently clear about what "is or appears to be specifically intended to contribute to democratic political debate" and this has the potential for serious and unintended consequences.

44. Secondly, the definition of "recognised new publisher" [9] (which is relevant to both the general exemption for news publisher content and the protections for content of journalistic importance in Clause 16) is broad, and arguably relatively easy to meet. A recognised news publisher includes any publisher which publishes news, has a UK office or business address, and has a standards code and complaints process.

45. As these requirements are not further defined, we are concerned that it may be too easy for those wishing to spread harmful disinformation to intentionally establish themselves in a way designed to benefit from the exemptions and protections intended for legitimate news outlets.

46. More generally, if the Bill requires special protections to make journalism possible under its rules, then its restrictions on ordinary internet users go too far. Journalists should not need or have privileged freedom of expression compared to their audiences (as distinct from privileged access to information or protection from interference with newsgathering, which they do sometimes need and the law provides for).

47. Many such concerns were expressed during pre-legislative scrutiny but these aspects of the Bill have returned to Parliament almost untouched. Unless these provisions can be sufficiently tightened so as to prevent the risk of abuse, we think clauses 15 and 16 should be removed, and all users given equal protections under Clauses 19 and 29 ("Duties about freedom of expression and privacy"). Full Fact believes freedom of expression is better protected by specific concrete duties. For example, we have called for a rule that internet companies may only use measures that restrict what people can see and share as a last resort. Better promotion of good quality, accurate information and other alternatives to content restriction or ‘take down’ should be preferred by law.

Elections

48. Attempts to disrupt elections and democratic choices take place in an online landscape, with harmful misinformation and disinformation having a wide reach. Unfortunately, the Bill sidesteps the challenge of dealing with this directly. And the types of content that can be brought within scope of the safety duties by the Secretary of State will not extend to misinformation and disinformation that might affect electoral integrity (because of the need for link to physical or physiological harm to individuals). The Elections Act also fails to address this gap.

49. This Bill should therefore be amended so that democratic harms are brought within scope. Larger platforms in particular should be required to risk assess their services to prevent widespread dissemination of information harmful to elections.

50. The Government should also publish a Critical Election Incident Public Protocol (similar to that operated in Canada) to alert the public to incidents that threaten the UK’s ability to have free and fair elections, and allow the public to take steps to protect themselves. The Online Safety Bill could be amended to allow such a protocol to be enacted.

Making government interventions in content moderation transparent

51. We know that the government regularly seeks to lobby internet companies about content on their platforms, including pushing for content removal. This approach was, for example, a marked feature of the government’s response to the Covid-19 pandemic.

52. Initiatives such as the Counter Disinformation Unit in the Department for Digital, Culture, Media and Sport undertake valuable work on disinformation and misinformation. But operating these sorts of initiatives, with little-to-no parliamentary or legal scrutiny, is a threat to freedom of expression. Continued silence will lock in a form of "censorship-by-proxy" as the new normal.

53. Unnecessary secrecy around government attempts to counter false information must therefore stop and be replaced with transparency. Just as internet companies should not be left to make decisions on issues as fundamental as freedom of expression without proper scrutiny and oversight ‒ a fundamental tenet of the Online Safety Bill ‒ then neither should the government of the day. Full Fact recognises that some information may need protecting, but appropriate mechanisms of oversight could be identified.

54. The purpose of this Bill is to introduce a new regulatory regime for internet services which is overseen by a dedicated independent regulator. If government action in this area is to continue post Royal Assent then the Bill should include a transparent framework for these types of activities, under which the Government must publish details of the efforts it takes to influence internet company decisions about specific items of content, user accounts, or terms of service (accepting that allowances may need to be made for matters of national security etc).

Full Fact

23 May 2022


[1] Full Fact, ‘Tackling Misinformation in an Open Society’, 2018 https://fullfact.org/media/uploads/full_fact_tackling_misinformation_in_an_open_society.pdf

[1] ‘The Full Fact Report 2020: Fighting the Causes and Consequences of Bad Information’, April 2020, https://fullfact.org/media/uploads/fullfactreport2020.pdf .

[2] ‘The Full Fact Report 2021: Fighting a pandemic needs good information’, January 2021, https://fullfact.org/media/uploads/full-fact-report-2021.pdf

[3] ‘Full Fact Report 2022: Tackling online misinformation in an open society-what law and regulation should do’, https://fullfact.org/about/policy/reports/full-fact-report-2022

[4] RESIST Counter Disinformation Toolkit, 2019 https://gcs.civilservice.gov.uk/publications/resist-counter-disinformation-toolkit/

[5] https://www.gov.uk/government/publications/online-safety-bill-supporting-documents/online-safety-bill-factsheet#how-the-new-laws-tackle-misinformation-and-disinformation

[6] As an example of a design and operation of the service that may reduce or increase risks identified in the risk assessment (see for example Clause 8(5)(g). .

[7] Adults' Media Use and Attitudes report 2022, Ofcom, March 31 2022, p12-13 https://www.ofcom.org.uk/__data/assets/pdf_file/0020/234362/adults-media-use-and-attitudes-report-2022.pdf

[8] https://fullfact.org/about/policy/incidentframework/

[9] We declare an interest in that Full Fact is likely to be a recognised news publisher within the meaning of the Bill.

 

Prepared 26th May 2022