Online Safety Bill

WRITTEN EVIDENCE SUBMITTED ON BEHALF OF THE TOGETHER ASSOCIATION by Francis Hoar (OSB29)

THE HOUSE OF COMMONS COMMITTEE 

ON THE ONLINE SAFETY BILL

The Together Association was formed in August 2021 , bringing together business, health, media, sport, religious leaders, campaigners and citizens. Free Speech is one of the key pillars that Together has been mandated to uphold. Together was instrumental in leading a public campaign to end vaccine passports and mandates , working with MPs and citizens across Britain to do so. The Together Association is enormously alarmed at the proposed content of the Online Safety Bill and the damaging impact it would render to Britain should it proceed in the current form. While we understand the concerns with regard to children, we cannot treat the whole of society as though they were children to be protected. It is with this in mind and in the context of generations that fought and died for the rights that we enjoy today with regard to free expression that we are submitting, on behalf of our 270,000 signatories this response, written by Francis Hoar.

Introduction and Executive Summary

1 I am a barrister in private practice [1] specialising in constitutional and public law with a particular interest in the protection of freedom of expression. [2] In June 2022 I published a report ‘In Protection of Freedom of Speech’ which included an analysis and criticism of the first draft of the Online Safety Bill (‘the Bill’) and which included a Foreword by retired Supreme Court justice Lord Sumption. [3] These submissions are based on that report but have been updated to take account of the changes to the Bill. [4]

2 Not only does the Online Safety Bill fail to protect freedom of expression online it imposes at least two layers of censorship. First, social media companies will be required to impose systems that monitor speech that might be ‘harmful’ – including a new criminal offence requiring only that a person is caused an adult ‘serious distress’ and to do so even in the absence of positive reports. It requires little imagination to think of expression that might cause some distress but which in fact represents either the free expression of political opinion or high art. Secondly, compliance with the Bill will be monitored by Ofcom with the power to impose considerable penalties, encouraging a more restrictive rather than a less restrictive approach.

3 The scale of the change imposed by this form of regulation of speech is considerable.

4 First, the Bill introduces a new and unprecedented concept of a ‘duty of care’ for social media companies that is based not on harm being directed at an individual but its assessment of risks to adults from material for which it is not the publisher. This undermines the sensible common law principle that a company hosting a ‘noticeboard’ or un-moderated platform is not responsible for a statement (as a publisher or otherwise) unless it is brought to its attention.

5 Secondly, the Bill’s creation of the new ‘harmful communication’ offence (now at Cl 150) and the ‘false communication’ offence (now at Cl 151) is a serious threat to freedom of speech. The former can and will criminalise the expression of any opinion if there was a ‘real risk’ that it would cause ‘psychological harm amounting to at least serious distress’; and the latter would require a court to determine what information is true and what is false and requires evidence only of ‘non-trivial’ harm. [5]

6 Both offences would significantly broaden the scope of the criminal law, which previously criminalised communications only where they were at least ‘grossly offensive’. While there is a public interest defence, it is not determinative and it requires a court to decide what is in the public interest and to what extent that can justify the expression of opinion.

7 The right to freedom of expression has long been upheld as a fundamental constitutional principle, with origins in the right of petition to the Crown entrenched in the Bill of Rights 1689. As Lord Justice Hoffman has said:

‘… a freedom which is restricted to what judges think to be responsible or in the public interest is no freedom. Freedom means the right to publish things which government and judges, however well motivated, think should not be published. It means the right to say things which ‘right-thinking people’ regard as dangerous or irresponsible. This freedom is subject only to clearly defined exceptions laid down by common law or statute.’ [6]

8 Lord Justice Sedley has added that:

Free speech includes not only the inoffensive but the irritating, the contentious, the eccentric, the heretical, the unwelcome and the provocative … Freedom only to speak inoffensively is not worth having ...’ [7]

9 Yet the Bill that the Committee is considering will criminalise offensive speech if it might foreseeably cause a person ‘serious distress’ – something that can be expected wherever controversial, unpopular or uncomfortable ideas are expressed and that can have no reasonably clear or foreseeable meaning.

10 The government appear to have responded to the criticism of the original Bill requiring censorship of speech that was offensive but not criminal by ensuring that such ‘offensive’ speech was not merely to be censored but criminalised. Those amendment have thereby increased, not decreased, its threat to free expression.

11 And the objection remains. Censorship by private companies would not merely be permitted but required, with no objective or foreseeable definition of what it is that may be censored.

12 In ‘On Liberty’, [8] John Stuart Mill identified that the liberty of discussion is inextricably linked to the liberty of conscience: ‘the usefulness of an opinion is itself a matter of opinion: as disputable, as open to discussion and requiring discussion as much, as the opinion itself.’ Moreover, a free and democratic society is one that recognises that there can be no infallible arbiter of truth; and that no individual or body has the right to appoint one. As Lord Justice Hoffman put it, it is particularly dangerous to give that power to the government or judges. The fundamental problem with the Bill is that it does precisely that – through requiring that the arbiter or truth and of the utility of an opinion measured against its potential for harm is first a company, then a state regulator (Ofcom) and ultimately (on a judicial review of Ofcom’s decisions or through a prosecution) the courts.

The current criminal law and offences added by the Bill

13 Before legislation is enacted, the first question for a legislator is whether it is necessary. At least in respect of the criminal law, the Bill is not necessary. Moreover, insofar as it creates responsibilities for platforms that go beyond the responsibilities of a publisher, it changes the relationship between platform and user from that of the neutral facilitator of communication to that of moderator.

14 It is important to be aware of the breadth of the criminal law’s treatment of expression online and elsewhere. This includes speech that is at least grossly offensive and which has the effect of stirring up hatred but does not (outside the context of public order offences) go beyond that threshold. These offences include:

(1) Section 1 of the Malicious Communications Act 1988 (‘MCA 1988’): sending, with intent to cause distress or anxiety, an electronic or other communication conveying an indecent or grossly offensive message, a threat or information which is false.

(2) Section 127 of the Communications Act 2003 (‘CA 2003’): sending through a public communications network, a message or other matter that is grossly offensive or of an indecent, obscene or menacing character (s 127(1) CA 2003); or persistently using a public electronic communications network to cause annoyance, inconvenience or needless anxiety (s 127(2) CA 2003).

(3) Threatening or abusive conduct likely to cause harassment, alarm or distress contrary to section 5 of the 1986 Act as amended by the Crime and Courts Act 2013 section 57 to remove the word ‘insulting’.

(4) Section 31 of the Crime and Disorder Act 1998 (‘CDA 1998’) created a racially or religiously aggravated form of the offences under sections 4, 4A and 5 of the 1986 Act.

(5) Offences relating to stirring up racial hatred, religious hatred or hatred on grounds of sexual orientation (ss 18-23 of the 1986 Act sections 29B and 29C;

(6) Sections 2, 2A, 4 and 4A of the Protection from Harassment Act 1997 contain the offences of pursuing a course of conduct which amounts to harassment, stalking, putting another in fear of violence, and stalking involving fear of violence or serious alarm or distress. Section 32 of CDA 1998 creates a racially or religiously aggravated form of these offences.

15 Some of the above offences could also apply to a person or company that was considered in law a publisher (the test of which is addressed below). Thus, there are at present a wide range of potential criminal sanctions against social media companies that fail to remove comments that breach the criminal law if they were not removed with sufficient time to avoid being a publisher.

16 By the introduction of the communication offence at Cl. 151, the Bill would require social media companies to remove content that could cause no more than ‘serious distress’ (the meaning of ‘harm’ set out in Cl. 150(4)) and require a court to determine what that meant. The judge of such an ‘impact’ will be social media companies and (on review) the state (in the guise of Ofcom) and (ultimately) the judiciary.

17 Moreover, a company, Ofcom and (ultimately) a court would not only be put in the position of deciding about whether the low hurdle of the risk of ‘serious distress’ is met but would have to determine whether there was a ‘public interest’ in the message and whether that amounted to a reasonable excuse (Cl 150(5)). That would involve a court making an inherently political judgement about what is in the public interest and how important that public interest is – precisely what the courts should not do.

18 The ‘false communication’ offence (now at Cl. 151) is also objectionable:

(1) The test for what becomes criminal behaviour is very low – sending false information that might cause ‘non-trivial’ psychological harm; and, while this is not defined (which is itself problematic) the courts are likely to be guided by the definition of ‘psychological harm’ in C. 150(4): ie something that causes nothing more than ‘serious distress’; and

(2) The court will be required to be the arbiter of what is ‘false’ information; and while this will generally be straightforward there will be situations where the question is difficult to judge; and a court should not be required to make such a judgement.

19 The common law has long upheld a principle of the criminal law, also protected by Article 7 of the ECHR, that requires that the basis on which criminal liability is imposed must be both sufficiently clear and foreseeable. The risk of causing a person serious distress is neither.

20 In summary, both new offences would not only remove the fundamental right of a person to express ideas that are so offensive, heretical and uncomfortable that they cause ‘serious distress’ to many, they require a company and on review, a government agency and the courts to determine those questions.

Who is a ‘publisher’ online?

21 The common law has applied the old principle that distinguished between an active publisher – such as of a newspaper – and a person providing a facility for others to publish – such as the host of a noticeboard. The general rule was that a publisher that had the opportunity to review statements before distributing them was responsible for their contents, whereas the host of a noticeboard, freely available for people to post announcements, was not a publisher unless and until informed of the contents and had had a reasonable opportunity to remove them.

22 In the online era, the Court of Appeal, in Tamiz v Google Inc, [9] confirmed that an online platform could not have been a publisher until after it was notified of the content of the material, when it might be inferred that it ‘associated itself with, or to have made itself responsible for, the continued presence of that material on the blog and thereby to have become a publisher of the material’. [10]

23 The other part of the legal framework is EU law. [11] The Electronic Commerce (EC Directive) Regulations 2002 provides three tiers of protection, depending on the level of involvement of the provider: ie whether they act as a ‘mere conduit’ for, ‘cache’ or ‘host’ the offending material.

24 In summary, a social media company will be liable as a publisher for what is written by one of its users only if it has been informed about or becomes aware of it and has failed to remove it. The balance between freedom of speech and the responsibility of host to remove potentially defamatory or criminal statements is met by the host becoming a publisher if it does not remove the material. Once notified, it has a choice. Either it can stand with the maker of the statement and share his liability or it can remove it, expeditiously, and bear none.

25 That legal principle works. Social media has opened and democratised the possibility of speech and debate to huge audiences. The choice of who has a large platform no longer lies exclusively with newspapers and broadcasters but is dependent upon a national and international audience that enables ordinary individuals to spread their opinions to an almost unlimited audience. None of this would be possible if the hosts of these platforms were responsible for the content of bloggers, ‘tweeters’ and Facebook users. Because it is impossible for them to know whether statements might be defamatory or criminal in nature, the corporations would be obliged to restrict content either by aggressive algorithms preventing almost any form of controversial speech or by scrutinising content in advance. The political content on their sites would become greatly reduced in size, neutered as a forum for discussion or both.

The Online Safety Bill

26 The regulatory structure of the Bill imposes on regulated websites duties to:

(1) Undertake ‘risk assessments’ taking into account the level of risk of illegal content (Cl 8);

(2) A duty to ‘to take or use proportionate measures to effectively mitigate and manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service’ (and more) (Cl. 9);

(3) Assess the risk to adults from ‘priority content’ that is harmful to adults (Cl. 12), which must take into account Ofcom’s code of practice; and

(4) Be subject to regulation by Ofcom which, pursuant to Cl 37, must prepare a code of practice for the purpose of compliance with duties set out in section 9 or 24;

27 ‘Priority content’ is defined in Cl. 54 as ‘content of a description designated in regulations made by the Secretary of State as priority content that is harmful to adults’. Those regulations may only be made if the Secretary of State is satisfied that ‘there is a

material risk of significant harm to an appreciable number of adults presented by content of that description that is regulated user-generated content’ (Cl. 55(3)) and that content must be illegal. Parliament would thus give the government the power to define what this is by secondary legislation over which it will have minimal scrutiny and which it may not amend.

28 At Sch 4 para 4, the Bill outlines ‘online safety objectives’. These are the objectives that Ofcom must ensure a service provides. They include not only that ‘the service provides a higher standard of protection for children than for adults (at para 4(a)(vi))  but requires there to be ‘adequate controls over access to the service by adults’ (at para 4(a)(viii)). That is to say, the state imposes on private businesses a duty to undertake censorship in accordance with its directions (made through Codes of Conduct that can be taken into account if Ofcom takes enforcement action against the websites). The Schedule also has regard to algorithms used by the service (para 4(b)(i)) – meaning that a website will be rewarded by designing algorithms designed to identify content that might cause ‘serious distress’ to adults.

29 As the original draft Bill had, the current iteration includes a ‘duty’ to protect freedom of expression (at Cl. 19 and 29). No doubt in response to criticism, the Secretary of State has amended the Bill to include the protection of democratic and journalistic content (C. 15/16). As with any attempt to mitigate the consequence of the regulations of speech, however, they create their own problems:

(1) It is only because of the Bill’s restriction of speech that might cause ‘serious distress’ that such measures are needed;

(2) While a website is enjoined to enforce its own restrictions in such a way as encourages a ‘wide diversity of political opinion’ (Cl. 15(3), the Bill recognises that there will be censorship of political content, the enforcement of which will be by a website and overseen by Ofcom;

(3) A website will be required to decide what is or appears to be specifically intended to contribute to ‘democratic political debate in the United Kingdom or a part or area of the United Kingdom’, which could itself be a political judgement and which will be difficult to implement;

(4) The duty to operate ‘proportionate systems and processes’ designed to ensure that ‘the importance of the free expression of journalistic content is taken into account’ allows a website to make what are inevitably political decisions about the extent to which freedom of expression is measured against the risk of ‘serious distress’; and

(5) This is all in the context of a Code of Conduct requiring the identification of ‘priority content’ – that will not be immune from removal – that is as yet unknown (to be determined by the Secretary of State in regulations).

In summary, this is a solution grafted onto the Bill only because it has created its own problem: namely the criminalisation of ordinary – and political, artistic, religious – expression on the grounds that it is likely to cause ‘serious distress’.

30 Finally, the impact of the regulations imposed by the Bill must be seen in the light of the potential severity of the enforcement measures that can be taken against companies – up to and including the restriction of its services (Cl. 123-127).

Conclusion

31 There are three fundamental objections to the Bill:

(1) It criminalises and regulates speech that might cause ‘serious distress’ – a description of behaviour that, through its lack of precision, potentially encompasses ordinary, political, artistic and religious ideas;

(2) By imposing duties of care and risk assessments, it requires social media companies to exercise their judgement over the content of material at the risk of severe sanctions up to and including the sort of disruption of online services until now seen only in autocratic states; and

(3) It gives a state agency, Ofcom, oversight over decisions by companies that will have the effect of censoring speech; and, ultimately, would require the courts to resolve political questions, including the degree to which speech is of democratic importance and the extent to which its public importance should allow it immunity (under Cl. 151) from criminalisation for causing a person a serious distress.

32 The Bill is a concerning escalation of the power of state. The more the state considers it has a duty to ‘protect’, the more it will – of necessity – increase its power.

33 The Bill isn’t merely the wrong answer: it is an attempt to solve the wrong problem. The true threat posed by large social media companies is not that they have become the ‘wild west’ of the internet. It is that they have allowed the accumulation of power in a handful of companies and a small number of individuals, all imbued with the power to police speech. This Bill will not only enhance that power but put it on a statutory footing and ensure that it is regulated, in turn, by the state.

Francis Hoar

On behalf of the Together Association

23rd May, 2022


[1] https://fieldcourt.co.uk/barrister/francis-hoar/

[2] Including in White v GMC [2021] EWHC 3286, in which I established that the GMC had breached a doctor’s right to free expression by preventing him from contributing to the debate on public health measures.

[3] https://static1.squarespace.com/static/6192438b8e7bbb6ce14e1375/t/61b227d2bbe2e879dc85c8ab/1639065555406/In-Protection-of-Freedom-of-Speech_1429_1607.pdf

[4] All opinions expressed in these submissions are my own and those of the Together Association and not those of Field Court Chambers. Any errors are mine only.

[5] In passing, the author and the Together Association have no objection to the proposed new ‘threatening communication offence’, now Cl. 152, which criminalises threats of death or serious harm. This meets the test for what it is appropriate to criminalise, namely that it sufficiently serious conduct to merit criminalisation and the test for the act penalises is sufficiently clearly established by the statute.

[6] Later Lord Hoffman, in R v Central Independent Television plc [1994] Fam 192, 202-203, emphasis added.

[7] Redmond-Bate v Director of Public Prosecutions (1999) 7 BHRC 375, [20], emphasis added.

[8] By John W. Parker and Son, 1859

[9] [2013] EWCA Civ 68.

[10] Ibid, para 34.

[11] Which continues to apply unless and until amended under the (first) EU Withdrawal Act 2020.

 

Prepared 26th May 2022