Draft Online Safety Bill Contents

4Safety duties relating to adults

Illegal content and activity

113.The Bill places duties on companies to tackle “illegal content” online. To do this, it has to define what is meant by “illegal content”. The criminal law does not specify “this piece of content is illegal”. Rather, it says that someone commits an offence if they (for example) publish an “obscene article”.243 The draft Bill therefore defines “illegal content” as being where the “the use of the words, images, speech or sounds” that make up the content can comprise a “relevant offence”, either on their own or in the context of the rest of the site. It can also refer to content where it is the dissemination that is the offence. For example, under the Obscene Publications Act, cited above, it is publication that is the offence.244

114.Criminal guilt is determined by law enforcement and the courts. Platforms will not be able to rely on these findings when they draw up and apply their terms and conditions. The draft Bill therefore defines “illegal content” as content where the “service provider has reasonable grounds to believe” that use or dissemination amounts to a “relevant offence”.245 “Reasonable grounds” is a term recognised in law that requires objectivity. For example, a police officer must have reasonable grounds for suspecting someone is committing an offence or about to commit an offence before they use certain powers of arrest.246

115.The draft Bill is not concerned with all content whose creation or dissemination might be an offence. It is only concerned with “relevant offences”. A “relevant offence” must be terrorism content,247 CSEA content,248 an existing offence specified by the Secretary of State in regulations249 or an offence where the victim is an individual or individuals.250 Designating an offence under the draft Bill cannot be used to create new offences. Equally, an offence that can be committed online is not considered “illegal content” under the draft Bill unless it falls under one of these categories above.

116.The draft Bill also has a category of “priority illegal content”, defined by the Secretary of State in regulations under criteria that include the risk of harm, the severity of that harm and the prevalence of behaviour that could amount to a relevant offence. Defining “priority illegal content” distinguishes those forms of illegal content that providers are required to proactively seek out and “minimise” their presence on a platform, and those where they are only required to mitigate harm arising from it or to take down on report. It is unclear whether the Secretary of State could regulate for non-priority illegal content. The different categories are set out below.

Table 1: Summary of definitions of “Illegal Content” in the draft Bill

Category

Defined by

Platforms required to

CSEA and Terrorism content

The face of the draft Bill

Mitigate and effectively manage risks to individuals. Ensure it is not persistent or prevalent. Proactively minimise its presence and dissemination, as well as swiftly take down when alerted to it.

Priority illegal content

Regulations from the Secretary of State

Mitigate and effectively manage risks to individuals. Proactively minimise its presence and dissemination, as well as swiftly take down when alerted to it.

Other illegal content

Clause 41(4)(d): another offence of which the intended victim is an individual

Mitigate and effectively manage risks to individuals.

Swiftly take down such content when alerted to it.

Explicitly excluded

Offences concerning:

a) the infringement of intellectual property rights;

b) the safety and quality of goods;

c) the performance of a service by someone not qualified to perform it.

Not covered by the draft Bill.

Implicitly excluded

Offences not covered above.

Do not appear to be considered illegal content by the draft Bill (but may be considered content harmful to adults or children).

Source: Clauses 3 and 41

Focus of the draft Bill

117.For many witnesses, including those who were generally critical of the draft Bill, tackling illegal content online was their priority. Professor Richard Wilson, Gladstein Distinguished Chair of Human Rights at the University of Connecticut, suggested that the Bill should “go after the low hanging fruit” and “suppress that which is already illegal.”251 Silkie Carlo, Director of Big Brother Watch, agreed saying “the first priority has to be getting a grip on the sheer amount of illegal content online, and criminal communications and criminal abuse …”.252 It was clear from evidence to us, as outlined in Chapter 2, that social media companies are failing to remove content that could amount to a criminal offence. To take just one specific example, Ms Jankowicz told us “technology companies are not doing their due diligence. We receive rape threats; we are told that these do not constitute a violation of terms of service.”253

118.We have received a large amount of evidence in our inquiry but very little of it takes issue with the regulation of illegal content. This seems to us to point to a self-evident truth, that regulation of illegal content online is relatively uncontroversial and should be the starting point of the Bill.

Scope of “illegal content”

119.We heard that what constitutes “illegal content” is not clear on the face of the draft Bill. As noted above, “illegal content” can be specified by the Secretary of State in regulations or is content that amounts to an offence against an individual. Barbora Bukovská, Senior Director of Law and Policy at Article 19, noted that the Bill does not exhaustively list illegal content meaning that the decision on what is covered is, partly, delegated to providers.254 The British and Irish Law, Education and Technology Association made similar criticisms and noted the potential impact on fundamental rights of inconsistent decision making.255

120.The Crown Prosecution Service (CPS) lists at least fourteen criminal offences that can be committed or facilitated by online communication.256 They include harassment or stalking,257 making threats to kill,258 disclosing sexual images without consent259 and blackmail.260

121.We heard from the Department that the list of offences that would be designated priority illegal content under the Secretary of State’s powers was already being developed:

“Although the final list of offences is yet to be confirmed, priority categories of criminal offences are likely to include hate crime, revenge pornography, promoting or facilitating illegal immigration and the sale of illegal drugs and weapons.”261

122.Examples of offences that are prevalent online and might not be covered under “illegal content” as it stands include some relating to extreme pornography, some elections offences (including the proposed offences relating to exercising undue influence through disinformation about election administration and failure to include information about origins of election material in the Elections Bill262) and some hate crime depending on how the regulations are drafted, as we discuss below.

123.The Minister’s letter says that the Government intends to include hate crime as priority illegal content. The criminal law in England and Wales has two forms of hate crime. The first recognises hostility263 to the victim as an aggravating factor where it is based on five characteristics or perceived characteristics: race, religion,264 sexual orientation, disability, and transgender identity.265 Where hostility is found the court can increase the sentence above the recommended guideline for the substantive offence. Similar provisions on hate crime apply in Scotland following the passing of the Hate Crime and Public Order (Scotland) Act 2021.

124.The second type of hate crime concerns specific offences such as Incitement to Racial Hatred266 and Stirring Up Racial Hatred,267 or Stirring Up Hatred on the Grounds of Religious Belief or Sexual Orientation.268 Stirring up racial hatred may be particularly relevant to the online environment as it explicitly includes written material that is “threatening, abusive or insulting”,269 and does not necessarily require the offender to demonstrate intent.270 It remains to be seen which forms of hate crime the Government intends to address.

125.We heard particular support for including offences that disproportionately affect women in the definition of illegal content. Bumble, a dating site, wanted to see the inclusion of offences “relating to image-based abuse, sexual harassment that takes place or is facilitated online, misogynistic content, gendered hate crime, and stalking” adding “it should not be left to the Secretary of State’s discretion for services to be obliged to mitigate and tackle harms that disproportionately affect women.”271 Other witnesses agreed.272 At the same time, not all of these issues are currently the subject of the criminal law. The Law Commission has recently published its review of the law on Hate Crime and is currently reviewing the law on intimate image abuse. We consider these issues below.

126.We believe the scope of the Bill on illegal content is too dependent on the discretion of the Secretary of State. This downplays the fact that some content that creates a risk of harm online potentially amounts to criminal activity. The Government has said it is one of the key objectives of the Bill to remove this from the online world.

127.We recommend that criminal offences which can be committed online appear on the face of the Bill as illegal content. This should include (but not be limited to) hate crime offences (including the offences of “stirring up” hatred), the offence of assisting or encouraging suicide, the new communications offences recommended by the Law Commission, offences relating to illegal, extreme pornography and, if agreed by Parliament, election material that is disinformation about election administration, has been funded by a foreign organisation targeting voters in the UK or fails to comply with the requirement to include information about the promoter of that material in the Elections Bill.

Reform of the Criminal Law

128.Whilst there is broad agreement that the Bill should regulate illegal content, the criminal law in relation to the online world has long been recognised as in need of reform. In 2019 the House of Commons Petitions Committee described the law relating to online abuse as “not fit for purpose”.273

129.Communications offences have long existed in domestic law and have evolved as methods of communication have changed. The current law is based primarily on two offences: sending a communication with the intent to cause distress or anxiety contrary to Section 1 of the Malicious Communications Act 1988274 and the improper use of the public communications network under Section 127 Communications Act 2003.275 Section 1 of the Malicious Communications Act 1988 requires the communication be sent to another, meaning public posts on social media may not be covered. Both offences rely on the communication being “grossly offensive” or otherwise “indecent”.276

130.Concerns over the utility of the current communications offences led the Government to ask the Law Commission to examine the law and make recommendations for reform. The Law Commission’s report Modernising Communications Offences was published in July 2021.277 The Commission concluded the current law potentially both over and under-criminalised social media users. Under-criminalisation occurred because “some abusive, stalking and bullying behaviours, despite causing substantial harm, simply fall through the cracks.” Over-criminalisation happened because of the focus of the communications on the content of the message, not the harm it causes or is intended to cause. This, combined with the subjective nature of “grossly offensive” or “indecent”, means:

“ … the law criminalises without regard to the potential for harm in a given context. Two consenting adults exchanging sexual text messages are committing a criminal offence, as would be the person saving sexual photographs of themselves to a ‘cloud’ drive.”278

131.The Law Commission also concluded that the current criminal law does not adequately police behaviour such as the promotion of self-harm, cyber-flashing or the sending of flashing images to people with epilepsy. We heard compelling evidence on these issues, as outlined in Chapter 2. The Commission concluded that the following new offences (which will apply to users rather than service providers or their senior managers) should be introduced:

a new “harm-based” communications offence to replace the offences within Section 127(1) of the Communications Act 2003 and the Malicious Communications Act 1988 which would require intent to commit harm;

a new offence of encouraging or assisting serious self-harm;

a new offence of cyberflashing which would require either intent to cause harm or recklessness as to whether harm was caused and would be defined as a sexual offence;

a new offence of intentionally sending flashing images to a person with epilepsy with the intention to cause that person to have a seizure; and

new offences of sending knowingly false, persistent or threatening communications, to replace section 127(2) of the Communications Act 2003.279

132.The Law Commission’s report was published after the publication of the draft Bill and the creation of our Committee. On 4 November, following press reports, the Secretary of State confirmed to us that she intends to adopt the Law Commission’s recommendations on harm-based and false, persistent and threatening communications and anticipated that the other recommended offences would be taken forward by the relevant departments.280

133.The Law Commission’s report paid particular attention to the need to protect freedom of expression online and the Commission made significant changes to its proposed offences in response to concerns about this raised during the consultation. These were noted by the House of Lords Communications and Digital Committee in their report.281

134.Days before we finalised our report, the Law Commission produced its report on reforming hate crime legislation. It recommended the creation of an offence of “stirring up” hatred or hostility on the grounds of sex, disability and transgender, or gender diverse, identity.282 We had already considered how these forms of hate should be considered in online safety regulation and recommended that they should be covered, as set out below.

135.Implementation of the Law Commission’s recommendations on reforming the Communications Offences and Hate Crime will allow the behaviour covered by the new offences to be deemed illegal content. We believe this is a significant enhancement of the protections in the Bill, both for users online but also for freedom of expression by introducing greater certainty as to content that online users should be deterred from sharing. We discuss how to address concerns about ambiguity and the context-dependent nature of the proposed harm-based offence through a statutory public interest requirement in Chapter 7.

136.We endorse the Law Commission’s recommendations for new criminal offences in its reports, Modernising Communications Offences and Hate Crime Laws. The reports recommend the creation of new offences in relation to cyberflashing, the encouragement of serious self-harm, sending flashing images to people with photo-sensitive epilepsy with intent to induce a seizure, sending knowingly false communications which intentionally cause non-trivial emotional, psychological, or physical harm, communications which contain threats of serious harm and stirring up hatred on the grounds of sex or gender, and disability. We welcome the Secretary of State’s intention to accept the Law Commission’s recommendations on the Communications Offences. The creation of these new offences is absolutely essential to the effective system of online safety regulation which we propose in this report. We recommend that the Government bring in the Law Commission’s proposed Communications and Hate Crime offences with the Online Safety Bill, if no faster legislative vehicle can be found. Specific concerns about the drafting of the offences can be addressed by Parliament during their passage.

137.New offences need enforcement resources to be addressed effectively. We heard from T/Commander Clinton Blackburn about the challenges the police face resourcing dealing with economic crime.283 In Brussels we heard about the overwhelming amount of CSEA content that Interpol assist national enforcement services with. Adding new offences without increasing resources to enforce them will not help victims. When the Law Commission’s new offences are brought into law, the police will need greater enforcement resources to ensure that perpetrators are brought to justice.

138.The Government must commit to providing the police and courts with adequate resources to tackle existing illegal content and any new offences which are introduced as a result of the Law Commission’s recommendations.

Identifying “illegal content”

139.The criminal law is designed to establish whether or not an individual is guilty of an offence to a high standard of proof following an extensive, and adversarial, legal process. Since an individual’s liberty and good name may be at stake, the criminal law requires that all elements of an offence be proved so that jurors are “satisfied so you are sure” or convinced “beyond reasonable doubt” before an offender is found guilty.

140.As set out in paragraph 114, the test for illegal content as defined by the draft Bill284 is substantially lower than the test applied in the criminal courts.285

141.The draft Bill requires the provider to operate “proportionate” systems and processes to mitigate risks of harm, minimise the presence of such content or take it down once reported. We heard that the application of the “reasonable grounds to believe” test by providers would be a challenging task given the complexity of much of the criminal law. We heard concerns from Mr Millar that:

“ … applying the statutory wording of most modern criminal offences to the facts is a difficult and technical exercise. It is one which police, CPS and courts often get wrong. This is both because of the flexibility of the language that is used and because of detailed nature of the drafting in most of our contemporary criminal offences. Criminal offences now, especially terrorism and CSEA offences, are much more complex than they were 30 or 40 years ago.”286

142.We heard concerns from some witnesses that this might lead to over-censorship by platforms, looking to avoid being subject to penalties under the draft Bill.287

143.The draft Bill addresses the problem of how some illegal content can be identified in practice by requiring Ofcom to publish a Code of Practice on terrorism content and CSEA content. It does not require such a Code of Practice for the wider duties around illegal content.

144.We recommend that Ofcom be required to issue a binding Code of Practice to assist providers in identifying, reporting on and acting on illegal content, in addition to those on terrorism and child sexual exploitation and abuse content. As a public body, Ofcom’s Code of Practice will need to comply with human rights legislation (currently being reviewed by the Government) and this will provide an additional safeguard for freedom of expression in how providers fulfil this requirement. With this additional safeguard, and others we discuss elsewhere in this report, we consider that the test for illegal content in the Bill is compatible with an individual’s right to free speech, given providers are required to apply the test in a proportionate manner that is set out in clear and accessible terms to users of the service.

145.We recommend that the highest risk service providers are required to archive and securely store all evidence of removed content from online publication for a set period of time, unless to do so would in itself be unlawful. In the latter case, they should store records of having removed the content, its nature and any referrals made to law enforcement or the appropriate body.

Power to designate priority illegal content

146.Given our recommendation that more offences should be listed on the face of the Bill, the question might arise as to whether the power of the Secretary of State to designate priority illegal content is still required.

147.Some of our witnesses expressed concern about the powers to designate priority content in the draft Bill. Ms Carlo described them as a “blank cheque”288, whilst Prof Wilson said:

“The question we should always ask of legislation is, ‘Would I like this in the hands of my political opponents?’ because one day they will come to power.”289

148.We recommend that the Secretary of State’s power to designate content relating to an offence as priority illegal content should be constrained. Given that illegal content will in most cases already be defined by statute, this power should be restricted to exceptional circumstances, and only after consultation with the Joint Committee of Parliament that we recommend in Chapter 9, and implemented through the affirmative procedure. The Regulator should also be able to publish recommendations on the creation of new offences. We would expect the Government, in bringing forward future criminal offences, to consult with Ofcom and the Joint Committee as to whether they should be designated as priority illegal offences in the legislation that creates them.

Duties to protect adults’ online safety

149.For some of our witnesses, the scope of the Bill should be confined to regulating content that is likely to be illegal. The campaign coalition, “Legal to Say, Legal to Type” argued:

“If something is legal offline, it should be legal online. If the government believes that particular content should be criminalised online, they should address this through parliament and the courts, not big tech.”290

150.We also heard that this would leave large areas of content and activity that causes risks of harm online unregulated. This would include content and activity that is legislated for offline in the criminal and civil law and potentially give scope for service providers to refuse to act even against content that has been or may be legislated for online.291

What lies outside “illegal content”

Characteristics

151.The criminal law in England and Wales covers hate crime arising from hostility to race, religion, disability, sexual orientation and transgender identity. This is significantly different from the protected characteristics under the Equality Act 2010 which prohibits discrimination on the grounds of age,292 disability,293 gender reassignment,294 marriage and civil partnership,295 race,296 religion or belief,297 sex,298 and sexual orientation.299 The Equality Act applies in workplaces and to the provision of public and private services.300 However, the protections it provides against, for example abuse or harassment, do not apply in other offline settings or to private companies online or the users of social media platforms.

Misogyny

152.Much of the harmful online behaviour we heard about from witnesses would not be covered even under the expanded scope of illegal content we recommend. For example, hostility on the grounds of sex does not currently constitute a hate crime, even though we heard women face significant abuse online and offline. Edleen John, Director of International Relations and Corporate Affairs and Co-Partner for Equality, Diversity and Inclusion at the FA told us that misogynistic abuse was experienced by women players “from the top flight game—England players—down to the grass roots, so including young women and players in our impairment specific pathways.”301 Ms John told us the volume of recent misogynistic and racist abuse was so high that certain players were blocked from reporting it anymore to social media companies, apparently because the companies had assumed so many complaints from one person must be malicious.302

153.The Centenary Action Group, a coalition of groups campaigning to remove barriers to women’s political representation, highlighted research that showed women are 27 times more likely than men to be harassed online.303 Incels, who claim there is a conspiracy preventing some men from having sexual relationships with women, post “violent chatter including celebrating the murder of women and calling for the rights of women to be curtailed.”304

154.Misogynistic abuse taking place in a workplace, on public transport or by the provider of a service, could lead to action under the Equality Act, a law intended to prevent people being disadvantaged by hostility to their personal characteristics. Many witnesses told us that women’s experience of gendered abuse online leads to a “chilling” effect”305 on their freedom of expression and professional careers, the fear of attracting abuse inducing self-censorship.306 Prof McGlynn told us her research showed that woman can “experience a more general sense of threat of sexual harassment, violence and abuse from having been abused online which impacts their daily lives and decisions”.307 Other witnesses agreed.308

155.Several witnesses told us that gender-based abuse online deterred women from participating in public life. Ms Jankowicz noted as an example that the abuse received by Vice-President Kamala Harris was often sexualised. Ms Jankowicz highlighted the damaging impact on young women’s participation in the democratic process.309 Ms Wick agreed that misogynistic abuse to individuals led to societal harm: “We know that teenage girls are much less likely to speak up on social media for fear of being criticised. That has a very real effect on their ambitions to go into public life.”310 Mr Perrin highlighted the impact of online intimidation on women’s participation in political life in Northern Ireland.311

Other characteristics

156.We have not heard specific evidence on ageist abuse, abuse against non-religious belief or on the basis of maternity or marital status. At the same time, none of these would be covered by a Bill that focused only on illegal content. As another example, the laws covering the “stirring up of hatred” only apply to race, religion and sexual orientation, and would not apply in the case of other characteristics protected by the law on hate crime or the Equality Act.312 The Law Commission has, however, recently recommended that the offences should be extended to cover sex, disability and transgender, or gender diverse, identity.313

Threshold

157.We discussed above the threshold of proof required to prove a criminal conviction and the test that the draft Bill applies. Although the threshold of proof is lower in the draft Bill, it remains the case that providers would need to have systems and processes in place to take a view as to whether all elements of the offence might reasonably have been committed. This presents particular problems with offences that require proof of “state of mind” (such as intent or malice) on the part of the guilty party, which would include the Law Commission’s new harm-based communications offence.314 For example, the offence of harassment requires that the person in question either knows or “ought to know” that their behaviour constitutes harassment.315

158.Much of the behaviour we heard creates risks of harm may not therefore fit easily into a regulatory regime solely focused on illegal content. For example, the Law Commission’s new offence of cyberflashing requires the sender either intended to cause distress or sent the image for their personal sexual gratification and was reckless as to whether distress was caused.316 We heard that the sending of unsolicited penis images was a particular problem for young women and girls, a concern borne out by the findings of Ofsted in its report on sexual abuse in schools.317 Research suggests such images are frequently not sent with intent to distress or for sexual gratification but that a “large amount of it is a kind of male bonding among their peers. That is why they share unsolicited nude images as well; they want to share among their peer group, ‘Oh, we’ve sent them’.”318

159.Hope not Hate, among others, had particular concerns about the large volume of far-right or other extremist propaganda “that does not reach the legal threshold for prosecution.”319 The British Horseracing Authority and Professional Jockeys Association told us that people who had lost money betting on horseraces post huge volumes of vitriolic abuse of their members online. This included material that would fall short of a direct, criminal threat to kill but nonetheless is threatening: “A truly dodgy b*****d. Karma punish you, wish you break your neck and never ride again. A***hole. Idiot.”320

Content that is harmful to adults

160.To address these cases, the Government introduced a third safety duty in Clause 11. Clause 11 introduces a duty on Category 1 providers to protect adults’ online safety, covering content that is “harmful to adults”.321 Like the other safety duties, it requires a specific risk assessment and for service providers to state in the terms of service how they will deal with this type of content, where it is designated as priority content, or identified in the provider’s risk assessment. It does not mandate specific outcomes, such as removing or minimising the presence of this content. As written, the draft Bill leaves such decisions to the service providers, although Ofcom has the power to issue a code of practice on compliance.322

Defining content that is harmful to adults - Clause 11

161.One of the problems this legislation must grapple with is defining what creates a risk of harm to adults. Clause 11 attempts this in a broad way, and we have heard throughout our inquiry that this will make it difficult to apply, as well as open to legal challenge.323

162.As with other categories of content, the Government aims to identify specific types of harmful content by designating them as “priority content that is harmful to adults”. These are not listed on the face of the Bill but DCMS suggested they may include the “most prevalent forms of online abuse, together with other harmful material which might disproportionately impact vulnerable users, such as self-harm or suicide content.”324 The draft Bill requires service providers to set out priority content that is harmful to adults that will be “dealt with” by the service in their terms and conditions.325 It does not specify what is meant by “dealt with”.

163.Beyond those designated specifically as “priority content that is harmful to adults”, content is considered to be harmful to adults if: “the service has reasonable grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities.”326 This uses the same terminology as the definition for content that is harmful to children. The service provider only has to specify how non-priority content that is harmful to adults will be “dealt with” in their terms of service, if it is identified in the service provider’s risk assessment.327

Power to designate priority harm

164.The difference between priority content that is harmful to adults and non-priority is whether a service provider has to include it in their terms and conditions regardless of whether it is identified in their risk assessment. The Government’s justification for the power to designate priority content that is harmful to adults was to allow it to respond to upcoming risks of harm.328 However, we heard widespread concern about the breadth of this power. Unlike the powers granted in relation to illegal content and content harmful to children, this power isn’t bound by any definition or legislation, nor is it restricted to a particular group. The Secretary of State is required to consult Ofcom, but not follow their recommendations.329 Whilst the initial use of the power requires an affirmative vote in both Houses, subsequent amendment is exercisable by negative statutory instrument, meaning there is no guarantee of parliamentary scrutiny.330

Delegation of decision making

165.Another aspect of Clause 11 that concerned witnesses was that it effectively delegates to service providers responsibilities for deciding what is ‘harmful’ and gives them the authority of the state in doing so. Prof Wilson said that the broad definition of harm “may contribute to the misapplication of the regulatory powers of the Bill”331. Journalist Matthew d’Ancona, Editor at Tortoise Media, added that it would involve “handing over the definition of harm” to tech companies, and that the draft Bill “allows huge latitude around what constitutes harm.”332 Ms Bukovská said vagueness would contribute to over removal and the suppression of minority voices.333 The delegation to service providers of deciding what may be harmful was one of the most frequent concerns we heard. The British and Irish Law, Education and Technology Association said:

“In being asked to make determinations of legal speech, commercial platforms are being trusted with decisions on what is–or is not–permitted speech. The model proposed therefore rests on trust, placing the operators of platforms in a position where they are directly controlling the speech of an individual.”334

166.On the other side of the argument, there were concerns that Clause 11 as drafted was simply ineffective. Dr Francesca Sobande, Cardiff University, said:

“Before the Bill is finalised it should include a more detailed explanation of online abuse and harms to appropriately contextualise how online safety is understood, and to ensure that a broad range of forms of online abuse are acknowledged (e.g. including, but not limited to, ableism, ageism, racism, sexism, misogyny, xenophobia, Islamophobia, homophobia, and transphobia).”335

167.Mr Ahmed agreed: “You are asking the companies to mark their own homework, but you are also, in one respect, asking them to set their own rules and set the test itself.”336

168.We were told by several witnesses, including representatives of Facebook and Twitter, as well as the Football Association, that it should be for Parliament to decide what should and should not be covered by regulation.337 The House of Lords Communications and Digital Committee came to a similar conclusion:

“If a type of content is seriously harmful, it should be defined and criminalised through primary legislation. It would be more effective—and more consistent with the value which has historically been attached to freedom of expression in the UK—to address content which is legal but some may find distressing through strong regulation of the design of platforms, digital citizenship education, and competition regulation.”338

169.As we discussed above, the conclusion of the Law Commission’s work into reform of the communications offences creates a new harm-based offence for online communication.339 This may allow for the provisions on content that is harmful to adults to be refined further, as some of this will in future be caught by the duties to act on regulated activity if the Government accepts our recommendations. As we discussed in the previous section however, there are challenges in setting thresholds for service providers to use, which may not be the same as those used by law enforcement. Some content that is harmful to adults will remain outside of scope of legislation.

Replacing Clause 11

170.We heard from those primarily concerned with freedom of expression that the definition of content that is harmful to adults is unsuitably broad, but also from many who welcome it or think it may mean leaving significant causes of harm unaddressed or in the hands of the service providers. While narrowly defining every type of content that could be harmful is appealing in some ways, we have also heard that the Bill needs to be agile and flexible.340 The legislation will need to be able to adapt to an ever changing societal and technological landscape, but also not be subject to undue political influence.

171.While a tighter definition of content that is harmful to adults may make the statutory requirements easier to fulfil and reduce some of the risk around overzealous moderation, a definitive list also carries risks that it will become out of date (if it is difficult to update) and that it is open to undue political influence (if it is too easy to add to). There is however an existing body of law that taken together may provide a reasonable estimation of what types of activity society may agree are potentially harmful, even if the relevant offences are not always directly applicable online. For example, while hate crime legislation only protects a limited number of groups, the characteristics named by the Equality Act and hate crime legislation together reflect those considered to warrant protection in civil society. It may be possible therefore to refine the definition of content that is harmful, in relation to existing law, even if not linking to specific offences. Mr d’Ancona agreed that there may already be a basis that could be used:

“Except for free speech absolutists, there are plenty of perfectly legitimate, legislated, in precedent or in common law, restrictions on speech that now really need to be put into action. The problem is a legislative structure that matches the technological revolution rather than identifying speech that is harmful.”341

172.Sanjay Bhandari, Chair of Kick It Out, said:

“Sometimes people think that the legal part feels like a big grey area, and how do you legislate for that? Actually, we have some jurisprudence from elsewhere. There is a civil law cause of action in conspiracy, and conspiracy has two limbs: if lawful means conspiracy, or unlawful means conspiracy. You can conspire by lawful means and be held to be civilly responsible for that. That goes back to the 1940s and was clarified in the Lonrho v Fayed litigation in the late 1980s/early 1990s, and there has been a rich history of that economic tort.

There are two key defining characteristics. Was harm experienced in this case? Yes, tick, harm was experienced. Was it intended? Was it aimed? If you send a monkey emoji to a footballer, that is pretty clearly intended to cause harm.

We have precedents, we have jurisprudence. We just need to look at that jurisprudence from elsewhere and bring that under harmful content, because I think it is achievable.”342

173.As well as the body of criminal law that may not have been designed to be applied online but that represents well understood causes of harm, this may also draw on the Equality Act, electoral law and legitimate reasons for interference in freedom of expression as defined by the ECHR, as well as its protections for free and fair elections. The BBFC, who regulate film and video in the UK, described in evidence to us the risks of harm they consider in detail when classifying video content, and how those standards are not applied online.343 Similarly, the Communications Act 2003 references the characteristics described in the Charter of Fundamental Rights of the European Union when defining harmful content for video sharing platforms, albeit with the higher threshold of incitement. Using this extends beyond hate crime legislation and covers characteristics more similar to those in the Equality Act.

174.Clause 11 of the draft Bill has been widely criticised for its breadth and for delegating the authority of the state to service providers over the definition of content that is harmful and what they should do about it. We understand its aims and that the Government intended it primarily as a transparency measure over something companies are already doing. As drafted, however, it has profound implications for freedom of speech, is likely to be subject to legal challenge and yet may also allow companies to continue as they have been in failing to tackle online harm.

175.We agree that the criminal law should be the starting point for regulation of potentially harmful online activity, and that safety by design is critical to reduce its prevalence and reach. At the same time, some of the key risks of harm identified in our evidence are legislated for in parts of the offline world, but not online, where the criminal law is recognised as needing reform, or where drafting that makes sense in the context of determining individual guilt would allow companies to challenge attempts to make them act. A law aimed at online safety that does not require companies to act on misogynistic abuse or stirring up hatred against disabled people, to give two examples, would not be credible. Leaving such abuse unregulated would itself be deeply damaging to freedom of speech online.

176.We recommend that Clause 11 of the draft Bill is removed. We recommend that it is replaced by a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill. These definitions should reference specific areas of law that are recognised in the offline world, or are specifically recognised as legitimate grounds for interference in freedom of expression. For example, we envisage it would include:

177.As with the other safety duties, we recommend that Ofcom be required to issue a mandatory code of practice to service providers on how they should comply with this duty. In doing so they must identify features and processes that facilitate sharing and spread of material in these named areas and set out clear expectations of mitigation and management strategies that will form part of their risk assessment, moderation processes and transparency requirements. While the code may be informed by particular events and content, it should be focused on the systems and processes of the regulated service that facilitates or promotes such activity rather than any individual piece of content. We envisage that this code would include (but not be limited to):

178.Accepting these recommendations would create a narrower, but stronger, regulatory requirement for service providers to identify and mitigate risks of harm in the online world that may not necessarily meet the criminal thresholds, but which are based on the same criteria as those thresholds, indicating that society has recognised they are legitimate reasons to interfere with freedom of speech rights. It would place these areas on the face of the Bill and remove the broad delegation of decisions on what is harmful from service providers.

179.We recognise that the broad power to define new types of content that is harmful to adults in secondary legislation was a key concern with Clause 11. We recognise that there will need to be the ability to amend what is covered by this proposal to ensure that the Bill is futureproofed. At the same time, it needs to be tightly proscribed and subject to active parliamentary scrutiny and review.

180.We recommend that additions to the list of content that is harmful should be by statutory instrument from the Secretary of State. The statutory instrument should be subject to approval by both Houses, following a report from the Joint Committee we propose in Chapter 9. Ofcom, when making recommendations, will be required by its existing legal obligations to consider proportionality and freedom of speech rights. The Joint Committee should be specifically asked to report on whether the proposed addition is a justified interference with freedom of speech rights.

Accessibility and consistency of terms and conditions

181.We have heard throughout this inquiry that when it comes to the types of content and activity that present a risk of harm we have been discussing, many service providers already have terms and conditions, or terms of service, that prohibit them but they are poorly understood and inconsistently applied and enforced. Mr Ahmed told us that service providers already have policies banning vaccine disinformation but do not enforce them.345 Mr Russell described a “veneer of useability about most platforms”. He continued:

“As soon as you get beneath that veneer to the … pages of the average terms and conditions, or whatever it is, it is a mystery to most people. They are a great example of how the user experience needs to be simplified for all, so that it is better understood and more readily understood by those who need to understand it.”346

Others agreed that the size of most service providers’ terms and conditions meant they were unlikely to be read. Mr Harrison told us accessibility would be the “key point for those with learning disabilities”.347 Ms Pelham suggested a one-page synopsis, accessible to the average reader, should be required.348 As well as their complexity, there have also been cases of companies applying their terms and conditions inconsistently deliberately. For instance, Facebook has been reported to “whitelist” high profile accounts, such as celebrities and politicians, essentially exempting them from their terms and conditions altogether.349 We discuss this further in the later chapter on transparency (Chapter 9).

182.Dame Melanie Dawes DCB, Chief Executive of Ofcom talked about the important role clear terms and conditions could play:

“This is about terms and conditions that make sense to the user and are not just about your assent; they are about you being given information that helps you to manage your life online and manage risks to you. They are a commitment from the company to you as to what you can expect.”350

183.The original Clause 11 in the draft Bill, in common with the other safety duties, required providers to produce clear and accessible terms of service and enforce them consistently in relation to content harmful to adults. While we have recommended a narrower but stronger regulatory requirement for service providers to identify and mitigate risks of harm, the requirements for transparency, clarity and consistency are vital to ensuring users are well informed about how platforms promote content to them and what protections they can expect. Clear, concise and fully accessible terms will allow users to make informed choices.

184.We recommend that the Bill mandates service providers to produce and publish an Online Safety Policy, which is referenced in their terms and conditions, made accessible for existing users and made prominent in the registration process for new users. This Online Safety Policy should: explain how content is promoted and recommended to users, remind users of the types of activity and content that can be illegal online and provide advice on what to do if targeted by content that may be criminal and/or in breach of the service providers’ terms and conditions and other related guidelines.

185.The Online Safety Policy should be produced in an accessible way and should be sent to all users at the point of sign up and, as good practice suggests, at relevant future points. “Accessible” should include accessible to children (in line with the Children’s Code), where service providers allow child users, and accessible to people with additional needs, including physical and learning disabilities. Ofcom should produce a Code of Practice for service providers about producing accessible and compliant online safety policies and on how they should make them available to users to read at appropriate intervals in line with best practice (for example, when the user is about to undertake an activity for the first time or change a safety-relevant setting).

Online fraud

186.Some of the most prevalent illegal content risking harm to adults that we heard about was fraud, which was reported to be the single biggest single crime in the UK last year351—an estimated £2.3bn was lost by victims to fraud over the past year alone.352 As well as financial detriment, victims suffer psychological harms. For example, 28 per cent of adults reported feeling depressed after being scammed.353 According to Action Fraud, 85 per cent of scams rely on the internet in some way.354

187.We received evidence on a number of methods of online fraud which were of varying sophistication. For example, we heard from insurance companies that fraudsters can make copies of legitimate insurance providers’ websites and pay for them to appear at the top of search results. Customers can inadvertently enter the copy website and disclose their details to criminals.355

188.Martin Lewis, founder of Money Saving Expert and the Money and Mental Health Policy Institute, highlighted investment scams, which promote fake investment opportunities promising high returns. He raised the case of a man who had lost £19,000 to such a scheme,356 and a grandmother who put the money their grandchild inherited from a deceased parent into such a scam.357 We also heard about romance scams, where fraudsters create fake accounts on dating sites and develop relationships with victims. Once victims are emotionally invested, fraudsters pretend to be in urgent need of money and request assistance. We heard that a total of £21.2 million was lost to romance scams in 2020 (an increase of 17 per cent from 2019), affecting nearly 9,000 reported victims.358

189.The Office of the City Remembrancer, City of London Corporation, told us that compared to 2019, there has been a significant increase in reports of fraud facilitated through online channels: online shopping and auction fraud (43 per cent increase), romance scams (15 per cent increase), and investment fraud (16 per cent increase).359

190.In the White Paper the Government had initially said fraud would not be covered by online safety regulation However, on publication of the draft Bill the Government announced it would be included360 and the Prime Minister confirmed in his July 2021 appearance before the Commons Liaison Committee that “one of the key objectives of the Online Safety Bill is to tackle online fraud”.361

191.Nonetheless, concerns remain over whether the provisions in the draft Bill will tackle fraud effectively. Notably, the draft Bill considers fraud “illegal content” rather than “priority illegal content” or explicitly mentioning it in the same vein as CSEA and terrorism content. This means that providers will have a duty to remove the content, but only on being notified of it by users.362 We heard that this reactive rather than proactive approach is likely to be ineffective at dealing with fraud, given that people may only become aware and make a report after a crime has taken place. Designating fraud as “priority illegal content” would place a duty on providers to minimise the risk that the content would appear on their service in the first place—several witnesses argued that this would be a more effective provision in the fight against online fraud.363

192.The CMA was particularly concerned to see fraud designated as “priority illegal content” to ensure that the Bill does not undermine existing consumer legislation. The CMA outlined its interpretation of the Consumer Protection from Unfair Trading Regulations 2008 as requiring platform operators take proactive steps to minimise economically harmful content on their platforms, rather than simply responding to it when it is reported.364 The CMA therefore expressed concern that, were fraud not designated “priority illegal content”, “people will see that slightly narrower duty [reactive rather than proactive] and think that it supersedes the existing law, supplants it and therefore weakens it.”365 Guy Parker, the Chief Executive of the Advertising Standards Authority also told us: “I think there are good arguments for extending the scope of the Online Safety Bill to cover financial scams.”366

193.UK Finance argued that fraud could be explicitly mentioned in the Bill in the same way CSEA and terrorism offences are and suggested amendments to achieve this.367 This would mean that the Bill itself, rather than secondary legislation, would ensure that platforms were required to proactively prevent fraudulent content from appearing.

194.We welcome the inclusion of fraud and scams within the draft Bill. Prevention must be prioritised and this requires platform operators to be proactive in stopping fraudulent material from appearing in the first instance, not simply removing it when reported. We recommend that clause 41(4) is amended to add “a fraud offence” under terrorism and child sexual exploitation and abuse offences and that related clauses are similarly introduced or amended so that companies are required to proactively address it. The Government should consult with the regulatory authorities on the appropriate offences to designate under this section. The Government should ensure that this does not compromise existing consumer protection regulation.

195.The Bill must make clear that ultimate responsibility for taking action against criminal content remains with the relevant regulators and enforcement bodies, with Ofcom reporting systemic issues relating to platform design and operation—including in response to “super complaints” from other regulators. The Bill should contain provisions requiring information-sharing and regulatory cooperation to facilitate this.


243 Obscene Publications Act 1964

244 Draft Online Safety Bill, CP 405, May 2021, Clause 41(3)

245 Draft Online Safety Bill, CP 405, May 2021, Clause 41(3)

246 Police and Criminal Evidence Act 1984, sections 24(1)(c) and (d)

247 Draft Online Safety Bill, CP 405, May 2021, Clause 41(4)(a); Clause 42 and Schedule 2

248 Draft Online Safety Bill, CP 405, May 2021, Clause 41(4)(b); Clause 43 and Schedule 3

249 Draft Online Safety Bill, CP 405, May 2021, Clause 41(4)(c) also see clause 44

250 Draft Online Safety Bill, CP 405, May 2021, Clause 41(4)(d)

255 Written evidence from British & Irish Law, Education & Technology Association (OSB0073)

256 Crown Prosecution Service, ‘Social Media - Guidelines on prosecuting cases involving communications sent via social media’: https://www.cps.gov.uk/legal-guidance/social-media-guidelines-prosecuting-cases-involving-communications-sent-social-media [accessed 22 November 2021]

257 Protection form Harassment Act 1997, sections 2, 2A, 4 or 4A

258 Offences Against the Person Act 1861, section.16

259 Criminal Justice and Courts Act 2015, section 33

260 Theft Act 1968, section 21

261 Letter from Secretary of State 26 November 2021

262 Elections Bill as amended in Committee, Clause 7(1)(6)(f) and Clauses 38 to 43

263 Hostility is not defined in law and so the courts consider it as the ordinary English meaning of the word which the CPS website describes as “ill-will, ill-feeling, spite, prejudice, unfriendliness, antagonism, resentment, and dislike.” CPS, ‘Hate Crime’: https://www.cps.gov.uk/crime-info/hate-crime [accessed 30 November 2021]

264 Crime and Disorder 1998, section 28-32 and Sentencing Act 2020, section 66; The criminal law interprets these characteristics broadly: race includes “race, colour, nationality (including citizenship) or ethnic or national origins” while religion includes a lack of religious belief. The law also covers erroneous assumptions as to a victim’s race or religion by the offender, liability for abuse cannot be avoided simply because the offender picked an abusive term that did not actually apply to the victim. Case law has found that Gypsies, Irish Travellers, religious converts and apostates are all protected by the legislation.

265 Sentencing Act 2020, section 66; CPS, ‘Hate Crime’: https://www.cps.gov.uk/crime-info/hate-crime [accessed 30 November 2021]

266 Public Order Act 1986, part III, sections 18-23

267 Public Order Act 1986, sections 29B-29C

268 Public Order Act 1986, section 29B(1)

269 Public Order Act 1986, section 18(1)

270 Public Order Act 1986, section 18(1)(b)

271 Written evidence from Bumble (OSB0055).

272 For example, Carnegie UK (OSB0095), Refuge (OSB0084), Centenary Action Group (OSB0047).

273 House of Commons Petitions Committee, Online Abuse and the Experience of Disabled People (First Report, Session 2017–19, HC 759), para 19

274 Malicious Communications Act 1988, section 1

275 Communications Act 2003, section 127

276 Malicious Communications Act 1988, section 1(1)(a)(i); Communications Act 2003, section 127(1)(a)

277 The Law Commission, ‘Modernising Communications Offences: A Final Report’: Law Com No 399, HC 547 [accessed 22 November 2021]

278 The Law Commission, ‘Modernising Communications Offences: A Final Report’, para 1.6: Law Com No 399, HC 547 [accessed 22 November 2021]

279 The Law Commission, ‘Modernising Communications Offences: A Final Report’, para 1.30, para 1.31: Law Com No 399, HC 547 [accessed 22 November 2021]

281 Communications and Digital Committee, Free for all? Freedom of expression in the digital age, (1st Report, Session 2021–22, HL Paper 54)

282 Law Commission, ‘Hate Crime Laws: Final Report’, Law Com No 402, HC 942 [accessed 9 December 2021]

284 Draft Online Safety Bill, CP 405, May 2021, Clause 41(9)

285 Written evidence from Gavin Millar QC (OSB0221); ‘Satisfied so that you are sure’ in England and Wales; ‘Beyond reasonable doubt’ in Scotland and Northern Ireland

286 Written evidence from Gavin Millar QC (OSB0221)

287 For example, written evidence from: Big Brother Watch (OSB0136); Global Partners Digital (OSB0194)

290 Written evidence from Legal to Say, Legal to Type (OSB0049)

291 For example, Q 69; Written evidence from: Sara Khan (Former Lead Commissioner at Commission for Countering Extremism); Sir Mark Rowley (Former Assistant Commissioner (2014–2018) at Metropolitan Police Service) (OSB0034); Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047)

292 Equality Act 2010, section 5

293 Equality Act 2010, section 6

294 Equality Act 2010, section 7

295 Equality Act 2010, section 8

296 Equality Act 2010, section 9

297 Equality Act 2010, section 10

298 Equality Act 2010, section 11

299 Equality Act 2010, section 12

300 Equality Act 2010, part 5 and section 29

303 Written evidence from Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047)

304 Written evidence from Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047)

305 Written evidence from Advisory Committee For Scotland (OSB0067)

306 Among others Q 55; Written evidence from: Mumsnet (OSB0031); Bumble Inc. (OSB0055); Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047); Advisory Committee For Scotland. (OSB0067); HOPE not hate (OSB0048); Dr Kim Barker (Senior Lecturer in Law at Open University); Dr Olga Jurasz (Senior Lecturer in Law at Open University) (OSB0071)

307 Written evidence from Professor Clare McGlynn (Professor of Law at Durham University) (OSB0014)

308 Written evidence from HOPE not hate (OSB0048); Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047)

312 Public Order Act 1986 ss.

313 Law Commission, ‘Hate Crime Laws: Final Report’: Law Com No 402, HC 942 [accessed 9 December 2021]

314 The Law Commission, ‘Modernising Communications Offences: A Final Report’: Law Com No 399, HC 547 [accessed 22 November 2021]

315 Ofsted, ‘Research and Analysis: Review of sexual abuse in schools and colleges’: www.gov.uk/government/publications/review-of-sexual-abuse-in-schools-and-colleges/review-of-sexual-abuse-in-schools-and-colleges [accessed 10 December 2021]

316 The Law Commission, ‘Modernising Communications Offences: A Final Report’, para 6.133: Law Com No 399, HC 547 [accessed 22 November 2021]

317 Q 73; Ofsted, ‘Research and Analysis: Review of sexual abuse in schools and colleges’: www.gov.uk/government/publications/review-of-sexual-abuse-in-schools-and-colleges/review-of-sexual-abuse-in-schools-and-colleges [accessed 10 December 2021]

318 Q 73; for an alternative approach to the offence see written evidence from Professor Clare McGlynn (Professor of Law at Durham University)

319 Written evidence from: HOPE not hate (OSB0048); Sara Khan (Former Lead Commissioner at Commission for Countering Extremism); Sir Mark Rowley (Former Assistant Commissioner (2014–2018) at Metropolitan Police Service) (OSB0034)

320 Written evidence from British Horseracing Authority (OSB0061)

321 Draft Online Safety Bill, CP 405, May 2021, Clause 11

322 Draft Online Safety Bill, CP 405, May 2021, Clause 29(3)

324 Written evidence from Department of Digital, Culture, Media and Sport and Home Office (OSB0011)

325 Draft Online Safety Bill, CP 405, May 2021, Clause 11(2)(a)

326 Draft Online Safety Bill, CP 405, May 2021, Clause 46 (3)

327 Draft Online Safety Bill, CP 405, May 2021, Clause 11(2)(b)

328 Department for Digital, Culture, Media and Sport, and the Home Office, ‘Memorandum to the Delegated Powers and Regulatory Reform Committee’, para 158: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/985030/Delegated_Powers_Memorandum_Web_Accessible.pdf [accessed 9 December 2021]

329 Draft Online Safety Bill, CP 405, May 2021, Clause 42(6)

330 Department for Digital, Culture, Media and Sport, and the Home Office, ‘Memorandum to the Delegated Powers and Regulatory Reform Committee’, para 161–162: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/985030/Delegated_Powers_Memorandum_Web_Accessible.pdf [accessed 9 December 2021]

334 Written evidence from British & Irish Law, Education & Technology Association (OSB0073)

335 Written evidence from Dr Francesca Sobande (Lecturer in Digital Media Studies at Cardiff University) (OSB0144)

338 Communications and Digital Committee, Free for all? Freedom of expression in the digital age, (1st Report, Session 2021–22, HL Paper 54), para 182

339 The Law Commission, ‘Modernising Communications Offences: A Final Report’: Law Com No 399, HC 547 [accessed 22 November 2021]

340 Written evidence from Department of Digital, Culture, Media and Sport and Home Office (OSB0011)

343 Written evidence from BBFC (OSB0006)

344 Age, disability, gender reassignment, marriage and civil partnership, pregnancy and maternity, race, religion or belief, sex, sexual orientation and transgender status.

349 ‘Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt’ Washington Post (13 September 2021): https://www.wsj.com/articles/facebook-files-xcheck-zuckerberg-elite-rules-11631541353 [accessed 30 November 2021]

351 Oral evidence taken before the Work and Pensions Committee, 6 January 2021 (Session 2019–2021), 223 (Graeme Biggar)

352 Written evidence from Which? (OSB0115)

353 Written evidence from Money and Mental Health Policy Institute (OSB0036)

354 Written evidence from Paul Davis (Director of Fraud at TSB Bank Plc) (OSB0164)

355 See written evidence from: Keoghs LLP (OSB0003), Somerset Bridge Group Ltd (OSB0004), Quilter (OSB0024), the Association of British Insurers (OSB0079), and M&G PLC (OSB0176)

356 Q 110 (Martin Lewis)

357 Q 112 (Martin Lewis)

358 Written evidence from UK Finance (OSB0088)

359 Written evidence from Office of the City Remembrancer, City of London Corporation (OSB0148)

360 HC Deb, 12 May 2021, UIN HCWS12

361 Oral evidence taken before the Liaison Committee, 7 July 2021 (Session 2019–2021), Q 79 (The Prime Minister)

362 Draft Online Safety Bill, CP 405, May 2021, Clause 9(3)

363 See written evidence from: the Financial Conduct Authority (OSB0044), Office of the City Remembrancer, City of London Corporation (OSB0148), Association of British Insurers (OSB0079), Barclays Bank (OSB0106), Which? (OSB0115) and Money and Mental Health Policy Institute (OSB0036). TSB Bank do not call for fraud to be made “priority illegal content”, but do stress the importance of measures preventing fraud from appearing on platforms at all (OSB0164)

364 See written evidence from: the Competition and Markets Authority (OSB0160) and Q 119. The CMA interpret ‘due diligence’ in the Consumer Protection from Unfair Trading Regulations 2008 as placing a proactive duty on providers, but makes clear that this is its own interpretation and is subject to challenge.

367 Written evidence from UK Finance (OSB0088)




© Parliamentary copyright 2021