Draft Online Safety Bill Contents

5Protection of Children

Definition of content harmful to children

196.As set out in Chapter 2, one of the key objectives of the draft legislation is to ensure a higher level of protection for children than adults. The draft Bill’s definition of content that is harmful to children has three elements set out in Table 1 below. The broad category of “undesignated” content harmful to children is set out in Clause 45. It concerns content that the provider has reasonable grounds to believe may have or indirectly have “a significant adverse physical or psychological impact on a child of ordinary sensibilities.”368 The remainder of the clause requires providers to consider factors such as children’s characteristics, the means of dissemination of the content, the impact on children of different age groups and so forth.369

Table 2: Categories of content harmful to children in the draft Bill

Category

Defined by

Duty on provider

Primary priority content

Regulations made by the Secretary of State

Use proportionate systems and processes to prevent children of any age from encountering it.

Priority content

Regulations made by the Secretary of State

Use proportionate systems and processes to protect children from age groups judged to be at risk of harm from encountering it

Undesignated content

Clause 45 (see above)

Use proportionate systems and processes to protect children from age groups judged to be at risk of harm from encountering it, if such a risk of harm has been identified in the most recent children’s risk assessment.

197.The provisions relating to content that is harmful to children have been criticised by service providers and rights groups as requiring providers to make fine judgements about individual pieces of content. Some described the provisions as “overly broad”, expressing concern about a possible impact on educational material and children’s right to access information.370 There was also some criticism for the lack of clarity in what will be covered by the “primary priority” and “priority” content duties.371

198.As we set out above, we have heard arguments that it is important for the Bill to include specifics about the types of harm and content that it will cover. At the same time, as set out in Chapters 2 and 3, we heard compelling evidence about the breadth of content and activity that creates a risk of harm that children are exposed to and concerns that too specific a definition will not keep pace with the changing online world. The draft Bill requires providers to have in place proportionate systems and processes to protect children from encountering such material, where there is a risk of harm resulting from them doing so as identified in the provider’s risk assessment. That could include effective content moderation, it could also include design features like content warnings, safe search features, algorithmic tweaks, or age assurance or verification.372 There is already a direct precedent in regulation for a similar requirement, the Video Sharing Platform Regulations, which require providers to prevent children accessing material “that might impair the physical, mental or moral development of under 18s.”373

199.We heard that the definition of harm to children would benefit from being tightened. In particular, the Government has decided not to use the established formula of a “reasonable person”, instead going with a relatively novel formula of a “child of ordinary sensibilities”. In devising their proposed reforms to the Communications Offences, the Law Commission rejected “universal standards” such as a reasonable person test for establishing harm to an individual. They took the view that there are too many characteristics that may be relevant to whether communications may be harmful to apply such a test, as Mr Millar put it: “In reality there is no such person. Some people are more robust and resilient than others.”374

200.The draft Bill attempts to recognise this in Clause 10(4) which requires providers to assume that a person encountering content has characteristics or is a member of a group that might reasonably be expected to mean they are particularly affected by content. The intent here appears to be to cover for example targeted abuse on the basis of someone’s appearance. However, it could also be read as requiring providers to consider all possible audiences and act on behalf of the most vulnerable, even where that group might be very small, such as those with an uncommon phobia.

201.The test the Law Commission arrived at for their harm-based offence was “likely to cause harm to a likely audience”. We believe this is a better way of ensuring that service providers consider those who may be harmed or impacted by content or activity on a platform than the “person of ordinary sensibilities” test in the draft Bill. Having a single test for a key category of illegal content and for regulated content and activity harmful to children reduces regulatory burden and improves consistency. Online providers generally have a good understanding of their audience. Where their platform allows users to target content at particular people it would require service providers to consider how the design of their systems might be used to create or mitigate harm.

202.Recognising the key objective of offering a higher level of protection for children than adults, we support the inclusion of a broad definition of content that is harmful to children. At the same time, we believe the definition should be tightened. We recommend that Clauses 10(3) to (8) are revised. Content and activity should be within this section if it is specified on the face of the Bill, in regulations or there is a reasonably foreseeably risk that it would be likely to cause significant physical or psychological distress to children who are likely to encounter it on the platform.

203.As with other duties, we recommend that key, known risks of harm to children are set out on the face of the Bill. We would expect these to include (but not be limited to) access to or promotion of age-inappropriate material such as pornography, gambling and violence material that is instructive in or promotes self-harm, eating disorders or suicide, and features such as functionality that allows adults to make unsupervised contact with children who do not know them, endless scroll, visible popularity metrics, live location, and being added to groups without user permission.

204.We recognise the concerns that, without proper guidance, service providers might seek to place disproportionate age assurance measures in place, impacting the rights of both children and adults. We recommend that Ofcom be required to develop a mandatory Code of Practice for complying with the safety duties in respect of children. Ofcom should be required to have regard to the UN Convention on the Rights of the Child (in particular, General Comment No. 25 on children’s rights in relation to the digital environment), the Information Commissioner’s Office’s Age Appropriate Design Code, and children’s right to receive information under the ECHR when drawing up that Code.

Alignment with the Age Appropriate Design Code

205.The Age Appropriate Design Code (AADC) came into force in September. It sets out 15 standards that online services need to follow to meet their obligations to protect children’s data online. The AADC covers all Internet Society Services that collect personal data and are likely to be accessed by children.375

206.There are synergies and overlaps between the AADC and the draft Bill. Both cover social media platforms and search engines. Both have the aim of protecting children online. Both include a test of whether a service is “likely to be accessed by children”. Given the importance of data to the design-driven risks of harm we identify in Chapter 2, the protections required for children’s data by the AADC are a key part of online safety. Our witnesses did not consider it a coincidence that Google and other service providers introduced a raft of new safety features aimed at children, including a widely welcomed decision to turn off auto-play by default on YouTube for Kids, just a few weeks before the AADC came into force.376 TikTok and Snap Inc. both told the US Senate that they welcomed the ADDC.377

207.The Information Commissioner’s Office (ICO) says that it applies the test in section 123 of the Data Protection Act 2018 on the balance of probabilities—is it more likely than not that children will access an Internet Society Service?378 The test in the draft Bill is more complex. It must be possible for children to access all or a part of a service and “the child user condition” is met in respect of that service or part of it. The “child user condition” is whether the service (or part of a service) attracts or is likely to attract an undefined “significant number of child users”.379

208.Common Sense argued in their written evidence that the scope of the Online Safety Bill in respect of children should not be more restricted than the AADC and the “likely” test should be the same in both. They drew on the experience of the US Children’s Online Privacy Protection Act to argue that the provision in the draft Bill is too weak. They felt it gives too much scope for providers to argue that there is insufficient evidence of a “significant number” of child users to fall within the provisions relating to children—an argument Microsoft used in their written evidence.380 They also argued that the distinction would be problematic for businesses, seeking to comply with two different standards.381

209.Greater alignment between the two sets of regulation may also be welcomed by providers and aid compliance. TechUK, for example, were concerned on the burdens for smaller businesses of navigating two different sets of regulation. They called on the regulators to issue a joint statement to address any inconsistencies.382 Google and Microsoft, in their written evidence, urged us to consider how the two regulations could be best aligned.383

210.Ofcom told us that the scope of the Bill “did not need to be identical” for them to cooperate with the ICO but agreed with the principle of alignment: “What Elizabeth Denham [the Information Commissioner] and I would both like to do over the next few years is create one set of requirements that may be operated by two different regulatory systems and sets of legal powers, but that as far as possible are asking the same things.”384

211.We recommend that the “likely to be accessed by children” test in the draft Online Safety Bill should be the same as the test underpinning the Age Appropriate Design Code. This regulatory alignment would simplify compliance for businesses, whilst giving greater clarity to people who use the service, and greater protection to children. We agree that the Information Commissioner’s Office and Ofcom should issue a Joint Statement on how the two regulatory systems will interact once the Online Safety Bill has been introduced. They should be given powers to cooperate on shared investigations, with appropriate oversight.

Beyond user-to-user and search

212.The draft Bill covers user-to-user services and search services.385 It includes messaging apps (such as Facebook Messenger, WhatsApp, Telegram, etc) but excludes “one to one aural communication”, SMS messages and email, as well as other types of content, discussed later.386

213.One of the major themes in the evidence we received was the potentially limiting nature of the focus on user-to-user and search. For example, Schillings LLP raised the role of file transfer services, webhosts and “cyber lockers”.387 On the other hand, Ofcom and the Secretary of State, as well as some industry bodies whose members fall on both sides of the draft Bill’s scope, all made the point that the Bill’s scope could not be unlimited and still be effective.388

214.At the same time, concerns were raised by children’s rights groups about services likely to be accessed by children that fall outwith the scope of the draft Bill, but nonetheless contain content and activity that creates a risk of harm to children. As John Carr OBE, Secretary at Children’s Charities’ Coalition on Internet Safety, put it: “What counts is not the nature of the platform or the environment but the nature of the likely harm irrespective of how or where it appears on a child’s screen or how or by whom it was put there.”389 App stores who mis-advertise age restrictions and allow easy access to age-restricted apps were one case raised by 5Rights.390 Many organisations raised repeatedly the issue of commercial pornography.391

Pornography

215.Professor Sonia Livingstone, Department of Media and Communications at LSE, described including pornography on sites that do not host user-to-user content, and are therefore not covered by the draft bill, as the “number one concern of children, and indeed many adults”.392 The evidence we received would appear to support this. Submissions from media safety groups, children’s rights groups, campaigners on violence against women and girls, and others strongly supported either bringing such material within scope of the Bill or implementing the never-commenced Part 3 of the Digital Services Act, which the draft Bill would repeal.393

216.The day after launching the draft Bill, the then-Secretary of State told the DCMS Select Committee: “On the issue of commercial pornography, the biggest risk is kids stumbling across it but there is a greater risk from social media and user-generated content … I believe that the preponderance of commercial pornography sites have user-generated content on them, so most of them will be in scope.”394

217.Most of those who submitted evidence accepted that the draft Bill would capture the largest sites hosting free pornography, as most host user-to-user content as defined in the draft Bill. However, there was widespread concern in the evidence cited above that they could evade the draft Bill by removing that functionality—as Pornhub largely did in December 2020 following criticism of its hosting of illegal content—or that children would simply move to sites not covered by the draft Bill.395 As the Centre to End All Sexual Exploitation told us, the pornography site XVideos received a boost in traffic after Pornhub introduced safeguarding reforms. They and the British Board of Film Categorisation stressed that standards had to be universal and strictly enforced to be effective. Otherwise, they would simply divert traffic away from the sites who do introduce age assurance/verification.396

218.We set out briefly in Chapter 2 some of the compelling and disturbing evidence we received concerning children’s access to pornography and the impact that it has on them. It is worth recounting here the evidence of Prof McGlynn and Dr Elly Hanson, Clinical Psychologist, that the largest pornographic sites are immediately accessible, have no age verification at all and that their “landing pages” auto-play videos with themes around violence against women, lack of consent (including rape of sleeping women) and sex between step-relatives. Even when bondage, domination, submission, and masochism (BDSM) content was excluded, one in eight videos on the homepages of XVideos, Pornhub and XHamster depicted sexual violence or non-consensual conduct, including unconscious women and girls being raped and footage from “spy cams”. All three are among the UK’s top 25 most visited websites. 397

219.Many of the submissions we received called for specific provisions in the Bill for pornographic sites or the revival of the provisions in the Digital Services Act. 5Rights in their evidence called for a broader approach, aligning the Bill with the AADC by including “services likely to be accessed by children” as a third category of regulated service under Clause 3(2). They identified this as their priority change to the draft Bill, seeing it as placing the onus on pornography, app stores and other sites without user-to-user content to either ensure they are not “likely to be accessed by children” or to comply with the child safety duties of the draft Bill.398 Either way, pornography sites would have a legal obligation to prevent children from accessing their content.

220.We asked Ofcom whether extending the scope of the Bill to include the scope of the AADC would avoid the risk that the Bill might be brought into disrepute by pornographic sites escaping regulation by removing user-to-user content. Dame Melanie agreed that it would achieve the aim and acknowledged the risk, whilst stressing the concerns about scope outlined above.399 The Secretary of State disagreed with such a change, seeing pornography as a separate issue, and telling us: “We need to keep the scope of the Bill very tight in order to keep it watertight and effective, so that it works. This is not the Bill to fix all online problems and harms. It is important to say that. This Bill is not to fix the internet. This Bill is solely aimed at platforms that we know do harm to children.”400

221.Easy, often unwitting or unintended, access by children to pornography was one of the largest online concerns raised with us during our scrutiny of the draft Bill. It is evident to us that the credibility of the Bill will be undermined if the largest online pornography providers simply remove user-to-user elements from their sites and continue showing extreme content and content that creates a risk of harm to children.

222.Whilst there is a case for specific provisions in the Bill relating to pornography, we feel there is more to be gained by further aligning the Bill with the Age Appropriate Design Code. Whilst we understand the concerns over scope and regulatory burden, this provision would only bring within the scope of the Bill services already covered by the scope of the Age Appropriate Design Code. Both regulatory systems are risk-based and require the regulator to act proportionately. This step would address the specific concern around pornography, requiring all such sites to demonstrate that they have taken appropriate steps to prevent children from accessing their content. It would also bring other sites or services that create a risk of harm into scope whilst bringing us closer to the goal of aligned online regulation across data protection and online safety. We believe that our proposal on expanding the role of risk profiles, discussed later in this report, will be key to ensure that the Bill’s provisions impact the riskiest services and are not disproportionate on those at lower risk.

223.All statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code. This would have many advantages. In particular, it would ensure all pornographic websites would have to prevent children from accessing their content. Many such online services present a threat to children both by allowing them access and by hosting illegal videos of extreme content.

Age Assurance and verification

224.We discuss age assurance above as one of the possible ways that companies can mitigate risks to children resulting from them accessing unsuitable services. This is a fast-growing area with new technological methods being identified including some that use AI, for example, in facial analysis.

225.The impact of some of the information, behaviours and pressures that exist for children online is well-documented.401 Jim Steyer, CEO of Common Sense Media, told us: “the psychological impacts on young people—on children and teens—are discernibly more important and significant because their brains are still developing.” He went on to describe research conducted by Common Sense Media about the negative impacts of some social media services on their “self-esteem, their sense of anxiety and depression, their body image, and their sense that their body images and their overall existence, if you will, does not measure up to the idealised images they can see from influencers on platforms like Instagram.”402 Professor Jonathan Haidt, Ethical Leadership at New York University Stern School of Business, linked this to a “sudden trend” in the US between 2015 and 2021 when the “pre-teen suicide rate in the USA is more than double in those couple of years … Something terrible and huge is happening.”403

226.Children’s online experiences are not limited to products and services that are intended for them. Ms Wick said: “The minimum age of use for most social media platforms is 13, but we know that a huge number of under-13s use these platforms. If the companies recognised this, their use base would drop quite significantly.”404 The Office of the Children’s Commissioner noted: “Overall, we received little clarity from the companies on how many children they estimated to be using their services, and how many underage users were being identified, although there were notable exceptions to this.” 405

Role of the draft Bill–minimum standards

227.At the moment there is no single regulatory code in the UK that sets out rules for age assurance. There is the Age Appropriate Design Code, which sets out rules for data protection. There is the Video Sharing Platform Regulations, which requires service providers to protect “under-18s from videos containing pornography, extreme content and other material which might impair their physical, mental or moral development.”406 Finally there are the provisions of Part 3 of the Digital Economy Act 2017 that would have required mandatory age assurance for commercial pornography sites, but that have never been brought into force. The draft Bill repeals both these and the Video Sharing Platform Regulations.

228.Ms Wick told us that these measures had pushed forward the development of age assurance. In her view the draft Bill was “ … to set the expectations for companies and establish rules of the road. We need Ofcom to produce minimum standards on very basic things such as … age assurance. There should be a requirement in the Bill for Ofcom to do that, which will then establish the floor of protection.”407 Summarising the outcome of our roundtable, Professor Lee Edwards, Strategic Communications and Public Engagement at LSE, noted:

“There was broad consensus among participants about the risks of platforms being granted too much leeway regarding risk assessments and age verification. Without clear guidelines on the powers of Ofcom, there is a danger of companies setting their own standards. Likewise, a strict separation is necessary between verification providers, their clients, and advertisers.” 408

Privacy

229.On the other hand, numerous witnesses expressed concern about the impact of age assurance on privacy. Big Brother Watch feared that “ … mandating age verification … would be hugely damaging to privacy rights online.” 409 Demos wrote specifically of children’s right to privacy and their concern that:

“Although there are many third-party identity providers, it is likely that this market would be instantly captured by the large tech companies who already facilitate identity provision across platforms, such as Facebook and Google. This would further consolidate their market power and their control of and ability to use and monetise people’s personal data.” 410

230.Internet Matters highlighted concerns of inadvertent exclusion by age assurance processes: “Some vulnerable groups are at risk of being unable to access content they are entitled to, for example, some care experienced young people will not have easy access to an acceptable form of official ID required for age verification. Some young people will also be unable to comply with age assurance mechanisms for physical or cognitive reasons.” 411

231.Others recognised concerns relating to privacy but did not feel they were wholly justified. 5Rights explained: “people are rightly concerned about privacy implications, as are we when it comes to children’s privacy, but age assurance is not the same thing as identification, and you can establish a user’s age without knowing anything else about them. The technology exists. What is missing is the governance around it.” 412

Independent age assurance sector

232.In the report of our roundtable discussion held on 27 October 2021, Professor Lee Edwards noted:

“A range of standards and technology options exist for age verification. These include hard verification via identity documents, facial recognition systems, or age estimation. While all technologies come with trade-offs, the significance of finding privacy-preserving solutions was highlighted.” 413

233.We heard that the developing age assurance sector was willing to follow minimum standards set by government.414 For example, the Age Verification Providers Association recommended: “an independent privacy-protecting standards based, open competitive and interoperable age verification sector as a foundation for a safer internet for children.” In their written evidence, Yoti recommended: “standards based and independently accredited approaches to age verification and identity verification for social media registration.” 415 Yet some element of compulsion or regulation is likely to be required. The Office of the Children’s Commissioner said:

“Many of the tech companies recognised the need to implement or improve their age assurance systems in order to more effectively enforce their minimum ages. However, some indicated that they did not feel that the issue was a problem for their service, or that the design of the service meant implementing age assurance was difficult, perhaps impossible.”416

Age Assurance (Minimum Standards) Bill

234.The Age Assurance (Minimum Standards) Bill, a Private Member’s Bill, was introduced in the House of Lords on 27 May 2021 and had its second reading on 19 November 2021. 417 If enacted, the Bill would require that age assurance systems for online or digital services or products used by consumers or operated in the UK must meet certain minimum standards. It requires Ofcom to publish those minimum standards within six months of the passing of the Bill. During the Bill’s second reading, the Parliamentary Under-Secretary of State at DCMS explained that, while sharing its aims, the Government’s view was that this draft Online Safety Bill was the better route by which they could be met, with the regulator including “steps on age assurance in its regulatory codes, as part of which Ofcom can include specific standards and name them.”418

235.There is currently no single regulatory or statutory code in the UK that sets out rules for age assurance. We believe that existing codes, and the duties outlined in the draft Bill, cannot be implemented properly without a statutory system of regulation of age assurance, that is trusted, effective and preserves privacy. We believe that an independent, privacy-protecting age assurance sector operating to a set of minimum standards appropriate for different methods of age assurance in different circumstances is key to any system that aims to protect children from harm online. Such a system:

a)should be for independent commercial providers as well those built by the service providers themselves;

b)should impose standards appropriate to the content and age of the user and be compatible with existing law, including international treaties such as the UN Convention of the Rights of the Child, to provide necessary protections for privacy and data protection; and

c)should provide a route of redress for users to challenge specific conclusions reached on age.

A binding Code of Practice would provide a clear basis for service providers whose risk assessment identifies their content as likely to be accessed by children to put in place mitigations in the form of a rigorous system of age assurance.

236.We recommend that the Bill require Ofcom to establish minimum standards for age assurance technology and governance linked to risk profiles to ensure that third-party and provider-designed assurance technologies are privacy-enhancing, rights-protecting, and that in commissioning such services providers are restricted in the data for which they can ask. Ofcom should also require that service providers demonstrate to them how they monitor the effectiveness of these systems to ensure that they meet the minimum standards required.

237.The Government should ask Ofcom to prioritise the development of a mandatory age assurance technology and governance code as a priority ahead of the Bill becoming law and, in doing so, set out risk profiles so that the use of such systems is clearly proportionate to the risk. The code must bear in mind that children have rights to freedom of association, participation, and information, as well as the right to protections. We expect this to be in place within three to six months of the Bill receiving Royal Assent.


368 Draft Online Safety Bill, CP 405, May 2021, Clause 10(3)

369 Draft Online Safety Bill, CP 405, May 2021, Clause 10(5–8)

370 Written evidence from Google: (OSB0175)(OSB0175); TikTok (OSB0181); Global Partners Digital (OSB0194); Microsoft (OSB0076); Wikimedia UK (OSB0169)

371 Written evidence from Care (OSB0085)

372 Age assurance refers to any system of age checking and estimation. The Age Verification Providers Association (AVPA) makes the distinction between “age assurance” and “age verification”: age assurance is a broad term for different methods of discerning the age or age-range of an online user; age verification is a subset of that with more stringent methods and a higher level of accuracy and confidence in the age or age-range of that user.

373 Ofcom, Video Sharing Platform Guidance, (6 October 2021): https://www.ofcom.org.uk/__data/assets/pdf_file/0015/226302/vsp-harms-guidance.pdf [accessed 30 November 2021]

374 Written evidence from Gavin Millar QC (OSB0221)

375 Information Commissioner’s Office (ICO), ‘Introduction to the Age Appropriate Design Code’: https://ico.org.uk/for-organisations/guide-to-data-protection/ico-codes-of-practice/age-appropriate-design-code/. [accessed 18 November 2021]. ISSs are defined as “any service normally provided for remuneration, at a distance, by electronic means and at the individual request of a recipient of services.”

376 Q 53 (Ian Russell); Q 62 (Izzy Wick); Google, ‘Giving kids and teens a safer experience online’: https://blog.google/intl/en-in/company-news/technology/giving-kids-and-teens-safer-experience-online/ [accessed 18 November 2021];

377 U.S. Senate Committee on Commerce, Science and Transportation: Hearings, ‘Protecting Kids Online: Snapchat, TikTock and YouTube’: https://www.commerce.senate.gov/2021/10/protecting-kids-online-snapchat-tiktok-and-youtube [accessed 9 December 2021]

379 Draft Online Safety Bill, CP 405, May 2021, Clause 26

380 Written evidence from Microsoft (OSB0076)

381 Written evidence from Common Sense (OSB0018)

382 Written evidence from techUK (OSB0098)

383 Written evidence from: Google (OSB0175); Microsoft (OSB0076)

384 Q 259 (Dame Melanie Dawes)

385 Draft Online Safety Bill, CP 405, May 2021, Clause 2

386 Draft Online Safety Bill, CP 405, May 2021, Clause 39(2)

387 Written evidence from Schillings International LLP (OSB0183)

388 Q 284 (Rt Hon Nadine Dorries MP), Q 259 (Dame Melanie Dawes), for example, Written evidence from UK Interactive Entertainment (OSB0080)

389 Written evidence from Mr John Carr (Secretary at Children’s Charities’ Coalition on Internet Safety) (OSB0167)

390 5Rights Foundation, Systemic breaches to the Age Appropriate Design Code: https://5rightsfoundation.com/uploads/Letter_5RightsFoundation-BreachesoftheAgeAppropriateDesignCode.pdf [accessed 9 December 2021]

391 Q 62 (Izzy Wick)

393 For example, Written evidence from: BBFC (OSB0006); Professor Clare McGlynn (Professor of Law at Durham University) (OSB0014); Barnardo’s (OSB0017); Office of the Children’s Commissioner (OSB0019); The Naked Truth Project (NTP) (OSB0023); All-Party Parliamentary Group on Commercial Sexual Exploitation (OSB0037); COST Action CA16207 - European Network for Problematic Usage of the Internet (OSB0038); Advisory Committee For Scotland (OSB0067); Dr Elly Hanson (Clinical Psychologist & researcher at I am independent) (OSB0078); Care (OSB0085); CEASE (Centre to End All Sexual Exploitation) (OSB0104); The Age Verification Providers Association (OSB0122); Parent Zone (OSB0124); Baroness Floella Benjamin, DBE (OSB0161); BT Group (OSB0163); Independent Schools Council (OSB0187); Dame Margaret Hodge (Member of Parliament for Barking and Dagenham at House of Commons) (OSB0201); NSPCC (OSB0109)

394 Oral evidence taken before the Digital, Culture, Media and Sport Committee, 13 May 2021 (Session 2021–22), QQ 26 and 27 (Rt Hon Oliver Dowden MP)

395 For example, Written evidence from: Care (OSB0085); CEASE (Centre to End All Sexual Exploitation) (OSB0104); BBC News, ‘Pornhub removes all user-uploaded videos amid legality row’: https://www.bbc.co.uk/news/technology-55304115 [accessed 30 November 2021]; Written evidence from BBFC (OSB0006)

396 Written evidence from: CEASE (Centre to End All Sexual Exploitation) (OSB0104); BBFC (OSB0006)

397 Written evidence from: Professor Clare McGlynn (Professor of Law at Durham University) (OSB0014); Dr Elly Hanson (Clinical Psychologist & researcher) (OSB0078); CEASE (Centre to End All Sexual Exploitation) (OSB0104)

398 Q 62 (Izzy Wick)

399 Q 259 (Dame Melanie Dawes)

400 Q 284 (Rt Hon Nadine Dorries MP)

401 See para 16 of this report.

405 Written evidence from Office of the Children’s Commissioner (OSB0019)

406 Ofcom, ‘Video-sharing platform regulation’: https://www.ofcom.org.uk/online-safety/information-for-industry/vsp-regulation [accessed 15 November 2021]

408 Written evidence LSE Department of Media and Communications—Anonymity & Age Verification Roundtable (OSB0236)

409 Written evidence from Big Brother Watch (OSB0136)

410 Written evidence from Demos (OSB0159)

411 Written evidence from Internet Matters (OSB0103)

413 Written evidence from LSE Department of Media and Communications—Anonymity & Age Verification Roundtable (OSB0236)

414 Written evidence from: The Age Verification Providers Association (OSB0122); Match Group (OSB0053)

415 Written evidence from Yoti (OSB0130)

416 Written evidence from Office of the Children’s Commissioner (OSB0019)

418 HL Deb, 19 November 2021, col 538




© Parliamentary copyright 2021