Draft Online Safety Bill Contents

Conclusions and recommendations

Chapter 2: Objectives of the Online Safety Bill

1.Self-regulation of online platforms has failed. Our recommendations will strengthen the Bill so that it can pass successfully into legislation. To achieve success, the Bill must be clear from the beginning about its objectives. These objectives must reflect the nature of the harm experienced online and the values of UK society. Online services are not neutral repositories for information. Most are advertising businesses. Service providers in scope of the Bill must be held liable for failure to take reasonable steps to combat reasonably foreseeable harm resulting from the operation of their services. (Paragraph 51)

2.We recommend the Bill is restructured. It should set out its core objectives clearly at the beginning. This will ensure clarity to users and regulators about what the Bill is trying to achieve and inform the detailed duties set out later in the legislation. These objectives should be that Ofcom should aim to improve online safety for UK citizens by ensuring that service providers:

a)comply with UK law and do not endanger public health or national security;

b)provide a higher level of protection for children than for adults;

c)identify and mitigate the risk of reasonably foreseeable harm arising from the operation and design of their platforms;

d)recognise and respond to the disproportionate level of harms experienced by people on the basis of protected characteristics;

e)apply the overarching principle that systems should be safe by design whilst complying with the Bill;

f)safeguard freedom of expression and privacy; and

g)operate with transparency and accountability in respect of online safety. (Paragraph 52)

3.The draft Bill creates an entirely new regulatory structure and deals with difficult issues around rights and safety. In seeking to regulate large multinational companies with the resources to undertake legal challenges, it has to be comprehensive and robust. At the same time, a common theme in the evidence we received is that the draft Bill is too complex, and this may harm public acceptance and make it harder for those service providers who are willing to comply to do so. (Paragraph 59)

4.We recommend that the Bill be restructured to contain a clear statement of its core safety objectives—as recommended in paragraph 52. Everything flows from these: the requirement for Ofcom to meet those objectives, its power to produce mandatory codes of practice and minimum quality standards for risk assessments in order to do so, and the requirements on service providers to address and mitigate reasonably foreseeable risks, follow those codes of practice and meet those minimum standards. Together, these measures amount to a robust framework of enforceable measures that can leave no doubt that the intentions of the Bill will be secured. (Paragraph 60)

5.We believe there is a need to clarify that providers are required to comply with all mandatory Codes of Practice as well as the requirement to include reasonably foreseeable risks in their risk assessments. Combined with the requirements for system design we discuss in the next chapter, these measures will ensure that regulated services continue to comply with the overall objectives of the Bill—and that the Regulator is afforded maximum flexibility to respond to a rapidly changing online world. (Paragraph 61)

Chapter 3: Societal harm and the role of platform design

6.We recommend that references to harmful “content” in the Bill should be amended to “regulated content and activity”. This would better reflect the range of online risks people face and cover new forms of interaction that may emerge as technology advances. It also better reflects the fact that online safety is not just about moderating content. It is also about the design of platforms and the ways people interact with content and features on services and with one another online. (Paragraph 68)

7.We heard throughout our inquiry that there are design features specific to online services that create and exacerbate risks of harm. Those risks are always present, regardless of the content involved, but only materialise when the content concerned is harmful. For example, the same system that allows a joke to go viral in a matter of minutes also does the same for disinformation about drinking bleach as a cure for COVID-19. An algorithm that constantly recommends pictures of cats to a cat-lover is the same algorithm that might constantly recommend pictures of self-harm to a vulnerable teenager. Tackling these design risks is more effective than just trying to take down individual pieces of content (though that is necessary in the worst cases). Online services should be identifying these design risks and putting in place systems and process to mitigate them before people are harmed. The Bill should recognise this. Where online services are not tackling these design risks, the regulator should be able to take that into account in enforcement action. (Paragraph 81)

8.We recommend that the Bill includes a specific responsibility on service providers to have in place systems and processes to identify reasonably foreseeable risks of harm arising from the design of their platforms and take proportionate steps to mitigate those risks of harm. The Bill should set out a non-exhaustive list of design features and risks associated with them to provide clarity to service providers and the regulator which could be amended by Parliament in response to the development of new technologies. Ofcom should be required to produce a mandatory Safety by Design Code of Practice, setting out the steps providers will need to take to properly consider and mitigate these risks. We envisage that the risks, features and mitigations might include (but not be limited to):

a)Risks created by algorithms to create “rabbit holes”, with possible mitigations including transparent information about the nature of recommendation algorithms and user control over the priorities they set, measures to introduce diversity of content and approach into recommendations and to allow people to deactivate recommendations from users they have not chosen to engage with;

b)Risks created by auto playing content, mitigated through limits on auto-play and auto-recommendation;

c) Risks created by frictionless cross-platform activity, with mitigations including warnings before following a link to another platform and ensuring consistent minimum standards for age assurance;

d)Risks created through data collection and the microtargeting of adverts, mitigated through minimum requirements for transparency around the placement and content of such adverts;

e)Risks created by virality and the frictionless sharing of content at scale, mitigated by measures to create friction, slow down sharing whilst viral content is moderated, require active moderation in groups over a certain size, limit the number of times content can be shared on a “one click” basis, especially on encrypted platforms, have in place special arrangements during periods of heightened risk (such as elections, major sporting events or terrorist attacks); and

f)Risks created by default settings on geolocation, photo identification/sharing and other functionality leading to victims of domestic violence or VAWG being locatable by their abusers, mitigated through default strong privacy settings and accessible guidance to victims of abuse on how to secure their devices and online services. (Paragraph 82)

9.We recommend that the Bill include a requirement for service providers to co-operate to address cross-platform risks and on the regulator to facilitate such co-operation. (Paragraph 83)

10.Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response. The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms. (Paragraph 91)

11.We recommend that platforms that allow anonymous and pseudonymous accounts should be required to include the resulting risks as a specific category in the risk assessment on safety by design. In particular, we would expect them to cover, where appropriate: the risk of regulated activity taking place on their platform without law enforcement being able to tie it to a perpetrator, the risk of ‘disposable’ accounts being created for the purpose of undertaking illegal or harmful activity, and the risk of increased online abuse due to the disinhibition effect. (Paragraph 92)

12.We recommend that Ofcom be required to include proportionate steps to mitigate these risks as part of the mandatory Code of Practice required to support the safety by design requirement we recommended in paragraph 82. It would be for them to decide what steps would be suitable for each of the risk profiles for online services. Options they could consider might include (but would not be limited to):

a)Design measures to identify rapidly patterns of large quantities of identical content being posted from anonymous accounts or large numbers of posts being directed at a single account from anonymous accounts;

b)A clear governance process to ensure such patterns are quickly escalated to a human moderator and for swiftly resolving properly authorised requests from UK law enforcement for identifying information relating to suspected illegal activity conducted through the platform, within timescales agreed with the regulator;

c)A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category;

d)Measures to prevent individuals who have been previously banned or suspended for breaches of terms and conditions from creating new accounts; and

e)Measures to limit the speed with which new accounts can be created and achieve full functionality on the platform. (Paragraph 93)

13.We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality. These should be developed in conjunction with the Information Commissioner’s Office and following consultation with groups including representatives of the LGBTQ+ community, victims of domestic abuse, journalists, and freedom of expression organisations. Enforcement of people’s data privacy and data rights would remain with the Information Commissioner’s Office, with clarity on information sharing and responsibilities. (Paragraph 94)

14.We recognise the difficulties with legislating for societal harms in the abstract. At the same time, the draft Bill’s focus on individuals potentially means some content and activity that is illegal may not be regulated. We discuss this further in Chapter 4. (Paragraph 106)

15.The viral spread of misinformation and disinformation poses a serious threat to societies around the world. Media literacy is not a standalone solution. We have heard how small numbers of people are able to leverage online services’ functionality to spread disinformation virally and use recommendation tools to attract people to ever more extreme behaviour. This has resulted in large scale harm, including deaths from COVID-19, from fake medical cures, and from violence. We recommend content-neutral safety by design requirements, set out as minimum standards in mandatory codes of practice. These will be a vital part of tackling regulated content and activity that creates a risk of societal harm, especially the spread of disinformation. For example, we heard that a simple change, introducing more friction into sharing on Facebook, would have the same effect on the spread of mis- and disinformation as the entire third-party fact checking system. (Paragraph 107)

16.Later in this report we also recommend far greater transparency around system design, and particularly automated content recommendation. This will ensure the regulator and researchers can see what the platforms are doing, assess the impact it has and, in the case of users, make informed decisions about how they use platforms. Online services being required to publish data on the most viral pieces of content on their platform would be a powerful transparency tool, as it will rapidly highlight platforms where misinformation and disinformation is drowning out other content. (Paragraph 108)

17.Many online services have terms and conditions about disinformation, though they are often inconsistently applied. We recommend later a statutory requirement on service providers to apply their terms and conditions consistently, and to produce a clear and concise online safety policy. Later, we identify two areas of disinformation—public health and election administration—which are or will soon be covered in the criminal law and that we believe should be tackled directly by the Bill. (Paragraph 109)

18.As a result of recommendations made in this report, regulation by Ofcom should reduce misinformation and disinformation by:

The Joint Committee that we recommend later in this report should take forward work to define and make recommendations on how to address other areas of disinformation and emerging threats. (Paragraph 110)

19.Disinformation and misinformation surrounding elections are a risk to democracy. Disinformation which aims to disrupt elections must be addressed by legislation. If the Government decides that the Online Safety Bill is not the appropriate place to do so, then it should use the Elections Bill which is currently making its way through Parliament. (Paragraph 111)

20.The Information Commissioner, Elizabeth Denham, has stated that the use of inferred data relating to users’ special characteristics as defined in data protection legislation, including data relating to sexual orientation, and religious and political beliefs, would not be compliant with the law. This would include, for example, where a social media company has decided to allow users to be targeted with content based on their data special characteristics without their knowledge or consent. Data profiling plays an important part in building audiences for disinformation, but also has legitimate and valuable uses. Ofcom should consult with the Information Commissioner’s Office to determine the best course of action to be taken to investigate this and make recommendations on its legality. (Paragraph 112)

Chapter 4: Safety duties relating to adults

21.We have received a large amount of evidence in our inquiry but very little of it takes issue with the regulation of illegal content. This seems to us to point to a self-evident truth, that regulation of illegal content online is relatively uncontroversial and should be the starting point of the Bill. (Paragraph 118)

22.We believe the scope of the Bill on illegal content is too dependent on the discretion of the Secretary of State. This downplays the fact that some content that creates a risk of harm online potentially amounts to criminal activity. The Government has said it is one of the key objectives of the Bill to remove this from the online world.(Paragraph 126)

23.We recommend that criminal offences which can be committed online appear on the face of the Bill as illegal content. This should include (but not be limited to) hate crime offences (including the offences of “stirring up” hatred), the offence of assisting or encouraging suicide, the new communications offences recommended by the Law Commission, offences relating to illegal, extreme pornography and, if agreed by Parliament, election material that is disinformation about election administration, has been funded by a foreign organisation targeting voters in the UK or fails to comply with the requirement to include information about the promoter of that material in the Elections Bill. (Paragraph 127)

24.Implementation of the Law Commission’s recommendations on reforming the Communications Offences and Hate Crime will allow the behaviour covered by the new offences to be deemed illegal content. We believe this is a significant enhancement of the protections in the Bill, both for users online but also for freedom of expression by introducing greater certainty as to content that online users should be deterred from sharing. We discuss how to address concerns about ambiguity and the context-dependent nature of the proposed harm-based offence through a statutory public interest requirement in Chapter 7. (Paragraph 135)

25.We endorse the Law Commission’s recommendations for new criminal offences in its reports, Modernising Communications Offences and Hate Crime Laws. The reports recommend the creation of new offences in relation to cyberflashing, the encouragement of serious self-harm, sending flashing images to people with photo-sensitive epilepsy with intent to induce a seizure, sending knowingly false communications which intentionally cause non-trivial emotional, psychological, or physical harm, communications which contain threats of serious harm and stirring up hatred on the grounds of sex or gender, and disability. We welcome the Secretary of State’s intention to accept the Law Commission’s recommendations on the Communications Offences. The creation of these new offences is absolutely essential to the effective system of online safety regulation which we propose in this report. We recommend that the Government bring in the Law Commission’s proposed Communications and Hate Crime offences with the Online Safety Bill, if no faster legislative vehicle can be found. Specific concerns about the drafting of the offences can be addressed by Parliament during their passage. (Paragraph 136)

26.The Government must commit to providing the police and courts with adequate resources to tackle existing illegal content and any new offences which are introduced as a result of the Law Commission’s recommendations. (Paragraph 138)

27.We recommend that Ofcom be required to issue a binding Code of Practice to assist providers in identifying, reporting on and acting on illegal content, in addition to those on terrorism and child sexual exploitation and abuse content. As a public body, Ofcom’s Code of Practice will need to comply with human rights legislation (currently being reviewed by the Government) and this will provide an additional safeguard for freedom of expression in how providers fulfil this requirement. With this additional safeguard, and others we discuss elsewhere in this report, we consider that the test for illegal content in the Bill is compatible with an individual’s right to free speech, given providers are required to apply the test in a proportionate manner that is set out in clear and accessible terms to users of the service. (Paragraph 144)

28.We recommend that the highest risk service providers are required to archive and securely store all evidence of removed content from online publication for a set period of time, unless to do so would in itself be unlawful. In the latter case, they should store records of having removed the content, its nature and any referrals made to law enforcement or the appropriate body. (Paragraph 145)

29.We recommend that the Secretary of State’s power to designate content relating to an offence as priority illegal content should be constrained. Given that illegal content will in most cases already be defined by statute, this power should be restricted to exceptional circumstances, and only after consultation with the Joint Committee of Parliament that we recommend in Chapter 9, and implemented through the affirmative procedure. The Regulator should also be able to publish recommendations on the creation of new offences. We would expect the Government, in bringing forward future criminal offences, to consult with Ofcom and the Joint Committee as to whether they should be designated as priority illegal offences in the legislation that creates them. (Paragraph 148)

30.Clause 11 of the draft Bill has been widely criticised for its breadth and for delegating the authority of the state to service providers over the definition of content that is harmful and what they should do about it. We understand its aims and that the Government intended it primarily as a transparency measure over something companies are already doing. As drafted, however, it has profound implications for freedom of speech, is likely to be subject to legal challenge and yet may also allow companies to continue as they have been in failing to tackle online harm. (Paragraph 174)

31.We agree that the criminal law should be the starting point for regulation of potentially harmful online activity, and that safety by design is critical to reduce its prevalence and reach. At the same time, some of the key risks of harm identified in our evidence are legislated for in parts of the offline world, but not online, where the criminal law is recognised as needing reform, or where drafting that makes sense in the context of determining individual guilt would allow companies to challenge attempts to make them act. A law aimed at online safety that does not require companies to act on misogynistic abuse or stirring up hatred against disabled people, to give two examples, would not be credible. Leaving such abuse unregulated would itself be deeply damaging to freedom of speech online. (Paragraph 175)

32.We recommend that Clause 11 of the draft Bill is removed. We recommend that it is replaced by a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill. These definitions should reference specific areas of law that are recognised in the offline world, or are specifically recognised as legitimate grounds for interference in freedom of expression. For example, we envisage it would include:

33.As with the other safety duties, we recommend that Ofcom be required to issue a mandatory code of practice to service providers on how they should comply with this duty. In doing so they must identify features and processes that facilitate sharing and spread of material in these named areas and set out clear expectations of mitigation and management strategies that will form part of their risk assessment, moderation processes and transparency requirements. While the code may be informed by particular events and content, it should be focused on the systems and processes of the regulated service that facilitates or promotes such activity rather than any individual piece of content. We envisage that this code would include (but not be limited to):

34.Accepting these recommendations would create a narrower, but stronger, regulatory requirement for service providers to identify and mitigate risks of harm in the online world that may not necessarily meet the criminal thresholds, but which are based on the same criteria as those thresholds, indicating that society has recognised they are legitimate reasons to interfere with freedom of speech rights. It would place these areas on the face of the Bill and remove the broad delegation of decisions on what is harmful from service providers. (Paragraph 178)

35.We recognise that the broad power to define new types of content that is harmful to adults in secondary legislation was a key concern with Clause 11. We recognise that there will need to be the ability to amend what is covered by this proposal to ensure that the Bill is futureproofed. At the same time, it needs to be tightly proscribed and subject to active parliamentary scrutiny and review. (Paragraph 179)

36.We recommend that additions to the list of content that is harmful should be by statutory instrument from the Secretary of State. The statutory instrument should be subject to approval by both Houses, following a report from the Joint Committee we propose in Chapter 9. Ofcom, when making recommendations, will be required by its existing legal obligations to consider proportionality and freedom of speech rights. The Joint Committee should be specifically asked to report on whether the proposed addition is a justified interference with freedom of speech rights. (Paragraph 180)

37.The original Clause 11 in the draft Bill, in common with the other safety duties, required providers to produce clear and accessible terms of service and enforce them consistently in relation to content harmful to adults. While we have recommended a narrower but stronger regulatory requirement for service providers to identify and mitigate risks of harm, the requirements for transparency, clarity and consistency are vital to ensuring users are well informed about how platforms promote content to them and what protections they can expect. Clear, concise and fully accessible terms will allow users to make informed choices. (Paragraph 183)

38.We recommend that the Bill mandates service providers to produce and publish an Online Safety Policy, which is referenced in their terms and conditions, made accessible for existing users and made prominent in the registration process for new users. This Online Safety Policy should: explain how content is promoted and recommended to users, remind users of the types of activity and content that can be illegal online and provide advice on what to do if targeted by content that may be criminal and/or in breach of the service providers’ terms and conditions and other related guidelines. (Paragraph 184)

39.The Online Safety Policy should be produced in an accessible way and should be sent to all users at the point of sign up and, as good practice suggests, at relevant future points. “Accessible” should include accessible to children (in line with the Children’s Code), where service providers allow child users, and accessible to people with additional needs, including physical and learning disabilities. Ofcom should produce a Code of Practice for service providers about producing accessible and compliant online safety policies and on how they should make them available to users read at appropriate intervals in line with best practice (for example, when the user is about to undertake an activity for the first time or change a safety-relevant setting). (Paragraph 185)

40.We welcome the inclusion of fraud and scams within the draft Bill. Prevention must be prioritised and this requires platform operators to be proactive in stopping fraudulent material from appearing in the first instance, not simply removing it when reported. We recommend that clause 41(4) is amended to add “a fraud offence” under terrorism and child sexual exploitation and abuse offences and that related clauses are similarly introduced or amended so that companies are required to proactively address it. The Government should consult with the regulatory authorities on the appropriate offences to designate under this section. The Government should ensure that this does not compromise existing consumer protection regulation. (Paragraph 194)

41.The Bill must make clear that ultimate responsibility for taking action against criminal content remains with the relevant regulators and enforcement bodies, with Ofcom reporting systemic issues relating to platform design and operation—including in response to “super complaints” from other regulators. The Bill should contain provisions requiring information-sharing and regulatory cooperation to facilitate this. (Paragraph 195)

Chapter 5: Protection of Children

42.The test the Law Commission arrived at for their harm-based offence was “likely to cause harm to a likely audience”. We believe this is a better way of ensuring that service providers consider those who may be harmed or impacted by content or activity on a platform than the “person of ordinary sensibilities” test in the draft Bill. Having a single test for a key category of illegal content and for regulated content and activity harmful to children reduces regulatory burden and improves consistency. Online providers generally have a good understanding of their audience. Where their platform allows users to target content at particular people it would require service providers to consider how the design of their systems might be used to create or mitigate harm. (Paragraph 201)

43.Recognising the key objective of offering a higher level of protection for children than adults, we support the inclusion of a broad definition of content that is harmful to children. At the same time, we believe the definition should be tightened. We recommend that Clauses 10(3) to (8) are revised. Content and activity should be within this section if it is specified on the face of the Bill, in regulations or there is a reasonably foreseeably risk that it would be likely to cause significant physical or psychological distress to children who are likely to encounter it on the platform. (Paragraph 202)

44.As with other duties, we recommend that key, known risks of harm to children are set out on the face of the Bill. We would expect these to include (but not be limited to) access to or promotion of age-inappropriate material such as pornography, gambling and violence material that is instructive in or promotes self-harm, eating disorders or suicide, and features such as functionality that allows adults to make unsupervised contact with children who do not know them, endless scroll, visible popularity metrics, live location, and being added to groups without user permission. (Paragraph 203)

45.We recognise the concerns that, without proper guidance, service providers might seek to place disproportionate age assurance measures in place, impacting the rights of both children and adults. We recommend that Ofcom be required to develop a mandatory Code of Practice for complying with the safety duties in respect of children. Ofcom should be required to have regard to the UN Convention on the Rights of the Child (in particular, General Comment No. 25 on children’s rights in relation to the digital environment), the Information Commissioner’s Office’s Age Appropriate Design Code, and children’s right to receive information under the ECHR when drawing up that Code. (Paragraph 204)

46.We recommend that the “likely to be accessed by children” test in the draft Online Safety Bill should be the same as the test underpinning the Age Appropriate Design Code. This regulatory alignment would simplify compliance for businesses, whilst giving greater clarity to people who use the service, and greater protection to children. We agree that the Information Commissioner’s Office and Ofcom should issue a Joint Statement on how the two regulatory systems will interact once the Online Safety Bill has been introduced. They should be given powers to cooperate on shared investigations, with appropriate oversight. (Paragraph 211)

47.Easy, often unwitting or unintended, access by children to pornography was one of the largest online concerns raised with us during our scrutiny of the draft Bill. It is evident to us that the credibility of the Bill will be undermined if the largest online pornography providers simply remove user-to-user elements from their sites and continue showing extreme content and content that creates a risk of harm to children. (Paragraph 221)

48.Whilst there is a case for specific provisions in the Bill relating to pornography, we feel there is more to be gained by further aligning the Bill with the Age Appropriate Design Code. Whilst we understand the concerns over scope and regulatory burden, this provision would only bring within the scope of the Bill services already covered by the scope of the Age Appropriate Design Code. Both regulatory systems are risk-based and require the regulator to act proportionately. This step would address the specific concern around pornography, requiring all such sites to demonstrate that they have taken appropriate steps to prevent children from accessing their content. It would also bring other sites or services that create a risk of harm into scope whilst bringing us closer to the goal of aligned online regulation across data protection and online safety. We believe that our proposal on expanding the role of risk profiles, discussed later in this report, will be key to ensure that the Bill’s provisions impact the riskiest services and are not disproportionate on those at lower risk. (Paragraph 222)

49.All statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code. This would have many advantages. In particular, it would ensure all pornographic websites would have to prevent children from accessing their content. Many such online services present a threat to children both by allowing them access and by hosting illegal videos of extreme content. (Paragraph 223)

50.There is currently no single regulatory or statutory code in the UK that sets out rules for age assurance. We believe that existing codes, and the duties outlined in the draft Bill, cannot be implemented properly without a statutory system of regulation of age assurance, that is trusted, effective and preserves privacy. We believe that an independent, privacy-protecting age assurance sector operating to a set of minimum standards appropriate for different methods of age assurance in different circumstances is key to any system that aims to protect children from harm online. Such a system:

a)should be for independent commercial providers as well those built by the service providers themselves;

b)should impose standards appropriate to the content and age of the user and be compatible with existing law, including international treaties such as the UN Convention of the Rights of the Child, to provide necessary protections for privacy and data protection; and

c)should provide a route of redress for users to challenge specific conclusions reached on age.

A binding Code of Practice would provide a clear basis for service providers whose risk assessment identifies their content as likely to be accessed by children to put in place mitigations in the form of a rigorous system of age assurance. (Paragraph 235)

51.We recommend that the Bill require Ofcom to establish minimum standards for age assurance technology and governance linked to risk profiles to ensure that third-party and provider-designed assurance technologies are privacy-enhancing, rights-protecting, and that in commissioning such services providers are restricted in the data for which they can ask. Ofcom should also require that service providers demonstrate to them how they monitor the effectiveness of these systems to ensure that they meet the minimum standards required. (Paragraph 236)

52.The Government should ask Ofcom to prioritise the development of a mandatory age assurance technology and governance code as a priority ahead of the Bill becoming law and, in doing so, set out risk profiles so that the use of such systems is clearly proportionate to the risk. The code must bear in mind that children have rights to freedom of association, participation, and information, as well as the right to protections. We expect this to be in place within three to six months of the Bill receiving Royal Assent. (Paragraph 237)

Chapter 6: Scope of the draft Bill

53.We recommend that the categorisation of services in the draft Bill be overhauled. It should adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model. The draft Bill already has a mechanism to do this: the risk profiles that Ofcom is required to draw up. We make recommendations in Chapter 8 about how the role of the risk profiles could be enhanced. We recommend that the risk profiles replace the “categories” in the Bill as the main way to determine the statutory requirements that will fall on different online services. This will ensure that small, but high risk, services are appropriately regulated; whilst guaranteeing that low risk services, large or small, are not subject to unnecessary regulatory requirements. (Paragraph 246)

54.We recognise that search engines operate differently from social media and that the systems and processes required to meet the separate duties that the draft Bill places on them are different. The codes of practice drawn up by Ofcom will need to recognise the specific circumstances of search engines to meet Ofcom’s duties on proportionality. Search engines are more than passive indexes. They rely on algorithmic ranking and often include automatic design features like autocomplete and voice activated searches that can steer people in the direction of content that puts them or others at risk of harm. Most search engines already have systems and processes in place to address these and comply with other legislation. It is reasonable to expect them to come under the Bill’s requirements and, in particular, for them to conduct risk assessments of their system design to ensure it mitigates rather exacerbates risks of harm. We anticipate that they will have their own risk profiles. (Paragraph 251)

55.The Government needs to provide more clarity on how providers with encrypted services should comply with the safety duties ahead of the Bill being introduced into Parliament. (Paragraph 257)

56.We recommend that end-to-end encryption should be identified as a specific risk factor in risk profiles and risk assessments. Providers should be required to identify and address risks arising from the encrypted nature of their services under the Safety by Design requirements. (Paragraph 258)

57.The exclusion of paid-for advertising from the scope of the Online Safety Bill would obstruct the Government’s stated aim of tackling online fraud and activity that creates a risk of harm more generally. Excluding paid-for advertising will leave service providers with little incentive to remove harmful adverts, and risks encouraging further proliferation of such content. (Paragraph 268)

58.We therefore recommend that clause 39(2) is amended to remove “(d) paid-for advertisements” to bring such adverts into scope. Clause 39(7) and clause 134(5) would therefore also have to be removed. (Paragraph 269)

59.Ofcom should be responsible for acting against service providers who consistently allow paid-for advertisements that create a risk of harm to be placed on their platform. However, we agree that regulating advertisers themselves (except insofar as they come under other provisions of the Bill), individual cases of advertising that are illegal, and pursuing the criminals behind illegal adverts should remain matters for the existing regulatory bodies and the police. (Paragraph 270)

60.We recommend that the Bill make clear Ofcom’s role will be to enforce the safety duties on providers covered by the online safety regulation, not regulate the day-to-day content of adverts or the actions of advertisers. That is the role of the Advertising Standards Authority. The Bill should set out this division of regulatory responsibility. (Paragraph 271)

61.We recognise that economic harms other than fraud, such as those impacting consumers, and infringement of intellectual property rights, are an online problem that must be tackled. However, the Online Safety Bill is not the best piece of legislation to achieve this. Economic harms should be addressed in the upcoming Digital Competition Bill. We urge the Government to ensure this legislation is brought forward as soon as possible. (Paragraph 275)

Chapter 7: Freedom of speech requirements, journalism, and content of democratic importance

62.We propose a series of recommendations throughout this report to strengthen protection for freedom of expression. These include greater independence for Ofcom, routes for individual redress beyond service providers, tighter definitions around content that creates a risk of harm, a greater emphasis on safety by design, a broader requirement to be consistent in the applications of terms of service, stronger minimum standards and mandatory codes of practice set by Ofcom (who are required to be compliant with human rights law), and stronger protections for news publisher content. We believe these will be more effective than adjustments to the wording of Clause 12. (Paragraph 284)

63.We recommend that Ofcom be required to produce an annual report on the impact of regulated services on media plurality. (Paragraph 291)

64.We recommend that the news publisher content exemption is strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence, or which has been found to be unlawful by order of a court within the appropriate jurisdiction. We recommend that the Government look at how bad actors can be excluded from the concept of news publisher. We suggest that they may wish to exclude those that have been repeatedly found to be in breach of The Ofcom Broadcasting Code, or are publications owned by foreign Governments. Ofcom should also examine the use of new or existing registers of publishers. We are concerned that some consumer and business magazines, and academic journals, may not be covered by the Clause 40 exemptions. We recommend that the Department consult with the relevant industry bodies to see how the exemption might be amended to cover this off, without creating loopholes in the legislation. (Paragraph 304)

65.The draft Bill already makes a distinction between “news publisher content” and citizen journalism, in recognition that the former is subject to editorial control and there are existing mechanisms for accountability. There is also a clear difference between the categories, as one is based on “who” is sharing the content, and the other focuses on the purpose of the content, rather than the identity of those behind it. For both citizen journalism and content of democratic importance, the justification for special consideration appears to be that they are in the public interest to be shared. This should therefore be key to any final definition and providers will require guidance as to how to balance the risk of harm with the public interest. It is not, nor is it intended to be, a blanket exemption in the same way as that for news publisher content, but a counterbalance to prevent overzealous moderation, particularly in borderline cases. (Paragraph 305)

66.Our recommendations to narrowly define content that is harmful to adults by way of reference to existing law should provide some of the extra clarity service providers need to help protect freedom of expression. At the same time, journalism and content of democratic importance have long been recognised as vital in a democratic society and should be given specific consideration and protection by providers, who have significant influence over the information we see. We have heard concerns around the definitions used however, and about the ability of the providers to interpret and apply them consistently. We feel that “democratic importance” may be both too broad—creating a loophole to be exploited by bad actors—and too narrow—excluding large parts of civil society. Similarly, we are concerned that any definition of journalistic content that is designed to capture citizen journalism would be so broad it would render the consistent application of the requirement almost impossible, and see the expedited complaints route overwhelmed by people claiming without merit to be journalists in order to have their content reinstated. “Public interest” might be more useful in ensuring that content and activity is judged on its merit, rather than its author. (Paragraph 306)

67.We recommend that the existing protections around journalistic content and content of democratic importance should be replaced by a single statutory requirement to have proportionate systems and process to protect ‘content where there are reasonable grounds to believe it will be in the public interest’. Examples of content that would be likely to be in the public interest would be journalistic content, contributions to political or societal debate and whistleblowing. Ofcom should produce a binding Code of Practice on steps to be taken to protect such content and guidance on what is likely to be in the public interest, based on their existing experience and case law. This should include guidance on how appeals can be swiftly and fairly considered. Ofcom should provide guidance to companies in cases of systemic, unjustified take down of content that is likely to be in the public interest. This would amount to a failure to safeguard freedom of expression as required by the objectives of the legislation. (Paragraph 307)

Chapter 8: Role of the regulator

68.Robust regulatory oversight is critical to ensuring the ambition of the Online Safety Bill is fully met. Tech companies must not be allowed to snub the Regulator, to act with impunity, to continue to rely on self-regulation, or to abdicate responsibility for the harms which occur through the operation of their services or because of their governance structures. In turn, Ofcom must be able to move at pace to hold providers to account authoritatively to issue substantial fines, and assist the appropriate authorities with criminal prosecutions. The Bill extends substantial powers to the Regulator, but there are improvements to be made if the Government is to ensure the Bill is enforced effectively. (Paragraph 312)

69.Ofcom should have the power on the face of the Bill to share information and to co-operate with international regulators at its discretion. (Paragraph 315)

70.To help differentiate between the risk assessment undertaken by the regulator and that undertaken by the service providers, Ofcom’s risk assessment should be renamed the “Ofcom register of risks of regulated services” (henceforth, register of risks). Ofcom should begin working on this immediately so that it is ready to be actioned when the Bill becomes law. (Paragraph 317)

71.The Bill’s provision that Ofcom should develop risk profiles based on the characteristics of services should be strengthened. Ofcom should begin drawing up risk profiles immediately so that they are ready to be actioned when the Bill becomes law. Risk profiles should reflect differences in the characteristics of the service. These could include (but are not limited to) risks created by algorithms; risks created by a reliance on artificial intelligence moderation; risks created by unlimited ‘one-click’ sharing; risks caused by “engagement” maximising design features; risk of unsupervised contact between adults and children which may give rise to grooming; risks caused by surveillance advertising; and such other risks as Ofcom identifies in its overall risk assessment, as well as platform design, risk level, end-to-end encryption, algorithmic design, safety by design measures, and the service’s business model and overall corporate aim. Ofcom should also be able to take into account whether a company has been the subject of a super complaint, other legal proceedings or publicly documented evidence of poor performance e.g. independent research, a poor monitoring report in the EU’s Code of Conduct for Illegal Hate, or whistleblowers’ evidence. (Paragraph 323)

72.The Bill should be amended to clarify that Ofcom is able to take enforcement action if it identifies a breach of the safety duties, without requiring a provider to redo a risk assessment. (Paragraph 325)

73.It should not be possible for a service provider to underestimate the level of risk on their service without fear of sanction. If Ofcom suspects such a breach, it should have the power to investigate, and, if necessary, to take swift action. We are not convinced that the draft Bill as it currently stands achieves this. (Paragraph 332)

74.Ofcom should be required to set binding minimum standards for the accuracy and completeness of risk assessments. Ofcom must be able to require a provider who returns a poor or incomplete risk assessment to redo that risk assessment. Risk assessments should be carried out by service providers as a response to the Online Safety Act before new products and services are rolled out, during the design process of new features, and kept up to date as they are implemented. (Paragraph 333)

75.The required content of service providers’ risk assessments should follow the risk profiles developed by Ofcom, which in turn should be based on the differences in the characteristics of the service, platform design, risk level, and the service’s business model and overall corporate aim. For example, a provider that does not have an engagement-based service would not need to address irrelevant risks associated with virality, whilst a site containing adult content would have to address the higher level of risks associated with children accessing the site. (Paragraph 334)

76.The Bill should be amended to clarify that risk assessments should be directed to “reasonably foreseeable” risks, to allow Ofcom greater leeway to take enforcement action against a company that conducts an inadequate risk assessment. (Paragraph 335)

77.Ofcom should look to the Data Protection Impact Assessment as they come to form their own guidance for minimum standards for risk assessments for regulated services. (Paragraph 336)

78.In bringing forward the final Bill, we recommend the Government publish an assessment of the audit powers given to Ofcom and a comparison to those held by the Information Commissioner’s Office and the Financial Conduct Authority. Parliament should be reassured that the Bill will give Ofcom a suite of powers to match those of similar regulators. Within six months of the Act becoming law, Ofcom should report to Parliament on how it has used those powers. (Paragraph 339)

79.We recommend that the largest and highest-risk providers should be placed under a statutory responsibility to commission annual, independent third-party audits of the effects of their algorithms, and of their risk assessments and transparency reports. Ofcom should be given the explicit power to review these and undertake its own audit of these or any other regulated service when it feels it is required. Ofcom should develop a framework for the effective regulation of algorithms based on the requirement for, and auditing of, risk assessments. (Paragraph 340)

80.In taking on its responsibilities under the Bill, Ofcom will be working with a network of other regulators and third parties already working in the digital world. We recommend that the Bill provide a framework for how these bodies will work together including when and how they will share powers, take joint action, and conduct joint investigations. (Paragraph 346)

81.We reiterate the recommendations by the House of Lords Communications and Digital Committee in their Digital Regulation report: that regulators in the Digital Regulation Cooperation Forum should be under a statutory requirement to cooperate and consult with one another, such that they must respect one another’s objectives, share information, share powers, take joint action, and conduct joint investigations; and that to further support coordination and cooperation between digital regulators including Ofcom, the Digital Regulation Cooperation Forum should be placed on a statutory footing with the power to resolve conflicts by directing its members. (Paragraph 347)

82.The draft Bill does not give Ofcom co-designatory powers. Ofcom is confident that it will be able to co-designate through other means. The Government must ensure that Ofcom has the power to co-designate efficiently and effectively, and if it does not, this power should be established on the face of the Bill. (Paragraph 348)

83.During the course of its duties, Ofcom will be required to investigate companies for a range of breaches, some of which will relate to suspected or known child sexual exploitation and abuse material. As child sexual exploitation and abuse investigations lie so far outside Ofcom’s normal duties, we expect Ofcom to work closely with experts like the Internet Watch Foundation, to develop and update the child sexual exploitation and abuse Code of Practice; monitor providers to ensure compliance with the child sexual exploitation and abuse code; and during investigations relating to child sexual exploitation and abuse content. (Paragraph 352)

84.Ofcom may receive unsolicited child sexual exploitation and abuse material which would constitute an offence under Section 1 of the Protection of Children Act 1978. The Bill should be amended to provide Ofcom with a specific defence in law to allow it to perform its duties in this area without inadvertently committing an offence. (Paragraph 353)

85.The Bill should be amended to make clear that Codes of Practice should be binding on providers. Any flexibility should be entirely in the hands of and at the discretion of the Regulator, which should have the power to set minimum standards expected of providers. They should be subject to affirmative procedure in all cases. (Paragraph 358)

86.Ofcom should start working on Codes of Practice immediately, so they are ready for enforcement as soon as the Bill becomes law. A provisional list of Codes of Practice, including, but not necessarily limited to, those listed in Box 2 above should be included on the face of the Bill. Some of the Codes should be delegated to co-designated bodies with relevant expertise, which would allow work on multiple Codes to happen simultaneously and thus the entire endeavour to be completed more quickly. Once the Codes of Practice are completed, they should be published. (Paragraph 359)

87.The Bill should require that companies’ risk assessments be reported at Board level, to ensure that senior management know and can be held accountable for the risks present on the service, and the actions being taken to mitigate those risks. (Paragraph 367)

88.We recommend that a senior manager at board level or reporting to the board should be designated the “Safety Controller” and made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users. We believe that this would be a proportionate last resort for the Regulator. Like any offence, it should only be initiated and provable at the end of an exhaustive legal process. (Paragraph 368)

89.The Committee welcomes the Secretary of State’s commitment to introduce criminal liability within three to six months of Royal Assent and strongly recommends that criminal sanctions for failures to comply with information notices are introduced within three months of Royal Assent. (Paragraph 369)

90.The power for the Secretary of State to exempt services from regulation should be clarified to ensure that it does not apply to individual services. (Paragraph 376)

91.The powers for the Secretary of State to a) modify Codes of Practice to reflect Government policy and b) give guidance to Ofcom give too much power to interfere in Ofcom’s independence and should be removed. (Paragraph 377)

92.Exercise of the Secretary of State’s powers in respect of national security and public safety in respect of terrorism and child sexual exploitation and abuse content should be subject to review by the Joint Committee we propose later in this report. (Paragraph 378)

93.If the Government wishes to improve the UK’s media literacy to reduce online harms, there must be provisions in the Bill to ensure media literacy initiatives are of a high standard. The Bill should empower Ofcom to set minimum standards for media literacy initiatives that both guide providers and ensure the information they are disseminating aligns with the goal of reducing online harm. (Paragraph 381)

94.We recommend that Ofcom is made responsible for setting minimum standards for media literacy initiatives. Clause 103 (4) should be amended to include “(d) about minimum standards that media literacy initiatives must meet.” (Paragraph 382)

95.We recommend that the Bill reflects that media literacy should be subject to a “whole of Government” approach, involving current and future initiatives of the Department of Education in relation to the school curriculum as well as Ofcom and service providers. We have heard throughout this inquiry about the real dangers that some online content and activity poses to children. Ofsted already assesses how schools manage online safety as part of their safeguarding policies. We recommend that Ofsted, in conjunction with Ofcom, update the school inspection framework to extend the safeguarding duties of schools to include making reasonable efforts to educate children to be safe online (Paragraph 385)

96.Ofcom should require that media literacy is built into risk assessments as a mitigation measure and require service providers to provide evidence of taking this mitigation measure where relevant. (Paragraph 386)

97.We recommend that clause 103(11) is amended to state that Ofcom’s media literacy duties relate to “the public” rather than “members of the public”, and that the definition of media literacy is updated to incorporate learning about being a good digital citizen and about platform design, data collection and the business models and operation of digital services more broadly. (Paragraph 388)

98.The highest risk services, as assessed by Ofcom, should have to report quarterly data to Ofcom on the results of the tools, rules, and systems they have deployed to prevent and remove child sexual exploitation and abuse content (e.g. number and rates of illegal images blocked at upload stage, number and rates of abusive livestreams terminated, number and rates of first- and second- generation images and videos detected and removed). (Paragraph 394)

99.Ofcom should have the power to request research and independent evaluation into services where it believes the risk factors for child sexual exploitation and abuse are high. (Paragraph 395)

100.Ofcom should move towards a risk factors approach to the regulation of child sexual exploitation and abuse material. It should be able to issue a Use of Technology notice if it believes that there is a serious risk of harm from child sexual exploitation and abuse or terrorism content and that not enough is being done by a service to mitigate those risks. The Bill should be amended to clarify that Ofcom is able to consider a wider range of risk factors when deciding whether to issue a Use of Technology notice or take enforcement action. Risk factors should include:

a)The prevalence or the persistent prevalence of child sexual exploitation and abuse material on a service, or distributed by a service;

b)A service’s failure to provide and maintain adequate tools, rules, and systems to proactively prevent the spread of child sexual exploitation and abuse content, and to provide information on those tools, rules, and systems to Ofcom when requested;

c)A service’s failure to provide adequate data to Ofcom on the results of those tools, rules, and systems (e.g., number and rates of illegal images blocked at upload stage, number and rates of abusive livestreams terminated, number and rates of first- and second- generation images and videos detected and removed);

d)The nature of a service and its functionalities;

e)The user base of a service;

f)The risk of harm to UK individuals (and the severity of that harm) if the relevant technology is not used by the service

g)The degree of interference posed by the use of the relevant technology with users’ rights to freedom of expression and privacy; and

h)The safety by design mechanisms that have been implemented. (Paragraph 396)

Chapter 9: Transparency and oversight

101.We recommend that Ofcom specify that transparency reports produced by service providers should be published in full in a publicly accessible place. Transparency reports should be written clearly and accessibly so that users and prospective users of the service can understand them, including children (where they are allowed to use the service) and disabled people. (Paragraph 410)

102.We recommend that the Bill require transparency reporting on a regular, proportionate basis, with the aim of working towards standardised reporting as the regulatory regime matures. The Bill should require minimum standards of accuracy and transparency about how the report was arrived at and the methodology used in research. For providers of the highest risk services, the outcome of the annual audits recommended in paragraph 340 should be required to be included in the transparency report. (Paragraph 411)

103.We agree with the list of information that Ofcom can require as part of its transparency reporting powers and recommend that it should have the clear power to request any other information. We recommend that transparency reporting should aim to create a competitive marketplace in respect of safety, where people can reasonably compare, using robust and comparable information, performance of services as they operate for UK users. We suggest Ofcom also be able to require information be published in transparency reports including (but not limited to):

a)Safety by design features;

b)Most viewed/engaged with content by month;

c)Most recommended content by month by age group and other demographic information (where that information is collected);

d)Their terms and conditions;

e)Proportion of users who are children;

f)Proportion of anonymous users;

g)Proportion of content breaching terms and conditions;

h)Proportion of content breaching terms and conditions removed;

i)Proportion of appeals against removal upheld;

j)Proportion of appeals against removal, by both recognised news publishers and other users on the grounds of public interest, upheld; and

k)Time taken to deal with reports. (Paragraph 412)

104.In addition to transparency reporting, Ofcom should be empowered to conduct its own independent research with the aim of informing the UK public about the comparative performance of services in respect of online safety. (Paragraph 413)

105.Independent researchers currently have limited access to the information needed to conduct research. This hinders progress in understanding online activity that creates a risk of harm, the way that services’ systems work, and how services’ systems could be improved to mitigate the risk of harm. It also limits the ability to scrutinise service providers and hold them accountable. This issue must be addressed urgently. (Paragraph 424)

106.The transparency powers in the Bill are an important opportunity to encourage service providers to share relevant data with external researchers studying online safety and allied subjects. (Paragraph 425)

107.The draft Bill requires that Ofcom produce a report on access to data for independent researchers. We recommend work on this report starts as soon as possible. We recommend that Ofcom be given the powers in the Bill to put into practice recommendations from that report. (Paragraph 426)

108.Ofcom should have the power i) to audit or appoint a third-party to audit how services commission, surface, collate and use their research; ii) to request a) specific internal research from services; b) research on topics of interest to the Regulator. (Paragraph 427)

109.Ofcom should commission an independent annual assessment, conducted by skilled persons, of what information should be provided by each of the highest risk services to advance academic research. (Paragraph 428)

110.We recommend that the Bill should require service providers to conduct risk assessments of opening up data on online safety to independent researchers, with some pre-defined issues to comment on, including a) privacy; b) risk of harm to users; c) reputational risks (for the service provider) and; d) financial cost (Paragraph 429)

111.We recommend that Ofcom should require service providers to conduct an annual formal review of using privacy-protecting technologies and enable them to share sensitive datasets. (Paragraph 430)

112.We agree with other Committees that it is imperative that digital regulation be subject to dedicated parliamentary oversight. To achieve this, we recommend a Joint Committee of both Houses to oversee digital regulation with five primary functions: scrutinising digital regulators and overseeing the regulatory landscape, including the Digital Regulation Cooperation Forum; scrutinising the Secretary of State’s work into digital regulation; reviewing the codes of practice laid by Ofcom any legislation relevant to digital regulation (including secondary legislation under the Online Safety Act); considering any relevant new developments such as the creation of new technologies and the publication of independent research or whistleblower testimonies; and helping to generate solutions to ongoing issues in digital regulation. (Paragraph 434)

113.We fully support the recommendation of the House of Lords Communications and Digital Committee in their report on Digital Regulation that, as soon as possible, full Digital Regulation Cooperation Forum membership should be extended to statutory regulators with significant interests and expertise in the digital sphere, and that partial membership should be extended to non-statutory regulators and advisory bodies with subject specific knowledge to participate on issues particular to their remits. (Paragraph 435)

114.We recommend that, in addition to any other reports the Committee chooses to make, the Joint Committee produces an annual report with recommendations on what could or should change, looking towards future developments. We anticipate that the Joint Committee will want to look at the definition of disinformation and what more can be done to tackle it at an early stage. (Paragraph 436)

115.We recommend that whistleblowers’ disclosure of information to Ofcom and/or the Joint Committee on Digital Regulation, where that information provides clear evidence of non-compliance with the Online Safety Bill, is protected under UK law. (Paragraph 439)

Chapter 10: Redress

116.The Bill should establish proportionate minimum standards for the highest risk providers’ reports, complaints, and redress mechanisms as set out in a mandatory code of practice prepared by Ofcom. (Paragraph 443)

117.We recommend a requirement on the face of the Bill for Ofcom to set out i) how they will assess the a) ease of use; b) accessibility and c) transparency of a service’s complaints process for d) adults; e) children; and g) disabled people f) vulnerable adults; ii) what steps Ofcom will be able to take if it finds any of these processes wanting; and iii) how Ofcom will ensure that requirements to operate complaint, reporting and redress mechanisms are proportionate for smaller in-scope providers. (Paragraph 444)

118.Clause 15 (3)(c) should be amended so that it reads “is easy to access, including for disabled people and those with learning difficulties”. (Paragraph 445)

119.Providers of the highest risk services should have to give quarterly statistics to Ofcom on:

i)Number of user reports;

ii)User reports broken down by the reason the report was made;

iii)Number of actionable user reports;

iv)Actionable user reports broken down by the reason the report was made;

v) How long it took the service provider to respond to i) all user reports; ii) actionable user reports;

vi)What response was made to actionable user reports;

vii)Number of user complaints received;

viii)Number of actionable user complaints;

ix)How long it took the service provider to respond to i) all user complaints; ii) actionable user complaints;

x)What response was made to actionable user complaints;

xi)How many pieces of user content were taken down;

xii)How many pieces of content that were taken down were later reinstated;

xiii)The grounds on which content that was reinstated was reinstated;

xiv)How long it took the service provider to reinstate a piece of content that was later reinstated. (Paragraph 446)

120.Our proposed external redress process would not replace service providers’ internal processes or run concurrently to them, nor would it address individual complaints about individual pieces of content or interactions. Rather, for a victim of sustained and significant online harm, someone who has been banned from a service or who had their posts repeatedly and systematically removed, this new redress mechanism would give them an additional body to appeal those decisions after they had come to the end of a service provider’s internal process. (Paragraph 450)

121.In order for an external redress process to work, clear direction is needed in the Bill about Ofcom’s responsibility to set quality standards for service provider’s internal complaints procedures, and in relation to complaints about failures to meet those standards. We hope that the Government will consider our recommendations in this area, and that by improving the quality of service providers’ internal complaints procedures, any system of external redress will be needed only rarely and for the most serious cases. (Paragraph 451)

122.We support the Government’s ambition to make service providers behave responsibly, and by agreeing our recommendations the requirements of the Bill will bring about better responses from service providers to user complaints. However, the fact remains that service providers’ user complaints processes are often obscure, undemocratic, and without external safeguards to ensure that users are treated fairly and consistently. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact individuals. (Paragraph 456)

123.The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to significant, demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution. The right to complain to this Ombudsman should be limited to users to those i) who have exhausted the internal complaints process with the service provider against which they are making their complaint and ii) who have either a) suffered serious or sustained harm on the service or b) had their content repeatedly taken down. There should be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition to handling these complaints the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an “eligible entity” to make super-complaints. (Paragraph 457)

124.We believe that this Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act. (Paragraph 460)

125.Bereaved parents who are looking for answers to the tragic deaths of their children in their digital data should not have to struggle through multiple, lengthy, bureaucratic processes to access that data. We recognise that an automatic right to a child’s data would raise privacy and child safety concerns. At the same time, we believe there is more than could be done to make the process more proportionate, straightforward and humane. We recommend that the Government undertake a consultation on how the law, and service’s terms and conditions, can be reformed to give access to data to parents when it is safe, lawful and appropriate to do so. The Government should also investigate whether the regulator could play a role in facilitating co-operation between the major online service providers to establish a single consistent process or point of application. (Paragraph 463)

126.We also recommend Ofcom, the Information Commissioner and the Chief Coroner review the powers of coroners to ensure that they can access digital data following the death of a child. We recommend the Government legislate, if it is required, to ensure that coroners are not obstructed by service providers when they require access to digital data. We recommend that guidance is issued to coroners and regulatory authorities to ensure they are aware of their powers in dealing with service providers and of the types of cases where digital data is likely to be relevant. Our expectation is that the Government will look to implement the outcomes of these consultations in the Bill during its parliamentary passage. (Paragraph 464)

Chapter 11: Conclusion

127.This Report must be understood as a whole document, comprising a cohesive set of recommendations working in tandem to produce a new vision of the Online Safety Act. The Government should not seek to isolate single recommendations without understanding how they fit into the wider manifesto laid out by the Committee. Taken as a whole, our recommendations will ensure that the Bill holds platforms to account for the risks of harm which arise on them and will achieve the Government’s ultimate aim of making the United Kingdom the safest place in the world to be online. (Paragraph 469)





© Parliamentary copyright 2021