Draft Online Safety Bill Contents

Summary

Self-regulation of online services has failed. Whilst the online world has revolutionised our lives and created many benefits, underlying systems designed to service business models based on data harvesting and microtargeted advertising shape the way we experience it. Algorithms, invisible to the public, decide what we see, hear and experience. For some service providers this means valuing the engagement of users at all costs, regardless of what holds their attention. This can result in amplifying the false over the true, the extreme over the considered, and the harmful over the benign. The human cost can be counted in mass murder in Myanmar, in intensive care beds full of unvaccinated Covid-19 patients, in insurrection at the US Capitol, and in teenagers sent down rabbit holes of content promoting self-harm, eating disorders and suicide.

This has happened because for too long the major online service providers have been allowed to regard themselves as neutral platforms which are not responsible for the content that is created and shared by their users. Yet it is these algorithms which have enabled behaviours which would be challenged by the law in the physical world to thrive on the internet. If we do nothing these problems will only get worse. Our children will pay the heaviest price. That is why the driving force behind the Online Safety Bill is the belief that these companies must be held liable for the systems they have created to make money for themselves.

The Online Safety Bill is a key step forward for democratic societies to bring accountability and responsibility to the internet. Our recommendations strengthen two core principles of responsible internet governance: that online services should be held accountable for the design and operation of their systems; and that regulation should be governed by a democratic legislature and an independent regulator—not Silicon Valley. We want the Online Safety Bill to be easy to understand for service providers and the public alike. We want it to have clear objectives, that lead into precise duties on the providers, with robust powers for the regulator to act when the platforms fail to meet those legal and regulatory requirements.

The most important thing this Bill will do, if our recommendations are accepted, is hold online services responsible for the risks created by their design and operation. To give just three examples: those which aim to maximise engagement will have to mitigate the risks of that engagement. A platform that recommends content using users’ data will have to mitigate the risk it recommends dangerous content to vulnerable people. A platform that allows anonymous accounts will have to ensure those committing criminal acts can be traced in a timely way by UK law enforcement.

The criminal law relating to online communication pre-dates the age of social media and modern search engines. It needs updating. We welcome the Law Commission’s recommendations to reform this. We want to see new offences on the statute book at the first opportunity for harmful, threatening and knowingly false communications, cyber-flashing, trying to induce seizures in people with photosensitive epilepsy, promoting self-harm and stirring up hatred against people on grounds of sex or gender, or disability. Service providers will be required to mitigate the risks presented by content and activity that society has deemed unacceptable, whether through the criminal law, through the Equality Act, or other established legal principles. Paid-for advertising can be used by fraudsters and other criminals. Under our recommendations, providers will be held accountable for the risks created by adverts, like any other activity online.

Protecting children is a key objective of the draft Bill and our report. Our children have grown up with the internet and it can bring them many benefits. Too often, though, services are not designed with them in mind. We want all online services likely to be accessed by children to take proportionate steps to protect them. Extreme pornography is particularly prevalent online and far too many children encounter it—often unwittingly. Privacy-protecting age assurance technologies are part of the solution but are inadequate by themselves. They need to be accompanied by robust requirements to protect children, for example from cross-platform harm, and a mandatory Code of Practice that will set out what is expected. Age assurance, which can include age verification, should be used in a proportionate way and be subject to binding minimum standards to prevent it being used to collect unnecessary data.

If service providers fail to mitigate the risk of harm, the Bill will hold them accountable. We want Ofcom to have the powers to set minimum quality standards of risk assessment, under which service providers will be required to undertake independent audits of their systems, processes and algorithms. A radical transparency regime will empower people to take informed decisions about the online services they use. The Bill will introduce significant financial penalties for service providers that fail to comply, and we want to see criminal sanctions for executives who are grossly non-compliant in how they approach online safety.

Through our recommendations, the Bill will protect freedom of speech online. Service providers will no longer be able to ignore the abuse and hatred designed to silence women and minorities. They will be required to apply their terms and conditions consistently and transparently and, for the first time, be required to publish an accessible Online Safety Policy. Service providers will no longer be able to selectively censor without accountability. They will be told by Parliament and the Regulator what is illegal and unacceptable online, and how they should act against it. They will be required to protect speech that is vital to a democratic society—journalism, whistleblowing, political and societal debate, academic research, and more. If they fail, through our recommendations, individuals will have new rights of appeal and redress, through the service providers themselves, through an independent Ombudsman and, finally, through the civil courts.

We want this Bill to reset the relationship between citizens and online services, particularly the most risky. We should not have to rely on whistleblowers and court cases to get brief glimpses into how the online world is shaped. These recommendations offer a holistic and watertight regulatory regime that will make the sector accountable to UK citizens; they are not a pick and mix, but indivisible. We urge the Government to accept our recommendations and bring the Online Safety Bill to Parliament at the earliest opportunity.





© Parliamentary copyright 2021