440.As explored in Chapter 9, there is currently little transparency about decision-making or outcomes when users report issues to service providers.
441.We heard compelling evidence from Prof McGlynn and others that this Bill “provides a valuable opportunity to strengthen individual protections against online violence and abuse” but that it currently falls short of what it might achieve in this area. Online violence and abuse can take many forms, and individuals who have been abused often find that their options to gain redress from service providers or from the courts are limited. We heard from Refuge that tech abuse, which is a form of domestic or intimate partner violence, can entail “hundreds of abusive messages … from perpetrators, often across multiple platforms”, each of which has to be flagged to the service provider individually. Survivors of tech abuse can wait weeks or months even to receive acknowledgement of their report from service providers. Those who have been subjected to image-based sexual abuse can find themselves failed by “out-of-date, confusing, piecemeal” laws, which can lead police officers to use “informal resolutions” which fail to capture image-based sexual abuse as a sexual offence and therefore the available criminal justice response.
442.Under the draft Bill, services in scope must operate a complaints process that “provides for appropriate action to be taken by the provider of the service” and which is “easy to access”, “easy to use (including by children)” and “transparent” (15). Many large services already provide users with complaints procedures, alongside mechanisms to appeal takedown decisions. As such, it seems the clause is designed to improve service providers’ existing procedures, rather than to establish new ones. In that respect it differs, for example, from the EU Digital Services Act, which seeks to oblige digital marketplaces to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms. Yet, the Bill does not currently seek to establish minimum quality standards for the operation of any of these processes. Nor does it establish how Ofcom will assess how easy to access, easy to use, or transparent a service provider’s reporting or complaints process is.
444.We recommend a requirement on the face of the Bill for Ofcom to set out: i) how they will assess the a) ease of use; b) accessibility and c) transparency of a service’s complaints process for d) adults; e) children; and g) disabled people f) vulnerable adults; ii) what steps Ofcom will be able to take if it finds any of these processes wanting; and iii) how Ofcom will ensure that requirements to operate complaint, reporting and redress mechanisms are proportionate for smaller in-scope providers.
447.The Bill mandates that user-to-user services must provide opportunities for individuals to make complaints and provides service providers with the opportunity to appeal decisions made by Ofcom (Clause 104), Ofcom notices (Clause 105) and to make super-complaints (Clause 106).
448.The explanatory notes which accompanied the draft Bill explained that “a body representing the interests of UK users of regulated services, or members of the public can make a super-complaint to Ofcom about any feature of one of more regulated services, or the conduct of one or more providers of such services.” The super-complaints measure is useful for reporting multiple and widespread suspected breaches, for example the widespread bullying or abuse of a class or group. It also supports the transparency objective by enabling group complaints against powerful companies. However, a super-complaint may only be made if “the complaint is of particular importance” or if “the complaint relates to [or] impacts on a particularly large number of users of the service or members of the public (106(2)). As such, there is no right for an individual to seek external redress under the Bill as it is currently drafted.
449.We heard four principal arguments in favour of a new external appeals mechanism for individuals once internal routes have been exhausted. Firstly, that it would provide a source of redress to victims of online abuse of the kind described above once they have exhausted a service provider’s internal complaints process. Secondly, we heard arguments for the addition of an appeals mechanism on the grounds of consumer protection. Thirdly, that it would empower users if they were consulted on mechanisms for redress, whilst addressing the “current imbalance between democratic ‘people’ power and the power of platforms”. Finally, we received submissions which argued there should be an external appeals process to prevent over-enforcement and to protect the freedom of expression of individuals who feel their content has been unfairly removed or demoted.
450.Our proposed external redress process would not replace service providers’ internal processes or run concurrently to them, nor would it address individual complaints about individual pieces of content or interactions. Rather, for a victim of sustained and significant online harm, someone who has been banned from a service or who had their posts repeatedly and systematically removed, this new redress mechanism would give them an additional body to appeal those decisions after they had come to the end of a service provider’s internal process.
451.In order for an external redress process to work, clear direction is needed in the Bill about Ofcom’s responsibility to set quality standards for service provider’s internal complaints procedures, and in relation to complaints about failures to meet those standards. We hope that the Government will consider our recommendations in this area, and that by improving the quality of service providers’ internal complaints procedures, any system of external redress will be needed only rarely and for the most serious cases.
452.Ofcom has said that if they were the body designated to take individual complaints “that could not just overwhelm us in volume but conflict a bit with the role that the Bill gives us, which is a strategic one looking at overall systems and processes” and that they would prefer individual complaints to be handled by a specially appointed Ombudsman. The ICO were similarly cautious about making Ofcom responsible for handling individual complaints: “if individual complaints could come to a different organisation, that might be a way to go, and then Ofcom could learn from the experience of those individuals.”
453.Our hope is that by improving the quality of service providers’ complaints procedures, the burden on any external redress process will be lessened, but a stronger internal process is no substitute for the rigour of independent oversight. Nevertheless, one of the primary challenges to establishing a redress mechanism for individuals is the high number of potential claimants. The ICO cautioned: “imagine the millions of complaints for take-down requests that might go to an organisation such as Ofcom.” There are over 48.5 million Facebook users in the UK, and around 28.8 million on Instagram. However, we note that many external redress systems only apply once the internal process has been exhausted. For example, Ofcom only intervenes once a complainant has exhausted the BBC’s internal process. Furthermore, we note that the advent of the General Data Protection Regulation (GPDR) did not bring about an overwhelming surge of cases for breach of data protection law. This may be for a number of reasons including companies preferring to settle claims, that potential litigants are unaware of their rights, and/or because they are put off by the prospect of making a claim for a relatively low level of damages. We envisage the external complaints process as a last resort for users who have suffered serious harm on services. It is an important step towards greater transparency and clarity in service provider’s moderation decisions which greatly outweighs the potential for misuse by users.
454.We note with interest South West Grid for Learning’s Report Harmful Content, which offers online users an opportunity to report harmful online content, as well as an impartial dispute resolution service for users who have exhausted a service’s internal complaint procedures. We suggest that the Department look to Report Harmful Content as a potential model for what such an Ombudsman could look like.
455.The Secretary of State was reluctant to commit to introducing an Ombudsman:
“I know that [Dame] Melanie at Ofcom raised a point about an ombudsman, which is a slow and onerous process. We do not want to get into that. We want to get into making platforms behave responsibly as quickly as possible, under a legal framework, and that is what we are focused on.”
In a letter to the Committee, the Government stated that:
“Although Ofcom will not investigate or arbitrate on individual complaints (owing to the likelihood of becoming overwhelmed by sheer volume), it will be possible for individuals to submit complaints to Ofcom. Ofcom will use aggregate data from user complaints to inform its horizon scanning, research supervision and enforcement activity.”
456.We support the Government’s ambition to make service providers behave responsibly, and by agreeing our recommendations the requirements of the Bill will bring about better responses from service providers to user complaints. However, the fact remains that service providers’ user complaints processes are often obscure, undemocratic, and without external safeguards to ensure that users are treated fairly and consistently. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact individuals.
457.The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to significant, demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution. The right to complain to this Ombudsman should be limited to users to those i) who have exhausted the internal complaints process with the service provider against which they are making their complaint and ii) who have either a) suffered serious or sustained harm on the service or b) had their content repeatedly taken down. There should be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition to handling these complaints the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an “eligible entity” to make super-complaints.
458.A duty of care in negligence law implies a right to take anyone who fails in that duty of care to court and seek compensation. As we discussed in paragraph 54 above, the duties of care in the Bill do not create individual liability between the user and the service provider which would allow users to sue for negligence.We asked our witnesses how users might be able to seek redress through the courts. Mr Perrin told us Carnegie UK Trust had rejected the idea of a statutory tort “because we felt it would not lead to a good regulatory outcome; it favours people who have resources to sue, and the courts do not work terribly quickly and are rather overloaded at the moment.”Dr Harbinja told us that the approach of the courts to negligence was not in legal terms a good fit with the duties in the Bill, and ran the risk of unintended consequences. Prof Wilson saw some benefits in an avenue of redress through the courts “because the civil courts have a history and an experience of evaluating emotional and psychological harm and awarding damages on that basis. In a sense, they are trained to do that.”
459.The UK GDPR allows an individual to take a data controller to court if they believe there has been a breach of data protection law in the handling of their personal data.The court may order compensation for emotional distress without the need for other damage, although a monetary award for other loss may be made including loss of earnings as a result of exacerbation of a pre-existing serious mental health condition.
460. We believe that this Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.
461.Mr Russell, who campaigns to reduce suicide and self-harm in young people, told us of his distressing experiences in trying to access the digital data of his 14-year-old daughter Molly Russell, who tragically died in 2017. Digital data does not form part of a deceased person’s estate, and next of kin who want access have to apply to each tech company which each have their own processes for such requests. Mr Russell told us that a “typical tech company response” was:
The death certificate.
A court document that confirms you are the legal personal representative of the decedent.
Mr Russell’s distressing experiences are sadly not unique. We have also heard about the distress experienced by the bereaved parents of Frankie Thomas in trying to access their deceased daughter’s data.
462.Mr Russell also raised concerns about access to digital data for coroners and other investigatory and regulatory authorities. He asked the Committee to consider measures to ensure digital data is available to investigators and, above all, that other bereaved parents do not have to experience what his family has gone through in accessing the social media of their children.
463.Bereaved parents who are looking for answers to the tragic deaths of their children in their digital data should not have to struggle through multiple, lengthy, bureaucratic processes to access that data. We recognise that an automatic right to a child’s data would raise privacy and child safety concerns. At the same time, we believe there is more than could be done to make the process more proportionate, straightforward and humane. We recommend that the Government undertake a consultation on how the law, and service’s terms and conditions, can be reformed to give access to data to parents when it is safe, lawful and appropriate to do so. The Government should also investigate whether the regulator could play a role in facilitating co-operation between the major online service providers to establish a single consistent process or point of application.
464.We also recommend Ofcom, the Information Commissioner and the Chief Coroner review the powers of coroners to ensure that they can access digital data following the death of a child. We recommend the Government legislate, if it is required, to ensure that coroners are not obstructed by service providers when they require access to digital data. We recommend that guidance is issued to coroners and regulatory authorities to ensure they are aware of their powers in dealing with service providers and of the types of cases where digital data is likely to be relevant. Our expectation is that the Government will look to implement the outcomes of these consultations in the Bill during its parliamentary passage.
743 Written evidence from Ms. Daphne Keller (Director, Program on Platform Regulation at Stanford Cyber Policy Center) ();
744 Written evidence from Professor Clare McGlynn (Professor of Law at Durham University) (), p 7. See also written evidence from: Professional Players Federation (), p 3; Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement () p 6; RSA (Royal Society for the Encouragement of Arts, Manufactures and Commerce) (), p 6
745 Written evidence from Refuge (), pp 8–9
746 Written evidence from Refuge (), pp 8–9
747 Written evidence from Refuge (), pp 8–9
748 Written evidence from Professor Clare McGlynn (Professor of Law at Durham University) (), p 8
749 See, for example: Facebook, ‘How do I appeal the removal of content on Facebook for copyright reasons?’: [accessed 1 December 2021]; Twitter, ‘Appeal an account suspension or locked account’: [accessed 1 December 2021]; Instagram, ‘I don’t think Instagram should have taken down my post’: [accessed 1 December 2021]
750 Written evidence from Electrical Safety First (), 6.6
751 Written evidence from 5Rights Foundation (), point 4
752 [Bill CP 405-EN]
753 Written evidence from Refuge (), p 28.
754 Written evidence from Parent Zone (), p 4. See Written evidence from Competition and Markets Authority (), point 5, which argues that the Draft Bill “risks inadvertently setting a lower standard of consumer protection on platforms for economic and financial harms than that already envisaged by current law, and established by the CMA’s enforcement work”.
755 Written evidence from LSE Department of Media and Communications ()
756 Written evidence from: RSA (Royal Society for the Encouragement of Arts, Manufactures and Commerce) (), p 6; Demos (), 48; Ms. Daphne Keller (Director, Program on Platform Regulation at Stanford Cyber Policy Center) ().
760 Statista, ‘United Kingdom: Facebook users 2021, by age group’: [accessed 1 December 2021]; Statista, ‘United Kingdom: monthly Instagram users 2018–2021’: [accessed 1 December 2021]
761 Ofcom, ‘Complain about the BBC’: [accessed 1 December 2021]
762 Written evidence from SWGfL ().
763 ‘Report Harmful Content’: [accessed 1 December 2021]
765 Written evidence from Department of Digital, Culture, Media & Sport ()
766 Draft Online Safety Bill, CP 405, May 2021, Clause 6 and Clause 18
767 Right to bring a claim for breach of the duties in the Bill
771 Data Protection Act 2018, )
772 Data Protection Act 2018, ) and Fieldfischer, ‘Article 82 UK GDPR’: [accessed 1 December 2021]
773 Grinyer v Plymouth Hospitals NH Trust  EWCA Civ 1043
774 Ian Russell, Molly Rose Foundation (); “decedent” is as in the original communication with Mr Russell.