440.As explored in Chapter 9, there is currently little transparency about decision-making or outcomes when users report issues to service providers.743
441.We heard compelling evidence from Prof McGlynn and others that this Bill “provides a valuable opportunity to strengthen individual protections against online violence and abuse” but that it currently falls short of what it might achieve in this area.744 Online violence and abuse can take many forms, and individuals who have been abused often find that their options to gain redress from service providers or from the courts are limited.745 We heard from Refuge that tech abuse, which is a form of domestic or intimate partner violence, can entail “hundreds of abusive messages … from perpetrators, often across multiple platforms”, each of which has to be flagged to the service provider individually.746 Survivors of tech abuse can wait weeks or months even to receive acknowledgement of their report from service providers.747 Those who have been subjected to image-based sexual abuse can find themselves failed by “out-of-date, confusing, piecemeal” laws, which can lead police officers to use “informal resolutions” which fail to capture image-based sexual abuse as a sexual offence and therefore the available criminal justice response.748
442.Under the draft Bill, services in scope must operate a complaints process that “provides for appropriate action to be taken by the provider of the service” and which is “easy to access”, “easy to use (including by children)” and “transparent” (15). Many large services already provide users with complaints procedures, alongside mechanisms to appeal takedown decisions.749 As such, it seems the clause is designed to improve service providers’ existing procedures, rather than to establish new ones. In that respect it differs, for example, from the EU Digital Services Act, which seeks to oblige digital marketplaces to set up complaint and redress mechanisms and out-of-court dispute settlement mechanisms.750 Yet, the Bill does not currently seek to establish minimum quality standards for the operation of any of these processes.751 Nor does it establish how Ofcom will assess how easy to access, easy to use, or transparent a service provider’s reporting or complaints process is.
443.The Bill should establish proportionate minimum standards for the highest risk providers’ reports, complaints, and redress mechanisms as set out in a mandatory code of practice prepared by Ofcom.
444.We recommend a requirement on the face of the Bill for Ofcom to set out: i) how they will assess the a) ease of use; b) accessibility and c) transparency of a service’s complaints process for d) adults; e) children; and g) disabled people f) vulnerable adults; ii) what steps Ofcom will be able to take if it finds any of these processes wanting; and iii) how Ofcom will ensure that requirements to operate complaint, reporting and redress mechanisms are proportionate for smaller in-scope providers.
445.Clause 15 (3)(c) should be amended so that it reads “is easy to access, including for disabled people and those with learning difficulties”.
446.Providers of the highest risk services should have to give quarterly statistics to Ofcom on:
ii)User reports broken down by the reason the report was made;
iii)Number of actionable user reports;
iv)Actionable user reports broken down by the reason the report was made;
v)How long it took the service provider to respond to i) all user reports; ii) actionable user reports;
vi)What response was made to actionable user reports;
vii)Number of user complaints received;
viii)Number of actionable user complaints;
ix)How long it took the service provider to respond to i) all user complaints; ii) actionable user complaints;
x)What response was made to actionable user complaints;
xi)How many pieces of user content were taken down;
xii)How many pieces of content that were taken down were later reinstated;
xiii)The grounds on which content that was reinstated was reinstated;
xiv)How long it took the service provider to reinstate a piece of content that was later reinstated.
447.The Bill mandates that user-to-user services must provide opportunities for individuals to make complaints and provides service providers with the opportunity to appeal decisions made by Ofcom (Clause 104), Ofcom notices (Clause 105) and to make super-complaints (Clause 106).
448.The explanatory notes which accompanied the draft Bill explained that “a body representing the interests of UK users of regulated services, or members of the public can make a super-complaint to Ofcom about any feature of one of more regulated services, or the conduct of one or more providers of such services.”752 The super-complaints measure is useful for reporting multiple and widespread suspected breaches, for example the widespread bullying or abuse of a class or group. It also supports the transparency objective by enabling group complaints against powerful companies. However, a super-complaint may only be made if “the complaint is of particular importance” or if “the complaint relates to [or] impacts on a particularly large number of users of the service or members of the public (106(2)). As such, there is no right for an individual to seek external redress under the Bill as it is currently drafted.
449.We heard four principal arguments in favour of a new external appeals mechanism for individuals once internal routes have been exhausted. Firstly, that it would provide a source of redress to victims of online abuse of the kind described above once they have exhausted a service provider’s internal complaints process.753 Secondly, we heard arguments for the addition of an appeals mechanism on the grounds of consumer protection.754 Thirdly, that it would empower users if they were consulted on mechanisms for redress, whilst addressing the “current imbalance between democratic ‘people’ power and the power of platforms”.755 Finally, we received submissions which argued there should be an external appeals process to prevent over-enforcement and to protect the freedom of expression of individuals who feel their content has been unfairly removed or demoted.756
450.Our proposed external redress process would not replace service providers’ internal processes or run concurrently to them, nor would it address individual complaints about individual pieces of content or interactions. Rather, for a victim of sustained and significant online harm, someone who has been banned from a service or who had their posts repeatedly and systematically removed, this new redress mechanism would give them an additional body to appeal those decisions after they had come to the end of a service provider’s internal process.
451.In order for an external redress process to work, clear direction is needed in the Bill about Ofcom’s responsibility to set quality standards for service provider’s internal complaints procedures, and in relation to complaints about failures to meet those standards. We hope that the Government will consider our recommendations in this area, and that by improving the quality of service providers’ internal complaints procedures, any system of external redress will be needed only rarely and for the most serious cases.
452.Ofcom has said that if they were the body designated to take individual complaints “that could not just overwhelm us in volume but conflict a bit with the role that the Bill gives us, which is a strategic one looking at overall systems and processes” and that they would prefer individual complaints to be handled by a specially appointed Ombudsman.757 The ICO were similarly cautious about making Ofcom responsible for handling individual complaints: “if individual complaints could come to a different organisation, that might be a way to go, and then Ofcom could learn from the experience of those individuals.”758
453.Our hope is that by improving the quality of service providers’ complaints procedures, the burden on any external redress process will be lessened, but a stronger internal process is no substitute for the rigour of independent oversight. Nevertheless, one of the primary challenges to establishing a redress mechanism for individuals is the high number of potential claimants. The ICO cautioned: “imagine the millions of complaints for take-down requests that might go to an organisation such as Ofcom.”759 There are over 48.5 million Facebook users in the UK, and around 28.8 million on Instagram.760 However, we note that many external redress systems only apply once the internal process has been exhausted. For example, Ofcom only intervenes once a complainant has exhausted the BBC’s internal process.761 Furthermore, we note that the advent of the General Data Protection Regulation (GPDR) did not bring about an overwhelming surge of cases for breach of data protection law. This may be for a number of reasons including companies preferring to settle claims, that potential litigants are unaware of their rights, and/or because they are put off by the prospect of making a claim for a relatively low level of damages. We envisage the external complaints process as a last resort for users who have suffered serious harm on services. It is an important step towards greater transparency and clarity in service provider’s moderation decisions which greatly outweighs the potential for misuse by users.
454.We note with interest South West Grid for Learning’s Report Harmful Content, which offers online users an opportunity to report harmful online content, as well as an impartial dispute resolution service for users who have exhausted a service’s internal complaint procedures.762 We suggest that the Department look to Report Harmful Content as a potential model for what such an Ombudsman could look like.763
455.The Secretary of State was reluctant to commit to introducing an Ombudsman:
“I know that [Dame] Melanie at Ofcom raised a point about an ombudsman, which is a slow and onerous process. We do not want to get into that. We want to get into making platforms behave responsibly as quickly as possible, under a legal framework, and that is what we are focused on.”764
In a letter to the Committee, the Government stated that:
“Although Ofcom will not investigate or arbitrate on individual complaints (owing to the likelihood of becoming overwhelmed by sheer volume), it will be possible for individuals to submit complaints to Ofcom. Ofcom will use aggregate data from user complaints to inform its horizon scanning, research supervision and enforcement activity.”765
456.We support the Government’s ambition to make service providers behave responsibly, and by agreeing our recommendations the requirements of the Bill will bring about better responses from service providers to user complaints. However, the fact remains that service providers’ user complaints processes are often obscure, undemocratic, and without external safeguards to ensure that users are treated fairly and consistently. It is only through the introduction of an external redress mechanism that service providers can truly be held to account for their decisions as they impact individuals.
457.The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to significant, demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution. The right to complain to this Ombudsman should be limited to users to those i) who have exhausted the internal complaints process with the service provider against which they are making their complaint and ii) who have either a) suffered serious or sustained harm on the service or b) had their content repeatedly taken down. There should be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition to handling these complaints the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an “eligible entity” to make super-complaints.
458.A duty of care in negligence law implies a right to take anyone who fails in that duty of care to court and seek compensation. As we discussed in paragraph 54 above, the duties of care in the Bill do not create individual liability between the user and the service provider which would allow users to sue for negligence.766 We asked our witnesses how users might be able to seek redress through the courts. Mr Perrin told us Carnegie UK Trust had rejected the idea of a statutory tort767 “because we felt it would not lead to a good regulatory outcome; it favours people who have resources to sue, and the courts do not work terribly quickly and are rather overloaded at the moment.”768 Dr Harbinja told us that the approach of the courts to negligence was not in legal terms a good fit with the duties in the Bill, and ran the risk of unintended consequences.769 Prof Wilson saw some benefits in an avenue of redress through the courts “because the civil courts have a history and an experience of evaluating emotional and psychological harm and awarding damages on that basis. In a sense, they are trained to do that.”770
459.The UK GDPR allows an individual to take a data controller to court if they believe there has been a breach of data protection law in the handling of their personal data.771 The court may order compensation for emotional distress without the need for other damage, although a monetary award for other loss may be made772 including loss of earnings as a result of exacerbation of a pre-existing serious mental health condition.773
460. We believe that this Bill is an opportunity to reset the relationship between service providers and users. While we recognise the resource challenges both for individuals in accessing the courts and the courts themselves, we think the importance of issues in this Bill requires that users have a right of redress in the courts. We recommend the Government develop a bespoke route of appeal in the courts to allow users to sue providers for failure to meet their obligations under the Act.
461.Mr Russell, who campaigns to reduce suicide and self-harm in young people, told us of his distressing experiences in trying to access the digital data of his 14-year-old daughter Molly Russell, who tragically died in 2017. Digital data does not form part of a deceased person’s estate, and next of kin who want access have to apply to each tech company which each have their own processes for such requests. Mr Russell told us that a “typical tech company response” was:
The death certificate.
A court document that confirms you are the legal personal representative of the decedent.774
Mr Russell’s distressing experiences are sadly not unique. We have also heard about the distress experienced by the bereaved parents of Frankie Thomas in trying to access their deceased daughter’s data.
462.Mr Russell also raised concerns about access to digital data for coroners and other investigatory and regulatory authorities. He asked the Committee to consider measures to ensure digital data is available to investigators and, above all, that other bereaved parents do not have to experience what his family has gone through in accessing the social media of their children.
463.Bereaved parents who are looking for answers to the tragic deaths of their children in their digital data should not have to struggle through multiple, lengthy, bureaucratic processes to access that data. We recognise that an automatic right to a child’s data would raise privacy and child safety concerns. At the same time, we believe there is more than could be done to make the process more proportionate, straightforward and humane. We recommend that the Government undertake a consultation on how the law, and service’s terms and conditions, can be reformed to give access to data to parents when it is safe, lawful and appropriate to do so. The Government should also investigate whether the regulator could play a role in facilitating co-operation between the major online service providers to establish a single consistent process or point of application.
464.We also recommend Ofcom, the Information Commissioner and the Chief Coroner review the powers of coroners to ensure that they can access digital data following the death of a child. We recommend the Government legislate, if it is required, to ensure that coroners are not obstructed by service providers when they require access to digital data. We recommend that guidance is issued to coroners and regulatory authorities to ensure they are aware of their powers in dealing with service providers and of the types of cases where digital data is likely to be relevant. Our expectation is that the Government will look to implement the outcomes of these consultations in the Bill during its parliamentary passage.
743 Written evidence from Ms. Daphne Keller (Director, Program on Platform Regulation at Stanford Cyber Policy Center) (OSB0057); Q 53
744 Written evidence from Professor Clare McGlynn (Professor of Law at Durham University) (OSB0014), p 7. See also written evidence from: Professional Players Federation (OSB0035), p 3; Centenary Action Group, Glitch, Antisemitism Policy Trust, Stonewall, Women’s Aid, Compassion in Politics, End Violence Against Women Coalition, Imkaan, Inclusion London, The Traveller Movement (OSB0047) p 6; RSA (Royal Society for the Encouragement of Arts, Manufactures and Commerce) (OSB0070), p 6
748 Written evidence from Professor Clare McGlynn (Professor of Law at Durham University) (OSB0014), p 8
749 See, for example: Facebook, ‘How do I appeal the removal of content on Facebook for copyright reasons?’: https://en-gb.facebook.com/help/194353905193770 [accessed 1 December 2021]; Twitter, ‘Appeal an account suspension or locked account’: https://help.twitter.com/forms/general [accessed 1 December 2021]; Instagram, ‘I don’t think Instagram should have taken down my post’: https://www.facebook.com/help/instagram/280908123309761 [accessed 1 December 2021]
752 Explanatory Notes to the draft Online Safety Bill [Bill CP 405-EN]
754 Written evidence from Parent Zone (OSB0124), p 4. See Written evidence from Competition and Markets Authority (OSB0160), point 5, which argues that the Draft Bill “risks inadvertently setting a lower standard of consumer protection on platforms for economic and financial harms than that already envisaged by current law, and established by the CMA’s enforcement work”.
756 Written evidence from: RSA (Royal Society for the Encouragement of Arts, Manufactures and Commerce) (OSB0070), p 6; Demos (OSB0159), 48; Ms. Daphne Keller (Director, Program on Platform Regulation at Stanford Cyber Policy Center) (OSB0057).
760 Statista, ‘United Kingdom: Facebook users 2021, by age group’: https://www.statista.com/statistics/1030055/facebook-users-united-kingdom [accessed 1 December 2021]; Statista, ‘United Kingdom: monthly Instagram users 2018–2021’: https://www.statista.com/statistics/1018494/instagram-users-united-kingdom [accessed 1 December 2021]
761 Ofcom, ‘Complain about the BBC’: https://ofcomforms.secure.force.com/formentry/SitesFormBBCIntroductory?complaintType=SitesFormBBCOnlineMaterial [accessed 1 December 2021]
763 ‘Report Harmful Content’: https://reportharmfulcontent.com/?lang=en [accessed 1 December 2021]
766 Draft Online Safety Bill, CP 405, May 2021, Clause 6 and Clause 18
767 Right to bring a claim for breach of the duties in the Bill
771 Data Protection Act 2018, section 167(1)
772 Data Protection Act 2018, section 168(1) and Fieldfischer, ‘Article 82 UK GDPR’: https://ukgdpr.fieldfisher.com/chapter-8/article-82-gdpr/ [accessed 1 December 2021]
773 Grinyer v Plymouth Hospitals NH Trust [2012] EWCA Civ 1043