276.Balancing people’s right to freedom of expression with online safety was one of the most controversial subjects in our inquiry. The draft Bill attempts to tackle this in part by the inclusion of Clause 12, a duty on service providers to “have regard to the importance of protecting users’ right to freedom of expression within the law, and protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures.”472 This duty applies to all service providers, with additional responsibilities placed on Category 1 providers, who must carry out impact assessments and specify how they fulfil this duty in their terms of service.473 Clause 23 places a similar duty on search services.474
277.Article 10 of the ECHR, incorporated into UK law by the Human Rights Act 1998, states:
“1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.”475
Article 10 is not an absolute right but may be restricted by the state “in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.”476 In Handyside v United Kingdom, the European Court of Human Rights confirmed that Article 10 includes the right to say things that “offend, shock or disturb the State or any sector of the population”.477
278.Any restriction must be “prescribed by law”,478 “necessary in a democratic society”,479 and proportionate. We have heard concerns that the draft Bill does not fulfil these criteria, particularly in its definition of harm to adults. Mr d’Ancona told us:
“I think that with words like “harm” and “safety” there is a slippage or a kind of semantic mission creep going on in their use. We used to talk about safety, and what we really meant was physical safety. Now, when people talk about safety, they often mean convenience or comfort. It is not the task of democratic legislators to make people feel comfortable. I think that is stretching the job description.”480
279.Since the publication of the White Paper several rights organisations have talked of the “chilling effect” the legislation will have on freedom of expression online.481 This encompasses both censorship by moderation and removal of content, as discussed in Chapter 2, and the self-censorship that may follow when people are unsure what they can and cannot say online. We have also heard that, due to the volume of content online, the only way service providers can approach their duties is by using AI, and that this risked “overzealous removal of legitimate speech and the limiting of freedom of expression.”482 The size of the penalties, and the potential for criminal liability for managers, have also been highlighted as potential causes of excessive censorship. Open Rights Group said:
“It will create a culture of fear which results in a phenomenon known as “collateral censorship”, where service providers and companies feel they have no choice but to take down vast swathes of content which may be perfectly legal, perfectly subjective, and perfectly harmless, less they face sanctions, penalties, and even personal arrests for getting it wrong.”483
280.At the same time, we heard that inaction and a lack of regulation on online safety is impacting the freedom of expression of people now, particularly marginalised groups. Ms Jankowicz told us:
“The idea [of online abuse] is to quash women’s right to freedom of expression. It is to take them out of the public eye, because women, according to the purveyors of this abuse, do not deserve to be there. We need to stand up to that. We need to protect that right to work, that right to freedom of expression. That is at the core of this misogynistic abuse.”484
Hope Not Hate said:
“If done properly, the inclusion of legal but harmful content within the scope of this legislation could dramatically increase the ability for a wider range of people to exercise their free speech online by increasing the plurality of voices on platforms, especially from minority and persecuted communities.”485
281.Prof Haidt told us that “freedom of speech is not freedom of reach”, and that instead of looking at taking down individual pieces of content, service providers should focus on reducing amplification.486 Ms Carlo said she remained uneasy about this, as it would still involve suppression of legal content.487 Ms Zhang also urged caution, as it could set an “unfortunate precedent” that could be copied by authoritarian countries and used to shut down protest.488 However, Maria Ressa told us “we definitely need legislation”, adding that “doing nothing pushes the world closer to fascism.”489
282.While the intention behind Clause 12 (and Clause 23 for search services) has been welcomed, many have noted that the phrasing seems weak by comparison to the other duties placed on service providers, particularly that they must “have regard to” the importance of protecting people’s rights. Glassdoor said: “There is a strong risk that process-based safety duties will win out when companies are faced with the task of determining where the greater regulatory risk lies.”490 The Adam Smith Institute also felt that the duty in Clause 12 is “overwritten” by the safety duties, which they describe as “extraordinarily broad and threatening”.491 Mr Millar noted that the draft Bill put competing duties on service providers without providing guidance on how they could be balanced.492
283.We asked why the Government had chosen this phrasing, over something stronger such as “ensuring actions are consistent with”. DCMS Minister Chris Philp MP told us:
“‘consistent with’, and other similar formulations, might suggest that service providers owe a duty to their users under the ECHR or Human Rights Act 1998. The ECHR only imposes obligations in relation to freedom of expression on public bodies, and private actors are not required to uphold freedom of expression.”493
The ECHR does apply to the UK Government, and to Ofcom, and so their directions to service providers must still comply with Article 10.
284.We propose a series of recommendations throughout this report to strengthen protection for freedom of expression. These include greater independence for Ofcom, routes for individual redress beyond service providers, tighter definitions around content that creates a risk of harm, a greater emphasis on safety by design, a broader requirement to be consistent in the applications of terms of service, stronger minimum standards and mandatory codes of practice set by Ofcom (who are required to be compliant with human rights law), and stronger protections for news publisher content. We believe these will be more effective than adjustments to the wording of Clause 12.
285.Journalism and political debate are fundamental aspects of freedom of expression in a democratic society and receive a higher level of protection than everyday speech under the ECHR.494 Following concerns raised about the impact of regulation on freedom of expression and particularly media freedom in the White Paper consultation, the draft Bill places specific duties on Category 1 providers aimed at preventing excessive moderation of journalism or content of democratic importance. While these exemptions have been broadly welcomed, we have heard concerns about their definitions and the ability this will give the service providers to apply them consistently.
286.The draft Bill attempts to protect journalism and media freedom by two means. The first is by placing “news publisher content” outside of scope entirely, including on publishers’ own websites, in search results and if it is shared on user-to-user services, providing it is shared in full and without editing.495 “Recognised news publishers” are defined by Clause 40, which includes among others the requirements for them to produce “news-related material” with editorial control, a complaints procedure, a registered business address in the UK, and to be subject to a standards code.496 Comments on a news publisher’s site are also exempt, by means of the “limited functionality” exemption.497
287.Witnesses such as Hacked Off and the Independent Media Association disagreed with the definition used for “recognised news publishers”, feeling it could leave content that creates a risk of harm outside of scope while failing to protect independent publishers, who may not have a registered office.498 Hacked Off gave examples of websites that may qualify for the exemption, despite publishing racist stories and conspiracy theories.499 IMPRESS suggested that the inclusion of requirements for a standards code and complaints procedure, which make the draft Bill’s definition differ from that of a “relevant publisher” used in the Crime and Courts Act 2013, offer no extra protection, as they can be set by the publishers themselves with no minimum criteria.500 The Professional Publishers Association called for the “news-related material” requirement to be changed in favour of including consumer magazines and business media, which may currently fall outside of the definition if not focussing on current affairs.501
288.A concern we heard from news organisations was that it is “news publisher content” that is exempt under Clause 39, rather than news publishers’ websites in their entirety being explicitly placed outside of scope. While the limited functionality exemption covers below-the-line comments, News Media Association told us they were concerned because this can be repealed by the Secretary of State, and because the inclusion of other features such as games or online workshops may bring a news publisher’s site into scope of the regulations.502 The National Union of Journalists took a different view on the exclusion of comments sections:
“Material that doesn’t pass the editorial or legal threshold for other published material—as abuse, threats and defamatory content clearly does not—should not be publishable on the sites of media outlets in ‘below the line’ commentary dressed up as reader engagement.”503
We heard in evidence that newspapers can and have been held liable for comments on their own sites, that they already risk assess and apply moderation techniques, and that the Independent Press Standards Organisation has regulatory oversight.504
289.Ofcom noted in its recent report on the future of media plurality that “online intermediaries and their algorithms control the prominence they give to different news sources and stories” and the basis on which they “serve news via their algorithms is not sufficiently transparent.”505 These were identified as risks to media plurality in the UK that are not captured under the existing regulatory framework. We have heard throughout this inquiry about the power the biggest service providers hold in controlling the news and information people see, and DMG Media detail in their written evidence the impact search engine control has on the visibility of different news sites.506 We heard powerful evidence from Ms Ressa, Ms Zhang and Ms Haugen about the influence social media companies have both in the UK and abroad, particularly the global South, as they monopolise the news.507 On the other hand, Dr Martin Moore, Senior Lecturer in Political Communication Education and Director of the Centre for the Study of Media, Communication and Power at King’s College London, discussed how identifying recognised news publishers may have unwanted consequences by creating a list of statutory-recognised news publishers. 508
290.Work is ongoing in this area with the creation of the Digital Markets Unit (DMU) within the Competition and Markets Authority (CMA) to promote competition509, with powers to be provided by a Digital Competition Bill. DMG Media told us that, ahead of the establishment of the DMU’s regulatory powers, it was important that the online safety legislation included a full exemption for news publishers, to prevent services such as Google using it to discriminate against “those it does not favour”.510
291.We recommend that Ofcom be required to produce an annual report on the impact of regulated services on media plurality.
292.In addition to the exemption for recognised news publishers, Clause 14 places a duty on Category 1 providers to protect journalistic content by using “systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account” when making moderation decisions.511 This will apply beyond recognised news publishers and cover citizen journalism. Service providers must also establish a dedicated and expedited complaints procedure so that journalists may appeal when content is moderated or removed, and content must be swiftly reinstated if complaints are upheld.512 Journalistic content is defined as either news publisher content, or regulated content that is “generated for the purposes of journalism”, and UK linked.513 During our inquiry, we have heard concerns both about the definitions used and the duty itself.
293.While there are evidently some concerns about the “news publisher” definition, we heard that “journalistic content” was much harder to apply, with providers having to determine whether content had been generated for the “purposes of journalism”. The House of Lords Communications and Digital Committee thought that the inclusion of citizen journalism could overwhelm the appeals system and recommended that citizen journalism should be clearly defined. The Government said in its response that further clarification was not necessary and that it was intended to be interpreted broadly to include “content produced by individuals, freelancers and others.”514 Facebook told us the broad definition could be abused by those claiming to be citizen journalists “to ensure their content is given protections.”515 Mr Millar offered a definition: “content by which the user who generates it disseminates information and ideas to the public (or a section of the public) which they reasonably perceive to be of public interest.”516
294.The duty to protect journalistic content does not mean such content cannot be removed in any circumstance. Clause 14 states that Category 1 services must ensure “the importance of the free expression of journalistic content is taken into account” when making moderation decisions and provide an expedited complaints procedure with swift reinstatement of content when appeals are upheld. It does not mandate that all such appeals be upheld simply because the content is journalistic—providers may still remove content they judge to create a risk of harm that breaks their terms of service. In the draft Bill they will be required to make their policies clear and enforce them consistently, and it will be for Ofcom to determine if they are balancing the application of their duties correctly.
295.We have heard concern from news publishers that, while their content is placed outside of scope, and Clause 14 seemingly offers further protection, there is not a strong enough disincentive to stop providers from removing it at all. Peter Wright, Editor Emeritus at DMG Media, told us that he feared that the use of algorithms may continue to mean news content is caught by automatic moderation as they are a “blunt instrument”, and Alison Gow, President of the Society of Editors, added that by the time the issue is caught by human moderators, it is often too late.517 Ms Haugen gave us the example that “76 per cent of counterterrorism speech in an at-risk country was getting flagged as terrorism and taken down” and that “any system where the solution is AI is a system that is going to fail.”518 Mr Perrin told us that since traditional print media is already self-regulated, “introducing a new layer when one is trying to regulate in a complex new sector was always going to be counterproductive.”519 Ms Ressa warned that algorithmic design and the “incentive scheme” of the internet was already pushing people away from quality journalism “towards clickbait”, and told us “the news agenda needs to be protected.”520 We heard that the “perishable” nature of news meant that even an expedited complaints system may be too slow, and instead providers should be required not to restrict access to news publisher content.521 We asked the Government for their views on this:
“‘Positive requirements’, which actively prevent social media companies from removing any news publisher content, regardless of whether they consider it to comply with their terms of service or to be suitable for their audience, carry significant risk. This approach would constitute a significant interference with private companies’ ability to set their own terms and conditions regarding legal content. Moreover, it could create perverse outcomes if companies were prevented from removing this type of content in all circumstances.”522
296.Alongside the protections for journalism, the draft Bill contains a duty to protect “content of democratic importance”. Clause 13 is similar to Clause 14’s protections for journalism, in that providers must ensure their systems and processes are “designed to ensure that the importance of free expression of content of democratic importance is taken into account” when making moderation decisions or taking action against users.523 It requires clear and transparent policies to be published and providers must ensure they are applied equally across a “diversity of political opinion”, but unlike the protection for journalists there is no dedicated complaints route to appeal decisions, so people would need to use the standard complaints process. The definition encompasses news publisher content, and regulated content that “is or appears to be specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom.”524
297.Legal to Say, Legal to Type said the exemption, alongside that for journalism, creates a two-tier system, with “free speech for journalists and politicians, and censorship for ordinary citizens.”525 They queried whether individuals could claim protection for a broad spectrum of content that creates a risk of harm if they have stood for office, a concern also reflected by Mr Stone, who gave the example of misogynistic abuse of a Women’s Equality Party candidate by another candidate.526 It should be noted that as with the journalism exemption, Clause 13 does not mean content cannot be removed at all when judged to be create a risk of harm, and illegal content would still be required to be removed. It may make service providers more cautious about removal, as it is intended to, but some worry that the broad definitions of this and the journalism exemption create loopholes that may undermine confidence in the legislation.527 Facebook said that “private companies should not be the arbiters of what constitutes journalism or what is democratically important.”528
298.The Explanatory Notes give as examples “content promoting or opposing government policy and content promoting or opposing a political party.”529 The House of Lords Communications and Digital Committee felt that:
“The definition of ‘content of democratic importance’ in the draft Bill is too narrow. It should be expanded to ensure that contributions to all political debates—not only those debates which are about, or initiated by, politicians and political parties, and about policy, rather than social change—would be covered.”530
In its response, the Government said that the definition “covers all political debates, including where these are advanced by grassroots campaigns and smaller parties” and that the measures were designed “to protect content of democratic importance, rather than to protect specific actors.”531 It was also noted in our session with ministers that currently there are no protections in place for such speech and journalistic content, with Mr Philp stating on the subject of the definitions that “while I am sure they are not perfect and we can try to improve them, currently there is nothing at all.”532
299.In their response to the House of Lords Communications and Digital Committee, the Government said that when deciding how to balance the safety duties with protecting democratic content “platforms will need to consider whether the public interest in seeing some types of content outweighs the risk of harm it creates, or vice versa.”533 We heard providers sometimes censor what can be discussed elsewhere, including content that could be “of great public importance”. Ms Carlo gave the example of discussions around the origins of COVID-19 being removed from social media, which would have been allowed in mainstream newspapers.534
300.The Government has said these protections are designed to protect content rather than specific actors. However, a great deal of evidence we have heard centred on people’s understanding of what a journalist may be, and who is entitled to speak about matters of democratic importance. On this subject, responding to a question on whether someone’s standing as a political candidate would give their speech unwarranted protection, Mr Millar suggested that the person’s identity was of less importance than whether what they were saying was in the public interest.535 He elaborated in writing:
“In the domestic and international law of free speech it is well established that speech on matters of public interest in a democratic society is deserving of the strongest protection. Political speech is the paradigm example of this. Journalism on such matters is also particularly strongly protected.
But speech on matters of public interest in a democratic society is a flexible category of speech. It is not closed. It covers more than just political speech/speech about the activities of government and/or journalism on such matters.”536
We also heard in our roundtable event on freedom of expression on 3 November that “public interest” may be a term that is better understood than the novel definitions of “journalistic content” and “content of democratic importance” and would more easily catch discussions on topics that may not be subject to high level political discussion.537
301.This term has been used by the Law Commission in developing their proposals for a harm-based offence, which the Government have indicated they are minded to implement.538 One test that would need to be met is the defendant lacking a “reasonable excuse”, and in deciding if this has been met “the court must have regard to whether the communication was or was meant as a contribution to a matter of public interest.”539 It is already used in relation to balancing the right of an individual to privacy with the freedom of the press, in whistle-blowing protections, and in the Freedom of Information Act, as well as the ECHR and Human Rights Act 1998, to which Ofcom is subject.540 Given the precedent for use of this term more generally, and its likely role in determining whether a new harm-based offence has been committed, we wrote to the Government to seek their views on using this instead of the novel definitions in Clauses 13 and 14. They said in response:
“ … given the complexities of defining what the ‘public interest’ is, there may be concerns about requiring private companies to define what types of content are in the public interest. Our existing approach sets out more precisely the types of content that the government believes it is particularly important to protect.”541
302.While determining public interest carries some of the inherent difficulties of asking providers to identify journalistic or democratic content, there is guidance available already on applying public interest tests due to their use elsewhere. The ICO, for example, provides extensive guidance on applying public interest tests to Freedom of Information requests for public bodies542
303.Clause 13 and 14 apply only to Category 1 providers who are subject to the extra duty to address content that is harmful to adults, which could carry a greater risk of censorship than the duties around illegal content.
304.We recommend that the news publisher content exemption is strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence, or which has been found to be unlawful by order of a court within the appropriate jurisdiction. We recommend that the Government look at how bad actors can be excluded from the concept of news publisher. We suggest that they may wish to exclude those that have been repeatedly found to be in breach of The Ofcom Broadcasting Code, or are publications owned by foreign Governments. Ofcom should also examine the use of new or existing registers of publishers. We are concerned that some consumer and business magazines, and academic journals, may not be covered by the Clause 40 exemptions. We recommend that the Department consult with the relevant industry bodies to see how the exemption might be amended to cover this off, without creating loopholes in the legislation.
305.The draft Bill already makes a distinction between “news publisher content” and citizen journalism, in recognition that the former is subject to editorial control and there are existing mechanisms for accountability. There is also a clear difference between the categories, as one is based on “who” is sharing the content, and the other focuses on the purpose of the content, rather than the identity of those behind it. For both citizen journalism and content of democratic importance, the justification for special consideration appears to be that they are in the public interest to be shared. This should therefore be key to any final definition and providers will require guidance as to how to balance the risk of harm with the public interest. It is not, nor is it intended to be, a blanket exemption in the same way as that for news publisher content, but a counterbalance to prevent overzealous moderation, particularly in borderline cases.
306.Our recommendations to narrowly define content that is harmful to adults by way of reference to existing law should provide some of the extra clarity service providers need to help protect freedom of expression. At the same time, journalism and content of democratic importance have long been recognised as vital in a democratic society and should be given specific consideration and protection by providers, who have significant influence over the information we see. We have heard concerns around the definitions used however, and about the ability of the providers to interpret and apply them consistently. We feel that “democratic importance” may be both too broad—creating a loophole to be exploited by bad actors—and too narrow—excluding large parts of civil society. Similarly, we are concerned that any definition of journalistic content that is designed to capture citizen journalism would be so broad it would render the consistent application of the requirement almost impossible, and see the expedited complaints route overwhelmed by people claiming without merit to be journalists in order to have their content reinstated. “Public interest” might be more useful in ensuring that content and activity is judged on its merit, rather than its author.
307.We recommend that the existing protections around journalistic content and content of democratic importance should be replaced by a single statutory requirement to have proportionate systems and process to protect ‘content where there are reasonable grounds to believe it will be in the public interest’. Examples of content that would be likely to be in the public interest would be journalistic content, contributions to political or societal debate and whistleblowing. Ofcom should produce a binding Code of Practice on steps to be taken to protect such content and guidance on what is likely to be in the public interest, based on their existing experience and case law. This should include guidance on how appeals can be swiftly and fairly considered. Ofcom should provide guidance to companies in cases of systemic, unjustified take down of content that is likely to be in the public interest. This would amount to a failure to safeguard freedom of expression as required by the objectives of the legislation.
472 Draft Online Safety Bill, CP 405, May 2021, Clause 12(2)
473 Draft Online Safety Bill, CP 405, May 2021, Clause 12(3), (5)
474 Draft Online Safety Bill, CP 405, May 2021, Clause 23
475 Council of Europe: European Court of Human Rights, European Convention on Human Rights (August 2021) p 12: https://www.echr.coe.int/documents/convention_eng.pdf [accessed 22 November 2021]
476 Council of Europe: European Court of Human Rights, European Convention on Human Rights (August 2021) p 12: https://www.echr.coe.int/documents/convention_eng.pdf [accessed 22 November 2021]
477 European Court of Human Rights, Handyside v United Kingdom (December 1976): https://www.bailii.org/eu/cases/ECHR/1976/5.html [accessed 7 December 2021]
478 Foreseeability is inherent to fulfilling the ‘prescribed by law’ requirement in Article 10 cases: Sunday Times v United Kingdom (1979) 2 EHRR 245
479 “Necessary” has been strongly interpreted: it is not synonymous with “indispensable”, neither has it the flexibility of such expressions as “admissible”, “ordinary”, “useful”, “reasonable” or “desirable”: Handyside v United Kingdom (1976) 1 EHRR 737, 754, para 48, Shayler v (2002) UKHL 11, (2003) AC 247, per Lord Bingham, para 23
481 For example: Big Brother Watch, Big Brother Watch’s Response to the Online Harms White Paper Consultation (July 2019): https://bigbrotherwatch.org.uk/wp-content/uploads/2020/02/Big-Brother-Watch-consultation-response-on-The-Online-Harms-White-Paper-July-2019.pdf [accessed 22 November 2021]
494 On ‘political speech’ see R v BBC, ex p ProLife Alliance (2003) UKHL 23
495 Draft Online Safety Bill, CP 405, May 2021, Clause 39(8)
496 Draft Online Safety Bill, CP 405, May 2021, Clause 40(2)
497 Draft Online Safety Bill, CP 405, May 2021, Schedule 1, part 5
505 Ofcom, The future of media plurality in the UK (November 2021) p 1: https://www.ofcom.org.uk/__data/assets/pdf_file/0019/228124/statement-future-of-media-plurality.pdf [accessed 30 November 2021]
509 Competition and Markets Authority, ‘Digital Markets Unit’: https://www.gov.uk/government/collections/digital-markets-unit [accessed 23 November 2021]
511 Draft Online Safety Bill, CP 405, May 2021, Clause 14(2)
512 Draft Online Safety Bill, CP 405, May 2021, Clause 14(3),(4),(5)
513 Draft Online Safety Bill, CP 405, May 2021, Clause 14(8). ‘UK-linked’ means the UK is a target market, or the content is or is likely to be of interest to a significant number of United Kingdom users.
514 Department of Digital, Culture , Media and Sport, ‘Government response to the House of Lords Communications Committee’s report on Freedom of Expression in the Digital Age’: https://committees.parliament.uk/publications/7704/documents/80449/default/ [accessed 30 November 2021]
523 Draft Online Safety Bill, CP 405, May 2021, Clause 13(2)
524 Draft Online Safety Bill, CP 405, May 2021, Clause 13(5)
529 Explanatory Notes to the draft Online Safety Bill [Bill CP 405-EN], para 95
530 Communications and Digital Committee, Free for All? Freedom of Expression in the Digital Age (1st Report, Session 2021–22, HL Paper 54), para 80
531 Department for Digital, Culture, Media and Sport, Government response to the House of Lords Communications Committee’s report Freedom of Expression in the Digital Age (October 2021), para 7–8: https://committees.parliament.uk/publications/7704/documents/80449/default/ [accessed 9 December 2021]
533 Department for Digital, Culture, Media and Sport, Government response to the House of Lords Communications Committee’s report Freedom of Expression in the Digital Age (October 2021), para 9: https://committees.parliament.uk/publications/7704/documents/80449/default/ [accessed 9 December 2021]
537 Written evidence from LSE, Department of Media & Communications—Freedom of Expression Roundtable (OSB0247)
539 The Law Commission, Modernising Communication Offences: a summary of the final report, (July 2021) p 8: https://s3-eu-west-2.amazonaws.com/lawcom-prod-storage-11jsxou24uy7q/uploads/2021/07/Summary-of-Modernising-Communications-Offences-2021.pdf [accessed 1 November 2021]
540 European Court of Human Rights, Guide on Article 10 of the European Convention on Human Rights: Freedom of Expression (April 2021): https://www.echr.coe.int/documents/guide_art_10_eng.pdf [accessed 23 November 2021]; Information Commissioner’s Office, The Public Interest Test: Freedom of Information Act: https://ico.org.uk/media/for-organisations/documents/1183/the_public_interest_test.pdf [accessed 1 December 2021]
542 Information Commissioner’s Office, The Public Interest Test: Freedom of Information Act: https://ico.org.uk/media/for-organisations/documents/1183/the_public_interest_test.pdf [accessed 1 December 2021]