Misinformation in the COVID-19 Infodemic Contents

3Public sector response

Public service broadcasters

The turn to public service broadcasting

51.In contrast to the lack and loss of trust in social media as a source of news, evidence has showed that people have turned increasingly to public service broadcasters (PSBs) during the crisis. Weekly research commissioned by Ofcom has found that, by week 12 of the UK lockdown, 60% of people felt that broadcasters were their most important source of news. 84% had turned to broadcasters for news in the previous week.180 This is supported by PSB viewing figures. In the week of 23 March, for instance, BBC TV Network News reached 44 million people, the highest number since the 2003 Iraq War.181 Channel 4’s COVID-19 documentaries reached 9.9 million, including over 10% of 16–34s and over 15% of audiences described as ‘BAME’.182 Between 23 March and 16 April, BBC One special broadcasts reached almost two-thirds of the UK population;183 over the month of March, Channel 4 News was watched by almost one-quarter.184 Viewership increases have extended to regional news, radio and BBC News Online, the latter of which attracted 84 million unique views in the week commencing 16 March, far exceeding the previous record of 52 million set during the 2019 general election.185

52.Written and oral evidence argued that the role of regulation was significant in the turn to public service broadcasting. Stacie Hoffmann argued that social media regulation should be as robust as that applied to broadcasting, and should follow similar principles: “[i]t is a completely different ecosystem […], but there definitely should be the same expectations and the same kind of levels of restrictions or expectations on the actors involved as there is in current regulations for traditional media”.186 Campaign group Hacked Off, however, argued that regulation should be more robust than the self-regulatory regime overseen by IPSO.187 Dr. Megan Emma Smith posited that:

I would say that the print media and the television media are regulated and have obligations. If some of this misinformation appeared in their pages or on their screens, steps would be taken. Why should these other platforms be any different? I completely appreciate that they do not write the lies, they do not compose the lies, but they do facilitate the distribution of them, and that is what we have to get rid of.188

53.Channel 4’s submission posits three reasons for people’s choice of PSBs in particular over social media.189 First, the UK supports a diverse PSB ecosystem with different funding models, missions and purposes. Second, these organisations are held accountable by “an independent system of regulation” with real powers to sanction and “strict rules on accuracy and due impartiality and other detailed content standards”.190 Finally, these rules include “a clear set of quotas and requirements for the provision of high quality news and current affairs”.191 In oral evidence, Channel 4 chief executive Alex Mahon asserted explicitly that there has been “an increasing consumer awareness, and one might say backlash, against disinformation and misinformation”, particularly where “misinformation and disinformation are remaining on the tech platforms and, in some cases, being prioritised by them”, and called for “public service content [to have] a prioritised, prominent position across all these platforms”.192 YouTube maintained that it does act to support quality news, “to make sure that we are promoting their content in our top news shelf, the breaking news shelf, so we are exposing users to these outlets and helping drive traffic accordingly” (though it did not comment specifically on PSB prominence).193

54.Research has shown that the public has turned away from tech companies’ platforms as a source of trusted news and towards public sector broadcasting during the COVID-19 crisis, demonstrating a lack of trust in social media. The Government must take account of this as it develops online harms legislation over the coming months. It has already committed to naming an independent regulator; it should also look to the ‘clear set of requirements’ and ‘detailed content standards’ in broadcasting as a benchmark for quantifying and measuring the range of harms in scope of legislation.

Beyond broadcasting

55.We also found that PSBs have contributed to efforts to tackle misinformation and disinformation through other initiatives, both collaboratively and internally. Last year, the BBC set up the Trusted News Initiative (TNI) with the largest tech companies, global media organisations and independent researchers,194 with the specific aims of flagging disinformation during elections, sharing learning and promoting media education.195 Through the TNI, news organisations have put in place a shared alert system “so that content can be reviewed promptly by platforms”196 (though Facebook have stressed that it interprets this is an “information sharing exercise” rather than a “technical implementation by any party into each other’s systems”).197 We are interested whether and how this will need to adapt in response to the emergence of new platforms, such as TikTok, and new online behaviours associated with its distinct functionality and user base. The BBC also emphasises the work of its in-house BBC Monitoring disinformation team, Beyond Fake News team, User Generated Content Hub and Young Reporter project, which have separately undertaken and published research into disinformation and sought to improve media literacy for audiences in concert with the objectives of the TNI.198 Finally, both the BBC and Channel 4 contribute work alongside the fact-checking community (such as Full Fact) via BBC Reality Check, BBC Trending199 and Channel 4 News’ Coronavirus FactCheck.200 The BBC shared with us several instances of misinformation that it has tackled, such as defective directions for homemade sanitiser and scam vaccine adverts originating in Italy, claims by the South China Morning Post that raw garlic prevents infection, and an investigation into the origins of social media posts.201

56.The potential role of BBC and Channel 4 in fact-checking is great—the former thanks to their overall brand, the latter having been an early innovator with Channel 4 FactCheck. Whilst there are now a number of other fact-check organisations, none can claim nearly the same brand equity in the UK.

57.We asked Facebook and Google about how the Trusted News Initiative has fed into their efforts. Facebook offered a somewhat lukewarm response, noting that the TNI group has met once a week and described the channel as “a valuable additional potential signal for misinformation on which we can, where appropriate, take action”.202 Google, meanwhile, stressed that “this partnership builds on our existing efforts to ensure authoritative information, including the work of fact checkers, is surfaced on our platforms” and committed to supporting First Draft, a TNI partner, as part of its $6.5 million investment in fact-checking through the Google News Initiative.203 However, it struck us that this engagement could go further, such as whether there was any scope to give TNI partners access to WhatsApp accounts or automated features such as information bots that had been provided to the WHO, International Fact-Checking Network and Public Health England.

58.Resources developed by public service broadcasters such as the Trusted News Initiative show huge potential as a framework in which public and private sector can come together to ensure verified, quality news provision. However, we are concerned that tech companies’ engagement in the initiative is limited. Facebook, for example, has chosen not to provide TNI partners with accounts on WhatsApp, which could otherwise provide an independent but robust source of information of Government and public health advice. The Government should support the BBC to be more assertive in deepening private sector involvement, such as by adapting the Trusted News Initiative to changes in the social media ecosystem such as the emergence of TikTok and other new platforms. The Government and online harms regulator should use the TNI to ‘join up’ approaches to public media literacy and benefit from shared learning regarding misinformation and disinformation. It should do this in a way that respects the independence from Government and expertise of the group’s members, and not impose a top-down approach.

UK Government

Counter Disinformation Unit

59.When we asked the Secretary of State about the steps being taken against misinformation, he described the Department’s principal work as “both to understand the nature of what is going on and, in the process of that, to occasionally identify false narratives and things that the social media companies will take action to take down”.204 On 9 March, the Secretary of State announced his intention to re-establish the DCMS-led Counter Disinformation Unit, bringing together existing capability and capacity across government,205 to “help provide a comprehensive picture on the potential extent, scope and impact of disinformation”.206 Later that month, the Government announced that its Rapid Response Unit, which feeds into the DCMS Counter Disinformation Unit, would be tackling “[u]p to 70 incidents a week”.207 The announcement cited several responses where false narratives were identified, including “direct rebuttal on social media, working with platforms to remove harmful content and ensuring public health campaigns are promoted through reliable sources”.208

60.Throughout our inquiry, we raised concerns as to whether the Department has used its capability in the most effective way. On 11 March, we wrote to the Secretary of State to express support, but also ensure that the Counter Disinformation Unit was being resourced effectively.209 In response, the Secretary of State wrote that “capability is resourced full time through existing cross-government teams and there are no additional costs associated with it” as “existing structures had been monitoring for disinformation related to the disease as part of their ongoing work prior to this”.210 The letter committed to channelling outputs from the Counter Disinformation Unit to COBR through the Secretary of State and to “looking at ways to actively engage harder to reach groups”.211

61.There are lots of independent factchecking organisations already up and running. Public service broadcasters have several dedicated factchecking teams. Facebook212 and Google213 have themselves also provided funding to independent factcheckers. Full Fact has worked for several years as part of Facebook’s Third Party Fact Checking programme, and has monitored and rebuffed misleading claims that have been circulated on WhatsApp, Twitter and in the mainstream media, submitted by the public directly through an online form, or made by public figures (including parliamentarians).214 Indeed, the Government’s own webpage for its ‘Don’t Feed The Beast’ campaign against disinformation directs users to the Full Fact website alongside links to the NHS and GOV.UK sites.215 It remains, however, unclear as to how the Government engages with factcheckers like Full Fact, or disseminates its own information to frontline health services such as NHS 111 (or if these efforts are being duplicated by the health service as well). Dr. Megan Emma Smith recommended that factchecking be demonstrably independent and speciality-specific, noting the tension created by Government factchecking:

It puts politicians in an incredibly difficult position because it is an easy and, if I can say so, slightly low blow to come back at you and say, “You are politicians, you are the Government. Of course you have a vested interest in this.” It needs to be objective and it needs to be independent, and demonstrably so.216

62.Expert evidence we received has recommended where Government could add value, instead of duplicating existing efforts, particularly in the absence of online harms legislation. Professor Philip Howard, who described the Government response as “strong, and it needs to be stronger”, urged the Department to help provide independent researchers with more data, to help understand the scope and scale of the problem:

The best data we have is months old, it is not quite adequate and does not cover all the features that these social media platforms provide. The misinformation initiatives that the Government have are very important because you have the authority to collect and collate and analyse information in the public interest and the firms don’t act in the public interest. Independent researchers like myself, at [the Oxford Internet Institute], or investigative journalists, don’t have access to the same levels of information—the levels of information that we need to help fight this.217

Professor Howard specifically called for more representative samples of data on the comprehensive activity of suspicious accounts or those that have been removed, particularly where this might imply foreign interference.218

63.The Government should reconsider how the various teams submitting information to the Counter Disinformation Unit best add value to tackling the infodemic. Factchecking 70 instances of misinformation a week duplicates the work of other organisations with professional expertise in the area. Instead, the Government should focus on opening up channels with organisations that verify information in a ‘Factchecking Forum’, convened by the Counter Disinformation Unit, and share instances that are flagged by these organisations across its stakeholders, including and especially to public health organisations and all NHS trusts, key and/or frontline workers and essential businesses to prepare them for what they may be facing as a direct result of misinformation, allowing them to take appropriate precautions.

64.We recommend that the Government also empower the new online harms regulator to commission research into platforms’ actions and to ensure that companies pass on the necessary data to independent researchers and independent academics with rights of access to social media platform data. It should also engage with the Information Commissioner’s Office to ensure this is done with respect to data protection laws and data privacy. In the long term, the regulator should require tech companies to maintain ‘takedown libraries’, provide information on content takedown requests, and work with researchers and regulators to ensure this information is comprehensive and accessible. Proposals for oversight of takedowns, including redressal mechanisms, should be revisited to ensure freedom of expression is safeguarded.

Engagement with social media companies

65.Beyond leading the Counter Disinformation Unit, the DCMS has also led on engagement with the tech companies themselves. When pressed, Ministers have been bullish about the contribution of Big Tech in tackling misinformation; on 22 April, for example, the Secretary of State paid tribute to the number of different announcements from tech companies, saying “I have been impressed with how they have stepped up to the plate as part of a national and, indeed, international effort to address misinformation at this time of crisis”.219 The Minister for Digital, similarly, told the House of Lords Select Committee on Democracy and Digital Technologies that “the platforms that we are dealing with have been excellent at addressing concerns that we raise but have also come forward with ways of raising them themselves”.220

66.Tech companies have reciprocated. TikTok told us that it welcomed steps taken by the Government’s Rapid Response Unit and digital literacy campaign and “encourage Government to continue its engagement with industry as we work collectively to tackle the important area of [disinformation] and misinformation on COVID-19 and other issues that may arise in the future”.221 Google, in its second session with the Committee, noted that “we benefit from interactions like this and from cooperation with Government in continuing to improve”.222 All companies from whom we took evidence emphasised their support for the Government’s efforts. Facebook,223 Twitter224 and TikTok225 stated in evidence that they had provided the Government with pro bono advertising credit on their platforms (though ministers did not mention this to us in evidence, and we are not party to how these credits are being used). Facebook,226 Google,227 Twitter228 and TikTok229 all also asserted that they had amplified Government messaging on its platforms through various information hubs, adjusted search results and other platform-specific features.

67.In order to role model to demonstrate best practice regarding tech companies’ advertising libraries, the Government should create its own ad archive, independent of the archive made available by tech companies, to provide transparency, oversight and scrutiny about how these ad credits are being used and what information is being disseminated to the public.

Offline solutions

68.Written evidence we received emphasised the need for a comprehensive digital literacy, community engagement and school education programme. Our predecessor Committee’s Interim Report into Disinformation and ‘fake news’ called for digital literacy to be the ‘fourth pillar’ of education alongside reading, writing and maths.230 Protection Approaches, for instance, recommend that online strategies to tackle misinformation are matched by investment in offline interventions, arguing that “offline solutions to online harms remain startingly absent from policy and civil society efforts”.231 Their submission urges the Government to provide immediate resources for local community groups and schools and upskill and build capacity amongst grassroots organisations.232 Glitch, similarly, called on the Government to provide education and resources on digital citizenship and online safety, consult with women’s organisations about risks to women, and provide guidance to employers about the risks of online harassment and abuse in the workplace.233 The Government, in its interim consultation response, claimed that it would produce a media literacy strategy this summer to “ensure a co-ordinated and strategic approach to online media literacy education and awareness for children, young people and adults”, though at the point of writing we are still awaiting its publication.234

69.The Government had committed to publishing a media literacy strategy this summer. We understand the pressures caused by the crisis, but believe such a strategy would be a key step in mitigating the impact of misinformation, including in the current pandemic. We urge the Government to publish its media literacy strategy at the latest by the time it responds to this Report in September. We welcome the non-statutory guidance from the Department for Education on ‘Teaching online safety in school’ (June 2019),235 bringing together computing, citizenship, health and relationships curricula, which among other things covers disinformation and misinformation. We ask that the Government reports on adoption of this material before the end of the academic year 2020/1.

Implication for online harms

70.Despite the Secretary of State and Minister for Digital’s positive assessment of companies’ efforts in tackling misinformation, Ministers have elsewhere downplayed the possibilities offered by online harms legislation. In one instance, when asked if tech companies are doing enough to tackle false information, Lords Minister Baroness Williams appeared to justify the lack of action by tech companies: “The thing about the online world is that quite often it is reactive. Unless it is illegal, it is very difficult to make it proactive.”236 This statement, however, was then immediately contradicted by the Minister for Digital, who subsequently stated that she had “seen some really good proactive work” from Facebook, Twitter and Google to tackle misinformation that “now shows that it is possible for platforms to work at great speed and with great integrity to address some of these concerns”.237

71.Moreover, the Government has stated several times that online harms legislation will aim to hold companies “to what they have promised to do and to their own terms and conditions”.238 The Minister for Digital, however, has acknowledged the limits to this approach, particularly for misinformation and disinformation, stating that, “[i]n many cases, it does not actually contradict some of the platforms’ standards or regulations”.239 Moreover, statements elsewhere implied that in several instances, existing terms and conditions were not fit for purpose, as the Secretary of State himself stated that the Department had been working to improve the robustness of companies’ terms and conditions, claiming that “[w]e are working with them to understand and beef up their systems and how they as social media companies take action in respect of misinformation”.240 Dame Melanie Dawes, chief executive of Ofcom, set out the drawbacks for such an approach in oral evidence in June:

What I would say is that, although there are some sensible steps being taken, there is no transparency about it. There is no overall standard that has been set. It is very hard for parents to know what sort of risks their children are exposed to and how they are being managed by the platforms, because we cannot police what our children are doing all day.241

72.The Government should set out a comprehensive list of harms in scope for online harms legislation, rather than allowing companies to do so themselves or to set what they deem acceptable through their terms and conditions. The regulator should have the power instead to judge where these policies are inadequate and make recommendations accordingly against these harms.


73.Throughout our inquiry, the Government emphasised that decisions about the scope of regulation for so-called ‘harmful but legal’ content should fall to the regulator. For example, in response to a question on the balance of illegal harms and ‘harmful but legal’ in legislation, the Minister for Digital said:

Within the legislation, the only things that we are setting out are things that are illegal, so child sexual exploitation and terrorism are the two things that are mentioned on the face of the Bill, as far as I understand it at the moment. On the things that are what you describe as legal but harmful, Ofcom is the regulator here and that will be something it will lay down. We are not going to specify what those harms are.242

74.However, the Government’s position was rebuffed by Ofcom in oral evidence several weeks later, when we asked Dame Melanie Dawes about the harms in scope. When asked about the appropriate balance in legislation between illegal content and ‘harmful but legal’ content, Dame Melanie said:

If we are appointed, we will work with whatever regime Parliament decides. These are quite important questions for Ministers and Parliament to determine.243

Regarding the outcome of this inquiry, Dame Melanie stated that the “online harms regime will need to answer the question as to whether or not disinformation, in particular, is covered”.244 However, Dame Melanie did note that, despite the overall scope of legislation being within the purview of Parliament, Ofcom would require flexibility and discretion as a matter of practicality when enforcing the regime.245 Ofcom was also reluctant to describe the powers it might need to enforce the regime beyond financial penalties, saying that “[i]t would be presumptuous of me to ask for detailed power for a regime that we have not yet been asked to operate”.246 Regarding criminal sanction, which our predecessor called for as a last resort, Dame Melanie noted that “criminal sanction for criminal activities is incredibly important” but commented that it would be a relatively unique power across its other remits.247

75.Dame Melanie did argue that Ofcom needed to “deepen our understanding” of some specific harms,248 but emphasised Ofcom’s “good track record of using other people’s research, as well as commissioning our own”.249 Regarding the practicalities of identifying harms, Dame Melanie described the need to work with tech companies to identify issues:

The regulator will need access to data from the operators, and we would expect to be able to publish information about what is going on and what actions are being taken. With the scale of this, we are going to have to rely on the companies themselves to do a lot of the heavy lifting, but then the regulator’s job will be to shine a light, to hold them to account and to investigate if there are issues that suggest not all is as it should be.250

Oral and written evidence also emphasised the need to engage with tech companies to test new functions. Dr. Claire Wardle of First Draft told us she “would like to see is the platforms do more but then allow academics to test alongside them to see what the effects are”.251 Evidence from Dr. Nejra van Zalk from Imperial College London also described how ‘road testing’ code has helped understand the impact of digital technologies and innovations on children and young people before they are released to the public.252

76.We are pleased that the Government has taken up our predecessor Committee’s recommendation to appoint an independent regulator. The regulator must be named immediately to give it enough time to take on this critical remit. Any continued delay in naming an online harms regulator will bring into question how seriously the government is taking this crucial policy area. We note Ofcom’s track record of research and expedited work on misinformation in other areas of its remit in this time of crisis as arguments in its favour. We urge the Government to finalise the regulator in the response to this Report. Alongside this decision, the Government should also make proposals regarding the powers Ofcom would need to deliver its remit and include the power to regulate disinformation. We reiterate our predecessor Committee’s calls for criminal sanctions where there has been criminal wrongdoing. We also believe that the regulator should facilitate independent researchers ‘road testing’ new features against harms in scope, to assure the regulator that companies have designed these features ethically before they are released to the public.

77.We have also raised concerns that social media may be allowing third parties to exploit gaps in regulation. In correspondence with Dame Melanie, we raised concerns that Press TV had been using social media platforms to circumvent the revocation of its broadcasting licence in 2012253 until its UK YouTube channel was deleted by YouTube unilaterally in January 2020 for going against its policies.254 Finally, in oral evidence we raised the issue that vendors of the harmful Miracle Mineral Solution and other hoax cures have exploited gaps in foods standards and medicine regulations on social media, making it difficult to compel tech companies to take action at scale against them.255 We have noted that the Competition and Markets Authority, Information Commissioner’s Office and Ofcom have recently launched a ‘Digital Regulation Cooperation Forum’ to strengthen collaboration and co-ordination between them.

78.The Government should also consider how regulators can work together to address any gaps between existing regulation and online harms. It should do this in consultation with the Digital Regulation Cooperation Forum, the creation of which we note as a proactive step by the regulatory community in addressing this. We believe that other regulatory bodies should be able to bring super-complaints to the new online harms regulator.

181 BBC (DIS0012) para 13

182 Channel 4 (DIS0016) para 3.14

183 BBC (DIS0012) para 23

184 Channel 4 (DIS0016)

185 BBC (DIS0012) para 3.11

186 Q27

187 Hacked Off (DIS0014)

188 Q125

189 Channel 4 (DIS0016) paras 3.6–7

190 Ibid, para 3.7

191 Ibid

192 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 16 June 2020, HC (2019–21) 156, Qq84, 89

193 Q151 [Leslie Miller]

194 The partners within the TNI are: (from traditional media) the BBC, Agence France-Presse, Reuters, European Broadcasting Union, Financial Times, Wall Street Journal, the Hindu and CBC/Radio-Canada; (from tech) Facebook, Google/YouTube, Twitter and Microsoft; and (from the research community) First Draft and the Reuters Institute for the Study of Journalism.

196 BBC (DIS0012) para 42

197 Letter from Richard Earley, Facebook, re evidence follow-up, 14 May 2020

198 BBC (DIS0012)

199 Ibid, para 46

200 Channel 4 (DIS0016) para 3.12

201 BBC (DIS0012)

202 Letter from Richard Earley, Facebook, re evidence follow-up, 14 May 2020

203 Letter from Alina Dimofte, Google, re evidence follow-up, 11 May 2020

204 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 22 April 2020, HC (2019–21) 157, Q18

205 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 9 June 2020, HC (2019–21) 291, Q381

206 Coronavirus: Unit set up to counter false claims , BBC News, 9 March 2020

207 “Government cracks down on spread of false coronavirus information online”, Cabinet Office and Department for Digital, Culture, Media and Sport press release, 30 March 2020

208 Ibid

209 Letter from Chair to Rt. Hon. Oliver Dowden MP, Secretary of State for DCMS, re. Coronavirus disinformation, 11 March 2020

210 Letter from Rt Hon Oliver Dowden MP, Secretary of State for DCMS, re Coronavirus disinformation, 27 March 2020

211 Ibid

212 Letter from Richard Earley, Facebook, re evidence follow-up, 14 May 2020

213 Letter from Alina Dimofte, Google, re evidence follow-up, 11 May 2020

214 Full Fact (DIS0006) p 1

216 Q131

217 Q5

218 Q6

219 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 22 April 2020, HC (2019–21) 157, Qq18, 27

220 Oral evidence taken before the Select Committee on Democracy and Digital Technologies on 12 May 2020, HL (2019–21) 77, Q331

221 TikTok (DIS0018) p 5

222 Q135

223 Qq94, 177

224 Letter from Katy Minshall, Twitter, re evidence follow-up, 11 May 2020

225 TikTok (DIS0018) p 5

226 Qq92, 94

227 Q135

228 Letter from Katy Minshall, Twitter, re evidence follow-up, 11 May 2020

229 TikTok (DIS0018) p 5

230 Digital, Culture, Media and Sport Committee, Disinformation and ‘fake news’: Interim Report, fifth report of the session 2017–19, 29 July 2018, HC 363, para 246

231 Protection Approaches (DIS0011)

232 Ibid

233 Glitch (CVD0296) pp.8, 11, 22, 34

234 Department for Digital, Culture, Media and Sport and Home Office, Online Harms White Paper - Online Harms White Paper - Initial consultation response, 12 February 2020

235 Department for Education, Teaching online safety in school, (June 2019)

236 Oral evidence taken before the Home Affairs Committee on 13 May 2020, HC (2019–21) 232, Q 515

237 Ibid, Q516

238 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 22 April 2020, HC (2019–21) 157, Q20

239 Oral evidence taken before the Select Committee on Democracy and Digital Technologies on 12 May 2020, HL (2019–21) 77, Q339

240 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 22 April 2020, HC (2019–21) 157, Q18

241 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 23 June 2020, HC (2019–21) 439, Q8

242 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 9 June 2020, HC (2019–21) 291, Q380

243 Oral evidence taken before the Digital, Culture, Media and Sport Committee on 23 June 2020, HC (2019–21) 439, Q6

244 Ibid, Q24

245 Ibid, Q10; also Qq6–7, 14–5, 18, 21, 36, 39

246 Ibid, Q4, 25–8, 35

247 bid, Q37

248 Ibid, Q15

249 Ibid, Q22

250 Ibid, Q9

251 Q33

252 Dr Nejra van Zalk (DIS0020)

253 Letter from the Chair to Dame Melanie Dawes, Chief Executive, Ofcom, re Misinformation about the COVID-19 crisis, 6 April 2020

254 Google deletes Press TV UK’s YouTube account, Middle East Eye, 14 January 2020

255 Qq133–4

Published: 21 July 2020