Digital Technology and the Resurrection of Trust Contents

Chapter 2: Informed Citizens

20.One of our key concerns in this inquiry is the effect of inaccurate or malicious information on our democracy. Will Moy, Chief Executive of Full Fact, explained the effects of misinformation:

“Bad information can ruin lives. It damages people’s health. It promotes hate and it hurts democracy. We now see people suffering from curable diseases because they have been misled by false information about vaccines. There is false information about public health issues related, for example, to the rollout of 5G mobile communications technology. We see terrorist attacks sometimes promoted by people who have been radicalised by false information online.”13

Box 2: Definition of misinformation and disinformation

Baroness O’Neill of Bengarve neatly explained the difference between misinformation and disinformation:

“… if I make a mistake and tell you that the moon is made of blue cheese, but I honestly believe it, that is misinformation. If I know perfectly well that it is not made of blue cheese but tell you so, that is disinformation.”14

Whether or not information is purposefully false does not change whether it is harmful. In our Report we use ‘misinformation’ where it is unclear if there was purposeful intent to misinform and only label something ‘disinformation’ if that intent is clear.

21.Research from Full Fact during the 2019 General Election showed that misinformation reduces faith in democracy, trust in politicians and people’s drive to participate.15 It found that 17 per cent of the public said that they were less likely to vote because of the level of false and misleading claims in that election campaign. Focus group participants reported that political misinformation made them lose trust in the political process and even made them angry.

22.We have heard suggestions that the rise in misinformation and disinformation is a consequence of digital change. Alex Krasodomski-Jones Director of the Centre for the Analysis of Social Media at Demos told us that a breakdown in a common reality and the sense of an agreed set of facts on which democracy rests is a key symptom of digital change.16 However, this may be taking too rosy a view of the pre-digital world. The Kofi Annan Commission on Elections and Democracy in the Digital Age explained:

“… just as fake news and hate speech have been around for centuries, there has never been a time when citizens in democracies all shared the same facts or agreed on what constitutes a fact. Democratic citizens often disagree on fundamental facts and certainly do not vote on the basis of shared truths. Democracy is needed precisely because citizens do not agree on fundamental facts.”17

Misinformation and the media

23.Offline media has also had an active role in circulating misinformation. In recent months there have been shocking examples of broadcast media actively promoting dangerous conspiracy theories in the context of the COVID-19 pandemic.18 London Live, a local TV station, hosted known conspiracy theorist David Icke for an 80-minute interview in which he went largely unchallenged. During this interview Mr Icke spread false information about the pandemic. In its defence to Ofcom, the station argued that whilst Mr Icke’s views were absurd it was playing an important role in holding power to account and acting in a responsible manner. The station’s acknowledgment of the absurd nature of Mr Icke’s view shows that it was well aware that it was not acting in a responsible manner. Whilst Ofcom have enforced a correction to be shown on the station, at time of writing it has kept its broadcasting licence.

24.On ITV’s This Morning show, presenter Eamonn Holmes suggested that people were right to be concerned about links between 5G and COVID-19. He criticised individuals who correctly identified this as dangerous misinformation suggesting that they were simply fitting in with ‘the state narrative’. These irresponsible and dangerous comments were made at a time when Ofcom identified the most common piece of misinformation that individuals encountered online about COVID-19 was linking its origin or causes to 5G technology. There have also been a number of attacks on communications workers and infrastructure.19 Misinformation has appeared in many newspapers. The Daily Mail, The Daily Express, The Sun and The Metro all published misinformation about mass cremations occurring in China during the early days of COVID-19.20

25.It is not clear that online misinformation is particularly worse than offline misinformation or should be viewed as separate from it. A study from the Reuters Institute found that whilst 38 per cent of social media users reported seeing large amounts of false or misleading information about COVID-19, it found no link between using social media as a source of information and having a lack of knowledge about the virus.21 A further study looked at online misinformation about COVID-19 which had been fact checked. It found that although claims made by prominent public figures like celebrities and politicians were responsible for just 20 per cent of claims in their sample, these claims accounted for 69 per cent of the total social media engagement with misinformation.22 This suggests that online misinformation often comes through platforms broadcasting powerful individuals as well as through private individuals sharing within their own networks.

26.A particular example of online disinformation coming from powerful individuals and organisations is disinformation originating from foreign governments. Investigations from the US Senate have found that foreign states such as Russia use information warfare to sow societal chaos and discord.23 In the context of COVID-19, Russian state-funded sources have promoted disinformation that washing hands does not help fight the spread of the virus and that 5G can kill you.24 Meanwhile, Chinese state representatives have tried to spread the false suggestion that the virus did not originate in China but was spread to China by the US army.25

27.The 2019 General Election provides a good case study of how traditional media can play a key role in promoting online misinformation and its effects on democracy. During the election campaign period the Conservative Party and the Labour Party both published analysis of the effects of their own and each other’s policies. These consisted of largely meaningless numbers that were not based upon credible assumptions.26 Despite this fact, both received largely uncritical coverage from sympathetic newspapers.27 The same analysis was publicised through social media accounts for both parties.28 Similarly online advertising by both parties repeated claims made in news coverage.29 There has also been criticism of the Liberal Democrats for circulating bar charts that overstate their chances of winning an election both in paper leaflets and over social media.30 In the context of this misinformation being spread across platforms it is not surprising that the public became deeply cynical about politicians; 76 per cent of the public agreed that voters were being misled by false and dishonest claims and 56 per cent stated that they tend to ignore what parties and politicians say because they know they cannot trust them.31

Political advertising

28.One of the important tools that political parties use to reach out to voters online is advertising. Political advertising is regulated differently to other advertising. Commercial advertising online is the responsibility of the Advertising Standards Authority (ASA) according to a self-regulated system. The advertising industry has a Committee of Advertising Practice (CAP) which writes Advertising Codes that advertisers are expected to adhere to.32 The non-broadcast code explicitly rules out political advertisements from being covered by the code.33 The ASA acts as the regulator for the code, ensuring that advertising adheres to its standards. If advertisers persistently break the code and do not work with the ASA, then they can be referred to Trading Standards who take enforcement activity using their legal powers.34 Until 1999 political advertising was subject to some clauses of the code such as rules on offensiveness.35 However, the decision was made to exclude political advertising due to concerns over the ability of the ASA to act quickly enough, concerns around the 1998 Human Rights Act and a lack of consensus among the main political parties.36 These decisions were made in an era before online political advertising was a pressing concern but apply both to online and offline adverts.

29.We heard concerns about the regulation of political advertising. Matthew d’Ancona of Tortoise Media warned against creating a Ministry of Truth that attempted to regulate falsity.37 The Conservative Party told us that rather than regulate advertising it should be the role of the Government to ensure that there is an independent free press to facilitate robust political debate and scrutinise claims.38 The Labour Party argued that political adverts were more restricted by the disclosure requirements of platforms and the Electoral Commission than any commercial advertising is.39

30.Whilst these are legitimate concerns, the exemption of political advertising undermines confidence in the quality of public debate. Keith Weed, President of the Advertising Association, told us that in order for the public to have confidence in the quality of debate, political advertising needs to be held to the same standards of being legal, decent, honest and truthful as commercial advertising.40 The LSE Truth, Trust and Technology (T3) Commission told us that the current arrangement was not sustainable as paid advertising on social media becomes a more important element of political communication in the UK.41 It called for a mandatory code for political advertising. The Coalition for Reform of Political Advertising stated that 84 per cent of the public support a legal requirement that factual claims in political adverts must be accurate. 42 It recommends regulation covering political advertising in all formats including all types of media directly under the advertiser’s control like social media accounts and websites and all types of publicity that is earned by the campaign, for example coverage earned by editorial influence. This is the same scope that the ASA has for commercial advertising.

31.The Coalition for Reform of Political Advertising argued that if campaigns made what seem to be objective and quantifiable claims then those claims should be accurate and stand up to independent scrutiny.43 However, Guy Parker, CEO of the ASA, cautioned that this approach would be difficult.44 He told us that the vast majority of claims that would be contested in political advertising would not fall into the category of outright lies but would instead be statements where there could be arguments for and arguments against. Mr Parker highlighted that this was of particular concern because any regulatory judgement on this area is highly likely to be challenged by way of judicial review. If that were to happen, he suggested that courts would give a large amount of discretion to those speaking.

32.Given the lack of support from the two largest political parties, it is clear that a self-regulatory model could struggle. Lord Currie, Chair of the ASA, told us that being a collective self-regulatory system was crucial to the ASA’s success.45 He told us that as there was no buy-in from political parties it was not clear that their system could work in this area. However, Guy Parker pointed out that if regulation was introduced to create such a system it would de-facto have received political support in order to pass through Parliament, as Parliament is made up of political parties.46 Lord Currie expressed a concern that the contentious nature of regulating this area could undermine trust in the ASA more broadly.47 He also stressed that a regulator would need substantial resources in order to adjudicate on claims at the speed required by political campaigns. Guy Parker proffered the suggestion that regulation could work if it focused on outright lies rather than misleading statements. Both Guy Parker and Lord Currie told us that their personal opinion was that political advertising should be regulated. Since giving evidence to us the ASA has publicly stated that political advertising should be regulated.48

33.The ASA’s reluctance to be the sole regulator of this area is part of a general reluctance from regulators to be given this role. Professor Helen Margetts, Professor of the Internet and Society at the University of Oxford and Director of the Public Policy Programme at The Alan Turing Institute, told us that the ASA, the Electoral Commission, Ofcom, and the Information Commissioner’s Office (ICO) all expressed a preference that a different regulator be responsible.49 Keith Weed, the LSE T3 Commission and the Coalition for the Reform of Political Advertising all suggested that the ASA and the Electoral Commission should work together on this area.50 Guy Parker suggested that one way to mitigate the reputational risk to any one regulator would be to create a temporary body during a campaign period which drew on expertise from a range of different regulators.51

34.The suggestion put forward by Guy Parker of a joint approach with expertise from different regulators that focused on removing outright lies from the political conversation is a sensible way forward. However, his suggestion that it would be temporary is untenable as important political activity takes place outside of campaign periods and time would be needed for relevant bodies to develop productive ways of working together. The ASA’s experience regulating advertising could be useful in developing this scheme, as could the experience of the Office for Statistics Regulation from their efforts to correct public figures who misuse national statistics, as well as the experience of the Electoral Commission of the processes of political parties. Investigations should be expedited to ensure they reach a conclusion in time to have an effect before the relevant election. This may involve creating fast-track procedures that differ from existing ASA practice.

35.This process would unquestionably be more effective if it had the support of political parties from across the political spectrum and across the UK. It is incumbent on them to engage with this process to help restore public trust in political debate. As individuals in public life, politicians from all parties should abide by the Nolan Principles, which include being accountable, open and honest. In order to achieve this there must be regulation of political advertising.

36.The relevant experts in the ASA, the Electoral Commission, Ofcom and the UK Statistics Authority should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code.

Tackling misinformation and disinformation online

37.The regulated approach suggested above would help tackle inaccurate formal activity during political campaigns. However, this will not solve the broader problem of the prevalence of misinformation and disinformation online. The Online Harms White Paper recognises that hostile actors could use disinformation to undermine our democratic values and includes disinformation as a harm within its scope. However, there has been concern that it may not be included in the final Bill. The Government’s response to its consultation on the White Paper only mentioned disinformation in noting that some civil society groups had concerns about its inclusion and the effect this would have on freedom of expression.52 Kevin Bakhurst, Group Director of Content and Media Policy at Ofcom, suggested that it was not yet decided if disinformation should be in their remit.53

38.As outlined at the beginning of this Chapter, misinformation is harmful, which the Government knows. Caroline Dinenage MP, the Minister for Digital and Culture, told us that misinformation online about coronavirus is a mixture of already illegal content such as calls to attack 5G infrastructure and harmful content such as fake nurses suggesting false cures.54 We know that misinformation online is reducing trust in vaccines which may play a crucial role in combatting the novel coronavirus.55 Government must act to reduce its spread. We believe that, on balance, this would not be too much of a restriction on free expression if it focuses on preventing misinformation from being recommended rather than whether it is hosted on the platform. We discuss this in greater detail and present our recommendations for the Online Harms White Paper in Chapter 3 on accountability. Intent is irrelevant to whether bad information is harmful and therefore both misinformation and disinformation should be covered.

39.Chloe Smith MP, Minister for the Constitution and Devolution, told us that she believed that our elections were not less secure and robust due to the fact that the Government had yet to implement its Online Harms work.56 She added that “there should be no need to think that we are in some way vulnerable to interference right now.” That is a worryingly complacent attitude given the evidence of our systemic vulnerability to misinformation. As the evidence stated above shows, individuals are being put off taking part in democracy due to the prevalence of misinformation. There is also reason to believe that foreign states are actively pushing disinformation to undermine faith in the democratic process. The next generation of voters spend ever greater amounts of their time online and have their political views shaped by this. In the context of the COVID-19 crisis it is not difficult to imagine future misinformation campaigns either deterring people from voting in an election or undermining the legitimacy of an election outcome. This makes misinformation and disinformation existential issues for the future of our democracy. The Government should wake up and understand this.

40.The Online Harms Bill should make clear that misinformation and disinformation are within its scope.

The role of fact checkers

41.One of the more recent tools to fight misinformation are dedicated fact checking teams. These take the form of teams within news organisations, such as the BBC’s Reality Check Team57 and Channel 4’s FactCheck team,58 or dedicated organisations like Full Fact and FactCheckNI. Matthew d’Ancona of Tortoise Media told us that the development of fact checkers was an exciting development but also an indictment of modern journalism as this used to be a part of journalists’ role.59

42.Avaaz has suggested that the role of fact checkers should be built into the processes of technology platforms.60 It argues that platforms should feed potential misinformation to fact checkers, reduce the spread of misinformation once it has been identified as such by fact checkers and show those who have seen the misinformation the fact checkers’ verdict. Some elements of this exist in Facebook’s third-party fact checking initiative. Under this initiative, Facebook presents to its fact checking partners a queue of content that could potentially be misinformation.61 Fact checkers choose what content to check (including content outside of this queue) and are paid by Facebook for each article they write. Content that has been marked as misleading or false is then down weighted in Facebook’s recommendation algorithm. Facebook told the House of Commons Digital, Culture, Media and Sport (DCMS) Select Committee that during the month of April 2020 it displayed warning labels on approximately 50 million pieces of content related to COVID-19 based on around 7,500 articles, although it is unclear what proportion of total number of misleading articles this represents.62 Karim Palant, UK Public Policy Manager at Facebook told us that this reduces the content’s spread by 80 per cent of what it otherwise would have been.63 This shows the power of Facebook’s algorithms in recommending content to users and the impact of choosing to no longer recommend harmful content. Full Fact have described this as the high watermark of internet companies supporting fact checkers.64

43.Other platforms have partnerships with fact checkers, but these are much less transparent, and it is difficult to determine their efficacy. Google’s programme inserts articles written by fact checkers that Google’s algorithm determines to be a reliable source into a privileged position in search results.65 This happens where fact checking organisations identify individual pages as containing fact checks to Google using the meta-data of a page (parts of a webpage read by machines but not by visitors).66 Google’s algorithm then decides which search results to put these fact checks in and where to put them.

44.As with many of the topics identified in Chapter 4 on transparency, Google does not explain these processes to the outside world. However, we do note that on occasion this approach has gone badly wrong. In late 2019 if an individual asked Google the question “Do Muslims pay Council Tax?”, Google would return an answer drawn from Full Fact’s fact check on this question.67 This incorrectly repeated the claim that Full Fact was checking and attributing that to Full Fact rather than Full Fact’s verdict. As a result, Google’s top search result stated that Full Fact said that Muslims did not pay council tax. This was also the case if someone asked the question of Google’s voice assistant.68 In turn, video clips of Google assistant stating that Full Fact said that Muslims do not pay council tax spread online. This is despite the fact that the relevant page of Full Fact’s website’s meta-data was correctly formatted to be registered as a fact check by Google.69 It is not clear how often Google has been spreading dangerous misinformation in this manner. When we raised this specific issue with Google, Vint Cerf told us that machine learning systems can be brittle and break easily for unknown reasons.70 This type of intellectually remote response is deeply concerning given the real effects this type of misinformation can have. Misinformation about Muslims contributes to other false narratives that are used against groups who suffer discrimination and hate based violence. It is unclear that platforms accept the gravity of the consequences of their decisions.

45.During the COVID-19 crisis, YouTube began adding fact checks to certain search results in the United States following a previous trial in Brazil and India.71 YouTube’s description suggests that this works in a similar way to Google’s fact checking initiative.72 As this work is in its early stages and YouTube is not transparent about its processes it is unclear to what extent the fact checks are being correctly matched with relevant search results and are not being presented in a misleading manner.

46.A significant difference between Avaaz’s correct the record proposal and the Facebook third-party fact checking initiative is that Facebook does not automatically show individuals who have seen misinformation a fact checker’s correction. One concern is that doing this may cause a backfire effect, where it further embeds the misinformation rather than reducing the individual’s belief. Avaaz studied this effect by creating a fake version of Facebook (called Fakebook) that mimicked the way that Facebook functioned. It found in this test that corrections reduced belief in misinformation by 50 per cent.73 Avaaz’s review of the literature supports its suggestion that the primary effect is to reduce rather than increase belief in misinformation. To date, Facebook has not tested this at scale on their own website. Karim Palant told us that they had tested it during the 2019 election on certain misinformation but were unsure of its effect.74 In response to the COVID-19 crisis Facebook began the process of testing showing correct information to people who had previously posted misinformation.75 This included testing different language on what this might look like including in some cases an explicit correction of the misinformation.76 Facebook have not committed to publishing the results of these tests.

47.Will Moy of Full Fact told us that whilst he viewed Facebook’s scheme as positive and recommended it be spread to other platforms, he believed there was need for greater transparency.77 Mr Moy called for an independent assessment of the programme. Karim Palant of Facebook argued that this independent assessment was already possible on the basis of the data that Facebook had recently released for academic research.78 However, this is not the case. Independent researchers have not been granted access to the queue of potential disinformation that Facebook sent to fact checkers and so cannot audit this mechanism. Facebook have stated that this queue is populated with content flagged by both human reviewers and algorithmic selection.79 However, not even the fact checkers know whether a specific piece of content was referred by a human reviewer or an algorithm.80 Without Facebook sharing this information with an external reviewer it is impossible to independently determine whether this process works properly or how it could be improved. This would be a good area for public interest research commissioned through Ofcom in line with the recommendations in Chapter 4 on transparency.

48.Facebook have purposefully hobbled their third-party fact checking initiative by exempting all elected politicians and candidates for office from being fact checked. Allan Leonard of FactCheckNI told us that this harmed his organisation’s reputation as it made the public believe that fact checkers did not check politicians when in reality this is the majority of their work.81 Karim Palant of Facebook indicated to us that Facebook did not want to be seen as the referee of all political disputes whose role is to mark which contributions are true or false.82 He told us that attaching a fact check to politicians’ statements on the platform would mark a sharp departure for political debate. Mr Palant argued that it would be quite extreme for an elected politician’s communication to the public to be overlaid with whether it was true or not. He stated that this policy only applied to posts originating from politicians and that if a politician posted content from elsewhere that contained misinformation it would be fact checked. Facebook’s definition of a politician is essentially arbitrary, not fact checking candidates for elected office but allowing proxies who play a similar role in public debate to be fact checked.

49.Facebook has been inconsistent in its preference for staying out of political debate or allowing fact checkers to operate on politically sensitive content not posted by politicians. For example, Facebook removed a fact check from a video containing misinformation about abortion after US lawmakers complained about the existence of the fact check.83 There were no material concerns with the accuracy of the fact check and it was supported by an independent secondary review.84 This suggests that Facebook’s position is more about avoiding political pressure than any particular concern about preserving democratic debate.

50.There are legitimate questions about the way in which platforms have responded to misinformation about the COVID-19 crisis. Platforms have removed content posted by the heads of state of Brazil and Venezuela advocating for hydroxychloroquine as a cure for COVID-19.85 On the other hand, they have allowed the US President to advocate for the same drug.86 Facebook removed groups that were created to organise protests against social distancing measures.87 Mark Zuckerberg stated that this was due to these groups posting misinformation suggesting that social distancing was not effective.88 Another Facebook spokesperson clarified that this was consistent with Facebook’s policy of not allowing groups to organise events that break the law as these protests were.89 However, civil liberties groups including the UN special rapporteur on freedom of expression criticised this decision. It is not clear how much misinformation an item of content can contain before it is viewed as misinformation and is removed from the platform. Platforms have not clearly explained their thinking or released fact checks of each individual case explaining why each piece of content is inaccurate.

51.Tackling misinformation online requires transparency in order to be effective as well as to ensure a thriving democratic debate. The International Fact Checking Network (IFCN) is a global alliance of fact checkers which seeks to promote best practice in the field.90 The IFCN has a code of principles requiring that all fact checkers who sign up publish the sources for all of the information in their fact checks and have an open and honest corrections policy.91 This ensures that fact checks can be disputed and corrected as part of a democratic debate within civil society based on facts. Paddy McGuinness, former Deputy National Security Adviser, told us that to ensure a healthy democratic debate we must rely on civil society to establish whether something is true or not rather than relying on the state or private companies.92

52.It is not clear that the fact checking eco-system that currently exists in the UK is able to handle the suggested role in improving online debate. Facebook’s third-party fact checking initiative only partners with signatories to the IFCN’s code of principles.93 During our evidence taking process only Full Fact and FactCheckNI were UK based signatories.94 Reuters were added as a third UK partner in late March 2020.95 Will Moy of Full Fact told us he was concerned about how to scale up Facebook’s initiative to the scale required by the internet.96 FactCheckNI is focused solely on Northern Ireland, leaving only Full Fact and Reuters to monitor misleading information for the rest of the UK. The programme would cease to function in most of the UK if anything happened to Full Fact and Reuters or their relationship with Facebook. In the Netherlands Facebook’s only fact checking partner dropped out from the programme because it disagreed with Facebook’s policy not to fact check politicians, effectively discontinuing the programme there.97 Jenni Sargent, Managing Director at First Draft, warned that the financial incentives in the programme could draw new entrants into the fact checking market in order to receive this compensation and that this could cause an erosion of trust.98 Sir Julian King, former EU Security Commissioner, told us that support for independent fact checkers is one thing that the UK could do to increase societal resilience against external disinformation threats.99 Given the public concern in this area there is a case for regulatory involvement to promote a diverse range of fact checking organisations which have incentives that are clearly aligned with the public interest.

53.There are also legitimate concerns about the scale of the fact checking response compared to the vast amount of misinformation that is propagated on online platforms. Whilst fact checking has a role to play in reducing the spread of misinformation it is not currently sufficient to tackle the scale of the problem as it exists on these platforms.

54.Ofcom should produce a code of practice on misinformation. This code should include a requirement that if a piece or pattern of content is identified as misinformation by an accredited fact checker then it should be flagged as misinformation on all platforms. The content should then no longer be recommended to new audiences. Ofcom should work with platforms to experiment and determine how this should be presented to users and whether audiences that had previously engaged with the content should be shown the fact check.

55.Ofcom should work with online platforms to agree a common means of accreditation (initially based on the International Fact Checking Network), a system of funding that keeps fact checkers independent both from Government and from platforms, and the development of an open database of what content has been fact checked across platforms and providers.

Promoting good information

Communicating statistics

56.Alongside tackling bad information, it is important to promote good information in order to encourage and support citizens in an informed democratic debate. Jenni Sargent of First Draft told us that there was a risk of damaging public debate if too much focus is put on fact checkers publishing what is not true rather than also focusing on publishing and promoting accurate reporting.100 Caroline Dinenage MP, Minister for Digital and Culture, told us that one of the things she would like to see continue after the COVID-19 crisis would be platforms’ efforts to promote good information.101 Ed Humpherson, Director General for Regulation at the Office for Statistics Regulation (OSR), told us that he worries more about the failure of good information to be properly communicated in a way that is useful to the public than he does about misinformation.102 He argued that we should spend as much time thinking about what it means to inform as to misinform.103 A United Nations Educational, Scientific and Cultural Organisation (UNESCO) report on misinformation and COVID-19 stated that it was vital for governments to release public data about the spread of the disease and the Government’s response to COVID-19 was crucial to informing public debate.104 Oliver Dowden MP, the Secretary of State for Digital, Culture, Media and Sport told the DCMS Select Committee that he viewed his primary focus as promoting good news sources in response to COVID-19 and that dealing with misinformation was a secondary focus.105

57.Ed Humpherson told us that too many statistics were presented and communicated in a dry and mechanical way which meant that good information is not getting to the public where it was needed.106 Allan Leonard of FactCheckNI gave an example from Northern Ireland where data was being misused as it had not been communicated in the right context and how in response the Northern Ireland Statistics and Research Agency (NISRA) created a tool called NI: IN PROFILE which helps to combine different data products in a way that are easy to understand and can be better communicated to the public.107 Ed Humpherson agreed that NI: IN PROFILE was a positive development. He told us that to answer important policy questions many different statistics from different data sets are required and that better integrating and communicating existing data would be a useful service.108

58.Will Moy of Full Fact praised the Office for National Statistics (ONS) as an important institution that can help democracy but highlighted the difference between it and an organisation like the BBC.109 He noted that the BBC’s mission is to educate, entertain and inform and that entertainment is part of their role to get people’s attention. Mr Moy argued that good journalism gets people’s attention and does something useful with that attention. Statisticians are not journalists. They do not necessarily possess the skills needed to get the public’s attention and nor should we expect them to.

59.In response to the COVID-19 crisis there have been great efforts put into informing public debate. In a remarkably short time, government statisticians have ensured that there are high quality statistics available to help others scrutinise government policy.110 However, there are still concerns about how to communicate these statistics to the public.111

60.The issue of how best to communicate statistics has been explored many times before, from a report by the Statistics Commission in 2007 to an inquiry by the House of Commons Public Administration Select Committee in 2013.112 These inquiries noted some of the same issues we encountered including a difficulty in communicating statistics to the broader public. The lack of understanding of exactly who the users of statistics are was also noted in a 2019 Public Administration and Constitutional Affairs Committee inquiry into the UK Statistics Authority as a whole.113 That report recommended a user engagement strategy. However, these inquiries did not focus on the ways that official statistics can be used in an online world dominated by large social media platforms. We suggest a more specific approach than previous reviews, looking at the role of official statistics in informing online debate.

61.The House of Lords Communications and Digital Select Committee should consider conducting an inquiry to examine the communication of official statistics in an online world.

Making use of parliamentary expertise

62.Alongside the Office for National Statistics (ONS), the UK has other institutions that can help inform public debate. Several witnesses praised the Libraries of both Houses of Parliament. Will Moy of Full Fact described them as building blocks that can help tackle the harms that come from bad information.114 However, he noted that they are focused on a specific audience and, like the ONS, are not journalists who are skilled at garnering public attention.115 Dr Alan Renwick from the UCL Constitution Unit told us that the Libraries produce great, impartial information but that more could be done for them to feed into wider democratic processes.116 Liz Moorse, Chief Executive of the Association for Citizenship Teaching, expressed her frustration that while the House of Commons Library normally produces high-quality information to inform the public debate, during an election it has to stop its work. She told us: “That seems crazy. We desperately need good-quality information for citizens during election periods.”117

63.Parliament is dissolved 25 days before a General Election.118 This means that Members of both Houses are not able to make requests of either Library. There are also no Select Committees. The substantial number of researchers and policy specialists working across committees and Libraries cease to inform public debate at a critical time. For scale, there are over 100 policy professionals employed by both Houses whilst there are just seven fact checkers at Full Fact and three fact checkers at Channel 4.119 Currently the election period is used for ad hoc secondments for professional development but this could be an opportunity to use Parliament’s resources to improve public debate at what for many people is the most critical time.120

64.Parliament cannot speak for itself whilst it is dissolved so parliamentary resources should be used in partnership with another organisation that could communicate with the public. These partnerships would need to be with organisations that are impartial in order to maintain the neutrality of parliamentary staff. Broadcasters play a key role in informing the public during an election period and are regulated to ensure their impartiality. This could make them a key partner and a channel to use parliamentary expertise to better inform the public.

65.Parliament should establish formal partnerships with broadcasters during election periods to make optimal use of its research expertise to help better inform election coverage.

Public interest journalism

66.A further means of better informing the public is to provide greater support to public interest journalism. James Mitchinson, Editor of the Yorkshire Post, told us that the best way to inform the public was to use legislative and technological solutions to draw people’s attention to and increase the prominence of trusted sources of news from trained journalists.121 He argued that, to date, technology platforms have had a dramatic negative effect on local journalism. One analysis suggests that spending from small to medium sized businesses on advertising has risen between 2009 and 2019 from £2.1 billion to £5.1 billion for Google and from £25 million to £1.3 billion for Facebook whilst falling from £1.5 billion to £5.9 million for the local press.122 Mr Mitchinson stated that due to changes in the business model necessitated by platforms over the last decade, the output generated by local newspapers and websites has fallen by half.123 He suggested that there was an urgent need for technology platforms to more fairly distribute their revenue with news providers. Mr Mitchinson stated that much of the wealth generated through platforms came from the work done by regional news companies.124 He argued that this would not be a handout but instead was a structural redistribution of the amount of revenue that comes from their content.

67.This is further exacerbated by the COVID-19 crisis. News publishers are seeing declining advertising revenue and reduced subscriptions as a result of the economic downturn. They are also experiencing increasing difficulty in reporting stories due to restrictions on movement.125 Research from Enders Analysis suggests that without intervention, over £1 billion of revenue could fall out of the publishing industry in 2020.126 This includes an annual advertising decline of 50 per cent in print and 25 per cent for online publishers. UNESCO argues that there is a vital window for making an impact through timely stimulus or rescue packages for independent journalism and news outlets. It states that support for journalism is essential to ensure its sustainability as a public good whilst the pandemic takes a further toll on media institutions.127

68.The need for reform to protect the future of journalism is not a new issue and was the subject of a recent review by Dame Frances Cairncross. She told us that many of our concerns about democracy and digital technologies overlapped with the concerns addressed in her review.128 The Cairncross Review into sustainable high-quality journalism in the UK is a substantial piece of work and should form the basis of future work to ensure the future of public interest journalism.

69.While the Government has understandable concerns that it is not seen to be interfering with the work of the free press there are many areas where it agrees with the Cairncross Review. The Government has agreed to work on the Review’s recommendation of codes of conduct to formalise the relationship between publishers and online platforms and in working with interested parties to develop this further is supported by the Competition and Market Authority’s (CMA) provisional recommendation that there is a strong case for a code of conduct.129 There is also agreement on the need for platforms to do more to prioritise reliable and trustworthy sources of information and this may form part of the Online Harms programme of work. The Government also agreed with the review that there is a need for additional funding of public interest news. It suggests that this be done by expanding the Local Democracy Reporting Service. The BBC has indicated its intention to set up a new body to take over the running of the scheme and is seeking additional funding to expand it further. Jessica Cecil, Director of the BBC Online Project at the BBC, told us that this scheme helps them reach 8 to 10 million people a week. She told us that the BBC was keen to expand the scheme and that technology platforms among others help fund the expansion in the programme.130

70.There is a need for a fundamental rebalancing of power away from technology platforms which have abused their position to siphon revenue away from public interest journalism. Other democratic countries across the globe are slowly concluding that this is necessary.131 Countries must ask difficult questions about the power of online platforms and their control over the advertising market. In the UK this should mean the CMA moving from their initial review into a full market investigation.

71.The House of Lords Communications and Digital Committee is also currently conducting an inquiry into the future of journalism and this should enable further informed discussion in this area.

72.A new settlement is needed to protect the role of local and public interest news. The Government should work urgently to implement those recommendations of the Cairncross review which it accepts, as well as providing support for news organisations in dealing with the impact of COVID-19.

73.The Competition and Markets Authority should conduct a full market investigation into online platforms’ control over digital advertising.

13 Q 85 (Will Moy)

14 Q 9 (Baroness O’Neill of Bengarve)

15 Full Fact, Research into public views on truth and untruth in the 2019 General Election (December 2019) p 17: [accessed 13 May 2020]

16 Q 32 (Alex Krasodomski-Jones)

17 Kofi Annan Foundation, ‘Protecting Electoral Integrity in the Digital Age’ (January 2020) p 29: [accessed 13 May 2020]

18 Ofcom, ‘Ofcom decisions in recent programmes featuring David Icke and Eamonn Holmes’ (20 April 2020): [accessed 13 May 2020]

19 Ofcom, ‘Covid-19 news and information: consumption and attitudes’ (April 2020) p 1: [accessed 13 May 2020] and Wired, ‘The 5G coronavirus conspiracy theory just took a really dark turn’ (7 May 2020): [accessed 27 May 2020]

20 Full Fact, ‘These aren’t satellite images and they don’t show evidence of mass cremations in Wuhan’ (February 2020): [accessed 13 May 2020]

21 Reuters Institute, ‘Navigating the ‘infodemic’: How people in six countries access and rate news and information about coronavirus’ (April 2020): [accessed 13 May 2020]

22 Reuters Institute, ‘Types, sources, and claims of COVID-19 misinformation’ (April 2020): [accessed 13 May 2020]

23 Richard Burr, ‘Senate Intel Releases New Report on Intel Community Assessment of Russian Interference’ (April 2020): [accessed 13 May 2020]

24 EU vs Dinsinfo, ‘EEAS Special Report Update: Short assessment of narratives and disinformation around the Covid-19 pandemic’ (April 2020): and The Verge, ‘Why the 5G coronavirus conspiracy theories don’t make sense’ (April 2020): [accessed 13 May 2020]

25 Stanford Cyber Policy Center, ‘Coronavirus Conspiracy Claims: What’s Behind a Chinese Diplomat’s Covid-19 Misdirection’ (March 2020): [accessed 13 May 2020]

26 Full Fact, ‘Another flawed “cost of Corbyn” figure from the Conservatives’ (November 2019): and Full Fact, ‘Labour claims about savings under their policies are not credible’ (December 2019): [accessed 13 May 2020]

27 ‘The true ‘cost of Corbyn’: £2,400 a year for every British worker, claim Tories’, The Telegraph (12 November 2019): and ‘Labour will put £6,716 in your pocket with savings on bills and higher minimum wage’, Mirror UK (3 December 2019): [accessed 13 May 2020]

29 Full Fact, ‘The facts behind Labour and Conservative Facebook ads in this election’ (December 2019): [accessed 13 May 2020]

30 First Draft, ‘The main parties embrace a local digital strategy to win over voters’, (November 2019): and Full Fact, ‘Lowering the Bar’ (November 2019) [accessed 13 May 2020]

31 Full Fact, Research into public views on truth and untruth in the 2019 General Election (December 2019): [accessed 13 May 2020]

32 ASA, ‘About the ASA and CAP’: [accessed 13 May 2020]

33 ASA, ‘Scope of the CAP Code’: [accessed 13 May 2020]

34 ASA, ‘Self-regulation and co-regulation’: [accessed 13 May 2020]

35 Ravi Naik, Political Campaigning: The law, the gaps and the way forward (October 2019) p 37: [accessed 13 May 2020]

36 Written evidence from the ASA (DAD0029)

37 Q 102 (Matthew D’Ancona)

38 Written evidence from the Conservative Party (DAD0095)

39 Written evidence from the Labour Party (DAD0096)

40 Q 77 (Keith Weed)

41 Written evidence from the LSE T3 Commission (DAD0078)

42 Written evidence from the Coalition for Reform in Political Advertising (DAD0071)

43 Written evidence from the Coalition for Reform in Political Advertising (DAD0071)

44 Q 61 (Guy Parker)

45 Q 61 (Lord Currie of Marylebone)

46 Q 61 (Guy Parker)

47 Q 61 (Lord Currie of Marylebone)

48 ‘British political advertising must be regulated. How to do it is a harder question’, The Guardian (3 June 2020): [accessed 3 June 2020]

49 Q 54 (Professor Helen Margetts)

50 Q 78 (Keith Weed), written evidence from the LSE T3 Commission (DAD0078), written evidence from the Coalition for Reform in Political Advertising (DAD0071)

51 Q 63 (Guy Parker)

52 Department for Digital, Culture, Media & Sport (DCMS) , Home Office, ‘Online Harms White Paper – Initial consultation response’ (February 2020): [accessed 13 May 2020]

53 Q 282 (Kevin Bakhurst)

54 Q 331 (Caroline Dinenage MP)

55 Centre for Infectious Disease Research and Policy, ‘Facebook studies reveal mistrust winning on vaccine messaging’, 14 May 2020: [accessed 3 June 2020]

56 Q 344 (Chloe Smith MP)

57 Written evidence from the BBC (DAD0062)

58 Written evidence from Channel 4 (DAD0055)

59 Q 105 (Matthew D’Ancona)

60 Written evidence from Avaaz (DAD0073)

61 Full Fact, Report on the Facebook Third-Party Fact Checking programme Jan-June 2019 (July 2019): [accessed 13 May 2020]

62 Correspondence between Facebook and the Digital, Culture, Media and Support Select Committee, 14 May 2020 (Session 19–21): [accessed 27 May 2020]

63 Q 306 (Karim Palant)

64 Full Fact, The Full Fact Report 2020: Fighting the causes and consequences of bad information (April 2020) p 97: [accessed 13 May 2020]

65 Google The Keyword blog, ‘Fact Check now available in Google Search and News around the world’ (7 April 2017): [accessed 13 May 2020]

66 Google Search Help, ‘See fact checks in search results’: [accessed 13 May 2020]

67 Full Fact, The Full Fact Report 2020: Fighting the causes and consequences of bad information (April 2020) p 95: [accessed 13 May 2020]

68 ‘Are Muslims paying Council Tax’ (25 November 2019) YouTube video, added by Police Abusing Powers: [accessed 13 May 2020]

69 Full Fact ‘New Google News “Fact Check” label’ (October 2016): [accessed 13 May 2020]

70 Q 243 (Vint Cerf)

71 YouTube Official Blog, ‘Expanding fact checks on YouTube to the United States’ (April 2020): [accessed 13 May 2020]

72 YouTube Help, ‘See fact checks in YouTube search results’: [accessed 13 May 2020]

73 Avaaz, ‘White Paper: Correcting the Record’ (April 2020): [accessed 13 May 2020]

74 Q 306 (Karim Palant)

75 Facebook, ‘An Update on Our Work to Keep People Informed and Limit Misinformation About Covid-19’ (April 2020): [accessed 13 May 2020]

77 95 (Will Moy)

78 Q 306 (Karim Palant)

79 Written evidence from Facebook (DAD0081)

80 Full Fact, ‘Report on the Facebook Third-Party Fact Checking programme: Jan-Jun 2019’ (July 2019) p 8: [accessed 13 May 2020]

81 Q 95 (Allan Leonard)

82 Q 307 (Karim Palant)

83 Buzzfeed News, ‘Facebook Took Down A Fact-Check of an Anti-Abortion Video After Republicans Complained’ (11 September 2019): [accessed 13 May 2020]

84 Poynter, ‘IFCN concludes its investigation into Science Feedback complaint’ (27 September 2019): [accessed 13 May 2020]

85 The Verge, ‘Twitter removes tweets by Brazil, Venezuela presidents for violating Covid-19 content rules’ (30 March 2020): [accessed 13 May 2020]

87 David Pierce, ‘Facebook Takes Sides on Covid-19 Protests’, Protocol, (21 April 2020): [accessed 13 May 2020]

88 David Pierce, ‘Facebook Takes Sides on Covid-19 Protests’, Protocol (21 April 2020): [accessed 13 May 2020]

89 Good Morning America, ‘Mark Zuckerberg on Facebook’s plan to fight coronavirus: Full interview’ (Video), (20 April 2020): [accessed 13 May 2020]

90 Poynter, ‘The International Fact Checking Network’: [accessed 13 May 2020]

91 Poynter, ‘International Fact Checking Network: The commitments of the code of principles’: [accessed 13 May 2020]

92 Q 113 (Paddy McGuinness)

93 Poynter, ‘The International Fact Checking Network: The code and the platforms’: [accessed 13 May 2020]

94 Poynter, ‘Verified signatories of the IFCN code of principles’: [accessed 13 May 2020]

95 Reuters, ‘Reuters expands efforts to combat misinformation with extension of fact-checking partnership with Facebook in the United Kingdom’ (March 2020): [accessed 13 May 2020]

96 Q 95 (Will Moy)

97 The Verge, ‘Facebook’s only fact-checking service in the Netherlands just quit’ (26 November 2019): [accessed 13 May 2020]

98 Q 95 (Jenni Sargent)

99 Q 234 (Sir Julian King)

100 Q 95 (Jenni Sargent)

101 Q 332 (Caroline Dinenage MP)

102 Q 93 (Ed Humpherson)

103 Q 96 (Ed Humpherson)

104 UNESCO, ‘Disinfodemic: Dissecting responses to Covid-19 disinformation: Policy brief 2’ (April 2020): [accessed 13 May 2020]

105 Oral evidence taken before the House of Commons DCMS Select Committee, 22 April 2020 (Session 2019–21), Q 21 (Oliver Dowden MP)

106 Q 93 (Ed Humpherson)

107 Q 86 (Allan Leonard)

108 Q 86 (Ed Humpherson)

109 Q 86 (Will Moy)

110 UK Statistics Authority, ‘Statement from the Office for Statistics Regulation – Covid-19 Update’ (April 2020): [accessed 13 May 2020]

111 Pamela Duncan and Niamh McIntyre, ‘Why No 10’s Covid-19 death tolls slides don’t tell the whole story’, The Guardian, (5 May 2020): [accessed 13 May 2020]

112 Statistics Commission, Report No.34 Data on Demand – Access to Official Statistics, (June 2007): and Public Administration Committee, Communicating statistics: Not just true but also fair (First Report, Session 2013–14, HC 190)

113 Public Administration and Constitutional Affairs Committee, Governance of official statistics: redefining the dual role of the UK Statistics Authority; and re-evaluating the Statistics and Registration Service Act 2007 (Eighth Report, Session 2017–19, HC 1820)

114 Q 85 (Will Moy)

115 Q 86 (Will Moy)

116 Q 259 (Alan Renwick)

117 Q 151 (Liz Moorse)

119 Google, ’Using AI to Fight Misinformation’ (13 March 2020): [accessed 13 May 2020] and Written evidence from Channel 4 (DAD0055)

120 Disclosure: This Committee’s Policy Analyst was seconded to Full Fact as a fact checker for the 2019 General Election.

121 Q 102 (James Mitchinson)

123 Q 108 (James Mitchinson)

124 Q 108 (James Mitchinson)

125 Politico, ‘Coronavirus reignites feud between publishers and platforms’ (23 April 2020): ( [accessed 13 May 2020]

127 UNESCO, ‘Disinfodemic: Dissecting responses to Covid-19 disinformation: Policy brief 2’ (April 2020): [accessed 13 May 2020]

128 Written evidence from Dame Frances Cairncross (DAD0068)

129 DCMS, ‘Government response to the Cairncross Review: a sustainable future for journalism’ (January 2020): [accessed 13 May 2020]

130 Q 109 (Jessica Cecil)

131 Australian Competition & Consumer Commission, ‘Mandatory news media bargaining code: Concepts paper’ (19 May 2020): [accessed 28 May 2020]

© Parliamentary copyright 2018