Documents considered by the Committee on 5 June 2019 Contents

12Online disinformation

Committee’s assessment

Politically important

Committee’s decision

Cleared from scrutiny; drawn to the attention of the Digital, Culture, Media and Sport Committee

Document details

Report from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions on the implementation of the Communication “Tackling online disinformation: a European Approach”

Legal base

Department

Digital, Culture, Media and Sport

Document Number

(40258), 15475/18, COM(2018) 794

Summary and Committee’s conclusions

12.1This report provides an overview of the Commission’s developing workstream on tackling online disinformation, which was not part of the original Digital Single Market Strategy but was initiated late in the term of the current Commission.

12.2Various developments have fed into this strand of work. In March 2015, the European Council invited the High Representative to develop an action plan to address Russia’s disinformation campaigns,92 which resulted in establishing the East Strategic Communication Task Force, effective since September 2015. In late 2017 the Commission set up a High-Level Expert Group to advise on this matter, which reported on 12 March 2018.93

12.3On 3 May 2018 the European Commission published a Communication entitled ‘Tackling online disinformation: a European Approach’,94 which described the rise across the EU of disinformation, defined as verifiably misleading or false information created or shared online with the intent of causing public harm. The Communication noted that mass online disinformation campaigns were being widely used by a range of domestic and foreign actors to sow distrust and create societal tensions, and that disinformation campaigns by third countries could be “part of hybrid threats to internal security, including election processes”. The Communication observed that drivers which amplify this type of content include algorithms and click-based digital advertising models. The Commission also stated that measures taken by online platforms to date, particularly social media platforms, had not been sufficient to tackle the spread of disinformation online, necessitating EU action.

12.4The Communication undertook that the Commission would:

12.5Perhaps the most significant proposal announced in the Communication was the development, through a multi-stakeholder forum, of an industry Code of Practice—a voluntary and non-binding document which would only apply to the industry stakeholders that choose to participate in it, which was published on 26 September 2018.95 The Code of Practice on Disinformation establishes a wide range of commitments the signatories undertake to implement, which concern:

12.6The Code of Practice was accompanied by an annex of best practices by platforms to tackle disinformation. The Commission indicated that it would monitor the implementation of this code by industry and, if the results proved unsatisfactory, would propose further actions by the end of 2018, potentially including regulation. As noted in a previous Report,96 the Minister for Digital, Culture, Media and Sport (Margot James MP) took the view that the Communication was non-legislative, did not require UK action, and that the Government was “broadly supportive of the EU’s actions in this area, and do[es] not believe that such action will prevent the UK taking action in this area”.97 The Committee asked the Minister to ensure that the progress report on its initial implementation was deposited with the Committee for scrutiny when published.98

12.7On 6 December 2018, the Commission published an implementation report regarding the April 2018 Communication on tackling disinformation,99 which concluded that, overall, the actions outlined in the Communication had been either accomplished or launched. The Commission stated that it would closely monitor the implementation of ongoing actions addressed in this Report, in particular the Code of Practice on Disinformation, and continue to evaluate whether further actions, including measures of regulatory nature, were necessary. Subsequent reports on the implementation of the Code specifically100 concluded that, although the Commission was encouraged that the Reports provided further information on the policies the platforms had developed to meet their commitments, it remained concerned by the platforms’ failure to provide specific benchmarks to measure progress, by the lack of detail on the actual results of the measures already taken and lack of detail showing that new policies and tools are deployed timely and with sufficient resources across all EU Member States.

12.8On 5 December 2018, as trailed in both the previous Communication on online disinformation, and in response to a request from the European Council following the Salisbury chemical attack101 the Commission also published its Action Plan against Disinformation. The Committee previously cleared this file from scrutiny on 27 February 2019, partly on the basis that the UK would have left the European Union by the time the elections to the European Parliament were to take place, reducing the relevance of the Communication to the UK; however, given that the UK is now taking part in those elections, that file is here reconsidered, alongside the implementation report on the Code of Practice on Disinformation.

12.9The Communication firstly provides an account of the threats in terms of online disinformation. It states that the actors behind disinformation may be internal to Member States or external, including state and non-state actors. Reports suggest that more than 30 countries are using disinformation either internally or internationally. The EU Hybrid Fusion Cell suggests that disinformation by the Russian Federation poses the greatest threat to the EU: “ It is systematic, well-resourced, and on a different scale to other countries.” The Communication reports that social media, and increasingly private messaging, have become important means of spreading disinformation. It concludes that the tools and techniques are changing fast, and the response therefore needs to evolve just as rapidly.

12.10The Communication adds that the Union is interested in working with partners in three “priority regions”—the Union’s Eastern and Southern Neighbourhood and in the Western Balkans—in each of which Strategic Communication Task Forces (SCTF) have been set up under the European External Action Service.

12.11The Action Plan presents a range of actions grouped under four pillars:

i)Under the first pillar ‘Improving the capabilities of Union institutions to detect, analyse and expose disinformation’ the Commission primarily proposes to reinforce the Strategic Communication Task Forces and the EU Hybrid Fusion Cell in the European External Action Service (EEAS), as well as EU delegations in the neighbourhood countries. An increase of 50–55 new staff members is projected over the next two years. The EEAS’ strategic communication budget to address disinformation and raise awareness about its adverse impact is expected to more than double, from €1.9 million in 2018 to €5 million in 2019. Member States are expected to complement these measures by reinforcing their own means to deal with disinformation.

ii)The second pillar, ‘Strengthening coordinated and joint responses to disinformation’, is derived from the insight that the first hours after disinformation is released are critical for detecting, analysing and responding to it. The Commission therefore proposes to create a dedicated Rapid Alert System among the EU institutions and Member States to facilitate the sharing of data and assessments of disinformation campaigns and to provide alerts on disinformation threats in real time, through a dedicated technological infrastructure. It is suggested that each Member State should designate a contact point, ideally positioned within strategic communications departments, which would share alerts and ensure coordination with all other relevant national authorities as well as with the Commission and the European External Action Service. The Rapid Alert System would be closely linked to existing EU capabilities and share information and best practices with the G7 and NATO.

iii)The third pillar, ‘Mobilising the private sector to tackle disinformation’, calls upon signatories of the Code of Practice to implement its terms swiftly, in particular (i) ensuring scrutiny of political advertising, (ii) closing down fake accounts and (iii) identifying automated bots. The Commission will also carry out a comprehensive assessment at the conclusion of the Code’s initial 12-month period of application and, should the impact of the Code of Practice prove unsatisfactory, may propose further actions, including regulation.

iv)The fourth pillar ‘Raising awareness and improving societal resilience’ involves raising wider public awareness of ‘fake news’, including via national media literacy programmes and through EU-funded research to improve understanding of why people are “drawn to disinformation narratives”. Support will be provided to national multidisciplinary teams of independent fact-checkers and researchers to detect and expose disinformation campaigns across social networks.

12.12In the Government’s Explanatory Memorandum,102 dated 15 January 2019, the Minister for Digital and Creative Industries (Margot James MP) observed that these documents did not constitute proposals for legislation and that the cross-border nature of the internet and the spread of disinformation meant that a European approach and cooperation between Member States was necessary to ensure consistent and effective action. As it was not forecast that the UK would participate in the forthcoming elections to the European Parliament when the Minister produced her Explanatory Memorandum, the Minister concluded that “therefore the actions are not applicable to the UK.” This is a curious assertion given that, if there were to be a negotiated withdrawal from the EU, which is current Government policy, then the actions detailed in the Communication which did not exclusively relate to the European elections would have applied to the UK during the transition and implementation period provided for in the Withdrawal Agreement.

12.13In view of the latest extension of the Article 50 period on 11 April, all of the actions proposed in the Communication are now relevant to the UK. Officials at the Department have therefore been contacted for an update on the Government’s treatment of the Action Plan, and their response is summarised in this report’s conclusions.

12.14Two other domestic developments are also of note. On 18 February 2019 the House of Commons Digital, Culture, Media and Sport Committee published a Report, “Disinformation and ‘Fake News’”,103 which called for a Compulsory Code of Ethics for tech companies overseen by an independent regulator, which would have the power to launch legal action against companies breaching code. It also recommended that the Government reforms current electoral communications laws and rules on overseas involvement in UK elections, and oblige social media companies to take down known sources of harmful content, including proven sources of disinformation. The Report also noted that, while the Government had accepted evidence of Russian activity in the Skripal poisoning case, it had been reluctant to accept evidence of interference in the 2016 UK Referendum. The Report called on the Government to examine the extent of the targeting of voters by foreign actors during past elections, and to consider whether current legislation to protect the electoral process from malign influence was sufficient, suggesting that legislation should be explicit on the illegal influencing of the democratic process by foreign players. The Committee urged the Government to respond in its White Paper. The Committee has subsequently formed a Sub-Committee on Disinformation104 which will hold evidence sessions in the coming months and discuss the Government’s response to its report, which has not yet been published.

12.15On 8 April 2019 the Government published its Online Harms White Paper,105 in which it proposed that an independent regulator should be tasked with drawing up and policing statutory codes of conduct setting out how companies should protect users from violent content, suicide material, disinformation, and cyberbullying as well as prevent children from accessing inappropriate material, according to a government briefing note issued ahead of the white paper launch. A duty-of-care would be established towards platforms users and the regulator would specify what platforms would have to do to meet this duty. As such, it proposes a more interventionist regulatory approach than the European Union has so far taken in relation to online disinformation. This paper is being consulted on until 1 July 2019.

12.16We have taken note of the Government’s Joint Explanatory Memorandum regarding the Commission’s Action Plan for tackling online disinformation and its report on the implementation of an earlier Communication on the same subject, submitted in response to our request.106 The Commission has been steadily progressing this workstream through non-legislative activity over the past year, with the prospect of future legislative activity to the extent that it is necessary.

12.17The main points of note in the Action Plan are that:

12.18The main points of note in relation to the report on the implementation of the Code of Practice on Disinformation are that:

12.19The Minister did not in her Explanatory Memorandum object to the contents of these non-legislative documents and acknowledged that the cross border nature of the internet and the spread of disinformation meant a European approach was necessary to ensure effective action; however, given the UK’s (then) scheduled exit from the European Union on 29 March 2019, the Minister’s initial assessment was that “the European election, running in May, will take place after the UK’s scheduled exit from the EU and therefore the actions are not applicable to the UK.”

12.20We consider this assertion incorrect given that, with the exception of those actions directed exclusively at the elections to the European Parliament, had there been a negotiated withdrawal from the European Union in line with Government policy, the other actions would have continued to apply to the UK as to other Member States during the transition and implementation period provided for in the Withdrawal Agreement. If this is not the case, we ask the Government to provide us with its reasoning.

12.21Given that the UK is, in any case, now participating in the European elections,107 we contacted officials at the Department for Digital Media, Culture and Sport to provide us with an update on whether the Government has accordingly adapted its approach to the Action Plan. They reported that, when it became apparent in mid-March that the UK was going to participate in the European elections, the Department reviewed the Action Plan to establish whether there were any additional actions contained in it which they needed to take, but concluded that the only case where specific additional action was required was in relation to the Rapid Alert System. This system only became operational in March, since when the UK has been attending the meetings of this network and contributing to the platform.

12.22In terms of the implementation of the Code of Practice by its signatories, online platforms were not expecting to have to roll out measures targeted at the European elections in the UK due to EU exit and therefore had not prioritised them in the UK. However, when it became apparent in March that the UK was participating in the European elections, officials contacted the relevant platforms to ask them to ensure that the UK was covered by any measures they were taking to implement the Code. Officials have been assured that, although the Code’s signatories have not yet fully implemented every element of it, those elements which have been implemented have also been implemented for the UK. One visible example of this is that Google, Twitter and Facebook have published public archives of political advertising in the UK.108

12.23Given that the key actions provided for in the Action Plan and Code of Practice are now being implemented, the Report is non-legislative, and the Commission will be bringing forward further documents and potentially legislative proposals in the coming months, we now clear this document from scrutiny. Given its recent report and ongoing work on online disinformation we draw this report to the attention of the Select Committee for Digital, Culture, Media and Sport.

Full details of the documents

Report from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions on the implementation of the Communication “Tackling online disinformation: a European Approach”: (40258), 15475/18, COM(2018) 794.

Previous Committee Reports

Forty-First Report HC 301–xl (2017—2019), chapter one (24 October 2018).


92 European Council conclusions, 19–20 March 2015.

93 European Commission, Final report of the High Level Expert Group on Fake News and Online Disinformation (12 March 2018).

94 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, COM(2018) 236 final.

95 Links to the Code and a range of related documents are available in an article on the Commission’s website (26 September 2018).

96 Forty-first Report of Session 2017–19, HC 301–xl, chapter one (24 October 2018).

97 Explanatory Memorandum from the Government (tackling online disinformation) (21 May 2018).

98 Forty-first Report of Session 2017–19, HC 301–xl, chapter one (24 October 2018).

99 Report from the Commission to the European Parliament, the European Council, the Council, the European Economic and Social Committee and the Committee of the Regions on the implementation of the Communication “Tackling online disinformation: a European Approach”, COM(2018) 794.

100 European Commission, Code of practice on Disinformation: Summary of the signatories’ first reporting (29 January 2019); European Commission, Code of Practice on Disinformation: Intermediate Targeted Monitoring—January Reports (29 January 2019).

101 European Council conclusions, 22 March 2018.

102 Explanatory Memorandum from the Minister (15 January 2019).

103 Digital, Culture, Media and Sport Committee, Eighth Report of Session 2017–19, 17 February 2019, HC 1791.

104 Digital, Culture, Media and Sport Committee, Tenth Report of Session 2017–19, 26 March 2019, HC 2090.

105 HM Government, Online Harms White Paper, 8 April 2019.

106 See Forty-first Report of Session 2017–19, HC 301–xl, chapter one (24 October 2018).

107 The elections will have taken place by the time this report has been published.

108 The Google UK political ad transparency archive can be found here, Twitter’s UK political ad transparency archive is here, and Facebook’s UK ad library report is available here.




Published: 11 June 2019