Forty-fourth Report of Session 2019–21 Contents

6Preventing the dissemination of terrorist propaganda online82

The proposed Regulation is legally and politically important because:

  • it concerns an area of policy—the provision of digital services—which is “inherently cross-border in nature”;
  • once it becomes EU law, it will apply to UK tech companies offering services in the EU which could be used to disseminate terrorist material; and
  • it illustrates different EU and UK approaches to addressing terrorist use of the internet which may be mirrored in wider EU regulation on digital services and in the Government’s own plans to legislate on “online harms” later this year.

Action

  • No further action, following the agreement reached by the Council and European Parliament on a compromise text.
  • Draw to the attention of the Home Affairs Committee, the Justice Committee, the Digital, Culture, Media and Sport Committee and the Joint Committee on Human Rights, given the possibility of important synergies with the Government’s Online Safety Bill.

Overview

6.1The use of the internet to incite or glorify acts of terrorism and to radicalise and recruit terrorists has become a major concern for law enforcement. In September 2018, the European Commission published a proposal for a Regulation to prevent the dissemination of terrorist content online. It would empower national authorities to order the swift removal of terrorist material found on the web and require companies hosting online platforms to take active steps to prevent them from being used to spread terrorist propaganda. Applying a uniform set of rules would, the European Commission believes, be far less burdensome for companies operating within the EU’s Digital Single Market than the current patchwork of national rules. The Council agreed its general approach on the proposal in December 2018, with the European Parliament deciding on its negotiating position in April 2019.83

6.2Since we last considered the proposed Regulation in November 2020:

6.3When we wrote to the Government last November,85 we highlighted the potential for important synergies in the EU and UK approaches to the regulation of digital services, recalling that the Government had previously made clear (in relation to the EU’s proposal on online terrorist content) that it would “want to ensure alignment of UK and EU law, particularly in an area which is inherently cross-border in nature”.86 We requested an update on the outcome of negotiations on the proposed Regulation. We also underlined the need to monitor wider EU developments in the digital sphere to help inform the regulatory choices available to the UK when introducing its own legislation to tackle online harms, to understand how the EU’s regulatory framework may affect businesses, consumers and other online users in the UK, and to manage the consequences of regulatory divergence.

6.4In her letter of 20 January 2021, the Minister for Countering Extremism (Baroness Williams) updates us on developments on the proposed Regulation on terrorist content online, the European Commission’s new digital services package, and the interaction between these EU initiatives and the Government’s own proposals to tackle online harms.

Political agreement on the proposed Regulation

6.5The Minister describes the key features of the agreement reached by the Council and the European Parliament. We have supplemented some of the information as the final compromise text was published after the Minister wrote to us.

6.6To illustrate how the proposed Regulation will work in practice, the Minister provides the following example which we have adapted to reflect the final compromise text agreed by the Council and European Parliament. The example concerns a national authority in one Member State (France) issuing a removal order to a hosting service provider (Facebook) based in a second Member State (Ireland): the French authorities identify a piece of terrorist content on Facebook. The French authorities issue a removal order to Facebook, giving Facebook one hour to act.88 Facebook disputes the removal order, so appeals to the Irish authorities to review it. Facebook must remove the offending piece of content within one hour; the Irish authorities must review the offending piece of content within 72 hours; if the appeal is successful, the removal order ceases to have legal effect and Facebook must reinstate the content (or enable access to it) immediately.89

6.7The Minister underlines the important role played by the German Presidency in securing agreement on a compromise text before the end of its term in December 2020. While certain elements of the European Commission’s original proposal were removed, notably the possibility to mandate the use of automated tools to restrict the uploading of terrorist content, the Minister notes that many key features have been retained and describes the outcome as “a highly positive step”. The final compromise text still has to be formally approved by the Council and the European Parliament. Once the Regulation is published in the EU’s Official Journal, Member States will have 12 months in which to prepare for its provisions taking effect. It is therefore likely to apply in the first half of 2022.

Other EU initiatives on digital services

6.8The Minister explains that the proposed Regulation on terrorist content focusses on the removal of terrorist content once it has been made accessible to the public whereas the EU’s proposed Digital Services Act takes a more preventative approach, seeking to ensure that platforms are designed with safety in mind and that companies have effective processes in place to prevent users being subjected to a broader range of harmful material when using their services. There is nonetheless some degree of overlap. The Minister highlights the following elements which are relevant to the regulation of terrorist content online:

6.9The Minister notes that the French Government is keen to reach an agreement on the proposed Digital Services Act during its EU Presidency in the first half of 2022, though anticipates that this timetable will be challenging. In any event, the Act is unlikely to take effect until 2023 at the earliest.

Implications for the UK’s Online Safety Bill

6.10The Minister confirms that the Full Government Response (“FGR”) to the Online Harms White Paper forms the basis for the Government’s Online Safety Bill. She says that officials in the Home Office, the Department for Digital, Culture, Media and Sport, and the UK Mission to the EU have been in touch with the European Commission to discuss the Government’s plans and the EU’s recent legislative developments.

6.11The Minister explains that the Government’s Response is broader in scope than the EU Regulation on terrorist content but bears some similarity to the EU’s proposed Digital Services Act (“DSA”) as both focus on protecting users from a range of illegal or harmful content and activity on online. She continues:

The DSA and the FGR both note the importance of the largest tech companies (classed as ‘Category 1’ services in the FGR) taking additional, rigorous measures due to their significant reach—in UK, Category 1 services must take additional measures to tackle legal-but-harmful content, and publish transparency reports; in the EU, the DSA specifies that the largest platforms must conduct independent audits of their ability to tackle abuse of their services, for example.

6.12The Minister explains how the approaches taken by the EU and the UK to terrorist content online differ:

In terms of legislation to specifically tackle terrorist use of the internet, you will notice that, in the TCO Regulation, the EU has taken a different approach to the UK, by specifying a clear set of procedures and prescribed expectations that companies will follow to permit the take-down of terrorist content within one hour of receiving a removal order. The UK has deliberately chosen not to go down the one-hour removal route: whilst we recognise that there are strong benefits to securing the expeditious removal of terrorist content before it can be viewed and shared, we believe that by ensuring companies have adequate systems and processes in place, rather than focussing on removing specific pieces of content, our legislation will have a positive impact on behaviours and expectations, and will reduce the amount of terrorist content of platforms in the first place. This approach will allow Ofcom as the Online Safety Regulator to take a flexible approach rather than setting specific targets in legislation which could have unintended consequences, such as impacts on freedom of expression if content is removed in an overly precautionary way.

6.13The Government has published a voluntary and non-binding Interim Code of Practice on Terrorist Content and Activity Online to help tech companies understand how to mitigate the range of risks arising from online terrorist content and protect their users and the general public from harm.90 It will be replaced by a statutory code of practice to be drawn up by Ofcom once it has been established as the independent regulator for online harms under the UK’s Online Safety legislation. The Government says that the interim code of practice takes into account how the UK’s future online harms regulatory framework is likely to work and is intended to ensure that, by taking early action to protect their users from terrorist content, companies will be better prepared when the online harms legislation comes into force.

6.14The Minister recognises that “tech companies are experiencing several significant changes in the international regulatory environment”. Ministers and officials have intensified their engagement with these companies to help them to adapt to the changes that the UK’s new regulatory framework on online safety will require. She believes that the EU and UK approaches are “broadly complementary” and that “when considered in the round, internet users across the UK and EU will enjoy significantly increased protections against terrorist content online, and other online harms, thanks to these developments”. The Government will “continue to engage tech companies on a range of issues, including terrorist content online, going forward, to monitor the effectiveness of the legislation”.

Our assessment

6.15We welcome the Minister’s comprehensive response, her recognition of the broader regulatory framework in which the Government is developing its own domestic legislation on online harms, and the Government’s active engagement with the EU through the UK Mission to the EU.

6.16While the Minister views the political agreement reached on the proposed Regulation on terrorist content online as “a highly positive step”, it is striking that the Government has chosen to take a different approach. Once it becomes EU law, the Regulation will create a legally binding obligation enforceable against any internet company offering services in the EU to remove terrorist content (or disable access to it) at the behest of a national authority in any EU Member State. By contrast, the UK’s interim code of practice on terrorist content and activity online sets out guidance on best practice which companies may apply to prevent their platforms being exploited by terrorists. As it is a voluntary code, it encourages but cannot compel companies to comply with removal requests made by UK law enforcement bodies such as the Metropolitan Police’s Counter Terrorism Internet Referral Unit.

6.17The EU and UK approaches both seek to reconcile, in different ways, the public interest in preventing terrorist use of the internet and in protecting freedom of expression and the right to receive and impart information.91 The efficacy of each approach in limiting or eliminating the use of the internet to encourage terrorism and violent extremism will depend on the wider regulatory environment of which each form parts and which is still very much in the process of being constructed.

6.18As the Minister recognises, the immediate concern for UK or UK-based tech companies will be to understand how differences in the international regulatory environment affect their businesses. As regards EU-level regulation, these companies will first need to establish whether their activities are within the scope of the proposed Regulation on terrorist content online. This will be the case if they offer services within the EU which involve the dissemination of information to the public and they have a “substantial connection” to one or more Member States.92 These companies will need to designate a contact point and a legal representative in the EU if they are not headquartered there to receive and process removal orders and ensure they are complied with. They will also need to set out their policies on tackling online terrorist content in their terms and conditions, establish an effective complaints mechanism for content providers who may wish to challenge a removal order, and publish an annual transparency report if they have taken action themselves to remove terrorist content or been required to do so. These are legal obligations which, if breached, may result in fines.

6.19While UK tech companies may well fall within the scope of the proposed Regulation if they provide services within the EU, regulatory or law enforcement authorities in the UK will not have the same powers as their EU counterparts to issue removal orders against tech companies based in the EU which offer services in the UK unless a specific provision to this effect is included in the UK’s domestic laws. Even then, the lack of a cross-border enforcement mechanism may well make the orders too difficult to enforce in practice, creating the potential for a law enforcement capability gap.

Action

6.20For the reasons we have set out in this chapter, it is too soon to tell whether EU and UK approaches to the regulation of terrorist content and other online harms will lead to significant divergences in law and policy or to assess what this might mean for companies operating across multiple jurisdictions as well as the implications for public safety and security. Much will depend on the wider frameworks on online harms and digital services which are currently being developed by the EU and the UK. We trust that the Government will continue to engage actively with EU and other international partners, not least because the threats posed by online harms are “inherently cross-border in nature”93 and likely to be magnified by inadvertent regulatory gaps.

6.21While the political agreement reached on the proposed Regulation on terrorist content online brings our scrutiny of the proposal to an end, we recall the important synergies between developments at EU level and the Government’s own domestic agenda to legislate on a broader range of online harms and draw this chapter to the attention of the Home Affairs Committee, Justice Committee, Digital, Culture, Media and Sport Committee, Business, Energy and Industrial Strategy Committee, and the Joint Committee on Human Rights.

82 Proposal for a Regulation on preventing the dissemination of terrorist content online; Council number 12129/18 + ADDs 1–3, COM(18) 640; Legal base—Article 114 TFEU, ordinary legislative procedure, QMV; Department—Home Office; Devolved Administrations consulted; ESC number 40069.

83 See our predecessor Committee’s Seventy-third Report HC 301–lxxi (2017–19), chapter 6 (4 September 2019) for further details on the European Parliament’s position. Also see Forty-sixth Report HC 301–xlv (2017–19), chapter 16 (28 November 2018), Fiftieth Report HC 301–xlix (2017–19), chapter 5 (9 January 2019) and Thirtieth Report HC 229–xxvi (2019–21) chapter 6 (25 November 2020).

84 The compromise text is expected to be based on the Council’s “first reading” position.

85 See our letter of 25 November 2020 to the Minister for Security (Rt Hon. James Brokenshire MP).

86 See the letter of 24 July 2019 from the then Minister for Security and Economic Crime (Rt Hon. Ben Wallace MP).

87 Directive (EU) 2017/541 on combating terrorism.

88 Facebook would have to remove the content specified in the order within one hour of receiving it. It would have 48 hours to decide whether to appeal its removal. The authorities in Ireland would also be able to call in and scrutinise the removal order, acting on their own initiative. They would have 72 hours to decide whether to carry out their own scrutiny and reach a decision. If they act on an appeal made by Facebook, the 72 hours to reach a decision on the appeal would run from when they receive details of the appeal.

89 We have changed the deadlines to reflect those included in the final compromise text.

90 It is one of two interim codes of practice that the Government is publishing ahead of its Online Safety Bill. The second is on online child sexual exploitation.

91 Freedom of expression is protected under the European Convention on Human Rights (Article 1) and the EU Charter of Fundamental Rights (Article 11). The EU Charter also recognises the freedom to conduct a business “in accordance with Union law and national laws and practices” (Article 16).

92 A company which has its head office or registered office in a Member State will have a substantial connection. So too will a company which is not based in the EU but has a significant number of users of its services in one or more Member States or which targets its activities to one or more Member States.

93 See the letter of 24 July 2019 from the then Minister for Security and Economic Crime (Rt Hon. Ben Wallace MP) to the Chair of the European Scrutiny Committee (Sir William Cash MP).




Published: 27 April 2021 Site information    Accessibility statement