Documents considered by the Committee on 24 October 2018 Contents

6Preventing the dissemination of terrorist propaganda online

Committee’s assessment

Legally and politically important

Committee’s decision

Not cleared from scrutiny; further information requested; drawn to the attention of the Home Affairs Committee, the Justice Committee, the Digital, Culture, Media and Sport Committee and the Joint Committee on Human Rights

Document details

Proposal for a Regulation on preventing the dissemination of terrorist content online

Legal base

Article 114 TFEU, ordinary legislative procedure, QMV

Department

Home Office

Document Number

(40069), 12129/18 + ADDs 1–3, COM(18) 640

Summary and Committee’s conclusions

6.1The global reach of the internet and its penetration into all aspects of daily life make it a potent tool for terrorist propaganda and recruitment. One of the core tasks of the EU Internet Referral Unit (based in Europol’s European Counter-Terrorism Centre) is to strengthen cooperation between the internet industry and the law enforcement community in identifying and removing online terrorist content. The Unit itself has no enforcement powers — whilst it can “flag” and refer terrorist content to the host platform, the decision to remove the offending material is taken by the hosting company in accordance with its terms and conditions. Since becoming operational in July 2015 until the end of 2017, the Internet Referral Unit has made 44,807 referrals across 170 online platforms and achieved a 92% success rate for removal.81 Despite this progress, the European Commission considers that voluntary referrals alone are insufficient to ensure that terrorist content is detected and swiftly removed before it can be disseminated across other online platforms. At their June Summit meeting, EU leaders welcomed the Commission’s plans to put forward a new law.82 In his State of the Union speech in September 2018, Commission President Jean-Claude Juncker announced that the EU would be proposing “new rules to get terrorist content off the web within one hour — the critical window in which the greatest damage is done”.83

6.2The Commission’s proposal for a Regulation would establish a harmonised set of rules to prevent the dissemination of terrorist content on internet platforms hosted by service providers (“hosting service providers”) who offer their services in the EU, even if their headquarters are outside the EU.84 The main elements of the proposal are:

6.3The new power to issue a removal order would operate alongside existing voluntary referral mechanisms, but with a clear obligation on hosting service providers to put in place the necessary operational and technical measures to ensure that referrals are dealt with expeditiously.89 They would also be required to preserve any terrorist content and related data removed from their platform for at least six months to allow for the completion of any complaint and review procedures and in case the information is needed for a criminal investigation or prosecution.90 The proposed Regulation also seeks to strengthen cooperation between national authorities and with Europol to avoid duplication, enhance coordination and prevent any interference with ongoing investigations. It envisages that Member States and hosting service providers would be able to use tools and platforms developed by Europol to communicate with each other, channel removal orders and referrals and provide feedback.91

6.4The proposed Regulation includes a range of safeguards in recognition of “the fundamental importance of freedom of expression and information in an open and democratic society”.92 The content provider or hosting platform would be entitled to request a detailed statement of the reasons justifying the issuing of a removal order and to appeal against it.93 In addition, hosting service providers would be required to:

6.5To ensure effective implementation, hosting service providers must establish a point of contact (available around the clock) to receive and process removal orders and referrals and, if not established in the EU, designate a legal representative within the EU responsible for ensuring compliance with the requirements of the proposed Regulation.98 Member States must have “the necessary capability and sufficient resources” to fulfil the following functions:

6.6The proposed Regulation supplements a non-binding Recommendation on measures to tackle illegal online content adopted by the European Commission in March 2018. The Commission considers that legally binding EU rules dealing specifically with online terrorist content are necessary to address the scale of the threat to public security, ensure a more effective response, protect fundamental rights and ensure “a level playing field” for all hosting service providers operating within the EU’s Digital Single Market. The proposal cites an internal market legal base — Article 114 of the Treaty on the Functioning of the European Union (TFEU) — as harmonised rules will provide “clarity and greater legal certainty”, strengthen trust in the online environment and prevent the emergence of different regulatory approaches across the EU which would impede the functioning of the internal market, “creating unequal conditions for companies as well as security loopholes”.101 The Commission is keen to secure the adoption of the proposed Regulation before the next European Parliament elections in May 2019 (meaning that it would have to approved by March 2019 at the latest). The Commission envisages that the Regulation (a directly applicable measure) would take effect six months after its formal adoption and entry into force.

6.7In his Explanatory Memorandum of 10 October (see part one and part two), the Minister for Security and Economic Crime (Ben Wallace) welcomes the prospect of EU regulatory action to tackle online terrorist content. Whilst acknowledging the value of voluntary cooperation with service providers, he considers that tech companies “have not gone far enough or fast enough” and that the proposed Regulation will “lay the groundwork and support our own intention to legislate on illegal online content”, following the publication later this year of a White Paper covering “the full range of online harms”. He shares the Commission’s view that a fragmented framework of national rules would be burdensome for companies operating within the EU’s digital single market and that Article 114 TFEU, rather than EU Treaty provisions on justice and home affairs matters, is the appropriate legal base as the proposed Regulation does not impose any obligations on judicial or law enforcement bodies.

6.8The Minister considers that the UK is “well-configured” to implement the proposed Regulation as the UK’s Counter-Terrorism Internet Referral Unit (CTIRU) already refers terrorist content to hosting service providers and the Terrorism Act 2006 includes provision for referrals and removal orders. Whilst these are valuable tools, the processes involved in detecting and alerting tech companies to terrorist content are nonetheless too slow:

Terrorist content may already have been live on platforms for many hours or days before human moderators come across them, by which time the content will likely have been further disseminated or downloaded by supporters. While we welcome the proposal’s one-hour timeframe set for the removal of content by an HSP [hosting service provider] and the focus on quick removal, it will likely have taken a competent authority more than one hour to pull this request together. Furthermore, as a legal order, judicial sign off may also be needed, adding to the time taken to issue a removal order.

6.9Moreover, the bulk of terrorist content is detected through tech companies’ own technology or internal reviewers rather than through user reports or referrals, underscoring the need for “automated tools and proactive action” by the companies themselves to counter the speed at which terrorist content is disseminated. The Minister therefore expresses the Government’s “full support” for the provisions in the proposed Regulation requiring hosting service providers to take proactive measures to prevent the dissemination of terrorist content. For these provisions to be effective, he recognises that there is a need to develop the capacity of smaller tech companies to use automated detection tools and to share guidance on proactive measures across Member States so that they can be implemented in a consistent manner, whilst ensuring they are “proportionate” to the economic capacity of each service provider and the degree of risk that their platform may be used to host terrorist content.

6.10The Minister welcomes the inclusion of a specific recital in the proposed Regulation clarifying that an obligation placed on hosting service providers to take specific and targeted proactive measures to detect terrorist content would not conflict with the EU’s E-Commerce Directive which prevents Member States from imposing on service providers a general obligation to “monitor the information which they transmit or store” or “actively to seek facts or circumstances indicating illegal activity”.102 Whilst the Government would have preferred to go further in fleshing out the proactive measures to be taken by hosting service providers, the Minister recognises that “the Commission has had to act within the limiting parameters of the E-Commerce Directive”. He considers that the approach taken by the Commission in seeking to balance public security and fundamental rights (notably, freedom of expression and freedom to conduct a business) establishes “a helpful precedent as we look to develop measures for our domestic legislation on online harms”.

6.11The Minister expresses concern that the provisions on penalties leave too much discretion to Member States, creating a risk that they could choose to set a low bar to attract investment by tech companies. He anticipates that the Commission will be invited to produce guidance “to harmonise penalties across Member States”. He also expects the Commission to look again at a provision authorising the Member State issuing a removal order to impose financial penalties, even if the hosting service provider is established in another Member State.103

6.12Whilst recognising that the proposed Regulation would “potentially interfere with a number of fundamental rights”, the Minister says it includes “robust safeguards” which ensure that any interference is “limited to what is necessary to protect the public” and that “the requirements of proportionality are interwoven” in all the provisions. He is sanguine about the possibility that the UK may be required to implement the Regulation if it takes effect during a post-exit transition/implementation period, observing:

To date, the issue of how we tackle terrorist content online has been a somewhat politically neutral issue within the Brexit landscape, as both sides agree more needs to be done, and we have worked collaboratively with our European partners, both bilaterally (particularly with France and Germany) and through the EU Internet Forum. The UK is the global lead on this issue and EU partners value our expertise and views on how to address terrorist content online. We intend to continue to work closely with our EU partners to tackle these issues whilst we remain a Member State and in the future.

6.13The Minister notes the Commission’s intention to secure a general approach in the Council by December 2018 and to conclude trilogue negotiations with the European Parliament no later than March 2019.

Our Conclusions

6.14We thank the Minister for his thorough and informative Explanatory Memorandum. Although the proposed Regulation concerns the regulation of online terrorist content, the Government accepts that its principal purpose is to ensure the proper functioning of the EU’s digital single market and that Article 114 TFEU (on the internal market) is the correct legal base. This means that the UK will be bound to implement the proposal if it is adopted and takes effect before the UK’s exit from the EU (on 29 March 2019) or during a post-exit transition period ending on 31 December 2020. We ask the Minister:

6.15The Minister indicates that the Commission may be invited to issue “guidance in order to harmonise penalties across Member States”. We ask the Minister:

6.16We also ask the Minister whether the proposed Regulation should indicate what degree of non-compliance might be considered “systematic” (attracting a fine of up to 4% of the hosting service provider’s global turnover in the previous business year) and how concerned he is that such an open-textured and ill-defined provision may induce excessive caution on the part of hosting service providers and have a chilling effect on freedom of expression.

6.17One of the Commission’s objectives in putting forward the proposed Regulation is to establish “a clear and harmonised legal framework to prevent the misuse of hosting services for the dissemination of terrorist content online”.105 We ask the Minister whether the proposal (including its recitals) provides sufficient clarity and legal certainty regarding the obligations applicable to the providers of information society services under this and other EU laws. We have in mind the difficulty in reconciling the proactive measures permitted under the proposed Regulation (aimed at preventing the dissemination of online terrorist content) and the obligation under the E-Commerce Directive to protect freedom of expression by precluding any general monitoring obligation.

6.18Finally, we ask the Minister to clarify the circumstances in which it would be appropriate to make a referral rather than issue a removal order under the proposed Regulation, given that for referrals there is no set time limit within which hosting service providers would be required to remove the offending content or disable access to it.

6.19Pending further information, the proposed Regulation remains under scrutiny. We draw this chapter to the attention of the Home Affairs Committee, the Justice Committee, the Digital, Culture, Media and Sport Committee and the Joint Committee on Human Rights.

Full details of the documents:

Proposal for a Regulation on preventing the dissemination of terrorist content online: (40069), 12129/18 + ADDS 1–3, COM(18) 640.

Background

6.20The Minister’s Explanatory Memorandum sets out the context which has informed the Commission’s approach to the regulation of online terrorist content:

In 2017, there were a total of 205 foiled, failed and completed terrorist attacks in the EU, which killed over 68 and injured many more. The continued high level of terrorist threat in the EU is accompanied by continued concern about the role of the internet in aiding terrorist organisations to facilitate and direct such terrorist activity, but also to radicalise, inspire and recruit, particularly through the dissemination of propaganda. Such illegal terrorist content is shared online in particular through online services that allow the upload of third party content, also known (in this particular context) as hosting service providers (HSPs). This Regulation focuses on the availability and spread of terrorist content on such hosting services.

To date, governments, with the UK leading the way, have worked directly with tech companies on a collaborative, voluntary basis, to address the misuse of their platforms by terrorists to disseminate propaganda, recruit, radicalise and inspire individuals, and plan attacks. We have also engaged with the Global Internet Forum to Counter Terrorism (GIFCT), an industry-led group set up by Google, Facebook, Twitter and Microsoft, as well as through the EU Internet Forum on terrorist content online (launched in December 2015). By sharing our unique understanding of the threat and analysis of how groups such as Daesh use HSPs, the UK has been able to push some of the largest tech companies to do more to tackle this issue. This voluntary approach has yielded results. For example, Facebook announced that it had taken action on 1.9 million pieces of Daesh and al-Qaeda content in Q1 of 2018 — about twice as much from the previous quarter. Twitter also announced in April that, between July and December 2017, 274,460 accounts were suspended for violations related to promotion of terrorism, with 74% of those accounts suspended before their first tweet.

Despite this progress by some of the largest HSPs (Facebook. Google/YouTube and Twitter), the availability of terrorist content online remains a cause for concern, both in the UK and in the EU. Europol reports that over 150 platforms were identified as hosting terrorist content. A significant number of these companies are established outside of Europe but offer their services within the EU. With terrorist content still accessible on the open Internet and as content continues to diversify and spread on to a wide range of smaller platforms, the EU sees the need to address the issue as an urgent imperative. Voluntary collaboration with some of the major HSPs has only taken us so far, and to fully tackle terrorist content online, regulatory action is necessary.

6.21The Minister provides details of the outcome of the Commission’s consultation on tackling illegal content online. Whilst hosting service providers supported the continuation of a voluntary approach, they recognised “the potential negative effects of emerging legal fragmentation in the Union” and accepted that, if there had to be legislation, it should take the form of “a targeted intervention on specific issues of particular public value (e.g. a focus on terrorist content compared to all online harms)”. Many Member States considered that there was a need for binding EU legislation on terrorist content. Stakeholders nonetheless underlined the need to safeguard fundamental rights, particularly freedom of speech.

6.22The UK response to the consultation:

[…] reflected a previously expressed view that prioritising the use of proactive measures by HSPs to tackle terrorist content online, specifically automated removals, is the best way to make a meaningful impact on the problem. As such, we suggested that any EU action should: not rely solely on referral or ‘notice and takedown’ measures but put a requirement on the companies to specifically monitor their platforms for terrorist content; reinforce the importance of speed of removal; and ensure that companies take action within an hour of upload or ultimately prevent the content from being uploaded in the first place.

6.23The Minister recognises that the proposed Regulation will have the greatest impact on small, medium and micro companies that “do not already have the relevant processes in place to address dissemination of terrorist content” (for example, content moderation, filtering technologies) or to fulfil new transparency reporting requirements. There is a risk that the proposal could “reinforce the relative market power of the very large companies which have already internalised costs for the associated processes, or which have preferential arrangements between them to share specific technologies, if these technologies are not available to all market participants”. The Government is “actively lobbying the largest tech companies to share tools and expertise with smaller HSPs”.

6.24There is a further risk that the resources needed to comply with the proposed Regulation could have a negative impact on innovation. This has to be set against the potential to “incentivise further technology development on tools for automatic content detection, filtering and moderation technologies” and the prospect that a higher level of trust in the hosting service providers may lead to “greater uptake of digital services”, greater investment and increased advertisement revenues.

6.25The proposed Regulation would increase costs for Member States which lack the capability to detect and notify illegal online content swiftly. However, the costs for the UK are expected to be “minimal” as it already has an effective body (the Metropolitan Police’s Counter Terrorism Internet Referral Unit) and the necessary infrastructure to implement the proposal.

Previous Committee Reports

None.


82 The Conclusions agreed by EU Leaders on 28 June 2018 “welcome the intention of the Commission to present a legislative proposal to improve the detection and removal of content that incites hatred and to commit terrorist acts”.

83 For an overview of the Commission’s proposal, see the European Commission’s fact sheet published on 12 September 2018, A Europe that protects: Countering terrorist content online.

84 This includes service providers who are not established in the EU but have a substantial connection with one or more Member States because, for example, their activities are targeted towards those countries or they have a significant number of users — Article 2(3).

85 See Article 2(5) and recital (9) of the proposed Regulation and the Commission’s fact sheet on taking action to get terrorist content off the web, issued on 12 September 2018. The definition draws on the definition of terrorist offences in Directive (EU) 2017/541 on combating terrorism.

86 Article 3.

87 Article 13(4).

88 See Article 4 and Annex 9, p.141 of the Commission’s Impact Assessment accompanying the proposed Regulation (ADD 2). A Member State may designate an administrative, law enforcement or judicial body as the authority responsible for issuing removal orders — see recital (13) of the proposed Regulation.

89 Article 5.

90 Article 7. Related data can include subscriber data (to identify the content provider) and access data (such as log-in and log-off times, IP address, etc) — see recital (20) of the proposed Regulation.

91 Article 13.

92 See Articles 3(1) and 6(4).

93 Article 4(4) and (9).

94 Article 8.

95 Article 9.

96 Article 11.

97 Article 10.

98 Articles 15 and 16.

99 Article 6.

100 Articles 17 and 18.

101 See p.5 of the Commission’s explanatory memorandum accompanying the proposed Regulation and recital (1) of the proposed Regulation.

102 See Article 15 of Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce) and recital (19) of the proposed Regulation.

103 See Article 15(3) of the proposed Regulation.

104 Under section 3 of the Terrorism Act 2006, terrorist content notified to a hosting service provider must be removed within two working days — substantially longer than the one hour envisaged in the proposed Regulation.

105 See p.2 of the Commission’s explanatory memorandum accompanying the proposed Regulation.




Published: 30 October 2018