Culture, Media and Sport CommitteeWritten evidence submitted by the Home Office

Introduction

1. This paper sets out the Home Office’s evidence to the Culture, Media and Sport Committee inquiry into online safety.

2. Ensuring safety online, particularly for children, is a priority for Government. In a speech to the National Society for the Prevention of Cruelty to Children (NSPCC) the Prime Minister set out a range of measures to ensure that the cultural challenges of protecting children from accessing potentially harmful or age inappropriate content, and the challenges of restricting the availability of, and access to, criminal content such as child abuse material are met..

3. The Home Office’s evidence focuses particularly on the Committee’s concerns about illegal child abuse images online and illegal extremist materials online. It should be considered alongside evidence submitted by the Department for Culture, Media and Sport on harmful or age inappropriate content and threatening or abusive behaviour..

Tackling Illegal Child Abuse Images Online

4. Research by the NSPCC indicates that about 5% of children in the UK suffer contact sexual abuse at some point during their childhood. Child abuse has many forms, one of which is online and multinational activity using sophisticated technology and techniques to evade law enforcement attention. The internet (including the so-called “hidden internet” which facilitates a sense of a “safe” community of offenders) is used for the proliferation of indecent images of children. Work to tackle this between government, law enforcement and industry is an important part of the Organised Crime Strategy that was published on [7 October].

5. Images of child sexual abuse are illegal. It is illegal to take, make, circulate and distribute child abuse images (Protection of Children Act 1978)—such offences carry a maximum sentence of ten years’ imprisonment. It is illegal to possess indecent images of children (Criminal Justice Act 1988)—such offences carry a maximum sentence of five years’ imprisonment. Such offences apply equally online as they would offline.

6. The Government has worked with industry to develop some of the tightest controls in the world so that child abuse images are removed from the internet where possible, blocked from being accessed from the UK where they cannot be removed, and investigated so that those involved in their production, distribution or possession are brought to justice and so that the victims of these crimes can be protected and safeguarded. The Child exploitation and Online Protection Command of the National Crime Agency is central to this effort.

7. Tackling such abuse on the internet forms an important aspect of our wider approach to protecting children and vulnerable people—a key priority of this Government. The Government has established a Sexual Violence Against Children and Vulnerable People National Group (SVACV). This is a panel of experts and policy makers brought together by the Home Office to coordinate and implement the learning from recent inquiries into historic child sexual abuse and current sexual violence prevention issues. Tackling the misuse of the internet by those seeking to sexually abuse children, or gain personal gratification from it, is an important aspect of the National Group’s work. On 24 July 2013, the Government published a progress report and action plan on the work of the SVACV National Group. This contained clear activity already delivered by the Government to protect children online and an action plan to take forward further activity identified by the Group. The progress report and action plan can be found at the following link:

https://www.gov.uk/government/publications/sexual-violence-against-children-and-vulnerable-people-national-group.

Removing Illegal Child Abuse Images from the Internet and Preventing Access to Them

8. The Internet Watch Foundation (IWF) takes reports of illegal child abuse images from the public. It has the authority to hold and analyse these images through agreement with the Crown Prosecution Service and the Association of Chief Police Officers. The IWF assesses images, determines whether they are illegal and, if so, the severity of the images using Sentencing Council guidelines. Work is ongoing to expand the role of the IWF so that it can proactively search for child abuse images rather than being reliant on reports from industry and the public.

9. If the site hosting the image is located in the UK then the details will be passed to law enforcement agencies and the Internet Service Provider will be asked to take down the website using the “notice and takedown” process. In 2012, the Internet Watch Foundation found that 56% of UK child sexual abuse images were removed in 60 minutes or less from the point at which the Internet Service Provider or hosting provider was notified of the content. IWF Members remove 90% of such content within one hour, and 100% within two hours. In the past 17 years, over 400,000 webpages have been assessed—100,000 have been removed for containing criminal content. Less than 1% of child sexual abuse content is now thought to be hosted in the UK, down from 18% in 1996.

10. Where images are hosted outside the UK, the IWF will pass the details to their equivalent body in the country identified as hosting it so that they can take action. Often it is taken down completely, and the average time taken to remove child sexual abuse content hosted outside the UK had reduced to ten days in 2011 compared to over a month in 2008. Until the content is taken down the Internet Watch Foundation adds the URL to its “URL list” or “blocking list” which IWF Members can use to block access by their customers to the identified images. Such blocking arrangements currently apply to about 98.6% of domestic broadband lines.

11. The IWF is working with its members to introduce “splash pages”—these are warning messages that appear if a user attempts to access a webpage that has been removed for hosting illegal child abuse images, and they deliver a hard-hitting deterrence message to users seeking to access child abuse images.

12. These arrangements have been enormously successful, but the Government is committed to going further still in order to prevent offenders from being able to access child abuse images through the internet. That is why the Prime Minister, in his speech of 22 July, announced a further set of initiatives.

13. Internet search engine providers have been asked to go further in restricting access to illegal child sexual abuse images. They have been asked to do three things:

(a)To work with law enforcement to develop effective deterrence messages to target those users who try to access child abuse images through their search engines.

(b)To ensure that illegal child sexual abuse images are not returned in search results, and to use their search algorithms to guide users away from searches that could contain pathways to child abuse images.

(c)To prevent any search results from being returned when specific search terms are used that have been identified by CEOP as being unambiguously aimed at accessing illegal child sexual abuse images.

14. The Government continues to work with search engine providers to put these measures in place. If progress is not forthcoming then the Government will consider bringing forward legislation to require their compliance.

15. The objective for all of these actions is to make it more difficult for unsophisticated users to find a route from open searching to more sophisticated offending environments, make it more difficult for inquisitive non-offenders to access indecent images of children, and make it less likely that members of the public could inadvertently come across such images.

Investigating Those Responsible for Illegal Child Abuse Images and Protecting Victims

16. The Child Exploitation and Online Protection Command of the National Crime Agency (CEOP) is the national lead for tackling sexual exploitation and sexual abuse of children. It is an integral part of the new National Crime Agency, that was established on 7 October 2013 to lead the UK’s fight to cut serious and organised crime.

17. CEOP has done excellent work over the last seven years. It is dedicated to eradicating the sexual abuse of children, and supports both operational work to protect children and educational programmes to help children protect themselves online. It operates a reporting centre for children who want to report online sexual threats (in 2012–13 CEOP received an average of 1,600 reports per month of abuse from the public and the internet industry) a 14% increase on the previous year. This will have included direct reports from children. CEOP maintains close relationships with a wide range of stakeholders from law enforcement, industry and the educational sectors nationally and internationally. In 2012–13, CEOP safeguarded and protected 790 children, an increase of 85% on the previous year, and its work led to the arrest of 192 suspects. CEOP has now protected more than 2,250 children in its seven-year history.

18. Government funding for CEOP was higher in 2012/13 (£6.381m) than it was in 2009/10 (£6.353m). The CEOP budget has effectively been protected in cash terms since 2011/12 and there are now more people working in CEOP than at any time in its history.

19. Becoming a part of the NCA will bring advantages to CEOP. Its existing operating principles will be preserved as agreed when the Plan for the NCA was published, including its innovative partnerships with private and voluntary sectors. It will also gain access to more capacity to deal with complex cases of child sexual exploitation and abuse. It will gain greater resilience for front-line operational services and benefit from support from other NCA specialist functions such as the National Cyber Crime Unit. Its influence, as part of the national leadership the NCA will have across law enforcement, will be enhanced. The Crime and Courts Act 2013 places a statutory duty on the NCA to safeguard and promote the welfare of children in England and Wales, which means that all officers, not just those directly involved in child protection, will receive mandatory training on safeguarding children. This means that every one of over 4000 officers will have a legal duty to safeguard and promote child welfare so will be able to protect even more children from harm.

20. CEOP take the creation, distribution and accessing of illegal child sexual abuse images online extremely seriously. Not only do such images capture a child being sexually abused, but the viewing of such images can lead to an escalation in offending as well as continuing the victimisation of the child being abused in the image. A CEOP report in 2012 (“A Picture of Abuse”) noted that research had identified a link between the possession of illegal images of children and the contact sexual abuse of children.

21. The Government will support CEOP and the police to tackle those involved in the creation, dissemination and accessing of child abuse images online. This includes ensuring that law enforcement agencies have the powers that they need to investigate offenders and bring them to justice, and to develop new capabilities that will enhance their effectiveness. The Prime Minister, in his speech of 22 July, announced plans for the creation of a new child abuse image database that would hold a national record of the images seized by the police. This would enable the police to identify known images more quickly on suspects’ computers, will improve their ability to identify and safeguard victims from the images, and has the potential to enable industry to use the unique numerical “hash values” derived from the images to search for them on their networks and remove the before offenders can access them.

22. CEOP is also at the forefront of efforts to tackle more sophisticated offending where encryption and anonymisation is used by offenders to share illegal child abuse images through photo-sharing sites or networks away from the mainstream internet. This includes work in collaboration with the internet industry and the Internet Watch Foundation to tackle the peer-to-peer sharing of child abuse images.

Working to Tackle Child Abuse Images Globally:

23. In addition to the work of the IWF considered above, CEOP work extensively with their partners overseas to tackle the global threat from illegal child abuse images. CEOP participates in, leads and assists international investigations into illegal child sexual abuse images and online child sexual exploitation. CEOP is also a founding member of the Virtual Global Taskforce which brings together law enforcement, public and third sector partners from across the world to tackle child abuse materials.

24. The UK strongly supports the work of the Global Alliance between the EU and US, with CEOP leading the work for the UK. The Deputy Director of CEOP is leading as the Driver on the EU EMPACT (European Multidisciplinary Platform against Criminal Threats) Cyber Child Sexual Exploitation initiative. The first meeting took place in October 2013 at Europol and agreed the Strategic Goals and Action Plan for 2014.

25. The Prime Minister has asked Joanna Shields, the Business Ambassador for Digital Industries, to lead industry engagement for a new UK—US taskforce to bring together the technical ability of the industry in both countries to tackle the use of the internet for child sexual abuse images. The Government expects to be able to make further announcements about the work of the taskforce in due course.

Prevention—Helping Children Protect Themselves Against Becoming Victims

26. Online safety for children is of paramount importance to Government, and there are major cross-Government efforts taking place to ensure that children learn about staying safe online both in schools and through other initiatives.

27. CEOP’s work to educate children and young people focuses particularly on the implications of their online behaviour, the “digital footprint” they leave, and the range of contact, content and conduct risks they face online. CEOP has developed a specific educational programme called ThinkuKnow to tackle this issue. The programme includes educational films and cartoons, teachers’ packs and other online resources for use with children aged four—18. In 2012–13, over 2.6 million children saw the ThinkuKnow resources. In the same year, over 800 professionals in education, child protection and law enforcement have been trained by CEOP to educate children about online safety and how to respond to incidents. The Government, through CEOP, has now trained over 25,000 professionals who deliver these key messages to children in classrooms and other youth settings.

Tackling Material Intended to Promote Terrorism or Other Acts of Violence Online

28. Extremists use the internet to spread extremist messages to committed terrorists and those vulnerable to radicalisation. This material covers a spectrum from illegal AI Qaeda propaganda which incites violence like Inspire magazine, through to dedicated extremist forums and hate preacher videos on social media.

29. We know from recent attacks in Boston and Woolwich that online terrorist and extremist material remains one of a range of variables that can contribute to an individual becoming radicalised, and/or tipping into the acceptance of undertaking violent acts.1 Our primary focus under the Prevent Strategy (2011) has therefore been to remove online material that breaches terrorism legislation. Proposals for going further to restrict access to this material will be discussed at the Extremism taskforce in October 2013.

Current Approach—Limiting Access to Illegal/Terrorist Material

30. A specialist police unit, the Counter Terrorism Referral Unit (CTIRU) currently take down material that breaches the Terrorism Act and is hosted in the UK, or— where we have strong relationships with industry. CTIRU proactively seeks to identify illegal material, and members of the public concerned about terrorist or extremist content online are able to refer material for investigation and potential removal by the CTIRU. In this respect, the CTIRU acts like the Internet Watch Foundation (IWF) in receiving reports and ensuring content is removed where possible.

31. Since it was established in 2010, CTIRU have taken down over 6000 pieces of material (more than 5,000 of which in the last 12 months). Even so, we would like to take down more terrorist content. However, a significant amount is hosted overseas and domestic law, even when it has extra-territorial jurisdiction, is of limited use. It would be difficult to prosecute a person or service provider operating outside the UK—they would have to be extradited to the UK to face prosecution and they could well argue that they had not engaged in criminal conduct in the country in which they were operating.

32. Where we cannot take down material, it is filtered from parts of the public estate (we have prioritised schools and some libraries). The filtering project has developed and our content list (of over 1000 URLs) is assessed by both CTIRU and the Crown Prosecution Service (CPS) to contain only URLs that are illegal. The main drawback with filtering is that UK users are still able to choose to view this illegal terrorist content (via desktop settings). We are considering how we can further restrict access to illegal terrorist material (potentially at the network level), further aligning with the IWF’s approach.

33. There is no consistent global definition for what constitutes extremist material and the majority of harmful, offensive and inappropriate content online does not meet legal thresholds for action by law enforcement. For extremist material, this includes al-Awlaki’s series of online sermons and practical instructions that can be of use to terrorists.

Improve the Industry Response to Terrorist Content

34. We aim to encourage the public and civil society groups to report legal but harmful content and seek to get it removed under industry terms and conditions. We are therefore working with industry to ensure their acceptable use policies are clear about what is tolerated on their platform and provide easy mechanisms to flag concerns about extreme content.

35. We are keen to see a consistently effective response to terrorist content across industry. We have seen some improvement following bilateral negotiations on individual acceptable use policies, but inconsistencies remain. Some companies run schemes which prioritise law enforcement referrals, however others make no explicit mention of terrorist or extremist content in their policies nor do they provide a single point of contact for law enforcement. Whilst engaging with industry to ensure that their own acceptable use policies are being applied rigorously, we are also considering the Home Affairs Select Committee recommendation of establishing a code of conduct for internet companies, distinct from their own terms and conditions, to improve the response to terrorist material (eg including “terrorism” as a category under unacceptable use).

36. We also tackle this problem from the user-end and run digital awareness projects that work with practitioners, parents and young people to ensure they have the skills to recognise and challenge extreme content. For example, through Prevent funding the Youth Justice Board have trained a hundred youth offending team practitioners to ensure they have skills to challenge and report extremism online and we have worked with Association of Chief Police Officers (ACPO) who have disseminated an internet safety DVD to local prevent projects, raising awareness of the issue. We will work with DCMS to ensure we are linked into initiatives such as Safer Internet Centre and Get Safe Online, which provide internet safety information and advice alongside a wealth of internet safety resources for schools and information for parents and children.

Making Extremist Content Less Attractive and Promoting Positive Messages

37. We also seek to divert audiences and those vulnerable to radicalisation from extremist messages. The Research, Information and Communications Unit (RICU), part of the Office for Security and Counter Terrorism in the Home Office, aims to challenge and confront terrorist ideology and provide pathways for people to explore alternatives. Activity includes working with digital communications experts to help civil society groups make better use of the internet to counter extremist ideology. We intend to scale up this work.

38. RICU actively shares best practice on communications through the Global Counter Terrorism Forum, the EU Radicalisation Awareness Network, and in bilateral work with the US and other countries.

October 2013

1 Joint Terrorism Analysis Centre (JTAC) assessment 2013

Prepared 18th March 2014