Online safety - Culture, Media and Sport Committee Contents


2  Child abuse images

Nature and scale

5. Both the publication and possession of child abuse images are illegal. This is one form of censorship that commands near universal support. Not only do such images represent actual crime scenes, but they are likely to be used by online paedophiles to encourage and legitimise their criminal activities. The law in the UK recognises this by proscribing a range of images of different types, whether photographic or not.

6. Estimating the scale of child abuse images on the internet is problematical given the secretive nature of much online activity by paedophiles. The term "child pornography", while still in common use, not least in legal texts, is now largely abjured by professionals working in child protection. In no way can images of this kind by likened to consensual activities depicted in legal adult pornography. Paedophiles ought to be denied any comfort when seeking to affirm their practices and preferences. Accordingly, we shall use the terms "child abuse images" or "child sexual abuse images" to refer to the material covered in this section of our report. Claire Lilley of the NSPCC told us:

    We need to educate people that these are not just images; that by looking at these images, they are committing an offence; that a child is being re-victimised every single time an image is looked at; and that they are potentially affecting their own sensitivity around the issue and escalating their own ability to go down that route and end up abusing in the physical sense themselves. There is quite a lot of research now about the crossover between non-contact and contact offending and CEOP would put it at about 50%. There are a couple of studies that put it at between 40% and 55%, which is a very high level of crossover. It is not just looking at an image; it is much more dangerous than that.[2]

Peter Davies of CEOP underlined this point: "anybody who possesses indecent images of children is a criminal, but is also somebody who might present an additional risk to children, as if that were not enough."[3]

7. As the NSPCC notes, child abuse images are a visual record of the sexual abuse of a child. They can include pseudophotographs, animations, drawings, tracings, videos and films which are being streamed live. In the UK images are graded on a 1-5 scale. Level 5 images involve sadism or bestiality, Level 4 will portray a child engaged in penetrative sexual activity and so on to Level 1, where the images will depict erotic posing with no visible sexual activity.[4] In 2012 the NSPCC issued FOI requests to every local police force in England and Wales asking them to state how many child abuse images they had seized in arrests made in the two years ending April 2012. The five police forces (none of which had a large metropolitan base)[5] that replied had seized over 26 million such images. The Children's Charities' Coalition on Internet Safety told us that, on one calculation, that would imply that over 300 million illegal images may have been seized by all forces over the same period.[6] Many of the images seen by the Internet Watch Foundation have been recycled, though one or two new images—each representing a new victim—are seen weekly.[7]

The law

8. Section 1 of the Protection of Children Act 1978 makes it a criminal offence to take, permit to be taken or to make, distribute, show, advertise or possess for distribution any indecent photograph or pseudo-photograph of a child under the age of 18. Simple possession by individuals is proscribed by the Criminal Justice Act 1988. The 1978 Act defines a pseudo-photograph as "an image, whether made by computer-graphics or otherwise howsoever, which appears to be a photograph." Pseudophotographs were brought within the ambit of the Act by dint of the Criminal Justice and Public Order Act 1994. The Criminal Justice and Immigration Act 2008 extended the definition of an indecent photograph to include a tracing or other image derived from a photograph or pseudophotograph. Part 2, Chapter 2 of the Coroners and Justice Act 2009 extended the law proscribing the possession of child pornography to include non-photographic images such as cartoons, drawings and computer-generated images.

9. Peter Davies of CEOP told us: "it is quite remarkable to me how far the criminal legislation, for example, around indecent images of children, which was, I believe, conceived and passed before the internet was around, has still pretty much stood up to the new world of child abuse online, and I do not think what we need is a basic root-and-branch piece of work."[8] He added that "the UK has just about the best suite of child protection legislation that there is, and that we are more of an example to others than we are in need of catching up."[9] At a recent discussion meeting of the Digital Policy Alliance, the Deputy Director of CEOP Command, Andy Baker, referred in general terms to child protection legislation more widely; he suggested it would be better to "tidy it up".[10] The fact that the Communications Act 2003 was drafted with no mention of the internet also should be addressed. We believe that the Government should, in due course, consolidate the law around child abuse images into a single Act of Parliament with a view to providing even greater clarity for the purposes of law enforcement and deterrence.

10. Clearly the fight against child abuse and child abuse content on the internet is an international one. Peter Davies referred to two conventions—the Budapest Convention and the Lanzarote Convention—which together aim to provide a legal framework for protecting children online. The Budapest Convention on Cybercrime[11] and the Lanzarote Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse[12] are both Conventions of the Council of Europe. The United Kingdom is a signatory to both, though it has ratified only the former. Last July, the Parliamentary Under-Secretary of State, Home Office (Lord Taylor of Holbeach) was unable to say when the Government expected to ratify the Lanzarote Convention; he said: "Until the work to assess the practical arrangements is complete, it will not be possible to confirm the timescales for ratification. That work remains ongoing."[13] Given the worldwide nature of online crime, we recommend that the Government press for wider international adoption of both the Budapest and Lanzarote Conventions. The Government should ratify the Lanzarote Convention as soon as practicable.

Enforcement

11. Peter Davies of CEOP referred to an assessment, published in June 2013,[14] which gives an estimate of around 50,000 people in the UK who commit child sexual abuse offences "at least to the level of possessing indecent images of children."[15] He went on to tell us that the highest volume of offenders are using peer-to-peer networks rather than the open internet. "Far fewer" offenders are using the hidden internet, "also known as Tor or The Onion Router or something similar."[16]

12. The proliferation of material which is illegal in all circumstances poses a particular threat and a range of challenges, not least to law enforcement. John Carr of the Children's Charities' Coalition on Internet Safety referred to the challenge posed by the scale of online child sexual abuse: "I think it is a profound shock to realise that so many people are interested in that type of material. There is an appetite for it and people go and get it, and they circulate it. They are getting away with it because we do not have the capacity to stamp it out."[17] Peter Davies of CEOP referred to a need for "a plan that goes beyond enforcement".[18] That is not to diminish the vital role that law enforcement must continue to play. The former chief executive of CEOP, Jim Gamble, emphasised the need to deter paedophiles by applying the law as it currently stands.[19] Citing figures given by Peter Davies, Jim Gamble said:

    On the 50,000 Peter talks about, those are peer-to-peer sites. Those are hard-core paedophiles who do not stumble across anything on Google. They nest in these places on the internet where they can share secretly. You have to infiltrate that. You have to infiltrate it. You have to identify who they are and follow them offline and arrest them.[20]

13. He went on to propose the recruitment, by each police force, of up to ten special constables to work online. Their job would be to seek out online paedophiles with the aim of achieving arrests in sufficient numbers to have a significant deterrent effect.[21] Mr Gamble puts a cost of £1.4 million on setting up such a scheme which would include the recruitment of 12 detective sergeants and 12 trainers and coordinators.[22] He told us: "If you did that, you would have 520 people online at any given time who could manifestly cover 10, 20, 30 or 40 chat-room environments, where you would have people out there protecting our children. Not rogue vigilantes, but properly vetted, properly trained and accredited and properly supervised officers online. To me, that is what we should be thinking about so we attack the root cause, which is people."[23]

14. CEOP (Child Exploitation and Online Protection) is now one of the four commands within the National Crime Agency, which commenced on 7 October 2013. The CEOP centre has existed since 2006 and was previously affiliated to the Serious Organised Crime Agency. Peter Davies told us: "Our mission, whether as a centre or as a command, as we are now, is to protect children from sexual exploitation and sexual abuse. When we were founded, there was heavy emphasis on online protection. Our remit does not limit us to online activity, but what we try to do is to become a national hub for intelligence, a national contact point and a source of expertise and specialist support on any aspect of child sexual exploitation that would benefit from that approach." Mr Davies also highlighted the contributions being made by online child protection units in UK police forces.[24]

15. In its evidence, the Home Office praised the Child Exploitation and Online Protection Command of the National Crime Agency for both its operational work and educational programmes. In 2012/13 CEOP received an average of 1,600 reports per month of abuse from the public and the internet industry, a 14% increase on the previous year. "CEOP maintains close relationships with a wide range of stakeholders from law enforcement, industry and the educational sectors nationally and internationally."[25] In 2012/13, CEOP protected 790 children from sexual abuse, an increase of 85% on the previous year, and its work led to the arrest of 192 suspects.[26]

16. According to the Home Office, the CEOP budget "has effectively been protected in cash terms since 2011/12" and "there are now more people working in CEOP than at any time in its history".[27] The Home Office also states that becoming part of the NCA will bring advantages to CEOP (such as access to greater capacity and support from other parts of the NCA) and that all NCA officers will receive mandatory training on safeguarding children. The Minister of State for Policing, Criminal Justice and Victims, Damian Green, described becoming an arm of the National Crime Agency as a "game-changer"[28] for CEOP because of access to increased resources. Peter Davies told us that incorporation of CEOP into the National Crime Agency was "absolutely a good thing."[29] His predecessor, Jim Gamble, clearly disagrees. He described subsuming CEOP into the National Crime Agency as "a recipe for disaster".[30] Mr Gamble added:

    To take a child protection agency and put it into a national crime agency where the brand has now been diluted, where they will continue to have protection under Freedom of Information, which is fundamentally wrong, how can you have a child protection entity that is not subject to Freedom of Information? You have a child protection entity that is answerable through the NCA to one political person, the Home Secretary. Where is the credible reassurance around that? Where are the lessons of serious case review around professional independence and challenge? The fact of the matter is we had planned the three pillars to be education, social care, and criminal justice. We now have moved to one pillar, which is criminal justice.[31]

17. We recommend that the Government examines whether adequate resources are being deployed to track down online paedophiles in sufficient numbers to act as a meaningful deterrent to others. If not, additional funding should be provided to recruit and train a sufficiently large number of police officers adequate to the task.

18. CEOP has been increasingly effective not least because it is not solely a criminal justice organisation: its education and social care work has also been very important in increasing public understanding of the problem of child abuse and in offering means of countering abusers. We therefore recommend that CEOP continues to publish an annual review which includes an assessment of its ongoing contribution to all three elements of its mission—education, social care and criminal justice.

Surveillance

19. Tracing paedophiles online clearly involves allowing the police to deploy a range of surveillance methods. The Regulation of Investigatory Powers Act 2000 aims to provide a legal regime for both interception of communications and access to communications data that is consistent with human rights legislation—specifically having regard to privacy and proportionality. Peter Davies argued for further changes to the relevant legislation:

    It is my firm operational view, and I have articulated it previously, that if there is one piece of legislation that would most help us tackle online child abuse, it would be the provision of a clear, consistent and properly enforced regime for retaining and accessing communications data, because we are regularly in a situation where we are unable to convert, for example, an IP address into a name and address for lack of that precise thing.[32]

20. One potential area of operational difficulty was illustrated in evidence from the End Violence Against Women Coalition; their evidence cites CEOP as having warned that live streaming of child abuse through Skype and other platforms is emerging as a growing method of abusers sharing child abuse images.[33] A government briefing on proposals on the investigation of crime in cyberspace, prepared for the Queen's Speech in 2013, noted:

    When communicating over the Internet, people are allocated an Internet Protocol (IP) address. However, these addresses are generally shared between a number of people. In order to know who has actually sent an email or made a Skype call, the police need to know who used a certain IP address at a given point in time. Without this, if a suspect used the internet to communicate instead of making a phone call, it may not be possible for the police to identify them.[34]

21. A draft Communications Data Bill was announced in the Queen's Speech in 2012 and it was published on 14 June 2012. It was scrutinised by a Joint Committee of both Houses of Parliament and was also considered by the Joint Committee on Human Rights (JCHR) and the Intelligence and Security Committee (ISC). The draft Bill would have extended powers to obtain communications data covering messages sent on social media, webmail, voice calls over the internet and gaming in addition to emails and phone calls. The data could have included the time, duration, originator and recipient of a communication and the location of the device from which it is made. However, it would not have included the actual content of messages. The Joint Committee on the draft Bill published its report on 11 December 2012, concluding that the Bill's scope should be significantly narrowed, while recognising that more needed to be done to provide law enforcement and other agencies with access to data they cannot currently obtain. The Bill would have sat alongside the existing Data Retention (EC Directive) Regulations 2009.[35]

22. An illustration of both the capability and concerns of internet service providers was given us by Dido Harding of TalkTalk:

    I feel very strongly the weight of responsibility as an internet service provider in that our customers place their faith in the fact that it is their data, not our data, to choose what to do with and, therefore, we need a clear legal framework on what we store and what we do not store.

    [...]

    We do not keep browsing history of where our customers browse every day of the week— that is their data, not ours—unless or until there was a change in legislation that required us to.

    [...]

    It is very important that the ISPs do not feel like it is a free-for-all to use that data. Our customers do not expect that of us. We have been thinking through, if it is entirely criminal, how we get to a place where we can do that proactively with CEOP.[36]

23. The South West Grid for Learning told us that they have been working with the Internet Watch Foundation and South West police forces since 2006 to flag up any attempted access to websites containing child abuse images (CAI). This pilot project has been managed by CEOP with Home Office approval. Its objective is to flag intelligence to police forces if anyone in a school (specifically) attempts to access a website containing child abuse images. This intelligence prompts the police to then undertake their normal investigation routes and has resulted in a number of school staff being identified and removed as a direct result of what South West Grid for Learning terms a "simple alerting process."[37] They added: "We believe we should use technology better to identify those accessing CAI (and extremist material) and we are involved in a project with Plymouth University and IWF to extend and refine our existing alerting capability."[38] We welcome the increasing use of alerting tools to identify individuals who seek out child abuse and other illegal material online provided these tools are deployed in ways that do not unduly compromise the privacy of the law-abiding majority.

Internet service providers

24. The scope for further practical action against illegal content is better understood by recognising the different types of providers and how they fit within the current regulatory framework. The Internet Services Providers' Association identifies four main categories of internet companies:

·  Access providers—considered to be "mere conduits" under the e-commerce regulations[39] (regulation 17).

·  Hosting providers—these include social networks; while these do not have editorial control over content uploaded by users, they may have active or passive moderating policies. Under Regulation 19 of the e-Commerce Regulations they are not liable for the content they host as long as they do not have actual knowledge of unlawful activity or information. "However, upon obtaining such knowledge, hosting providers become liable if they do not act expeditiously to remove or to disable access to the information."[40]

·  Websites where operators have editorial control—content might include a news article and user-generated content like comments on the article.

·  Search engines— considered as "caches" under Regulation 18 of the e-Commerce Regulations; search engines "act expeditiously to remove or to disable access to any information if they are made aware of that the fact that this information may be illegal."[41]

Internet Watch Foundation

25. The UK internet industry was responsible for founding the Internet Watch Foundation, a membership organisation that serves as the UK hotline where the public can report child sexual abuse content, criminally obscene adult content and non-photographic images of child sexual abuse. In 2012 alone, the IWF processed 39,211 reports and assisted with the removal of 9,696 URLs containing potentially criminal child sexual abuse content. A URL can be as specific as a single image or could refer to an entire website containing potentially thousands of child sexual abuse images or videos. The majority of victims (81%) appeared to be 10 years old or younger (with 4% 2 years old or under) and 53% of the images and videos depicted sexual activity between adults and children, including rape and sexual torture.[42]

26. When child sexual abuse content is found to be hosted in the UK, the IWF will inform CEOP. After confirmation from CEOP that action can be taken, the IWF will notify the hosting provider who will remove the content from its servers, typically within 60 minutes after receiving the notification from the IWF. This process is commonly referred to as 'Notice and Takedown'. The IWF can also act against criminally obscene adult content and non-photographic child sexual abuse content hosted in the UK.[43]

27. When child sexual abuse content is found to be hosted outside the UK (accounting for 99% of known content), the IWF will inform its counterpart hotline in the hosting country through INHOPE, the international association of hotlines, or link in directly with local law enforcement. As other countries take significantly longer to remove child sexual abuse content—50% of the content about which the IWF passes on details internationally is still available after 10 days—the IWF adds the links (URLs) to the content to its URL list (or 'blocking list'). IWF members can use this list to voluntarily block access to these URLs to protect their customers from stumbling upon the images and videos. The Home Office told us that such blocking arrangements apply to about 98.6% of domestic broadband lines.[44] Susie Hargreaves of the Internet Watch Foundation told us: "The most effective way to remove content is to remove it at source. It is our view that blocking will only stop inadvertent access and will not stop the determined."[45]

28. On 18 June 2013, the Secretary of State for Culture, Media and Sport, Maria Miller, hosted a summit on tackling child sexual abuse material on the internet as well as protecting children from harmful or inappropriate content online. Participants included internet service providers, search engines, mobile operators and social media companies. Following the summit, the Secretary of State announced that the Internet Watch Foundation would work with CEOP to actively seek out illegal images of child abuse on the internet. The Internet Watch Foundation told us that, following a donation by Google and additional funding by other members, they will be able to increase their number of analysts from 4.5 (FTE[46]) to 11.5 (FTE) and start proactively searching for child sexual abuse content as requested by the Government.[47]

29. We very much welcome the commitment by the Internet Watch Foundation to embark on proactive searching for online child abuse images. The sooner these can be found and removed, the better. However, we are concerned that seven additional staff might prove woefully insufficient to achieve substantial progress towards what must be an important intermediate goal: the eradication of child abuse images from the open internet.

Deterrence

30. The IWF told us that, in addition to "Notice and Takedown" and the URL list, they also compile a keyword list of terms that specifically refer to child sexual abuse content:

    This list is used, for instance, by search engines to prevent people from finding images and videos of child sexual abuse content. The keywords are very specific—or very specific combinations of words—that carry no meaning besides the specific reference to child sexual abuse content. This means the keywords will not prevent access to legitimate websites such as, academic research papers into the area of child sexual abuse or websites aimed to help or inform people in relation to child sexual abuse.[48]

31. The Government has asked search engine providers to go further in restricting access to child abuse images. According to the Home Office, they are being asked to develop effective deterrence measures, to ensure child abuse images are not returned in search results, and to prevent any search results being returned "when specific search terms are used that have been identified by CEOP as being unambiguously aimed at accessing illegal child sexual abuse images."[49] If progress is not forthcoming, the Government will consider introducing legislation to ensure search engines comply.[50]

32. The IWF is also working with its members to introduce "splash pages"—these are warning messages that appear if a user attempts to access a webpage that has been removed for hosting illegal child abuse images. According to the IWF, "they deliver a hard-hitting deterrence message to users seeking to access child abuse images."[51] Greater use of splash pages and warning messages "to deter a certain class of person with a low level, opportunist or early interest in child abuse images" is one of a number of tactics put to us by the Children's Charities' Coalition on Internet Safety.[52] Jim Gamble reminded us that splash screens would not result in more hard-core paedophiles being arrested: "it is a diversion of attention and resource that does not work. We tried it."[53] The Home Office pointed out to us that the objective of further work by search engines and greater use of splash pages "is to make it more difficult for unsophisticated users to find a route from open searching to more sophisticated offending environments, make it more difficult for inquisitive non-offenders to access indecent images of children, and make it less likely that members of the public could inadvertently come across such images."[54]

33. Search engines and other internet service providers have a vital role in ensuring that access to online child abuse images is prevented and deterred. We expect the Government to monitor closely their degree of commitment and success and to consider the introduction of legislation should they fall short of reasonable expectations.

Other material

34. Much of the evidence we took on illegal content was in relation to child abuse images. However, in the United Kingdom at least, certain categories of extreme adult pornography are illegal both to publish and possess. Pornographic material that explicitly and realistically depicts a variety of non-consensual and injurious practices was outlawed (in England and Wales) by the Criminal Justice and Immigration Act 2008. Similar provisions also appear in the Criminal Justice and Licensing (Scotland) Act 2010, with the notable addition of obscene pornographic images which realistically depict rape or other non-consensual penetrative sexual activity. The Criminal Justice and Courts Bill, currently before Parliament, would extend to England and Wales the definition of the offence of possession of extreme pornographic images to include rape. We welcome the Government's decision to include pornographic depictions of rape in the definition of extreme pornography. It has been illegal to publish such images for many years; outlawing their possession is long overdue.

35. Evidence we received from the Home Office also considered another area within the inquiry's terms of reference: tackling material intended to promote terrorism or other acts of violence online. A specialist police unit, the Counter Terrorism Referral Unit (CTIRU) proactively seeks and takes down UK-hosted material that breaches the Terrorism Act 2006. UK law has limited application in relation to the significant amount of material hosted overseas; such material "is filtered from parts of the public estate"[55] (the Government has prioritised schools and some libraries). The filtering can currently be circumvented by users changing their desktop settings; the Government is considering how they can "further restrict access to illegal terrorist material (potentially at the network level), further aligning with the IWF's approach."[56]

36. The evidence notes inconsistencies in approach among internet companies. "Whilst engaging with industry to ensure that their own acceptable use policies are being applied rigorously, we are also considering the Home Affairs Select Committee recommendation of establishing a code of conduct for internet companies, distinct from their own terms and conditions, to improve the response to terrorist material (e.g. including 'terrorism' as a category under unacceptable use)."[57]

37. There is clearly a need to obtain wider international consensus and cooperation in relation to combating criminally obscene adult material and terrorist material and we urge the Government to use all the influences it can bring to bear to bring this about within a transparent, legal framework.


2   Q 29 Back

3   Q 34 Back

4   Ev 70 Back

5   Q 5 Back

6   Ev 66 Back

7   Qq 34-35, 40 Back

8   Q 48 Back

9   Q 48 Back

10   Recent Developments in Child Internet Safety, Digital Policy Alliance Discussion Meeting, 22 January 2014 Back

11   http://conventions.coe.int/Treaty/Commun/ChercheSig.asp?NT=185&CM=8&DF=&CL=ENG  Back

12   http://conventions.coe.int/Treaty/Commun/ChercheSig.asp?NT=201&CM=1&DF=&CL=ENG  Back

13   HL Deb, 24 July 2013, col 197WA Back

14   http://ceop.police.uk/Documents/ceopdocs/CEOP_TACSEA2013_240613%20FINAL.pdf  Back

15   Q 40 Back

16   Q 43 Back

17   Q 27 Back

18   Q 59 Back

19   Qq 114, 116 Back

20   Q 114 Back

21   Q 113 Back

22   IbidBack

23   IbidBack

24   Q 45 Back

25   Ev 105 Back

26   CEOP Annual Review 2012-2013 & Centre Plan 2013-2014 Back

27   Ev 105 Back

28   Q 204 Back

29   Q 50 Back

30   Q 124 Back

31   Q 119 Back

32   Q 48 Back

33   Ev w85-w88 Back

34   HM Government, The Queen's Speech 2013, 8 May 2013. Back

35   SI 2009/859 Back

36   Qq 93,95 Back

37   Ev w49 Back

38   Ev w49 Back

39   The Electronic Commerce (EC Directive) Regulations SI 2002/2013 Back

40   Ev 79 Back

41   Ev 79-80 Back

42   Ev 78 Back

43   Ev 77 Back

44   Ev 104 Back

45   Q 34 Back

46   Full time equivalent Back

47   Ev 78 Back

48   Ev 77 Back

49   Ev 104 Back

50   Ev 105 Back

51   Ev 104 Back

52   Ev 67 Back

53   Ev 116 Back

54   Ev 105 Back

55   Ev 106 Back

56   Ev 106 Back

57   Ev 107 Back


 
previous page contents next page


© Parliamentary copyright 2014
Prepared 19 March 2014