Culture, Media and Sport CommitteeWritten evidence submitted by the National Centre for Cyberstalking Research (NCCR)

Protecting children from accessing content requires enhanced identification and authentication schemes.

Education has a vital role to play in all strands and requires adequate funding.

There needs to be greater responsibility and action by providers of Social Media Services to protect their users.

Anonymity on the Internet can lead to disinhibited behaviour and actions that victims find distressing.

Background

1. The Internet is an excellent vehicle that allows enhanced communications enabling a more cohesive society and more productive business world. However, there are a number of serious issues that have arisen in this communications network, and further legislation and regulation are needed to ensure the safety of a number of groups. The National Centre for Cyberstalking Research (NCCR) was established to address the need for research and analysis into the motivations, means, impact and investigation of cyberstalking. This submission to the call for responses to the Culture, Media and Sport Committee inquiry into Online Safety predominantly relates to matters around online grooming and stalking. Professor Maple has also contributed to the submission by the BCS.

How Best to Protect Minors from Accessing Adult Content

2. It should be recognised that there is a fundamental difficulty in current computer systems access. That is, that a user usually presents an identity (username) and access credential (password, biometric, token or other). Whoever possesses this information can, in general, access authorised information. Given the acceptance that there is adult content on the Internet, any minor with appropriate credentials could access that content: those minors that are suitably determined will be able to access material intended only for adults. Society must make it virtually impossible for minors to unintentionally access adult content, and difficult for minors to access adult content that they have actively sought.

3. There is a need for clarity about the role of parents in “parental controls”. Many parents are unaware of what is their responsibility and what is the responsibility others. Many families that we have met, even with “filters” and “controls” in place, are sure their children have access to material not intended for minors. They are unaware of what they can do to prevent such access.

4. There is a problem regarding the understanding, by children, of what is adult content. Whilst with games, films and even music there may be some classification of the content, that does not exist for a great deal of information and content on the Internet. Furthermore some “adult” (and indeed illegal) content is actually produced by children (in the form of sexting for example). The Internet Watch Foundation (IWF) has stated that 12,224 images, apparently created by minors, were reported in just four weeks.

5. Content (illegal, adult or otherwise) can spread around the internet very rapidly and so removing material from one site may have minimal impact if the material has already been harvested and reposted to another site.

6. While awareness is important, education is also necessary. The technical knowledge of children is very impressive, but there is a need to educate children on the impact of technology and the management of information in the unregulated space that is the Internet. There is good guidance available through initiatives such as Safer Internet and Get Safe Online. A particular area that requires special attention in any education programme is the spread and the permanence of material put online.

7. It is not only children that require educating, but also parents. Parents use the Internet in different ways to children and therefore may not understand how best to protect their children. It is unlikely as many parents have seen Chat Roulette as children have. Children consider email outdated and use Youtube not only to access videos but as a discussion medium. It would be positive for schools to be involved in the bringing together or parents and children. Topics that could be discussed include the management of online relationships, appropriate communications (sending and receiving) and what to do if children are concerned by interactions online.

Filtering Out Extremist Material, including Images of Child Abuse and Material Intended to Promote Terrorism or other Acts of Violence

8. There are two main types of method that can be employed to filter material; material can be filtered by automatic or manual means. The former relies on sufficiently sophisticated algorithms do detect unwanted material and the latter relies on people in the community notifying an authority of the existence of unwanted material. Such material is then removed, and an identifier placed on a blacklist so that the material cannot enter a site again.

9. The difficulties in blacklisting material should be recognised. Automated tools are not yet sufficiently advanced to perform blacklisting without some human control. An Irish study considering flitering within a school setting found that 50% of schools reported that filtering occasionally blocked valid educational sites, while 20% of schools reported that it regularly did so. (http://tnc2007.terena.org/programme/presentations/show4fa0.html). Hence it is difficult to rely on content based automated tools alone. Furthermore, what should be restricted and what should pass through a filter is often a judgement call and so some level of human intervention is likely to be required.

10. Blacklisting is an important activity but does come with a cost, both financial and non-financial, that must be met. Currently, for example, the Internet Watch Foundation, is largely funded by ISPs. This model would need to continue, but it may be possible to consider other routes for funding filtering technologies in the general sense. The value of the time of the volunteers that report material to the IWF is not insignificant, and must be understood. It is vital that reporting of unwanted material is a simple and transparent process. The True Vision initiative for hate crime reporting is an excellent example of a simple and transparent reporting mechanism, but it needs to be more widely publicised.

11. Consideration could be given to requesting the Internet Watch Foundation to widen its remit—this could have the benefit of a one-stop reporting centre.

Preventing Abusive or Threatening Comments on Social Media

12. The issue of abusive or threatening comments on social media has clearly been one that has disturbed society of late and gained major press coverage. In particular, the case on Twitter of Caroline Criado-Perez received major attention and highlighted this negative side of the new communications environment we live in. In response to the case, Twitter did state it would trial a “Report Abuse” button. There is no such button available on Twitter at this time and this resistance by the social media companies to protect its users and provide an avenue for reporting requires addressing. Research undertaken by the NCCR has shown that people do not know where to report cyberstalkng abuse, or who should be responsible.

13. Some social media providers do provide a Report Abuse button, but it is important that they also provide clear guidance and support for victims of distressing communications. These should detail methods for locating support and information on how to report the incident(s). Where possible and appropriate providers should maintain regular contact with support and criminal justice agencies.

14. The Guidelines on prosecuting cases involving communications sent via social media from the Director of Public Prosecutions published on 20 June 2013, categorise communications as those which:

15. The distinction between the cases is very important to ensuring the correct legislation, if any is used in cases of threatening or abusive communications. In particular, given the evidence of the significant impact of the first three categories (including recent evidence on the impact of cyberstalking) we must ensure that such actions are not simply considered as grossly offensive. This would, however, be the easiest route for many stakeholders to take and therefore should be guarded against.

16. Without the public having a clear understanding of the differences in these communications the problem is unlikely to diminish. Digital Natives have embraced technology but unfortunately without appropriate training and education they struggle to understand the social norms of internet communication and behaviour. Education surrounding the appropriate use of these new communications media will have an important role to play in combating the problem.

17. There is a clear issue around anonymity and perceived anonymity (as well as untraceability) in social media.

18. It is important all stakeholders consider the vulnerability of the victim in cases of abusive or threatening messages.

September 2013

Prepared 18th March 2014