Culture, Media and Sport CommitteeWritten evidence submitted by BCS, The Chartered Institute for IT

The Institute promotes wider social and economic progress through the advancement of information technology science and practice. We bring together industry, academics, practitioners and government to share knowledge, promote new thinking, inform the design of new curricula, shape public policy and inform the public.

As the professional membership and accreditation body for IT, we serve over 70,000 members including practitioners, businesses, academics and students, in the UK and internationally. We deliver a range of professional development tools for practitioners and employees.

A leading IT qualification body, we offer a range of widely recognised professional and end-user qualifications.

www.bcs.org

Headline Points

The public, both collectively and individually, hold conflicting views on the balance between privacy and protection;

It is impossible to prevent knowledgeable determined people accessing material, however “illegal”, if that material is anywhere on the Internet;

However, it is possible to make it less likely for naïve users to stumble across material that some agency (which? and how?) has deemed to be offensive;

The question of what is offensive is one that calls for human judgement, is often context-dependent and that human judgement may be highly controversial;

Individual cases prompt knee-jerk reactions, but “hard cases make bad law”;

Preventing someone accessing something that they regard as desirable is likely to encourage them to adopt evasion technologies, which nullify *all* filters, not just those for material thought undesirable.

Background

1. In order to access a document/film/video across the internet one has first to know where it is (discovery) and then have its data on one’s device (delivery). Delivery is the business of Internet Service Providers (ISPs), while discovery is generally the business of search engines, but also of catalogues, such as Facebook links, recommendations, word of mouth and rumour in school playgrounds.

2. The Internet Service Providers are doing business in this country, and know who their customers are (in the sense of who is paying their bills). They are subject to various forms of UK regulation and legislation. The search engines may or may not be based in the UK. They may or may not have any office or business in the country and may not be subject to legal or moral pressure by the UK authorities. The Prime Minister’s speech was somewhat confused when he said; “the search engine shouldn’t be involved in finding out where these images are because the search engines are just the pipe that delivers the images, and that holding them responsible would be a bit like holding the Post Office responsible for sending illegal objects in anonymous packages”—the ISPs are the analogue of the Post Office, not the search engines.

3. It is important to understand that all documents (such as films and videos) are delivered across the internet as if they were live broadcasts: no intermediary holds the entire document for analysis. A good analogy is that of a service reading books aloud. There are then two fundamentally different approaches to censorship (which is the main issue).

4. One is black-listing: analogous to “I’m not going to read you that book because it’s called ‘Lady Chatterley’s Lover’, and I’ve been told not to read that book” or “I’m not going to show you that film because it’s classified 18”. The major problem with this approach is that a vanishingly small proportion of the internet has been examined for banning/classification.

5. The other is content-based filtering: analogous to “I’m going to stop reading that book now because I’ve come across this banned word in it”, or “I’m going to stop showing this film because the last frame was more than 40% ‘pink’ (sexualised imagery)”. There are two problems with this approach. The first is that, under the Regulation of Investigatory Powers Act, it is probably illegal for ISPs to do. The second is that of false positives: many books contain occasional “banned” words, and a frame may well be “pink” because of a sunset, or a swimming gala.

6. The same objections apply to hybrid approaches, such as “I’m going to stop reading that book now because I’ve come across a mention of a gamekeeper called Mellors”.

7. These difficulties should not be minimised. It would be no defence under the Regulation of Investigatory Powers Act for an ISP to argue that the consumer (typically the parent of the child actually using the connection) has consented: both participants to the connection have to consent to the interception, and it is hard to see how an automatic web-server can consent.

8. Equally, the Chinese had a significant research project as part of what is generally termed “the Great Firewall of China” to search for “inappropriate” skin tones, but that project has apparently been abandoned.

9. A further problem is that what is “abhorrent”, or even “illegal images of children”, is context-sensitive. It is easy to say “medical textbooks are a special case, and are not meant to be read by minors anyway”, but the problem is far broader than that. In 1995 there was a flurry (eg http://www.independent.co.uk/news/julia-somerville-defends-innocent-family-photos-1538516.html) of stories about photograph developers reporting parents over image of their children. The switch to digital photography has merely meant that such images do not need developing. Many of them then find their way onto social media.

“How Best to Protect Minors …”

10. The Select Committee asks about “minors”. There is no known technology which will determine if a computer, or other device, is being used by a minor. For a home broadband connection, it would be possible for the purchaser to be asked whether there was likely to be a minor using the connection, but of course it is possible for the purchaser to lie, or even be honestly mistaken, as when suddenly looking after grandchildren.

11. Greater availability of, and publicity about, “parental controls” (which in fact are not parental at all, but the parent buying into some-one else’s controls), on the lines of that offered by UK Safer Internet Centre,1 would help. However, it is worth recalling two fundamental statements from their site: “filters can be a helpful tool in reducing the chances of coming across something upsetting” and “remember that filtering is only part of the solution”.

12. A more challenging problem is provided by public WiFi technologies, which are used, often without accounts, or via accounts with no verification. Public convenience would seem to demand this light-touch access. It would be technically possible to have parental controls, although BCS does not necessarily recommend this would affect all users and could lead to a rapid spread of evasion technology.

13. Similar to parental controls on the connection, greater publicity about tools such as Google’s Safe Search2 would help, but again it is worth noting a fundamental statement: “please note that no filter can replace a watchful eye”.

“… From Accessing Adult Content”

14. One problem which complicates the issue is that there is no international agreement about what constitutes “adult” content. Social norms vary widely and it is unlikely that there will be much consensus in the near future.

15. Content which has been formally rated “adult” in the UK is not a major problem. That content typically requires purchasing, for instance, via a credit card or PayPal account and the transaction will show up. A greater problem is pirated copies of such material, which are therefore not formally classified. The worldwide digital content industry is working hard to combat such piracy and this should be viewed as an international problem based on the fundamentally international character of the internet. BCS therefore does not see that UK centric action is likely to be helpful.

16. A particularly worrying development is the prevalence of truly home produced material by apparent minors. In one four-week period, the Internet Watch Foundation3 (IWF) had 12,224 such images reported. 88% of these were on “parasite” (IWF terminology) websites, ie those that harvested such material from the website to which it was originally uploaded.

Education

17. The Byron report made a powerful analogy: “At a public swimming pool we have gates, put up signs, have lifeguards and shallow ends, but we also teach children how to swim”. To this one could well have added “and we help parents to teach children to swim, and we teach parents to be lifeguards.”

18. The sort of education necessary here for children is not technology education, it is societal education. For this reason BCS believes that it belongs in the general Personal, Learning and Thinking Skills (PLTS) category, rather than in ICT- or Computing-specific classes. There is excellent advice at the Get Safe Online website, and class templates such as https://www.isc2cares.org/safe-and-secure/are available.

19. The IWF’s comment on the home-produced material points again in this direction. “These findings provide evidence to support the importance of the education work delivered by child protection agencies to raise awareness of the permanence of information on the internet and the risks inherent to young people in creating and distributing this type of content.”

20. A challenging question is what and how much education is appropriate for parents. Some advice and help on “parental controls”, both on the configuration and on tools such as Google Safesearch4 and YouTube’s SafetyMode,5 most of which have come along since parents first encountered the Internet, is also appropriate. Similarly, parents need to become aware of the (excellent, and recommended in the Byron report) PEGI rating system for games and the advice at http://www.askaboutgames.com/, another site which has emerged since many parents learned about the internet and which recent learners will not necessarily come across. Schools should probably be encouraged to facilitate and host such classes in the PTA context.

21. Such classes will certainly need to cover technology, but should probably be wider. Parenting guidance is sometimes sought where parents would like support in terms of how to engage with their children on social topics that children might explore on the internet. These social topics are all related to what we might class as “growing up on the internet” and have security facets to them. Topics might include; management of friendships mediated by the Internet, trolling and other forms of Internet mediated abuse, balancing internet mediated activities with offline activities, identity and projection of self via the internet etc. Again, Get Safe Online has good age-specific materials, but these need to be drawn to the attention of parents, and their attention refreshed as the children grow older.

“Filtering Out Extremist Material, including Images of Child Abuse and Material Intended to Promote Terrorism or other Acts of Violence”

22. This is already done to a significant extent in the area of child abuse (technically speaking, indecent images of children, which are illegal to possess under the Protection of Children Act 1978) by use of blacklisting technology. More support could be given to the people, largely volunteers, who do the initial reporting, and to the blacklisting process, generally under the auspices of the IWF. It is worth commending the extent to which ISPs and mobile operators already cooperate with the IWF to manage and apply these blacklists.

23. It should be noted that blacklisting is not free, and has both financial and non-financial costs:

6 showed that 50% of schools reported that filtering occasionally blocked valid educational sites, with a further 20% reporting that it regularly did so;

24. There has been much resistance to the Internet Watch Foundation’s widening its remit to the other material in the Select Committee’s question, and BCS does not believe that this is the way forward.

25. Some people say that more should be done, and imply, without saying so, that content-based filtering should be used, so that more such material could be blocked. This would require a major change in society’s attitude to censorship, as well as primary legislation to enact fundamental changes to the Regulation of Investigatory Powers Act. BCS does not believe that this is either feasible or desirable.

Preventing Abusive or Threatening Comments on Social Media

26. There has been much publicity recently around “online bullying”. Again, the problem is largely societal rather than technological. The recent publicity has forced some developers into adding “report abuse” buttons, but that is of little use unless the victims have the courage to do so. Hence this really comes down to an education question, see above.

27. Even in cases where there is not a “report abuse” button, it is important that social media service providers provide clear guidance and support for victims of distressing communications. These should detail methods for locating support and information on how to report the incident(s). Where possible and appropriate, providers should maintain regular contact with support and criminal justice agencies.

28. It is vital however, that the distinction is understood between those communications which:

29. In particular, given the evidence of the significant impact of the first three categories (including recent evidence on the impact of cyber stalking) we must ensure that such actions are not simply considered as grossly offensive and are dealt with under the appropriate legislation.

30. These categories are discussed in the Guidelines on prosecuting cases involving communications sent via social media from the Director of Public Prosecutions published on 20 June 2013.

31. Without the public having a clear understanding of the differences in these communications the problem is unlikely to diminish. Digital Natives have embraced technology but unfortunately without appropriate training and education they struggle to understand the social norms of internet communication and behaviour.

32. There is a clear issue around anonymity and perceived anonymity (as well as untraceability) in social media.

33. It is important all stakeholders consider the vulnerability of the victim in cases of abusive or threatening messages.

Conclusion

34. As a charity whose mission is “to enable the information society”, BCS looks forward to being a significant player in society’s development of answers to these questions and welcomes continued dialogue.

September 2013

1 http://www.saferinternet.org.uk/advice-and-resources/parents-and-carers/parental-controls

2 https://support.google.com/websearch/answer/510?hl=en

3 https://www.iwf.org.uk/

4 https://support.google.com/websearch/answer/510?hl=en

5 https://support.google.com/youtube/answer/174084?hl=en-GB

6 http://tnc2007.terena.org/programme/presentations/show4fa0.html

Prepared 18th March 2014