Culture, Media and Sport CommitteeWritten evidence submitted by South West Grid for Learning (SWGfL)

Summary: With over 10 years of pioneering work in online safety and as part of the UK Safer Internet Centre, we are fully aware of emerging technologies and potential risks, the issues children and young people, professionals and parents face, and the gaps that exist in their support; as well as stakeholders and partner organisations working in this field. In the response below we outline strategies we feel would improve online safety in the UK:

For point I. How best to protect minors from accessing adult content we recommend parent/filtering tools together with wide education campaigns, better content rating mechanisms and more research to identify gaps in support and provision.

For point II. Filtering out CAI material we recommend providers should be members of IWF and the increased use of alert tools.

For point III. Preventing abusive comments on social media we recommend a combination of prevention and empowerment strategies such as education and information about reporting abuse, and clear and simple reporting mechanisms from industry.

1. Background to SWGfL:1 Our esafety team of experts has national and international reputation in safeguarding children and young people online. Our expertise is around e-safety policy guidance and improvements and training in schools to staff, parents and pupils. We produce award winning resources such as 360 degree safe2 used by over 4,000 schools to assess their esafety provision, pinpoint their weaknesses and make appropriate improvements. We also collaborated with OFSTED to inform the inspection of esafety standards in schools and early years settings.

2. Role within UK Safer Internet Centre: As one of the three partners comprising the UK Safer Internet Centre,3 SWGfL operates a unique helpline4 for professionals to help resolve online issues about themselves or the young people they work with. The centre also coordinates Safer Internet Day every February in the UK. To illustrate our work from the last 12 month here is an infographic:

I. How best to protect minors from accessing adult content

3. Adult content presents a range of harmful risks to children and young people and a number of strategies are required to best protect them. In this context, we are considering legal adult content. Children and young people have many influences on their lives- their age, social & economic environment, location, family situation, parental and peer groups etc; hence a single strategy is ineffectual. A broad complement of strategies would include:

4. Filtering/parental controls: There has been much discussion recently in relation to filtering or parental controls and whilst we support “Active Choice”, we think that this is not a panacea. Parental controls have advantages to help prevent accidental access to adult content, however we know it will not prevent determined access, especially for “tech savvy” teenagers. Over the past 14 years we have been providing network level filtering into schools in South-West England and continue to see “loop holes” that children discover to circumvent and bypass the filtering in place. As we move towards multi device access in the home, our preference is for network level filters compared to device specific ones. We encourage parents to use parental controls and advocate age appropriate filtering, however always point out that these should not be used in isolation but alongside other strategies—filtering and parental controls are not a “fire and forget” solution. Consoles already use content ratings like PEGI and BBFC but these need to be more sophisticated on cable, TV services, mobile devices and app devices.

5. Safe Search: Linked to filtering, we encourage parents to use safe search functionality and age appropriate settings for online searches. It would be worth considering applying “active choice” to these services? For example safe search could be enabled as the “default” configuration, allowing users to “opt out” if they wish?

6. Education for young people: Education is the most important strategy. We should provide children with age appropriate, reflective, non-moralising and non-sensationalist educational programmes, mapped across the wider curriculum and covering the broader aspects of digital life, alongside parental support initiatives. Integrating discussions around sexting and impact of online pornography into RSE (Relationship & Sexual Education) is in our view the most suitable approach, ensuring that children understand the context around online adult content. Our fears relate to the influence of adult content on children’s expectations and normal healthy sexual development. Whilst these are high level objectives, the effective success will require additional education and support for those raising these issues with children.

7. Education for parents: In our experience parents can often feel intimidated when discussing technology with their children and extending this to adult online content can exacerbate the situation. Many parents will not appreciate the extent and extremes of adult content online, well beyond the sort of content they witnessed as children. Support in this regard will be required, perhaps using a storyline in popular “soap operas” can help to raise the subject and allow parents to both progress discussions and to provide some messaging to children directly.

8. Education for teachers: Our evidence suggests that online safety training for teachers is consistently the weakest part of a school’s wider online safety practice. The impact of adult content should be integrated into staff training and should extend to all staff. Pedagogical methods of integrating into RSE and online safety sessions should be provided to teaching staff.

9. Social Workers: Social workers (and social care professionals) who work with society’s most vulnerable children should be able to access materials, resources and support in this regard. Combating CSE (Child Sexual Exploitation) is clearly a key priority in this sector and online adult content has a significant impact. Social workers should be able to recognise the impact of this content in particular cases.

10. Service Providers: For those service providers who do not allow or host adult content (as defined by the terms and conditions), they should be clear about how they monitor and enforce their policy, together with clear and transparent reporting procedures for users to report questionable content. For those service providers who do permit adult content, it should be clearly marked as such. Perhaps the use of age rating (ie 12, 15 and 18) could have a part to play? Providers who host user generated content should consider implementing user or community ratings.

11. Public Health: Public health campaigns should be considered to raise general awareness of online adult content, perhaps extending or integrating with existing teenage pregnancy campaigns, with which there is an obvious impact.

12. Research: Further research is needed to better understand the experiences, perceptions and knowledge of children and young people when it comes to online adult content together with assessments and evaluations of the impact of this variety of formal and informal educational programmes and resources.

II. Filtering out extremist material including CAI (Child Abuse Images) and material intended to promote terrorism and/or other acts of violence

13. IWF membership: UK ISPs should by mandatory requirement be members of the IWF (Internet Watch Foundation). Combating CAI in the UK through the work of the IWF has proved a particularly successful model and one that could be considered also for extremist material.

14. Using alert tools: SWGfL has been working with the IWF and South West police forces since 2006 to alert for any attempted access to websites containing CAI. This pilot project has been managed by CEOP with Home Office approval and due for evaluation shortly. The objective of the project is to flag intelligence to police forces if anyone in a school (specifically) attempts to access a website containing CAI. This intelligence prompts the police to then undertake their normal investigation routes resulting in a number of school staff, who have been identified and removed as a direct result of this simple alerting process. We believe we should use technology better to identify those accessing CAI (and extremist material) and we are involved in a project with Plymouth University and IWF to extend and refine our existing alerting capability.

III. Preventing abusive or threatening comments on social media

15. Background to Professionals Online Safety Helpline: The Professionals Online Safety Helpline, based at SWGfL is regularly called upon to comment on abusive online behaviours, both via printed and broadcast media and at public speaking events across the UK. The Helpline is part of the Safer Internet Centre, and is a dedicated service for professionals who work with children and young people, including teachers, social workers, police officers, youth workers, early years professionals and more. We have seen a large increase in calls from parents frustrated at the lack of practical help to get abusive comments removed online.

16. How the helpline operates: The helpline is unique in its approach to resolving social media issues, with a direct contact route in all of the major sites. Our current partners include Facebook, Twitter, Ask.FM, Instagram, Moshi Monsters, Tumblr, Apple, 192.com, Omegle, Snapchat and others. These contacts allow us in the first instance to ensure that the advice we give to callers about how to report, is accurate, and secondly, where necessary, to escalate the matter directly to the policy and safety personnel in each organisation, and have inappropriate, abusive content removed. In some cases this also leads to the perpetrator being suspended for their behaviour.

17. More guidance on how to report and remove content: Many services already exist to provide therapeutic support, particularly for children. What has been lacking is the practical guidance for users on reporting issues and ability to remove content. As such the helpline has addressed this gap and through its unique network of direct contacts to providers, supports professionals, and increasingly parents, in how to report abusive content appropriately in order to ensure or speed its removal.

18. Prevention strategies: Prevention is more challenging, and needs to be a collaboration between everyone involved. Schools, and even pre-school and early years settings, need to educate children about online behaviour much earlier. Generally where we see social media discussed and embedded in curriculum, there tend to be fewer issues and children take a more responsible approach. We believe children need to build a level of resilience, rather than being sheltered from experiences (of course we do not mean being subject to online abuse!). Schools should talk about how to respond to abuse, for example around blocking and reporting, and where possible introduce confidential reporting routes.

19. Strategies to empower young people and professionals: We know that children often do not “tell” when they are being bullied or abused. Allowing young people to report anonymously, either for themselves or their friends, can be very empowering. This can be simple, for example a “Whisper”5 alerting system via SMS, or email, or even a locked comments box. Other initiatives such as Anti Bullying Ambassadors also help discourage abusive behaviour. Schools need to act quickly and decisively when there are issues offline, often the online abuse follows a period of face to face bullying. If this is picked up earlier, better interventions can be put in place to support the child at risk.

20. Parental support: Parents need access to more up to date advice about specific popular sites, in how to support their child if the worst happens, and what they can expect from the child’s school. Schools need to alert parents of any bullying incidents at school as this can be an indicator that there or may be issues online which the parent can assist with. Having regular, open conversations with their children about their online activities can help to identify potential risks before they happen, help children feel more able to discuss concerns, and also alert parents if their children are themselves engaging in bullying or abusive behaviours. Parents need to take a more proactive response when they discover their children are perpetrators of online abuse, including working with the school to ensure that they can support any victims who may not have already come forward.

21. Role of industry: Industry needs to ensure that they have sufficient safeguards in place to protect users. Our experience has been very varied, and some sites are better at dealing with issues than others. The most effective systems use a combination of automated systems, such as language filters (although these are in no way foolproof), automated moderation of specific phrases or search terms, and human moderation. It has been our experience that with very rapidly growing sites, including Twitter and Ask.FM, user safety isn’t considered as one of the priorities at the start. We are not overly critical of this, because having some flexibility around policy decisions allows for regular review and improvements to be implemented, however it can be dangerous to launch a site without some basic safeguards in place from the offset. An ideal approach would be a more proactive moderation, rather than a purely “report”-based moderation. We do not support censoring of the web, nonetheless making users aware that sites are monitoring behaviour may act as a deterrent for online abuse.

22. Clearer privacy settings and simple reporting routes: As well as working to prevent harassment, sites must adhere to their own community standards when reports are made to them. We would like to see much clearer advice on how to set up privacy on each site, report abuse, and clarification on what will happen once a report has been made, eg timescale for resolution. Reporting should be a very simple process, and we are pleased to see improvements to both Twitter and Ask.FM with the introduction of “in tweet” reports and a more prominent report button soon coming to Ask.FM. Matters must be resolved quickly, and with decisive sanctions, such as temporary suspension of accounts while investigations are underway. Our helpline is undertaking a national role of mediation with providers in reporting and removing abusive content and could be considered for national recognition like in New Zealand.

23. Accountability for anonymous services: We would also like to see more accountability to sites that allow anonymous questions, such as Little Gossip. Although most media stories would say otherwise, Ask.FM have been fairly proactive in dealing with cases reported to them, we have a named contact and they have kept us informed of policy and procedures. Little Gossip however do not respond to any requests for contact and act on abuse reports. This site allows users to report an abusive message by posting an equally abusive message in its place, leading to significant harm to the young people named and shamed on the site. We would like these sites to have an “opt in” to being anonymous, rather than “anonymous” being the default setting, and to have warning messages on their landing pages.

24. Support links and help: Sites should have better links to support services which are relevant to each country, for example Tumblr has links to self harm and suicide support services, but only in the US. Syndicated support would be more effective, where different providers point to consistent sources of support—for example to ChildLine, Samaritans etc.

25. Supporting organisations like the UK Safer Internet Centre: One area we would like improved is in financial support from industry to organisations such as the UK Safer Internet Centre, to produce resources and support operation of the Helpline, which directly tackles the issues users are facing. At present Industry contribute financially to the IWF in tackling child abuse images, but there is no co-ordinated approach to funding the services which resolve other problems such as removal of abusive content and in general, wider education and awareness-raising initiatives.

26. Role of Government: Government has an integral part to play, statutory funding should be in place for services working to resolve online harassment, and there needs to be clear direction and consistency between departments. Too often we are given conflicting areas of focus from the Department of Justice, DCMS, DfE, and even the Prime Minister. It would be more effective to streamline these through one distinct channel or Department. Government should also be more active and consistent in engaging with social media sites when there are issues in the public interest. There should be clearer guidance for the public on what legal action can be taken, and under what circumstances.

27. Role of media: One area often overlooked is the responsibility that Media holds. While it’s important to openly discuss online issues, we would like to see a more responsible use of language and reporting on behalf of the media. Often stories will have attention grabbing headlines, and very inflammatory language, for example “Facebook bully probe after death” and “How many more teenage girls will commit suicide because of Ask.fm”. In nearly all the cases we have been involved in, media have made sweeping statements and assumptions about the circumstances of cases, without knowing the facts, and often while a Police investigation is still underway. These messages are very unhelpful, distressing for the families involved and some even contain inaccuracies about the sites which may lead to further problems. We would like to see the media adhering to a general code of conduct about how they respond to issues involving young people, and adopting more supportive safety advice and links to relevant resources.

28. Users responsibility: Finally, we need to consider the users themselves. In cases involving adults particularly, we need to be clear that if they break the law they may be prosecuted, that just as inappropriate offline behaviour may have legal consequences so may behaviour online. Young people’s issues however are sometimes more complex, and we don’t support criminalising children, however there does need to be a point when young people are held accountable for their behaviour. Awareness of those laws could be raised through an advertising campaign similar to TV licence/Road safety etc.

September 2013

1 http://www.swgfl.org.uk/Staying-Safe

2 http://www.360safe.org.uk/

3 http://www.saferinternet.org.uk/

4 http://www.saferinternet.org.uk/about/helpline

5 http://boost.swgfl.org.uk/about.aspx

Prepared 18th March 2014