Culture, Media and Sport CommitteeWritten evidence submitted by Mediawatch-UK

A. Executive Summary

1. We welcome recent government initiatives designed to protect minors from accessing adult content but are of the opinion that three key areas must still be addressed in order to provide the best possible protection. These are:


Age verification.

Statutory backing.

2. We welcome discussion of how best to prevent abusive and threatening behaviour using social media. This is a behavioural issue and unlikely to be effectively dealt with by filtering but by collaboration between industry and other stakeholders. We believe there are three key areas in tackling this problem:



Real sanctions, with statutory backing if necessary.

B. Mediawatch-UK

3. Mediawatch-UK is a voluntary organisation, established in 1965 by Mary Whitehouse CBE and formerly known as the National Viewers’ and Listeners’ Association. Mediawatch-UK has thousands of subscribing members throughout the UK. Our membership consists of people of all ages, occupations and backgrounds who are concerned about the overall influence of the media on the individual, the family and wider society.

4. Our Director and other spokespersons appear regularly in the media (press and broadcast) on behalf of our members. Further information about Mediawatch-UK can be found on our website:

C. How Best to Protect Minors from Accessing Adult Content

5. We believe that parents and businesses have a shared responsibility for children’s online safety. Parents/guardians have the ultimate responsibility for their children but businesses and government have a responsibility to provide parents with the tools they need to keep their children safe.


6. Many of our members have reported difficulty with installing content filters and keeping up to date with filters across the plethora of internet-enabled devices in their homes. Coupled with this many have reported confusion in knowing where to go to access reliable information and education on internet safety.

7. This is best illustrated by a family which recently contacted us for help. The family fostered children whose chaotic backgrounds placed them at particular risk. They were long time carers for a child who had suffered a history of serious sexual abuse. They were aware of the potential risks online and had taken advice from social services and their ISP. They had set up filters on the internet enabled devices in their home and believed their mobile devices would be protected under The Mobile Broadband Group Code of Practice. Their child was introduced to pornography on the phone of a child at school and she soon became addicted and was regularly accessing pornography using the wi-fi at the homes of friends using her phone. Because of this child’s history she is particularly vulnerable to sexual exploitation and misadventure.

8. At the time of the Parliamentary Enquiry into Online Child Protection in 2011 we asked our members (a group likely to be more than averagely engaged with this issue) whether their ISP had ever contacted them with details of the parental controls available as part of their package.

20% reported that their ISPs had informed them about parental controls.

80% said their ISPs had never done this.

Of those who had been told about parental controls 61% were with TalkTalk. Approximately 60% of TalkTalk subscribers had been alerted to their provider’s new HomeSafe service.

Of members who subscribed to other services only 11% had been told about the parental controls available.

Of some concern were a number of BT customers who were under the impression that installing parental controls would cost them a further subscription.

9. There is a great need for integrated and accurate advice for parents and carers on the potential dangers of accessing inappropriate online content and how best to protect their children. There is much good work being done in this area by a variety of organisations but further co-ordinated work is required to ensure that all parents are made aware of and receive access to information and assistance.

10. We believe there should be space in sex and relationship education in schools to educate young people about the potential dangers of accessing adult content online, how to avoid doing so and strategies to enable them to understand what they may have seen.

Age verification

11. We welcome the Government’s efforts in this area but believe no mechanism (“active choice”, “default-on” or “opt-in”), will work without “robust age verification”.

12. We commend efforts made in the internet gambling industry to all but eradicate underage online gambling and we would like to see similar measures implemented with regard to adult material with appointed agencies, possibly OFCOM or ATVOD, regulating.

Statutory backing

13. We also remain convinced that the best approach to ensure the Government’s pledges are actually delivered on is by changing the law. The current “default-on” proposal is a voluntary agreement between the major ISPs and, although four currently dominate the market, the Government’s plan does leave around 10% of the market unaccounted for. If filters were legislated for then all ISPs and MPOs, regardless of size, would have to conform which would offer a greater degree of protection.

14. Statutory backing would also remove anomalies between different providers and contracts. For example: many consumers believe that, as a result of the Mobile Broadband Network’s code, child filters are activated as a default for all UK mobile phone customers. This is not the case. Some providers require the filters to be activated and provision varies dependent on whether a phone is pay-as-you-go or contract, provided by a Mobile Phone Operator or a Mobile Phone Virtual Operator.

D. Preventing Abusive or Threatening Comments on Social Media


15. There is a great need for integrated and accurate advice for users, parents and carers on the potential pitfalls of social media use and how best to protect themselves and their children. There is much good work being done in this area by a variety of organisations but further co-ordinated work is required to ensure that all parents are made aware of and receive access to information.

16. We believe there should be space in PSHE education in schools to educate young people about safe and responsible social media use including the consequences of anti-social behaviour and sources of advice and support in case of abuse.


17. The former Director of Mediawatch-UK, John Beyer, during the course of his work attracted much criticism including an abusive Facebook paged entitled “I Hate John Beyer”. When this was brought to our attention we reported it Facebook and asked for it to be removed. We reported the page three times but heard nothing back from Facebook. Eventually, after approximately three months, the page was removed although we were not formally informed of its removal but discovered it ourselves.

18. Although the issue was dealt with it took too long and there was not enough communication in the process. Neither were we directed to sources of support and advice. These are issues which need to be addressed in a code of practice for social media sites.

19. Social media is now a mature medium and it is our opinion that it should be regulated accordingly. Recent events (such as the suicide of Hannah Smith following bulling on have demonstrated the potential effect of online abuse. We believe that abusive and threatening behaviour on social media is now an issue of health and safety and we would like to see it regulated accordingly via an industry code of practise.

20. We would like an independent regulator to ensure that sites are sufficiently moderated and that complaints are dealt with swiftly, efficiently and within an agreed timeframe. Such a regulator could ensure the development and implementation of an industry code of practice with the results promoted to parents and children. This code should include:

Clearer and simpler reporting mechanisms, especially where a service is marketed at and provided to under 18s, making it easier for users to report abuse.

Improved transparency and communication of protocols followed when reports of abuse are made, including average response times, so that reporting users are able to know the timescale for action with regards to when and if the problem will be dealt with and if and when the content will be removed.

Increased moderation of user-generated content. This moderation is especially important where a service is proactively promoted and used by children.

Prominent signposting to sources of expertise, advice, support and help for users affected by threats or abuse.

A code of practice relating to anonymous use of sites.

21. We would also like to see service providers working more closely with the organisations dealing with the consequences and aftermath of online threats and abuse taking place through their services, providing both support and funding.

Real sanctions with statutory backing if necessary

22. Given the anonymity afforded online there can be a perception that abusive behaviour on social media is without consequence. We welcome the clarification from the CPS on this issue but we would like to see increased sanctions for abusive or threatening behaviour online which reflect those which exist offline and in other forms of media. For maximum efficacy we believe these need to have statutory backing.

September 2013

Prepared 18th March 2014