Online safety - Culture, Media and Sport Committee Contents


Ofcom's response


Introduction: Ofcom's remit

Ofcom's written submission to the Committee set out our duties in relation to online safety and explained that our role in relation to internet services is limited. We regulate television channels and notified On Demand Programme Services delivered over the internet when they are established in the UK; but we have no powers or duties to regulate any other online content.

Section 11 of the Communications Act 2003 places a responsibility on Ofcom regarding the promotion of media literacy. We fulfil this duty through the publication of Media Literacy research into adults' and children/parents' media use and attitudes. We publish two substantial annual reports: 'Children and Parents: Media Use and Attitudes' and 'Adults' Media Use and Attitudes'. These provide detailed evidence about media use, attitudes and understanding among adults and among children and young people aged 3-15. The children's report also provides evidence about parental concerns over children's media use and the ways that parents seek to monitor and mediate that use.

Ofcom also publishes a range of other consumer research, exploring similar themes. In the context of the debate about network filters, Ofcom has been asked by the Secretary of State to report on parents' approaches to children's online safety, and specifically at the implementation of network-level filters by UK ISPs.

We share the findings of our research widely, including with the Government, industry, academia and the third sector. We also share our research data with the UK Council for Child Internet Safety (UKCCIS), on whose Executive Board we have a seat, to help inform debates about online safety, and have been involved in a number of UKCCIS initiatives (including, the 'Advice for Child Internet Safety 1.0 - universal guidelines for providers'[1] and the 'Good practice guidance for providers of social networking and other inter-active services'[2]).

Ofcom response to Committee recommendations:

We have limited our comments to those conclusions and recommendations directly relating to Ofcom's work or duties.

Recommendation 12:

We believe that, as part of its existing media literacy duties, Ofcom has an important role in monitoring internet content and advising the public on online safety. However, we are anxious to avoid suggesting a significant extension of formal content regulation of the internet. Among the unintended consequences this could have would be a stifling of the free flow of ideas that lies at the heart of internet communication. (Paragraph 55)

As noted above, Ofcom has a limited role in relation to internet services. We regulate television channels delivered over the internet and notified on-demand programme services (ODPS) where they are established in the UK (in this respect we have formally designated the Authority for Television On Demand [ATVOD] as the co-regulator for editorial content). We have no duties or powers to regulate, or monitor, any other online content.

Ofcom agrees with the Committee's caution against stifling the free flow of ideas on the internet; and freedom of expression is central to our approach to content regulation of the services for which we have such responsibilities.

As noted above and in our written submission to the Committee, Ofcom has media literacy duties which include the analysis of online media consumption. We fulfil this duty through the publication of Media Literacy research into adults' and children/parents' media use and attitudes. Our research provides evidence about parents' concerns about their children's media use and the ways that they seek to monitor and mediate that use.

We also publish the UK and Nations' Communications Market Report[3], last published in August 2013, which informs the delivery of our duties and our programme of work, and also keeps others informed about new technology developments and the impact these may have on the sectors that we regulate.

In addition to the above, the ParentPort website[4] aims to protect children by making it easier for parents to complain about material they have seen or heard across the media, communications and retail industries. In March 2013 it was updated to provide advice for parents on, amongst other things, keeping children safe online.

Any extension to our duties, related to media literacy or otherwise, would be a matter for Government and, ultimately, for Parliament.

Recommendation 13:

Providers of adult content on the internet should take all reasonable steps to prevent children under 18 from accessing inappropriate and harmful content. Such systems may include, but will not necessarily be restricted to, processes to verify the age of users. (Paragraph 62)

Ofcom welcomes any measures by industry players to promote child safety on the internet, including steps to protect under-18s from accessing inappropriate and harmful content.

Where Ofcom has regulatory duties, the approach recommended by the Committee is already applied. As explained in our evidence submission and above, Ofcom is responsible for ensuring that providers of television channels and video-on-demand services established in the UK observe relevant standards.

For linear services, the Ofcom Broadcasting Code (the Code) includes rules which help ensure the protection of minors from harmful content and from material that is unsuitable for them (covering content such as drugs, smoking, alcohol, violence and dangerous behaviour, and offensive language)[5]. This includes requirements that content unsuitable for children is appropriately scheduled and that television broadcasters must observe the watershed. The Code also includes rules that specifically relate to protecting children from sexual material, including prohibiting the broadcast of material equivalent to the British Board of Film Classification (BBFC) R18 rating and requiring 'adult-sex material'[6] to be shown only between 10pm and 5.30am and behind mandatory restricted access.[7]

For video-on-demand services we have formally designated the Authority for Television On Demand (ATVOD) as the co-regulator for editorial standards. As stated above, ATVOD's designated duties apply only to notified On Demand Programme Services (ODPS) delivered over the internet when they are established in the UK. ATVOD have no powers or duties to regulate any other online content. ATVOD's Rules and Guidance state that "if an on-demand programme service contains material which might seriously impair the physical, mental or moral development of persons under the age of eighteen, the material must be made available in a manner which secures that such persons will not normally see or hear it".[8] ATVOD interprets this to mean that R18 material or equivalent should only be made available in on-demand programme services in a manner if there is a robust age verification process in place.

Where R18 material or equivalent material has been made available in an on-demand programme service without robust age verification, we have taken enforcement action as appropriate - as in the case of our recent sanctions imposed on three ATVOD notified on-demand programme services 'Playboy TV'[9], 'Demand Adult'[10] and 'Strictly Broadband'[11]; and the service 'Jessica Pressley', which was suspended because it did not have a robust age verification process in place.

Recommendation 16:

We welcome the introduction of whole home filtering solutions that prompt account holders with a choice to apply them. We encourage all internet service providers to offer their customers this valuable service. Ofcom should monitor the implementation of this filtering and report back on its level of success and adoption. (Paragraph 74)

In July 2013, the four largest UK Internet Service Providers (ISPs) - BT, Sky, Virgin Media and TalkTalk - made a commitment to Government to offer network-level filtering (also described as "family-friendly" filtering) to all new customers, starting by the end of the 2013. Ofcom welcomes this development, as an addition to the range of tools which parents may use to manage the online experiences of their children.

Ofcom has been asked to play a specific role in the debate about network filters, by examining parents' approaches to children's online safety, their awareness of and confidence in parental controls of all kinds, and specifically at the implementation of network-level filters in the UK. In November 2013 the Secretary of State requested that Ofcom provide the Government with three reports during 2014.

The first report, published on 16 January 2014, covered:

·  the broader strategies parents may adopt to improve their children's online safety, including supervision, rules about online behaviour and the use of technical tools like parental control filters; and the extent to which parents use such approaches.

·  the levels of parental awareness and confidence with the safety measures that may be in place on sites regularly visited by children including, but not restricted to, content providers, search engines and social networking sites; and

·  research into the reasons parents may choose not to apply technical tools like parental controls.

A second report later in the year will report on the implementation of a network filtering service by BT, Sky, Virgin Media and TalkTalk, in line with the commitments made to Government in July 2013

A third report, to be published in December 2014, will track developments across the range of measures examined in the first report.

Recommendation 18:

We agree that the availability and performance of filtering solutions must be closely monitored, both for efficacy and the avoidance of over-blocking. It should also be easy for websites inadvertently blocked to report the fact and for corrective action to be taken. (Paragraph 79)

Recommendation 20:

Filters are clearly a useful tool to protect children online. Ofcom should continue to monitor their effectiveness and the degree to which they can be circumvented. (Paragraph 81)

As noted above, Ofcom has a specific role in reporting on parents' approaches to child online safety, and on the implementation of network level filtering solutions. During 2014 we will provide the Government with three reports on parental awareness of, confidence in and take-up of parental controls. The final of our reports will be published in December 2014.

The December 2014 report will include some research into parents' and children's use of parental controls, including the extent to which parents and children experience filters as blocking appropriate content; failing to restrict access to inappropriate content; or being circumvented.

Ofcom agrees that there should be clear procedures by which website operators can assess their filtering status, to notify ISPs if they believe they have been incorrectly classified (and blocked), and have their status reviewed. In this context, we welcome the clear framework and procedures set out by the Mobile Broadband Group, which includes specific points of contact for complaints about a mobile operator's filtering service, and a process of review operated by the British Board for Film Classification (BBFC). In our report on ISP implementation of network filtering (Report 2, above), we propose to cover ISPs' approaches to such issues. We also note that the UK Council for Child Internet Safety's has a working group on over-blocking which is considering on this and related issues.

Recommendation 21:

We welcome the introduction of ParentPort but believe Ofcom should seek to promote and improve it further. For example, more use could be made of it to collect data on complaints concerning children's access to adult material. (Paragraph 83)

ParentPort was launched in 2011 following a recommendation in Reg Bailey's review of the sexualisation and commercialisation of childhood.[12] The website is jointly operated and owned by seven of the UK's media regulators - the Advertising Standards Authority (ASA), the Authority for Television on Demand (ATVOD), the BBC Trust, the British Board of Film Classification (BBFC), Ofcom, the Press Complaints Commission (PCC) and the Video Standards Council (VSC)/Pan-European Game Information (PEGI).

The site was launched to:

·  set out simply and clearly what parents and carers can do if they feel a programme, advertisement, product or service is inappropriate for children;

·  explaining the rules in simple terms, and direct users to the right regulator for their area of concern so they can make a complaint quickly and easily; and

·  give parents and carers a way to provide informal feedback and comments which regulators can use as an extra gauge of parental views.

ParentPort now has a fourth function, added in March 2013, in providing a wide range of tips and advice to help parents keep children safe when they are online, using mobiles, social networking sites, watching films, advertising and playing video games.

ParentPort does not allow for the collection of data on complaints about children's access to adult material. The site only enables users to make complaints directly to the ParentPort regulators and therefore within their remits only.

All of the ParentPort regulators have well established complaints functions which can be accessed directly on their own websites, and they report publically on the complaints they receive. For example, Ofcom publishes data on viewers' and listeners' complaints as part of our Annual Report.[13] As a result, complaints data from ParentPort would only provide a subsection of parental concerns.

All of the regulators involved in ParentPort have worked to promote the site and raise parents' awareness of it. The regulators are currently reviewing how the website could be developed to benefit as many parents and carers as possible. This will take into account other parent-focused internet safety websites and campaigns, including the upcoming Internet Matters campaign developed and funded by the major UK ISPs.

Recommendation 22:

We further recommend that Ofcom regularly reports on children's access to age restricted material, particularly adult pornography and the effectiveness of filters and age verification measures. Ofcom is well-placed to fulfil this role given the work it does on its Children and Parents: Media Use and Attitudes Report. (Paragraph 84)

As noted in our submission and above, Ofcom fulfils its media literacy duties through the publication of Media Literacy research into adults' and children/parents' media use and attitudes. This provides detailed evidence of media use, attitudes and understanding among adults and among children and young people aged 3-15. The children's report also provides evidence of parents' concerns about their children's media use and the ways that they seek to monitor and mediate that use. We share the findings of our research widely, including with the Government, industry, academia, the third sector, and through our seat on the Executive Board of UKCCIS.

Our media literacy survey does not ask children directly in detail about their access to age restricted material, particularly adult pornography, as there are a number of methodological challenges in obtaining this type of data due to the sensitivity of the subject. However, we include questions that ask more generally about whether 8-15s have come across material online that they have found worrying, nasty or offensive. We also ask 12-15s if they have seen anything of a sexual nature online or on a mobile phone. In addition, we report comScore online measurement metrics that can provide useful additional context in terms of what children are accessing online.

With regard to the effectiveness of filters and age verification measures, this year Ofcom has introduced to the media literacy survey new questions related to internet safety measures (specifically filters and age verification systems). These questions will ask parents whether they believe the measures are useful, and whether they believe they block sites too much or too little. The results of these questions will be reported this autumn.

Any extension to our reporting duties in this area would be a matter for the government, and, ultimately for Parliament.

Recommendation 31:

Ofcom should monitor and report on complaints it receives, perhaps via an improved ParentPort, regarding the speed and effectiveness of response to complaints by different social media providers. (Paragraph 115)

As explained above, Ofcom has limited powers in relation to internet services, and we do not have any statutory duties in relation to social media. As such, we do not investigate complaints in relation to social media.

Any broadening of our duties in this area would be a matter for Government, and ultimately for Parliament.

While we do not have any statutory duties in this area, together with the other regulators involved in ParentPort we have contributed to making available a wide range of advice for parents and carers to help them keep children safe when they are online, including when using social networking sites. This includes, for example, information on setting up an account for a child, using social networks, and social networking safety features.

However, ParentPort does not gather data on the speed and effectiveness of social media providers' responses to complaints.


1   http://media.education.gov.uk/assets/files/ukccis%20advice%20on%20child%20internet%20safety.pdf Back

2   http://media.education.gov.uk/assets/files/industry%20guidance%20%20%20social%20networking.pdf  Back

3   http://stakeholders.ofcom.org.uk/market-data-research/market-data/communications-market-reports/cmr13/  Back

4   www.parentport.org.uk. ParentPort has been jointly developed by the Advertising Standards Authority (ASA), the Authority for Television On-demand (ATVOD), the BBC Trust, the British Board of Film Classification (BBFC), Ofcom, the Press Complaints Commission (PCC) and the Video Standards Council (VSC)/Pan-European Game Information (PEGI). Back

5   http://stakeholders.ofcom.org.uk/broadcasting/broadcast-codes/broadcast-code/protecting-under-18s/  Back

6   Material that contains images and/or language of a strong sexual nature which is broadcast for the primary purpose of sexual arousal or stimulation. Back

7   Mandatory restricted access means there is a PIN protected system (or other equivalent protection) which cannot be removed by the user, that restricts access solely to those authorised to view. Back

8   http://www.atvod.co.uk/uploads/files/ATVOD_Rules_and_Guidance_Ed_2.1_February_2014.pdf  Back

9   http://stakeholders.ofcom.org.uk/binaries/enforcement/vod-services/Playboy_TV_Sanction.pdf  Back

10   http://stakeholders.ofcom.org.uk/binaries/enforcement/vod-services/Demand_Adult.pdf  Back

11   http://stakeholders.ofcom.org.uk/binaries/enforcement/vod-services/Strictly-Broadband.pdf  Back

12   https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/175418/Bailey_Review.pdf  Back

13   http://www.ofcom.org.uk/files/2013/07/Ofcom_Annual-Report_AD600_ACC-2_English.pdf. It should be noted that Ofcom does not break down its data on broadcasting complaints by category of complaint. Back


 
previous page contents


© Parliamentary copyright 2014
Prepared 3 July 2014