3 Adult content
Nature and scale
38. While child abuse images attract worldwide opprobrium,
there exists less consensusboth nationally and internationallyon
what other kinds of material adults might properly access. Tensions
can, and do, arise between the proscription of obscene material
and freedom of expression. In the United Kingdom, it is perfectly
legal for adults to possess images of explicit sexual activity
of a kind that attracts an R18 certificate issued by the British
Board of Film Classification. The BBFC told us that that their
guidelines "are the result of extensive public consultation
with over 10,000 people across the UK being consulted during the
most recent Guidelines consultation in 2013. Research demonstrates
that the public agrees with the BBFC's classification decisions
most of the time."[58]
39. The BBFC's written evidence provides a clear
reminder of the qualitative difference in the nature and accessibility
of pornographic material between the off-line and online worlds:
The BBFC removes any material from pornographic
works which is potentially harmful or otherwise illegal. As well
as policing the border between legal and illegal pornography,
the BBFC polices the border between the strongest, hardcore pornography,
and the less strong, softcore pornography. The BBFC classifies
hardcore pornography as R18, which means that it may only be supplied
through licensed sex shops, as an extra precaution against underage
viewing. However, the risk of children accessing even the strongest
legal pornography is far greater online. In addition, there are
fewer effective controls on the distribution online of pornography
which the BBFC would not classify at any category.[59]
40. Judging from the evidence we received, pornography
was the category of adult content that caused most concern. This
could be an indication of its particular prevalence on the internet.
The Authority for Television on Demand (ATVOD) told us that five
of the top 50 websites most commonly accessed from the UK are
"tube" websites offering unrestricted access to hardcore
pornography videos. ATVOD also cited figures which suggest that
those five sites were (collectively) visited over 214 million
times by UK internet users during January 2013.[60]
John Carr of the Children's Charities' Coalition on Internet
Safety told us that "right now today within the UK, there
is nothing there routinely that restricts access to the most bizarre,
the most violent and the most graphic types of pornographyanybody
can get it."[61]
Pornographic material, much of it illegal, is but "two clicks
of a mouse"[62]
away.
41. The NSPCC's Childline service provides one indication
of the harm done to young people accessing pornography. According
to the NSPCC:
During 2011-12, there were 641 counselling sessions
where the young person specifically mentioned being exposed to
sexually indecent images. While these incidents will not exclusively
relate to online content, a large proportion of this sexually
explicit material will have been accessed through internet enabled
devices. Young people often told ChildLine that they felt guilty
and disgusted about what they had seen and were extremely worried
about getting into trouble for accessing these sites. ChildLine
has also seen a growing trend of young people talking about being
addicted to online pornography.[63]
42. In January, we held a meeting at the House of
Commons with eight young people, some of whom had been harmed
by exposure to adult pornography. One young man told us how he
had first encountered pornography at the age of eight; viewing
pornography had subsequently become a habit which distorted his
picture of loving relationships. Another participant told us
how, as a teenager, she had been drawn accidentally into viewing
pornography from information in a fashion magazine; just one encounter
had made her feel ashamed and had affected her relationship with
her father. Some girls told us how boyfriends sometimes expected
them to behave like "porn stars" and that the exchange
of sexually explicit material on mobile phones could lead to bullying.
The law
43. Online activity is subject to general offline
legislation such as the Obscene Publications Act 1959 and the
Human Rights Act 1998. Publication of obscene material, including
child abuse images and extreme adult pornography, is illegal under
the Obscene Publications Act 1959 (which extends to England
and Wales). An important point is that the definition of obscene
depends partly on the person who sees the material. "Legal"
adult pornography that has an R18 certificate, issued by the British
Board of Film Classification, would likely be classed as obscene
if it was published in a way in which children could readily access
it. Both the Children's Charities' Coalition on Internet Safety
and the Authority for Television on Demand (ATVOD) cited case
law (in particular, R v Perrin) in support of this assertion.
The test of obscenity in section 1 of the Act leaves little room
for doubt in our minds:
For the purposes of this Act an article shall
be deemed to be obscene if its effect or (where the article comprises
two or more distinct items) the effect of any one of its items
is, if taken as a whole, such as to tend to deprave and corrupt
persons who are likely, having regard to all relevant circumstances,
to read, see or hear the matter contained or embodied in it.[64]
44. The internet, where everyone with a connected
computer is potentially a publisher, is largely a free-for-allthough
some audiovisual content is becoming subject to broadcast-style
regulation. The British Board of Film Classification engages
extensively with the public in reaching decisions as to what standards
are acceptable. The R18 classification is given to restricted
videos depicting explicit consensual sexual activity which, by
definition, excludes so-called "rape porn" and other
illegal activities. R18 videos are only allowed to be sold in
licensed sex shops (to which only adults are admitted); they
may not be supplied by mail order. BBFC certification provides
more than a useful yardstick as to content that adults may legitimately
choose to access:
The Government has recognised the dangers of
extreme pornography and in 2008 made possession of extreme pornography
an offence under the Criminal Justice and Immigration Act. A BBFC
classification is a defence against a prosecution under this Act
therefore purchasing a legally classified work is a protection
against inadvertently possessing extreme pornographic material.
The BBFC regularly assists Local Government trading standards
officers in ensuring that pornographic material has been classified
by the BBFC and restricted for sale to licensed sex shops. However,
these methods of enforcement are not available online.[65]
45. Transposing accepted standards into the online
context represents a challenge for national institutions, not
least because the internet is in many ways an international space.
Enforcement
46. Evidence from the Authority for Television on
Demand (ATVOD) extensively explores the potential use of the Obscene
Publications Act. ATVOD draws attention to the current Crown
Prosecution Service guidance for prosecutors on the interpretation
of the Obscene Publications Act, which states that:
where children are likely to access material
of a degree of sexual explicitness equivalent to what is available
to those aged 18 and above in a licensed sex shop, that material
may be considered to be obscene and subject to prosecution. This
applies to material which is not behind a suitable payment barrier
or other accepted means of age verification, for example, material
on the front page of pornography websites and non-commercial,
user-generated material which is likely to be accessed by children
and meets the threshold. see R v Perrin, [2002] EWCA Crim 747.[66]
47. ATVOD told us they could find no recent example
of a prosecution being launched under the above guidance. John
Carr of the Children's Charities' Coalition on Internet Safety
told us:
That law is being honoured in the breach rather
than in the observance. It is principally because most of the
publishers are based overseas and the British police have not
sought to extradite them or go after them, and that is a great
pity. In our evidence, we have made a number of practical suggestions
about how we might try to get at least companies that are based
in Britain or operate from here to try to observe that particular
law. That is to say, "If you are going to publish porn, okay,
that is your business, but please take concrete steps to make
sure kids cannot get easy access to it."[67]
48. The Parliamentary Under-Secretary of State for
Culture, Communications and Creative Industries, Edward Vaizey
said: "Obviously nothing should stop us doing the right thing,
in terms of prosecutions or clarifying the law. Nevertheless we
do have to be aware that a lot of these sites do not provide suitable
identification as to who owns them, and again ATVOD is suggesting
that we clamp down on those sites by denying them their financial
support."[68] Jim
Gamble did not see access to legal adult pornography primarily
as a law enforcement issue. Alluding to the "active choice"
filtering solution, he told us: "I will deal with the inappropriate
material first because it is the easy one. I think inappropriate
material is a parental decision for those individuals who have
duty of care of the young people to make and I think active choice
is absolutely right. If parents and others are prompted to make
a decision, I do not think you can do more than that. You are
not going to go out into their homes and look after their children
for them."[69]
49. ATVOD told us that they are working with the
UK payments industry to design a process which would enable payments
from the UK to be prevented to foreign websites which allow children
to view hardcore pornography. However, there needed to be greater
clarity over the legality of providing unrestricted access to
hardcore pornography.
50. ATVOD has an enforcement role in connection with
a limited number of television-like services. It was designated
by Ofcom in March 2010 to regulate the editorial content of UK
video on demand services. Its duties and powers derive from the
EU Audiovisual Media Services Directive, which was implemented
in the UK via amendments to section 368 of the Communications
Act 2003 ("the Act"). Under the Act, UK services which
meet the statutory definition of an on demand programme service
("ODPS") must comply with a limited number of statutory
requirements which have been incorporated by ATVOD in a set of
Rules. ATVOD's primary role in relation to protection of minors
is founded in Article 12 of the Directive, which forms ATVOD's
Rule 11: "If an on-demand programme service contains material
which might seriously impair the physical, mental or moral development
of persons under the age of eighteen, the material must be made
available in a manner which secures that such persons will not
normally see or hear it."[70]
ATVOD's evidence anticipates legislation in this area:
The DCMS strategy paper ("Connectivity,
Content and Consumers") published in July 2013 sets out Government's
intention to legislate to make clear that R18-equivalent material
on ODPS must be provided only in a manner which ensures that under
18s do not normally see or hear it. ATVOD considers that such
legislation would provide clarity for consumers and industry and
better ensure deployment of effective safeguards. Such legislation
would also remove the possibility of ATVOD's consumer protection
measures being undermined by a legal challenge.[71]
51. We believe that the existing obscenity laws
already proscribe the publication of adult material in ways that
make it readily available to children. However, we are concerned
that no prosecutions have been brought despite the proliferation
of pornography sites which make no attempt to restrict access
by children. We welcome the Government's declared intention to
legislate to clarify the law in this area. However, in the meantime,
we urge the prosecuting authorities to use the existing law to
crack down on the worst offenders in order to put pressure on
all suppliers of hardcore pornography to make greater efforts
to ensure that such material is accessible only by adults.
52. A major difficulty lies in the fact that ATVOD
and Ofcom can only regulate services based in the United Kingdom.
Furthermore, the requirements of the Audiovisual Media Services
Directive are interpreted differently in other Member States.
The Dutch regulator, for example, takes the view that hardcore
pornography does not seriously impair the development of under
18s and such services operating from the Netherlands are not required
to ensure that under 18s cannot normally see or hear them.[72]
In any case, the majority of online hardcore pornography services
available to children in the UK operate from outside the European
Union (most commonly from the USA).
53. The Government should seek agreement with
other European Union Member States to ban on demand programme
services that make pornography readily available to children.
We further urge the Government to engage with other international
partners, particularly the USA, with the aim of securing a similar
outcome more widely.
54. Evidence from Ofcom includes a reference to a
research report, 'Protecting audiences in a converged world'.[73]
This research looked at public attitudes within the context of
convergence, in order to understand the public's expectations
for protection and how content should be regulated in the future:
viewers have high expectations of content regulation on broadcast
television, and associated video on demand and catch-up services,
less so for internet content accessed through devices such as
PCs and laptops. Quite how public expectations will develop as
smart TVs and other manifestations of media convergence become
more commonplace remains to be seen.
55. An independent parliamentary inquiry into online
child protection (April 2012), chaired by Claire Perry, suggested
that the Government should consider a new regulatory structure
for online content, with one regulator given a lead role in the
oversight and monitoring of internet content and in improving
the dissemination of existing internet safety education materials.
Ofcom already has a role in relation to internet services, though
this is largely confined to promoting media literacy and performing
research. Ofcom told us: "We regulate television channels
delivered over the internet and notified ODPS when they are established
in the UK; but we have no statutory powers to regulate any other
online content."[74]
We believe that, as part of its existing media literacy duties,
Ofcom has an important role in monitoring internet content and
advising the public on online safety. However, we are anxious
to avoid suggesting a significant extension of formal content
regulation of the internet. Among the unintended consequences
this could have would be a stifling of the free flow of ideas
that lies at the heart of internet communication.
Age verification
56. Providers of adult content can prevent children
from accessing inappropriate and harmful material by putting in
place systems that require evidence of age. In this regard, the
mobile network operators have stolen a march over the internet
service providers. A majority of children have a mobile and an
increasing proportion of them go online using a mobile phone or
smart phone. The mobile operators' work in online safety is underpinned
by a Code of Practice that was first published in January 2004,
'The UK code of practice for the self-regulation of new forms
of content on mobile'. The second edition of the code was published
in 2009 and the third (and current) edition in July 2013. The
Code was the first of its kind and was used as the boiler plate
for similar codes introduced by mobile operators throughout the
EU.[75]
57. The Code covers a broad range of topics: commercial
and internet content, illegal content, malicious communications,
spam communications and customer education. A distinction is made
between commercial and internet content. The Mobile Broadband
Group told us: "The mobile operators' respective responsibilities
for commercial contentwhere they have contractual agreements
in place with content providersas against general content
on the Internet are different."[76]
Any commercial content with an 18 rating (determined by the British
Board of Film Classification) is placed behind access controls
and subject to "a robust age verification process"[77]
(acceptable methods of which are set out in the Code).
58. Age verification is clearly more challenging
when accessing content does not involve a direct financial transaction
and where the users have an expectation of some degree of anonymity.
There is a stark contrast between the requirements on the online
gambling industry and those on other providers of online services
to adults. As the Remote Gambling Association highlighted:
The Gambling Act 2005 allowed for a wider range
of advertising of gambling products in Great Britain. To be able
to advertise a gambling operator has to hold an operating licence
issued by the Gambling Commission, or an equivalent licence issued
by an EU gambling regulator or an overseas regulator which issues
licences with equivalent standards to the UK regulator. These
licences require that before bets can be settled the customer
is over 18 and has had his or her identity verified.
As far as we are aware, no other adult service
providers are required by law to ensure that their customers are
over the age of 18. This puts the regulated online gambling industry
in a different position to other e-commerce sectors. Because there
are mandatory safeguards in place, but especially where children
are concerned we believe that the principles at least should be
applied equally.[78]
59. Online gambling of necessity involves a financial
transaction which makes age verification relatively easy. There
are, however, difficulties in relation to the use of other services
or consumption of content which do not necessarily involve direct
payment. British Naturism told us: "Age verification is used,
for example, by gambling websites where the possession of a valid
credit card forms both the financial security and implicit verification
of the person's right to gamble. But in the general case of free
access to unsuitable websites, it is unclear to us what mechanism
could be devised that verifies the age of the individual who has
made initial access, but does not block the unverifiable access
by, say, another family member or friend to whom control of the
device is passed."[79]
60. The Christian social policy charity, CARE, acknowledged
that many adult websites require robust age verification to access
18+ content. "However there are many more websites that provide
such content for free without robust age verification. Moreover,
the business model of these websites can be driven by their click
through rate as it relates to advertising. This means that the
more clicks a website receives, the more money they make, disincentivising
the owners of these websites from applying age verification."[80]
The Authority for Television on Demand (ATVOD) provided us with
further details on the business models of hardcore pornography
services, most of which are operated from overseas: "The
most frequently accessed services use a variation on the You
Tube business model (and are consequently commonly referred to
as "tube sites"). Such tube sites offer significant
quantities of unrestricted free hardcore porn videos as a shop
window in order to attract large number of viewers whose visits
are monetised in a number of ways: by up-selling to a premium
version of the free service (offering a wider choice, longer videos,
better picture quality, etc); by driving traffic to other paid
(pornographic) services operated by the provider of the tube site;
by charging on a 'click through' basis to affiliates whose content
is featured on a 'try before you buy' basis on the tube site;
and by selling advertising space (eg for 'contact' services or
penis enlargement treatments)."[81]
61. Among the measures recommended by CARE is financial
transaction blocking of adult websites that do not put in place
"robust" age verification procedures. ATVOD provided
us with the following examples of suitable age verification methods:
· Confirmation of credit card ownership
or other form of payment where mandatory proof that the holder
is 18 or over is required prior to issue.
· A reputable personal digital identity
management service which uses checks on an independent and reliable
database, such as the electoral roll.
· Other comparable proof of account ownership
which effectively verifies age[82]
62. The Mobile Broadband Group argues that providers
of age restricted services should themselves be putting in place
their own processes to protect minors. We agree. Providers
of adult content on the internet should take all reasonable steps
to prevent children under 18 from accessing inappropriate and
harmful content. Such systems may include, but will not necessarily
be restricted to, processes to verify the age of users.
63. The Children's Charities' Coalition on Internet
Safety suggested to us that more could be done in relation to
age verification. Their written evidence includes a suggestion
that refers to case law supporting the view that publishing adult
material in ways that makes it accessible by children is in breach
of obscenity legislation: "Nominet should make compliance
with R v Perrin a condition of operating a .uk domain name e.g.
if a site is to publish pornography the operator must give a binding
undertaking to put an effective age verification process in place".[83]
Nominet describes itself as "the trusted guardian of the
.uk domain name space, Nominet is responsible for the stability
and security of one of the largest internet registries in the
world, with more than 10 million registered domain names."[84]
We have no reason to suppose that Nominet has either the resources
or inclination to police the internet. Age verification, while
ideal, is not the only way of preventing children from accessing
unsuitable content. However, we believe that no .uk site should
offer unimpeded access to adult pornography to children. This
should be made a condition of registration.
Site blocking
64. The site blocking approach enabled by the Internet
Watch Foundation, with the necessary cooperation of ISPs, is one
instrumentalbeit a blunt oneaimed at preventing
access (by anyone) to online material. It has so far been applied
mainly to images of child abuse. Extending this approach to other
material, including some adult sites, would face challenges both
of scale and cost. BCS[85],
the Chartered Institute for IT, told us:
There has been much resistance to the Internet
Watch Foundation's widening its remit to the other material in
the Select Committee's question, and BCS does not believe that
this is the way forward.
Some people say that more should be done, and
imply, without saying so, that content-based filtering should
be used, so that more such material could be blocked. This would
require a major change in society's attitude to censorship, as
well as primary legislation to enact fundamental changes to the
Regulation of Investigatory Powers Act. BCS does not believe that
this is either feasible or desirable.[86]
65. BT, a major ISP, has also expressed concerns:
In the absence of clear primary legislation from
Parliament, or an EU-wide legislative instrument, BT does not
wish to police the internet beyond preventing access to illegal
material. To do so would set an unfortunate precedent in which
an ISP would become the arbiter of taste and decency in relation
to online content. It is not for an ISP to be placed in such a
position.
Legal opinion informs us that filtering any internet
material on home broadband or public wi-fi may be illegal under
RIPA 2000 and that this is so even if the purpose for filtering
is child protection, and even if the internet user has chosen
to set up filters. BT has raised with government the potential
conflict between network level content filtering and the Regulation
of Investigatory Powers Act (RIPA 2000). We would expect to receive
clarity that our employees and or those of our wi-fi site partners
would not face a criminal prosecution under RIPA 2000 by offering
filtering activities to our wi-fi site partners for blocking unsuitable
content from reaching vulnerable individuals.[87]
66. The Internet Watch Foundation describes blocking
of child abuse sites as "a short-term disruption tactic which
can help protect internet users from stumbling across these images,
whilst processes to have them removed are instigated."[88]
Site blocking is highly unlikely to be a suitable approach
for adult pornography or violent material much of which is legal
(at least if it is unavailable to minors) and which is prevalent
on the internet. However, blocking should be considered as a
last resort for particularly harmful adult websites that make
no serious attempt to hinder access by children.
Filters
67. An independent parliamentary inquiry into online
child protection (April 2012), chaired by Claire Perry, noted
that "while parents should be responsible for monitoring
their children's internet safety, in practice this is not happening".[89]
The report went on to recommend that the Government "should
launch a formal consultation on the introduction of an Opt-In
content filtering system for all internet accounts in the UK"
as well as seeking "backstop legal powers to intervene should
the ISPs fail to implement an appropriate solution".[90]
Following a subsequent Department for Education consultation,
the Government stopped short of proposing a default-on or opt-in
filtering system, partly on the grounds of practicality, the
danger of blocking legitimate sites and the inability of such
systems to cut off other types of harmful material such as grooming
and bullying. The Perry report came on the back of a number of
other studies that have looked at how best to protect children,
for example, Reg Bailey's Letting Children be Children
(June 2011) and Tanya Byron's Safer children in a digital world
(March 2008). The latter report led to the setting up of the UK
Council on Child Internet Safety in September 2008. Our predecessor
Committee also published its report, Harmful content on the
Internet and in video games, in July 2008.
68. In its evidence, the Department for Culture,
Media and Sport notes that 91% of children live in households
with internet access and that a greater proportion of children
aged 12-15 own smartphones than adults. The Government "understands
that, first and foremost, responsibility for keeping children
safe online falls to parents and guardians; however, Government
is acting to ensure that parents have the tools and information
they need to be able to do so."[91]
In particular, the Government has been working through the UK
Council for Child Internet Safety (UKCCIS) "to pursue a voluntary
approach to child internet safety and has called on industry to
make the right tools available to allow parents to protect children
online."[92]
69. In their submission, Intellect rehearsed the
roles the technology industries have been playing in the development
of parental control tools such as filters. They also refer to
a "continual process to innovate and update these tools".[93]
Intellect added:
Thus it is clear that an irreversible momentum
has developed across the industrial ecosystem providing internet
access to continually develop technology tools in response to
the fast evolving internet environment. It is this application
of technology innovation which will ensure the diverse set of
tools needed to support a safe online environmentnot regulation
which in contrast could freeze innovation.[94]
70. In a speech to the NSPCC on 22 July 2013,[95]
the Prime Minister announced a range of measures to tackle the
"corroding" impact of online pornography on childhood.
Some of these would prevent children from being able to access
(legal) pornography while other measures would target child abuse
images and the activities of paedophiles.
71. On access to pornography, Mr Cameron said that
by the end of the year, family-friendly filters would automatically
be selected for all new broadband customers (unless the account
holder chose otherwise). Once installed, the filters would cover
any device connected to the customer's internet account and only
the account holder, who must be an adult, would be able to change
the filters. Internet service providers would be given until the
end of 2014 to contact existing customers and present them with
an "unavoidable decision" about whether or not to install
family friendly content filters.
72. In March 2012, TalkTalk had become the first
internet service provider to introduce a genuine "unavoidable
choice" for new customers when they signed up to TalkTalk
broadband, as per the recommendation of the Bailey Review. TalkTalk
told us that customers are asked to make a 'yes' or 'no' decision
as to whether they want to filter access to content that might
be inappropriate for under 18s on their broadband connection or
not.[96] TalkTalk then
applies this to their internet connection as soon as it is live,
and no further action is required by the customer. The customer
is also alerted by email and/or text if any of the so-called Homesafe
settings are changed - safeguards such as this aim to ensure children
aren't changing settings without their parents' knowledge.
73. The Internet Service Providers' Association told
us: "The main consumer facing ISPs are moving to a system
where new and existing customers are presented with an unavoidable
choice of whether to apply filters or not. These filters cover
the whole home ... Some smaller consumer-facing providers are
considering solutions that offer family friendly filters but can
be deployed on smaller scale and at lower costs. ISPA is currently
discussing this issue with its members."[97]
Claire Perry told us that the top four ISPs are introducing "unavoidable
choice" filtering solutions. She said: "we would like
the others to commit to the same thing. They will ask you in different
ways. Some might ask you when you go to query your bill online,
some may interrupt your browsing session, which is a first for
the UK to do that. This is the commitment that Ofcom will be monitoring.
Every household will be contacted and asked whether or not they
would like to switch on the ?lters, and the box, "Yes, please"
will be pre-ticked."[98]
74. We welcome the introduction of whole home
filtering solutions that prompt account holders with a choice
to apply them. We encourage all internet service providers to
offer their customers this valuable service. Ofcom should monitor
the implementation of this filtering and report back on its level
of success and adoption.
75. While greater use of filters is welcome, they
should not be seen as a panacea. ATVOD told us: "The usefulness
of parental control software depends not only on its uptake but
also on its effectiveness. This is especially important lest parents
who use parental control software are lulled into a false sense
of security about the extent to which their children have been
protected when using the internet."[99]
ATVOD further cites EU Commission research which suggests that
the ?lters themselves when set to block "adult" content
suffer from relatively high rates of over-blocking (accidentally
blocking non-adult sites) and under-blocking (failure to block
adult sites). Although the efficacy of parental controls may have
improved since that research was conducted in 2011, ATVOD told
us it is clear that both "over-blocking" and "under-blocking"
still occur.[100]
The Internet Service Providers' Association told us that filtering
does have limitations and that over-blocking and under-blocking
of content "is inevitable".[101]
76. When we held a meeting with eight young people
in January, we heard varying views on filtering. Some called for
stronger filtering to prevent access to harmful material online,
particularly pornography. We were told of boys circumventing the
filters in place in school to access age-inappropriate content.
However, others expressed concern that if filters were too strong
or inappropriately applied, young people could be prevented from
accessing websites offering advice on sexual health and online
safety.
77. TalkTalk described its 'Homesafe' system as a
"whole home parental controls system that allows parents
to protect every device connected to the home broadband and control
the types of websites their family is able to visit."[102]
Homesafe has three features:
· Kids Safeparental controls that
allow the account holder to block content they don't want to be
accessed on their connection. There are nine different categories,
and customers can also choose to block other specific websites.
· Virus Alertsan alert system that
blocks access to web pages infected with malware[103]
and phishing sites.[104]
· Homework Timethis allows parents
to block social networking and online games sitescommon
sources of distraction for children from homeworkduring
a specified time of day.
The Association for UK Interactive Entertainment
describes as "deeply troubling" the inclusion of games
in the TalkTalk list. They told us of the "potential for
this to do signi?cant collateral damage to the UK games industry."[105]
We value the UK games industry and the many educational and
recreational benefits it provides to children. As filtering technologies
continue to develop, as they should, we trust parents will be
empowered to provide the supervision they want of what games their
children play and when.
78. In connection with filters applied by mobile
network operators, the BBFC has a role in calibration: providing
advice to mobile operators on where to set their Internet filters.
The Mobile Broadband Group told us that processes also exist
to remedy inadvertent or unfounded over-blocking. The Group also
told us: "The BBFC framework is binary18 or unrestricted.
This is because 18 is the only age at which it is currently practical
to implement convenient, ubiquitous and robust on-line age verification.
Stricter filters are available in the market for parents that
may want a narrower range of content for younger users but these
fall outside the Code."[106]
At a recent seminar, Adam Kinsley, Director of Policy, BSkyB,
gave details of that ISP's more granular filtering solution which
has additional age categories analogous to those used for cinema
exhibition.[107]
79. The Mobile Broadband Group reminded us that mobile
operators have had network filtering in place for nearly ten years.
They added: "Great pressure has also recently been put on
the domestic ISPs and public wi-fi operators to do the sameand
this is happening. However, all these efforts would be complemented
with the availability of better user controls at operating system
and device level. The UK, through UKCCIS and other channels, should
continue to examine closely what the manufacturers and operating
system providers are offering in the area of child safety and
challenge them to be as equally committed as the network providers."[108]
We agree that the availability and performance of filtering
solutions must be closely monitored, both for efficacy and the
avoidance of over-blocking. It should also be easy for websites
inadvertently blocked to report the fact and for corrective action
to be taken.
80. A recent report by Ofcom notes: "The provision
of accurate content labels or metadata by content providers would
help filtering systems to categorise content correctly. However,
only a tiny proportion of websites are labelled in a way that
allows easy categorisation for the purposes of filtering."[109]
Websites that provide adult content should signal the fact
clearly to enable filters better to take effect. A failure on
the part of the operators of such sites to do so should be a factor
in determining what measures should be taken against them.
81. One of the arguments given against filtering
is the ease with which it can be circumvented. According to Ofcom
research, most parents report that they know enough to keep their
child safe online, but around half of parents continue to feel
that their child knows more about the internet than they do, including
14% of parents of children aged 3-4. Ofcom also acknowledges:
"In some cases, children will be able to bypass filters,
either by altering the filtering settings or by using tools to
conceal the sites they are visiting from the filtering software.
The main mechanisms by which filters may be bypassed are through
the use of a VPN (virtual private network), which encrypts all
internet traffic, and the use of proxy sites."[110]
Ofcom research has also established that 18% of children aged
12-15 know how to disable online filters or controls, but considerably
fewer (6%) have done this in the past year. Filters are clearly
a useful tool to protect children online. Ofcom should continue
to monitor their effectiveness and the degree to which they can
be circumvented.
Media literacy and education
82. Filtering systems will in general fail to capture
text and picture messages sent directly between individuals.
Andy Phippen, Professor of Social Responsibility in Information
Technology, Plymouth University, told us about some of the discussions
he has had with young people themselves:
Protection from access is an interesting concept.
How can we protect them from content they wish to access (which
is certainly something I would observe from talking to boys far
more than girls)? This, again, was re?ected in discussions recently
with a very mature group of 14-16 year old boys in a local schoolone
boy, who was discussing the recent policy discussions around "opt-in"
and ?ltering in the home, made a very clear statement: "You
will not prevent teenage boys from accessing pornography".
He did not state this to be rebellious or controversial, he was
stating it from his observations of his peers. They access and
share pornography and have many ways of doing so.[111]
83. Comments such as this serve only to highlight
the importance of media literacy and education. The difficulty
in entirely preventing access to age-inappropriate material emphasises
the importance of e-safety in the school curriculum and the availability
of advice to parents and carers. Such advice should include how
to report harmful material. Evidence we received from the DCMS
did highlight the role of education in online safety.[112]
From September 2014, the national curriculum will extend e-safety
teaching to pupils aged between 5 and 10 (11-16 years olds are
already covered). The DCMS also referred to several general educational
tools and programmes: Think U Know (from CEOP); the UK's Safer
Internet Centre (which has a hotline for internet safety information);
Get Safe Online (providing advice at the initiative of government,
law enforcement, businesses and the public sector); online resources,
including Know IT All, from the Childnet charity; the South West
Grid for Learning; and ParentPort. The last of these is a complaints
portal that directs individuals to the relevant media regulator,
or to sources of advice for content that has no regulator responsible.
Edward Vaizey told us: "Ofcom was behind setting up ParentPort
because it felt you needed a one-stop shop for parents to go to
get all the advice they needed."[113]
We welcome the introduction of ParentPort but believe Ofcom
should seek to promote and improve it further. For example, more
use could be made of it to collect data on complaints concerning
children's access to adult material.
84. We further recommend that Ofcom regularly
reports on children's access to age-restricted material, particularly
adult pornography and the effectiveness of filters and age verification
measures. Ofcom is well-placed to fulfil this role given the work
it does on its Children and Parents: Media Use and Attitudes Report.
85. Childnet International commented on the need
for further work on the education front:
There is a need for ongoing educational and awareness
work in this area ... As the UK Safer Internet Centre, Childnet
(along with the Internet Watch Foundation and South West Grid
for Learning) will be running Safer Internet Day 2014 which will
take place on 11th February. The theme of Safer Internet Day is
"Let's create a better internet together". This positive
call to action provides all stakeholders with the opportunity
to reach out and positively work to empower internet users in
the UK.
We are hoping a proposed industry-led awareness
campaign, led mainly by the 4 big ISPs, can combine with our work
and help make Safer Internet Day 2014 even bigger than last SID
2013, which reached 10% of the population, and led to 40% changing
their online behaviour as a result of the campaign.[114]
Safer Internet Day 2014 was subsequently celebrated
by 106 countries, and early indications are that it was a great
success: over 25 million people were reached by the "SID2014"
Twitter hashtag alone.[115]
86. In their evidence, the sexual health charities
FPA[116] and Brook
included the following: "Sex and Relationships Education
(SRE) guidance pre-dates recent and emerging issues on technology
and safeguarding, with no reference to addressing on-line safety,
"sexting" or pornography in SRE. Brook and FPA recommend
that the Government should update SRE guidance for the modern
era."[117] The
young people we met in January were unanimous that schools should
be required to offer sex and relationships education. As one
young person put it, teachers are a sounder source of professional
information on sex than friends or the internet. The young people
said providing them with the knowledge, tools and confidence to
navigate potential online dangers would ultimately be more beneficial
than technical measures. The NSPCC recently told us that the
Government has committed to emailing every school SRE advice developed
by Brook, the sex education forum, and others.[118]
We note comments on the state of, and access to, sex and relationships
education. We are aware this is a politically contested subject
but believe the Government should take into account the views
of the young people who gave evidence to us of the value and importance
of good quality mandatory sex and relationship education as policy
develops. In the mean time, teachers have many opportunities to
use their professional judgement in advising children both on
online safety and on respect for each other. We believe there
is scope for providing teachers with clearer signposting of the
advice and educational resources that are already available.
58 Ev w14 Back
59
Ev w15 Back
60
Ev w135 Back
61
Q 7 Back
62
Q 7 Back
63
Ev 69 Back
64
Section 1, Obscene Publications Act 1959 Back
65
Ev w15 Back
66
Ev w137 Back
67
Q 6 Back
68
Q 222 Back
69
Q 113 Back
70
Ev w133 Back
71
Ev w135-w136 Back
72
Ev w134 Back
73
http://stakeholders.ofcom.org.uk/binaries/research/tv-research/946687/Protecting-audiences.pdf
Back
74
Ev 95 Back
75
Ev 86 Back
76
Ev 86 Back
77
Ev 86 Back
78
Ev w7 Back
79
Ev w120 Back
80
Ev w129 Back
81
Ev w134 Back
82
Ev w133 Back
83
Ev 65-66 Back
84
www.nominet.org.uk/uk-domain-names Back
85
Formerly known as the British Computer Society Back
86
Ev w35 Back
87
Ev w57-w58 Back
88
www.iwf.org.uk/members/member-policies/url-list Back
89
Independent parliamentary inquiry into online child protection: findings and recommendations,
April 2012, p5 Back
90
Ibid, p8 Back
91
Ev 108 Back
92
Ev 108 Back
93
Ev w140 Back
94
Ibid. Back
95
https://www.gov.uk/government/speeches/the-internet-and-pornography-prime-minister-calls-for-action
Back
96
Ev 83 Back
97
Ev 80 Back
98
Q 207 Back
99
Ev w137 Back
100
Ev w138 Back
101
Ev 81 Back
102
Ev 82 Back
103
Malicious software Back
104
Phishing sites aim to obtain personal data by deception Back
105
Ev w92 Back
106
Ev 87 Back
107
Westminster eForum Keynote Seminar, Childhood and the internet
- safety, education and regulation, 29 January 2014 Back
108
Ev 88 Back
109
Ofcom Report on Internet safety measures: Strategies of parental
protection for children online, 15 January 2014 Back
110
Ibid. Back
111
Ev w111 Back
112
Ev 109 Back
113
Q 206 Back
114
Ev w90 Back
115
https://infogr.am/sid-2014-overview?src=web Back
116
Family Planning Association Back
117
Ev w100 Back
118
Ev 112 Back
|