CHAPTER 4: OTHER ISSUES
Children
71. Society grants a leniency to children for
some behaviour which would be prosecuted as criminal if done by
an adult. In England and Wales, children below the age of 10 are
not generally held to be capable of committing a crime. Special
guidance from the Director of Public Prosecutions applies to considering
whether to prosecute a child between the ages of 10 and 18. This
is not because society considers that children between the ages
of 10 and 18 may behave with impunity; instead it considers that
it is usually proportionate for parents and schools to take proportionate
remedial action and to educate the child as to appropriate behaviour.
72. The Code for Crown Prosecutors says that
prosecutors must have particular regard to:
"was the suspect under the age of 18 at
the time of the offence?
The best interests and welfare
of the child or young person must be considered including whether
a prosecution is likely to have an adverse impact on his or her
future prospects that is disproportionate to the seriousness of
the offending
As a starting point, the younger the suspect,
the less likely it is that a prosecution is required
However, there may be circumstances which mean
that notwithstanding the fact that the suspect is under 18, a
prosecution is in the public interest. These include where the
offence committed is serious, where the suspect's past record
suggests that there are no suitable alternatives to prosecution,
or where the absence of an admission means that out-of-court disposals
which might have addressed the offending behaviour are not available"
73. This is generally thought to be proportionate
and appropriate: the criminal justice system can intervene when
it needs to do so.
74. Our inquiry is limited to consideration of
the law. It strikes us though that parents and schools have a
responsibility generally to educate children: children need to
be taught that being horrid online is just as wrong and hurtful
as being horrid face to face. Similarly, parents have an essential
responsibility to protect their children from harm on the internet
as they do when children are in any other public space. Schools
have an opportunity to draw to parents' attention when they detect
that parents might need to intervene. How most appropriately and
effectively to approach this is a matter we have not considered.
It strikes us as unlikely that simply banning access would be
effective.
Balances: law v policy interventions
75. We have limited this inquiry to an investigation
of the law, but the law is rarely the most effective tool for
changing behaviour: effective law tends to reinforce, rather than
in itself change, social attitudes.
76. At present, the law prohibits people from
sending grossly offensive messages but people send them nonetheless,
and in great number, in part due to the ease with which the internet
and social media facilitate communications. The threshold for
prosecution is rightly high. This prevents the courts from being
overwhelmed with inappropriate cases, but it does not reduce (let
alone prevent) inappropriate complaints to the police. As John
Cooper QC put it: "the police are being inundated with spurious
complaints
They cannot investigate every transgression
on the social media".[30]
The consequence is that there is every chance that offences which
deserve to be prosecuted will not be, due simply to the volume
of complaints.
77. A victim has to be confident that an offence
has been committed; the police constable to whom the offence is
reported needs to understand what offence has been committed and
whether it is initially proportionate to consider the matter criminal
or whether some other course of action should be taken. Other
than gradual, general social education, there is no efficient
way to address this. The advertisement of the law and of rules
on websites is desirable, but not very effective. The widespread
publicity given by the traditional media to the conviction of
people prosecuted for committing offences using social media does
more to educate than any advertisement. We welcome the efforts
of the police to educate themselves about the relationship between
social media and criminal offences and hope that this will extend
to the officers with whom the public are most likely first to
come into contact.
78. In the light of the volume of offences, society
has four options: i) do nothing and accept the status quo; ii)
add resources so that more allegations can be investigated and
prosecuted; iii) change the law so that the behaviour is no longer
criminal; iv) retain the law and approach to prosecutions, but
seek to change behaviour through policy interventions.
Website operators
ATTITUDE
79. Both Facebook and Twitter presented themselves
to us less as corporations responsible as legal persons under
the law, and more as communities who operate according to their
own rules.[31]
80. Those rules can be admirable: Facebook has
a real name culture, a set of community standards (e.g. regarding
nudity), enables people to control their own privacy, and enables
the reporting of abuse;[32]
Twitter have rules against threats of violence, targeted harassment
and similar issues. Other operators are less responsible. Irrespective
of the responsibility of the website operators, the behaviour
with which we are concerned is criminal.
MONITORING
81. The number of staff employed to consider
reports of content or conduct is inevitably inadequate to the
scale of use of the website. Globally, Facebook employ "hundreds"
of people in this area; Twitter "in excess of 100".
82. Facebook has developed technology to prevent
or quickly stop the posting of certain material, for example child
sexual exploitation.[33]
Similarly, systems urgently flag for human intervention the most
serious types of report, such as suicide or self-harm[34]
but the systems are not perfect because the traffic on the site
is varied and can irrationally spike. We received no evidence
about the speed or proportionality with which less serious types
of report were processed.
83. These actions in our opinion have been driven
by the companies' own values and by the market, not by law. Many
website operators are significantly less responsible.
84. We encourage website operators further
to develop their ability to monitor the use made of their services.
In particular, it would be desirable for website operators to
explore developing systems capable of preventing harassment, for
example by the more effective real-time monitoring of traffic.
SELF-HELP
85. Every user of Facebook can control the extent
to which other users may interact with them: privacy settings.
Facebook has introduced a tool to report abuse; and also a tool
whereby user A may ask user B to remove a post (usually a photograph)
in which the user A is portrayed. Facebook told us that in 85%
of cases, user B complies.[35]
86. Self-help, as in the ability to block
sight of abuse, is valuable but its value is limited when the
abuse remains in the public domain. We encourage website operators
further to develop the effectiveness of measures to enable individuals
to protect themselves when using social media services.
87. It would be desirable for website operators
to publish statistics on monitoring and self-help.
LIABILITY AT LAW
88. A European Union directive[36]
has harmonised provision on electronic commerce, including the
liability of websites which host content originated by others.
That directive is implemented in United Kingdom law in the Electronic
Commerce (EC Directive) Regulations 2002 (SI 2002/2013). Those
regulations give immunity to websites from damages or criminal
sanctions where they act merely as a conduit, cache or host, so
long as they operate an expeditious "take down on notice"
service. This acts as an incentive to website operators to remove
illegal or actionable material. It is for the website itself to
determine whether the material which they have been asked to remove
is genuinely illegal or actionable.
89. The Defamation Act 2013 goes one step further.
Section 5 creates a defence to an action for defamation for the
operator of a website to show that it was not the operator who
posted the statement on the website. The defence is defeated if
the claimant shows that it was not possible for the claimant to
identify the person who posted the statement, the claimant gave
the operator a notice of complaint in relation to the statement,
and the operator failed to respond to the notice of complaint
in accordance with regulations made by the Secretary of State.[37]
The act thus incentivises website operators not only to operate
an expeditious and proportionate "take down on notice"
service but also to be capable of identifying people who post
statements using their websites.
90. Parliament has thus accepted the view that
the liability of website operators should be limited in respect
of content they host but which they have not originated. It is
however significant in being the first statute in this country
to link immunity from liability to disclosure of the identity
of the person who made the statement. It might well prove desirable
to extend this approach to criminal offences capable of being
committed using social media. It is however premature to decide
until society has useful experience of its operation.
91. Website operators are not necessarily accessories
in liability to crimes. The law could be changed to clarify this.
92. Another approach might be the establishment
by law of an ombudsman, funded by website operators, to set policy
and consider complaints in this area. Although not a solution
to every problem, it is desirable to have a well-developed system
of self-policing and self-regulation.
JURISDICTION
93. It is trite but necessary to say that the
global nature of the internet raises difficult questions as to
jurisdiction. Facebook and Twitter offer their services across
the globe, as do most social media website operators. A fundamental
benefit of the internet is the way in which it has interconnected
the whole of the world. Facebook and Twitter are both publicly
listed companies incorporated in the United States of America
which operate data centres in a number of countries but not the
United Kingdom.[38] They
are by no means unusual in operating in this way. When a website
operator develops a technology automatically to prevent something
bad, it inevitably needs to do so to some common international
standard: it is not feasible that it should consider the drafting
of section 1 of the Malicious Communications Act 1988. It is though
feasible that every democratic state should expect automatic cooperation
from website operators in relation to the detection and prosecution
of crime. Similarly, there is at present inevitable uncertainty
as to the ability of our courts to try offences when the person
committing the offence, the host or publisher and the victim might
each be based in a different country. The only way as we see
it to resolve questions of jurisdiction and access to communications
data would be by international treaty.[39]
The question is though relevant to many more areas of the law
and public protection than criminal offences committed using social
media and is politically contentious in most countries. This raises
issues beyond the scope of this inquiry.
30 Q 7 Back
31
QQ 25, 26; cf paragraph 63 Back
32
Q 26 Back
33
Q 26 Back
34
QQ 27, 30 Back
35
Q 26 Back
36
Directive 2000/31/EC on electronic commerce, articles 12
to 15. Back
37
Defamation (Operators of Websites) Regulations 2013 (SI 2013/3028) Back
38
When we asked both Facebook and Twitter to give us specific data
about an element of their operations, they were unwilling to do
so. We found both companies obliging witnesses but, if we had
wished to press them for the data, we would have had no power
to compel its release because neither company operates formally
in the United Kingdom. Back
39
The Data Retention and Investigatory Powers Act 2014 has sought
to extend the extraterritorial effect of the Regulation of Investigatory
Powers Act 2000. Back
|