Examination of Witnesses (Questions 300-319)
MR KENT
WALKER
1 APRIL 2008
Q300 Philip Davies: There is one
argument you made that I cannot quite follow. You said that you
do not really want to go down the line of somebody like YouTube
looking at things before they go up there to decide whether or
not they are worthy of going on and that that would be a restriction
on free expression. Once something is flagged up to you you then
review it to see whether it is acceptable or not. What is the
difference between reviewing it before it goes on to see if it
is acceptable and reviewing it after it has gone on to see if
it is acceptable? Surely you are going through exactly the same
process.
Mr Walker: In the United States
I am not sure if the language is the same but I suspect the concept
is the same in Britain. We would have concerns about what what
in the States we would call prior restraint. We have hundreds
of thousands of pieces of material being uploaded on YouTube or
millions of pieces being uploaded if you start to include Blogger,
MySpace, Bebo, shared documents, posts going up onto videos, a
huge amount of user-generated content. This is what is called
"Web 2.0" and it is where you have empowering of the
individual consumers to create their own content. If you try to
take that vast amount of content and pre-screen all of it, it
is neither efficient nor effective and would burden the process
of creation. Think of the delays that would be occasioned between
the time you tried to edit a document or post a comment and some
days later when it would appear online. It is a very different
model to anything the Internet has ever had.
Q301 Philip Davies: What you are
saying is it would be a pain in the backside to do that. I do
not dispute for one minute that it would be hopelessly inconvenient
and a complete pain for you, but if it would stop things like
this from going on in some of my local schools would you not agree
that it is a price worth paying?
Mr Walker: It is not the price
for us that I would be concerned about but the price for the user.
It is a somewhat different model to what we have had before. A
page like this gets worked on by 20 different people. It is on
the Internet; it is user-generated content. If I change a comma,
should that need to be reviewed before it goes online? I recognise
there is a spectrum, from something that gets viewed millions
of times to the millions of things that get viewed only a small
number of times, but all of that is user-generated content. If
you have a correction to a Wikipedia article, should it need to
be pre-reviewed before you can move the comma? This is the challenge
that we face in trying to come up with a scaleable model that
works for all forms of user-generated content. The effective and
generally fairly efficient approach and the approach that has
safeguarded the user experience and the value of the Net have
been responding very quickly to problems. It is as it is in the
offline world. We do not have policemen on every street corner
stopping things from happening. We have policemen who very quickly
respond when there are problems that occur and consequences for
violation of those rules. In this case you would remove a video,
you would lose your account and you would be kicked off the Google
services if you were guilty of violating the rules.
Q302 Paul Farrelly: Mr Walker, it
was a very simple question that my colleague Mr Davies asked you
which you did not answer. How many people does YouTube employ
to proactively review and take down inappropriate content irrespective
of whether it has been flagged or not?
Mr Walker: The answer is we do
not proactively review in a human way
Q303 Paul Farrelly: Not a single
person?
Mr Walker: We have automated tools
which review material to see if it has been previously flagged
and will stop its reoccurrence. We are developing tools, again
to the point that we are a technology company
Q304 Paul Farrelly: Not a single
person is the answer?
Mr Walker: Devoted to doing prior
restraints of user communication, that is correct.
Q305 Paul Farrelly: Let us take a
clear-cut example. Somebody perverse uploads a piece of child
pornography, how does that get removed from YouTube before anyone
flags it?
Mr Walker: There are two potential
ways. One, we are working very hard on various forms of automated
filters that will detect
Q306 Paul Farrelly: How does it now
get removed?
Mr Walker: In some cases I believe
some of these filters are already being used to identify pornography
content. It is difficult to distinguish child pornography from
general pornography, but the advantage is that both are illegal
or unacceptable on YouTube and so we do not allow it.
Q307 Paul Farrelly: How does it work?
Until someone flags it
Mr Walker: If an individual piece
of video has been uploaded before and we have ruled it a violation
of our terms of service or illegal
Q308 Paul Farrelly: I am talking
about a new piece. You are saying somebody has got to flag it
to you before it is taken down.
Mr Walker: I am saying that that
is an important part of the way the material comes down. I think
you will find that there is very little pornography let alone
child pornography on the site because of that. As users know that
if they upload something they should know it is going to come
down very quickly it is almost not worth the bother of putting
it up in the first place. Either it is blocked because it has
already been taken down before, it is potentially blocked because
it is something that our system has identified as being pornographic
or it is blocked because the first person to look at it says it
is clearly child pornography and would flag it and it comes down
within a matter of minutes. The game is not worth the candle.
Q309 Paul Farrelly: Clearly it would
affect YouTube's business model if it had to employ banks of people
to do this and therefore your profitability. Does your business
model not abdicate any responsibility that you have?
Mr Walker: I would say the reverse
is true. The unacceptable content, child pornographywe
make no money from pornography and have no desire to or from the
offensive content. There is a much larger risk to our brand reputation
and to the vast amounts of money and time and effort we put in
to try to detect and remove this material from the service. We
would be, from a business perspective, much better off if none
of this had ever hit the site.
Q310 Paul Farrelly: Last year there
was a teenager murdered in random violence in Liverpool and the
press posted pictures of screamed rants from YouTube on videos
glorifying gang violence. I am sure this is not just a UK issue,
it is an issue in America as well. When I made some comments about
that YouTube's response was to say, "We're not in the process
of editorializing. It's nothing to do with us, Guv," and
subsequently I have been pestered by teams of YouTube PR people
seeking to meet me and educate me. If those people were employed
doing something about that sort of content or if the law as it
stands gave you more of an incentive to do that I would feel much
happier, particularly as the father of young children.
Mr Walker: Let me apologise for
any pestering. I am sure it was not intended. The general notion
is that it is right that we feel a sense of responsibility, as
I am sure you do and parents everywhere, to try and make sure
that this sort of stuff does not get uploaded. Material that promotes
violence on our site violates our rules and it should not be there.
Q311 Paul Farrelly: There was another
case recently where a young girl of 15 was prosecuted and sentenced
to detention for being an accessory to manslaughter and possibly
murder because of the filming of gratuitous violence that was
uploaded to YouTube. If the law were to be strengthened to make
it carry the risk that by virtue of owning YouTube and broadcasting
that you might also, unless you took the issue more seriously,
be prosecuted as an accessory, that would give you more incentive
to sharpen your act up, would it not?
Mr Walker: I do not think of ourselves
as a broadcaster, I think of ourselves as a communications platform.
I assure you, we take the issue extremely seriously right now.
The question is not a lack of will.
Q312 Paul Farrelly: But you do not
employ a single person.
Mr Walker: To do something that
we ultimately think would be the wrong approach. We employ many
people to get involved in this very complicated balancing of the
chilling of free speech on the one hand and the elimination of
harmful or offensive content on the other and that is ultimately
the right path. There are problematic phone conversations that
go on every day with people planning criminal offences. No one
would think to impose a requirement on the phone company to monitor
phone calls, which would probably be effective in reducing the
use of telephones to commit criminal acts. The balance on the
other side, of the invasion of privacy, would be thought to be
undue. Here again I think trying to create a model which turns
the Internet into a monitored broadcast medium where everything
you want to post to YouTube or MySpace, whether it a comment to
a blog or a blog itself or even your email which goes out to 100
people, should have to run through a filter before it is made
public
Q313 Chairman: I think our approach
would be to suggest to you that your corporate slogan might not
just be "Do no evil" but perhaps "Take an active
role to prevent others doing evil". I understand about the
amount of material that is posted. Could you confirm whether or
not ten hours every minute of video content is correct?
Mr Walker: That is currently correct.
It goes up every day.
Q314 Chairman: That video content
is tagged. You do not need to look at every single minute of video
content. Surely you could have people who would look at the video
content which is tagged with labels which suggest it could be
inappropriate. If something is tagged "rape" then you
might decide that would be worth looking at rather than waiting
to see if somebody reports it.
Mr Walker: We look at a variety
of different signals. Key words might be something to take into
account. The challenge is when you go down that path and someone
is posting a comment on "Sex in the City", you might
well get an awful lot of material that is not problematic.
Q315 Chairman: If you were to narrow
your search by looking at the material which is tagged with labels
which suggest it could well be inappropriate then you would not
have to be looking at the ten hours going up every minute, you
could actually employ some people specifically for this purpose.
Mr Walker: At the end of the day,
I think we all agree that the goal is to minimise the amount of
controversial material that is on the site. What is the most effective
way to do that, not the least expensive, but the way that is best
for the user experience, to block it? It may be that some combination
of an analysis of the material that is being uploaded through
technological tools, an analysis of the labels that are going
on, an analysis of the history of the user if they have previously
posted problematic material but not so much that their account
has been suspended and an analysis of how many people were viewing
an item or have viewed other items in the past. We take a lot
of different signals data into account. Certainly it is a fair
suggestion and it is one we will continue to look at.
Q316 Chairman: I only raise it because
there was a case in the UK very recently of a woman who was gang
raped and a video was then uploaded to YouTube. It was viewed
by 600 people before it was taken down.
Mr Walker: There were 600 views.
We believe it was a much smaller number of individuals, but I
am familiar with it. Clearly it was a mistake on our part as a
result of human review. Our reviewers review a lot of material
and in some cases just simply make a mistake.
Q317 Chairman: You say it was a mistake
on your part, but it would have been possible to take it down
earlier, would it not?
Mr Walker: The initial flag was
reviewed and the individual reviewer, who had reviewed a huge
number of materials, did not take it down promptly upon reviewing
it. I do not know exactly what happened, but it was a mistake.
It was a tiny, tiny, infinitesimal percentage of the material
that we are reviewing.
Paul Farrelly: That is incredible.
Q318 Adam Price: How could you make
a mistake like that? How can you agree with gang rape and not
see it for what it is?
Mr Walker: The challenge points
out the difficulty with human review and that the answer is not
always putting more people on this.
Adam Price: Come on!
Paul Farrelly: Do you know how absurd
you are sounding?
Q319 Adam Price: You are defending
the indefensible now. People will find this deeply objectionable.
You cannot defend that. No reviewer could view that kind of content
and not understand it for what it is. Surely that single case
is enough for you to realise that your approach is completely
inadequate. How can you defend that?
Mr Walker: I do not mean to defend
it. Certainly the rape itself and the underlying content are abominable
and no one would defend it.
|