Examination of Witnesses (Questions 280-299)|
1 APRIL 2008
Q280 Rosemary McKenna: Would you
support the Byron Review's recommendation for an independently
monitored code of practice on the moderation of user-generated
content, and will you modify your terms and guidelines accordingly?
Mr Walker: We would support the
development of a voluntary code of practice. We would support
the notion of moderation in the sense of working to make sure
that inappropriate content, once identified, is promptly removed.
We lead the industry in how quickly we remove that content. We
would certainly be open to a discussion on our terms of service
and content which have continued to evolve. I would go back to
the point that the service has only been in existence for two
or three years and those terms have evolved over the course of
its history. As I mentioned before, we are looking at the possibility
of different approaches in different circumstances around the
Q281 Rosemary McKenna: Do you tell
people that if they upload videos of others committing crimes
then they too risk criminal conviction?
Mr Walker: We prohibit anything
that is deemed to be hate speech, that is deemed to be graphic
and gratuitous violence and anything that is humiliating or bullying
or otherwise negative. I would have to go back to see if we specifically
call out the risk of criminal prosecution in an aiding and abetting
Q282 Mr Sanders: When somebody signs
up to YouTube they are confronted with the terms and conditions
and you have to scroll through thousands and thousands of legalese
words. I suspect most people scroll to the bottom, tick the box
and off they go because they want to get going. Surely you could
do more. You could have a series of questions that confront people
and say, "Are you aware that if you upload X you will be
liable to prosecution, yes or no? If you do this [...],"
and so you make the process of signing up almost like one of those
learning programmes where you go through a series of questions.
You could do that, could you not?
Mr Walker: It is a classic question
of online consumer services. There is a desire on the one hand
to cover a wide variety of potential risks and concerns and problems
and a desire on the other to make something short enough that
people will actually read it. We have this issue with our privacy
policy, for example, to describe accurately all the things that
are done with users' information, to benefit them and to customize
services. It is a very long document. The concern has been that
the longer you make it the less people will read it. We try and
balance those two considerations by making it not as legalistic
as some and I am sure we could always do better, but we do go
back regularly and try to shorten that document to make it more
user-friendly and more likely that people will read it. There
would be an infinite number of boxes that could be ticked around
a wide variety of different things. Our ultimate question is to
what degree does the user understand their role and responsibility
online and to encourage them to do the right thing and not violate
the terms. Where they do violate the terms we respond very quickly
and remove the offending content and let them know in no certain
terms that their accounts will be violated if they do it again.
Q283 Mr Sanders: Everybody has ticked
the box to say they have read and understood the terms and conditions,
but I doubt anybody has done a study to test whether they really
have read and understood the terms and conditions. It would not
be that difficult to put in, on a rota basis, some extra questions.
They do not need to be the same questions to everybody, but just
some extra questions in there that really make it clear that there
could be a penalty for misusing the system. That would offer that
little bit of extra protection at relatively no loss to yourselves.
Why do you not do it?
Mr Walker: It is certainly a fair
concern. I do not want to get into the micromanagement of product
design because the people back in Mountain View would not be happy
if I did. The tension would be between a form that had lots of
boxes that you needed to check all the way down. There is a temptation,
if you give them 20 different boxes to check, to say, "Yes,
yes, yes, yes, yes," all the way down.
Q284 Mr Sanders: That is not what
I am saying.
Mr Walker: I understand. That
is on one side. There may be some enhanced ways of making our
terms of service more visible or highlighting particular concerns
within them. Each different constituency has a different concern.
When I speak with representatives from the privacy agencies and
data protection agencies they tell me they would like us to put
more emphasis on the privacy protection sites of our materials.
The copyright industry would like us to put more emphasis on the
risks of infringing copyright. Certainly protecting children is
a paramount concern and we need to make sure that that is highlighted
appropriately. There are a number of others. We try and calibrate
and harmonize all these different concerns. We think of our terms
and our approach to our terms as a work in progress. I take the
point. It is something that I will study when I get back to the
United States to see if there are additional things we can do.
At the end of the day, the question is, how do we really reach
the user in a tangible, visible way? We have a lot of material
on-site, tutorials and other kinds of educational material that
we have developed in some of these areas and we may be able to
do more in this area as well.
Q285 Mr Evans: You have talked about
this struggle between the freedom of speech on the one hand and
protecting youngsters on the other. Let us go back to China for
a second. You have got the battle between the freedom of speech
and the protection of a government which regularly violates human
rights. Should that not be an easy one for Google to sort out?
You are on the side of freedom of speech, surely.
Mr Walker: We are on the side
of freedom of speech and the goal is to maximize freedom of speech
and access to information around the world. It is a harder question
to say how best one does that. In China and in a number of other
governments around the world, which are not as democratic as we
would like, you have to consider whether you serve that goal better
by absenting yourself completely from the country and not making
that service available to the citizens of the country or by trying
to engage in a constructive way and push the boundaries of what
is acceptable so that again more information is available on your
service than would be on contending services. We balance the desire
to be a socially responsible company in many areas with the desire
for free speech. We are working constantly on the copyright side.
It is a very hard challenge because you have 200 plus countries
around the world each with their own sets of laws and their unique
concerns. I alluded to some of these earlier on. It is in some
measure not for us to pick and choose which laws we like and which
we do not. On the other hand, at the margins there are certainly
situations where human rights or other concerns are such that
we have a difficult time. Another example would be a recent situation
involving a video on YouTube which involves someone being beaten
unmercifullyshocking, horrible stuff. We took it down immediately
only to discover it was actually posted by a well-known Egyptian
human rights blogger who had raised concern in his blog about
Egyptian police brutality. Not being able to find a place on his
blog to host a video, he had cross-linked to do it on YouTube.
I suspect it was evidence in support of his concern about police
brutality and abuses in violation of the laws of Egypt. We have
situations in which one country is concerned about the citizens
of another country viewing material in a way that might create
a danger to the citizens of the first country, so we have extra-territorial
or extra-jurisdictional issues that are raised. It is a phenomenally
complicated set of issues and we do our best to work both in a
public and on a private basis with governments around the world
to try and get to a position of maximum free speech everywhere
Q286 Mr Evans: I am just wondering
how best you serve the people of China when you put in Tiananmen
Square or Falun Gong into your search engine and it throws up
nothing because the Chinese authorities do not want that to happen.
Mr Walker: We do disclose to our
users that the government has edited the search results effectively
and there is available on our side a link to our dot-com results.
A user in China, if they can get through the Chinese government's
own "Great Firewall", will have access to our unfiltered
comprehensive results. I recognise the issue.
Q287 Mr Evans: How do you make these
decisions? Do you have a group of people who sit down and say,
"China is worth $100 million to us so let's buckle a little
Mr Walker: I can candidly say
that I have been in on a number of conversations and never have
I heard a discussion about how much money we stand to make from
a given country. It is always, and I think legitimately, a balance
in terms of trying to promote free expression or the free exchange
of ideas while maintaining our ability to operate in-country and
keep our people in country safe from a rampaging mob, which we
have seen, or situations where our country managers have been
hauled in to police departments or otherwise threatened.
Q288 Mr Evans: Do you think the fact
that you have buckled to the Chinese authorities has done any
damage to the reputation of Google?
Mr Walker: I would not describe
it as having buckled. I think it is a complicated area. There
are those who have a view of the world that would have us essentially
keep our hands clean of any involvement in any government that
they did not like around the world. It has not been our approach.
We have tried to continue to maintain our message of free speech.
We have tried to minimise the possibility that user information
would be disclosed to governments that would use it for ends that
we would not approve of. At the same time, we have a policy of
cooperating with law enforcement around the world and many government
requests, even from governments that may have human rights issues,
are in fact legitimate. There are situations involving child pornography
or bank robbery or murder in every country around the world. Balancing
those two things is another set of issues we wrestle with.
Q289 Mr Evans: Have the Chinese authorities
ever asked Google to do certain things and you have turned round
and said, "No, we're not going to do that"?
Mr Walker: The YouTube example
would be one most recently.
Q290 Mr Evans: That one is not so
clear because I think the Chinese authorities wanted footage of
the rioters shown on television to show the Chinese people that
these are hideous people who are damaging property. The Chinese
people were quite happy to see video footage of that.
Mr Walker: I am not clear on that.
They did block, for example, Reuters' and the BBC's accounts of
the rioting that was going on. We have a team of people in China
who work on exactly these issues and push back against requests
from the Chinese government to try and minimise the scope and
duration and in some cases, as with the recent situation, they
simply do not filter it.
Q291 Mr Evans: You negotiate with
them, but if the Chinese turn round and say, "No, we want
this off," then it has to come off otherwise they will just
block you completely.
Mr Walker: Within the dot.cn site,
the site that is limited now to China as to search, that is correct.
It is a condition of doing business in the country that we comply
with the laws of the country. That is the difficult issue that
we wrestle with. At the same time, we try and minimise it. Compared
to any other search engine out there we provide more information.
We filter less at the request of the Chinese government than anyone
else out there. We think that brings a benefit to the people of
China. If they want to, if they are interested in doing so, they
have access, to the maximum extent we can provide it, to the global
Q292 Mr Evans: If it was Zimbabwe
and Mugabe's hideous regime asked you to remove certain things
surely you would tell them to get lost, would you not?
Mr Walker: It would depend at
some level on what the request was. If it was for child pornography,
I think we would probably honour that globally. If it was politically
sensitive material, that is a much harder question. We do not
have offices in Zimbabwe. We do not have, to the best of my knowledge,
a site targeted at Zimbabwe. In fact, as you will see if you go
to YouTube or some of the search sites, a lot of information coming
out of Zimbabwe about the progress of the election, concerns about
potential election rigging and the like. It has been a wonderful
channel for that sort of information. In Venezuela within the
last year, when the Chavez administration shut down various television
stations, several of those stations turned around and started
broadcasting on YouTube as an alternative way of providing information
to the citizens of Venezuela. It is an important back and forth
that we look at continually.
Q293 Mr Evans: I suspect that the
difference is China is huge, it is very powerful and there is
a lot of money there. Maybe Zimbabwe is not so clever and clearly
very poor. Google, like a lot of countries, probably takes a different
attitude to China than they would to other countries that violate
Mr Walker: It is a challenging
game to sit down and assess the various criteria one would apply
to a different country with regard to its overall level of democracy,
level of transparency, independence of its judiciary, tolerance
for civil expression and then, taking all that into account and
the question of whether or not we have people in harm's way in-country,
what is the particular nature of the request that is being made?
Is it for child pornography? Is it for something that we genuinely
recognise as criminal? Is it for defamation, which might well
be recognised as criminal or at least a civil wrong in the United
Kingdom or in Europe? What if it is defamation of a political
figure? What if it is defamation of a dead person, Kemal Ataturk
or Mahatma Gandhi, which in the United States would not be defamation
at all but certainly is defamation in those countries? We try
and bring all of those factors to bear in the analysis of these
Q294 Mr Evans: At what level in the
company are these decisions made? Is it one person or is it a
group of people that make these decisions? Are they different
people for different countries?
Mr Walker: Right now our Executive
Management Group, which is the very senior people in the company,
has reviewed several of these issues. There is a woman on my staff
who we have christened "the decider" who is involved
in reviewing a variety of these things around the world. We have
policy teams, corporate communication teams and people on the
ground involved because these are close issues. There are different
points of view even within the company as to the right approach.
In most cases we are able to reach consensus internally. In some
cases we have escalated them to the senior management of the company
for decisions as to what to do.
Q295 Philip Davies: Could I give
you a real life example of some of the things that people have
been saying earlier on today about some of the content? It is
a paper you might not have come across, but as it is my local
paper it is the most important paper in the country and that is
the Bradford Telegraph & Argus. On the front cover
you will see video nasties with pictures of Buttershaw School
fights and Rhodesway Royal Rumble bitch fight 1 and bitch fight
2 all appearing on YouTube. Do you accept that this kind of gratuitous
violence happens partly because people want to put it up on YouTube
and that without YouTube some of these things would not happen?
Mr Walker: I think you are right
that there is a risk that any new form of communication often
times gets adopted first by the youngest people in that community
and can be used for many good things and for some bad things.
Gratuitous violence is against our policy so we try and remove
it very quickly. It is typically taken down in a number of minutes
where it is flagged. It is a concern. It is something we work
very hard on.
Q296 Philip Davies: You say you take
it down very quickly. How long does it take for you to remove
an item that is inappropriate like that?
Mr Walker: Once flagged, more
than 50% of that material is removed within half an hour. A large
majority of it is removed within an hour. In the longest cases
it is one to two days for materials that are harder to identify
or figure out whether or not it is a documentary or it is promoting
or glorifying violence.
Q297 Philip Davies: Do you not think
that is slightly lax, that once flagged you say you will take
things down as quickly as possible? Do you not think that you
have a duty, given that it is your site, to have people monitoring
what is going on there and proactively taking things down like
this, which are completely inappropriate and I think everybody
would agree is totally unacceptable, rather than waiting for somebody
somewhere to flag it up when they are thinking, "I don't
need to flag it up because surely somebody at YouTube is going
to be looking out for this and taking it down"?
Mr Walker: The community does
very actively flag. We get hundreds of thousands of flags every
day. The larger question you raise is what is the model of the
Internet? Is it more like broadcast television or a newspaper?
Is it more like the telephone system, a communication platform
that is used for one-to-one or a few-to-few kinds of communications?
It clearly has aspects of both. It would be hard to have the rules
of television or the requirement of impartiality applied to YouTube.
We have this "long tail" problem where there is an awful
lot of content. Whether it is on YouTube, Blogger or MySpace,
the notion of having somebody pre-clearing your content before
you posted it to a MySpace account or to a Bebo account or pre-clearing
a blog which you were going to write before you are allowed to
publish it on the Internet because it might have offensive content
in it has not been the way the Internet has worked. Looking at
the volume of material we have uploaded just on YouTube, it is
hundreds of thousands of video clips in a day. When you add in
all these other services it is in the millions. The effective
way of doing it has been to be responsive to problems, to respond
very quickly to problems and remove them. Going back to your earlier
notion of media literacy, it is to make sure people understand
that there will be stupid things posted; there will be misuses
and abuses of the platform in the same way that there is graffiti
scrawled on school walls. You learn to disregard it, to recognise
it is inappropriate. Parents work with children to make sure that
they are viewing things that are age appropriate for them and
they know how to deal with the challenges and the risks of the
Internet as they would with the risks of a school or a park.
Q298 Philip Davies: It is one thing
for stupid things to be put on, but it is another thing if you
are a young kid at school and you have been beaten and punched
and kicked just so that somebody can get some pleasure by putting
it up on your site. I was not quite sure from your answer whether
or not you employ teams of people or use some kind of system to
flag up to yourselves anything that is inappropriate so that you
can proactively take it out?
Mr Walker: It is a mix. Primarily
it is the three-legged stool I referred to earlier. First of all,
it is community flagging. Those flags are quickly reviewed by
a human team of reviewers and there is some automation at the
back end to make sure that once posted something does not get
reposted. That automation then comes back to the front end again.
If you are posting something that has previously been pulled down,
that would be blocked initially and will never go up. We are working
on additional software tools to identify material such as pornography
and prohibit it on the site. If we can recognise that and hold
it until it can be reviewed, that is something that we continue
to look for.
Q299 Philip Davies: How many people
do you employ to monitor what is going on? You have said that
you get hundreds of thousands of things a day being put on there.
How many people do you employ to monitor what gets put up there?
Mr Walker: It is a variety of
different teams that are working on it. Our primary focus is on
the tools and the development of something that facilitates what
I think is actually the industry leading responsiveness on the
speed of review. If we are reviewing more than 50% of the things
within half an hour, the large majority within an hour, the queues
are short. The challenge is not so much the review teams but in
making sure that we have identified the things quickly. It comes
back to the earlier point that was made about how to enhance community
flagging. How do we enhance that process and potentially empower
the community itself to be able to identify and remove material
and suspend the appearance of material pending review?