Select Committee on Culture, Media and Sport Minutes of Evidence


Examination of Witnesses (Questions 320-330)

MR KENT WALKER

1 APRIL 2008

  Q320  Adam Price: That is not the point. It is a mistake you made as a company and the system is inadequate, surely.

  Mr Walker: No system is perfect. The argument is that in the vast majority of situations we do get to the right answer and we get to the right answer very quickly. The challenge is how to make a better system that continues to honour the desire for free speech and does not interpose a government or a business between individuals who are putting up perfectly legitimate, positive, pro-social messages and the small number of people who are abusing the rules and that is the difficulty. There is no question that everybody regrets the fact that this video was on the site for a minute.

  Q321  Mr Evans: Can I ask what changes you have introduced in the company since making that error to ensure it does not happen again?

  Mr Walker: A number of different things. The mistake that was made had to do with the way the individual reviewer coded the video so that additional flags that came in were not immediately escalated to the beginning to the queue. We have made that much harder to do. It is now a double trigger, it needs to be reviewed twice. I do not want to go into the details because it would allow people who are trying to game the system to avoid the technologies. Where there are other signs of content being inappropriate that allows for a secondary review. There are a number of other things I would be happy to talk about in a private session.

  Q322  Rosemary McKenna: When you come across these things do you pass them to the local police immediately?

  Mr Walker: In that particular case we worked very closely with local police because the Internet, while it raises these challenges, is also a very transparent medium. The perpetrators of this crime are now captured on video in a way that is very powerful evidence for law enforcement to go after them and the same is true for some of these other instances of cyber bulling or other places where people appear to have been aiding and abetting the underlying crime.

  Q323  Paul Farrelly: Very quickly, Mr Walker, you have admitted to us that there is not a single person within YouTube (owned by Google) who proactively monitors offensive material. You have just told us that because of the amount of material that your reviewers of flagged stuff have to look at there are human failings and a gang rape got through. The next question is: how many people do you employ at YouTube to look at stuff that is flagged as offensive? How many people?

  Mr Walker: Again it is a combination of—

  Q324  Paul Farrelly: How many people? It is a very simple question.

  Mr Walker: I think it is impossible for me to sort out the people who are doing physical review from the people who are engineers working—

  Paul Farrelly: Have a guess.

  Q325  Chairman: Can we leave it that we would ask if you could supply the Committee with further information?

  Mr Walker: We will provide everything that is public and that which will be useful to you.[2]

  Chairman: We are going to need to move on to our next session but Alan Keen has one or two questions.

  Q326  Alan Keen: Before we come on to the IT questions, you very rightly tried to avoid areas which are not your expertise but we do not often get an expert with your knowledge here. I have got four grandchildren under five and the world moves so quickly: what have we got to fear in the next ten years? Give us some idea. This must be talked about all the time. I know you concentrate on trying to stop the problems now but what do we need to be looking out for?

  Mr Walker: It is an interesting question. I would say that the Internet generally as a daily communication platform is a different way of working and communicating than anything we are used to. When we talk about user-generated content or social networking, we need to think about a world that opens up new vistas for kids and ways to communicate across the country, make friends, all of this, but also which creates this Second Life virtual world phenomenon which has a whole new set of challenges. I would not say dangers because I think, appropriately educated, kids can work in that environment, but it is a different way of presenting yourself to the world. I talk to my children about how an email is a different way of working in the world than talking to people face-to-face and that you have to be very careful about how you present yourself, what information you provide, and how people will see you. I think the Internet as a tool generally will only become more commonplace and worldwide as a communications platform, so you have to think a little bit about how that facilitates collaboration in the way you work with people in your company, community and your school group. If there is zero cost to communication, collaboration becomes much easier and much more powerful. That is generally a very good thing but it also creates these risks of anti-social content and conduct in creating new things. I think that is something to be watched but not pre-judged until the problem emerges.

  Q327  Alan Keen: Will technology enable us to control the bad things that can happen on the Internet? Is that developing? We were pleasantly surprised when we were looking at counterfeit reproduction of films that the signals go right through the film and it is not just catching a bit at the beginning. Should we be confident that things will get better and not worse?

  Mr Walker: In general that is a useful approach to new technologies and it has been the case for the last couple of decades, and I think in specific areas, yes. We have talked a little bit about watermarking to avoid copyright infringement. Watermarking is challenging because once it is broken, it is broken for ever, whereas fingerprinting or video identification is actually more powerful, because if you try and circumvent it by tilting your video camera a little bit or tinting something orange, we can adjust on the server side to match it against this digital library of Alexandria that we are accumulating of various kinds of video content, and technology turns out to be a very good and effective tool when used in a collaborative way. The next challenge for us is to focus on a lot of these offensive materials, which is a harder technology challenge. Technology will never completely substitute itself for human judgment but the beauty of the Internet and of technology generally has been the ability to programme in intelligent rules, so essentially to discern from your query for "flowers" you are looking for a picture of a flower, you would like to purchase flowers for your spouse, you are looking to do research on flowers, or something else, have all of those different threads and give you the information that you are looking for. That is not because we have a lot of people working on it; it is because we have the ability to develop technology that incorporates some aspects of human intelligence and discerns your intent.

  Q328  Alan Keen: Internationally in the negotiations you have—and people have mentioned China and Zimbabwe—do you see in your meetings and discussions that gradually the world will be educated and these regimes will change? It is Foreign Office policy in a way that we are talking about, but from the IT point of view you have got this knowledge, so what is happening; is it getting better?

  Mr Walker: I think it is getting better. It is a big challenge because you have one Internet and you have a global platform and 200 different countries and they are not used to a world in which everybody has their own digital printing press. That is very challenging particularly to regimes that have limited the distribution of information in the past. That said, you see international standards changing. There was an incident in Pakistan not long ago where the Pakistani Government was concerned about a video that was critical of the Government and they blocked YouTube from being accessible in Pakistan, but inadvertently blocked it from being available anywhere in the world. Any request that was going to YouTube was directed to Pakistan instead. Within a couple of hours, as a result of government-to-government communications and our intervention, they reversed that and made it universally available. We see a growing sense that undue interference with the Internet and undue blocking of information in inappropriate ways is becoming internationally less acceptable.

  Q329  Alan Keen: Coming on to boring issues but vital to this inquiry, can you tell us a bit about SafeSearch and the Byron Report recommendations on that? Can you enlighten us a little bit on that?

  Mr Walker: Sure. We have, as you know, SafeSearch as a default setting on the Google search generally which avoids the display of offensive or pornographic imagery that might otherwise be out there on the net. There is also the possibility for an enhanced level of safe search that edits text as well. That is somewhat more complicated because you do not want to filter out somebody who is searching for information about breast cancer or sex education or other appropriate materials, but it is available as requested. We are also looking at a variety of other tools that we can provide to parents to further facilitate that. Again, as in many of these questions, it involves a question of balance, and of privacy concerns on that side because to block a web application essentially requires you to know in a verifiable way whose computer it is and who is behind that computer, and that is something we continue to work on.

  Q330  Chairman: I think we are going to have to draw a line here since Dr Byron has been waiting very patiently. Mr Walker, can I thank you very much.

  Mr Walker: Thank you, Sir.


2   Supplied in confidence Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 July 2008