UNCORRECTED TRANSCRIPT OF ORAL EVIDENCE To be published as HC 353-iv

House of COMMONS

MINUTES OF EVIDENCE

TAKEN BEFORE

CULTURE, MEDIA AND SPORT committee

 

 

HARMFUL CONTENT ON THE INTERNET AND IN VIDEO GAMES

 

 

Tuesday 1 April 2008

MR KENT WALKER

DR TANYA BYRON

Evidence heard in Public Questions 258 - 382

 

 

USE OF THE TRANSCRIPT

1.

This is an uncorrected transcript of evidence taken in public and reported to the House. The transcript has been placed on the internet on the authority of the Committee, and copies have been made available by the Vote Office for the use of Members and others.

 

2.

Any public use of, or reference to, the contents should make clear that neither witnesses nor Members have had the opportunity to correct the record. The transcript is not yet an approved formal record of these proceedings.

 

3.

Members who receive this for the purpose of correcting questions addressed by them to witnesses are asked to send corrections to the Committee Assistant.

 

4.

Prospective witnesses may receive this in preparation for any written or oral evidence they may in due course give to the Committee.

 


Oral Evidence

Taken before the Culture, Media and Sport Committee

on Tuesday 1 April 2008

Members present

Mr John Whittingdale, in the Chair

Philip Davies

Mr Nigel Evans

Paul Farrelly

Alan Keen

Rosemary McKenna

Adam Price

Mr Adrian Sanders

Helen Southworth

________________

Memorandum submitted by Google

 

Examination of Witness

Witness: Mr Kent Walker, General Counsel, Google, gave evidence.

Q258 Chairman: Good morning, everybody. Welcome to this further session in the Committee's inquiry into harmful content online and in electronic games. Since our last meeting our deliberations have been informed by the publication of Dr Byron's report and we will be hearing from her later, but before that I would like to welcome Kent Walker, the General Counsel of Google and to thank you for calling in on us on your way back from South-East Asia. I know you would like to make a brief statement to the Committee first.

Mr Walker: Thank you very much, Chairman. We appreciate the chance to speak with you this morning. I am Kent Walker. I am the General Counsel of Google Incorporated. We are one of the world's leading search engines and also a significant player in a number of other forms of Internet content, including YouTube, Blogger, Gmail and various forms of GO (?) in spatial navigation, Noik (?). My personal background includes having worked, not only for Google, but for eBay, for Netscape, for America On Line and having served as a Public Prosecutor for the Department of Justice in the United States. Also, my wife and I are the parents of three children, 14, 11 and eight, who are active Internet users, so I see these issues from a variety of perspectives. It is clear that the Internet has been a wonderful thing for the world. It has changed the way people communicate, live and work, it has made access to information easier and, significantly, has allowed a generation to grow up as creators of content, not just consumers of content. When many secondary school students are generating their own pages online, on Beebo and MySpace, Facebook, Blogger and YouTube, they shift into being, not just consumers, but authors and artists and really change their relationship with the media. This is an important part of the message of media literacy that is an important part of the inquiry today. That is not to say that the Internet is an unallied good. There are issues and challenges with all new technologies and all new communication platforms. We are focussed very much on the difficult question of balancing the risk of chilling free speech with the very real risk of harm to children and offensive content more generally on the Net. We think of this as a clash of valid but contending arguments on each side. We struggle every day around the world with different formulations of these questions. We see it as a shared responsibility. Certainly parents have a role to play, as Dr Byron has pointed out in her report, in helping their children become media literate, to oversee them, to education them, to make them better consumers and producers, authors or creators of content. Government has a role to play. I know that Ofcom has taken a leading role in media literacy here. Other government organisations have worked closely with us and with others to educate children and to work on the law enforcement side of things where appropriate. Industry has a significant responsibility to work on our own platforms and with our users to encourage safe and sophisticated use of the various tools that are now available. We do a number of things both at Google and YouTube to help balance this desire to make the world's information universally accessible and useful and to protect children and to deal with offensive content. We provide SafeSearch technology in our search engine to address the risk of inappropriate photos and text where appropriate. We have guidelines on YouTube and on our other community sites that set down strict, clear and plain English formulations as to what is acceptable and what is not. We have a variety of different tools to remove material that violates those guidelines. We work very closely with a number of different government organisations and non-profits around the world and in the UK to try to encourage media literacy and the safe use of our platforms and, where appropriate, we co-operate quite closely with law enforcement to deal with cases of abuse and improper content that is online. It is very early days in the history of the Internet. Google has existed as a company for nine years and YouTube for less than three years. We think that this balanced approach has worked quite well in fostering the growth of a new and dynamic medium, it has brought a lot of benefits, while making appropriate and increasing efforts to deal with the problems that have arisen. We recognise it is a continuing conversation and we welcome the discussion this morning.

Q259 Helen Southworth: I would like you to focus on some of the issues around corporate responsibility but very specifically on child protection. What annual budget does Google set in the UK for online child protection?

Mr Walker: Because our platforms are global we do not break up by country. I would say we spend millions of pounds a year on a variety of different things across all of our different products and that includes specific contributions to child protection organisations and those contributions are both in cash and in kind. One of the most valuable things that Google can do is give advertisements to services that are hoping to protect children so that when somebody puts in a search term like "suicide", for example, a user or a child is directed to a suicide prevention line or a suicide hot-line. We do an awful lot of that. In addition, I think a lot of the work we are doing with regard to filtering and evaluating content to make sure that there is very little chance that an inappropriate thing slips through is in a sense a very major contribution, tens of million of pounds, perhaps hundreds of millions of pounds, across all of our different platforms.

Q260 Helen Southworth: Quite a number of those things are general protection in society and I know we are going to be exploring those further. Would it be possible for you to let us have some information that is specifically in terms of your budget on child protection and the areas that you are specifically covering as follow-up information?

Mr Walker: Yes.

Q261 Helen Southworth: May I ask you about the reporting of inappropriate content or reporting of abuse bullying fears and concerns, some of which could be of a criminal nature. CEOP has given us evidence about a very successful reporting process that they have set up. Will Google be participating in that so that people can report direct to CEOP and the triage process could be carried out very rapidly to protect very vulnerable children?

Mr Walker: Yes, very much so. We work very closely with CEOP and with the National Center for Missing and Exploited Children in the United States, as well as with a variety of groups like "Don't you forget about me" and other non-governmental groups but focussed on missing and exploited children. We not only work on the law enforcement side of that in responding to requests for information, we receive thousands of requests and hundreds of subpoenas each year and we provide information that is useful to law enforcement on that front, but then also working in a more collaborative fashion with these different organisations and we look forward to continuing doing that.

Q262 Helen Southworth: Are you going to be making space available for a direct reporting process on the front page?

Mr Walker: Within Silicon we have a variety of different products. I think most of the concerns and requests are focused on YouTube. YouTube does have a direct online report page in addition to the community flagging, which you may be familiar with, when there is in inappropriate video. There is also an opportunity for people to communicate directly with YouTube staff about particular concerns.

Q263 Helen Southworth: We have been told that there are some cases where it is absolutely essential that law enforcement agencies are the first point of reporting and can take immediate action to protect children in the most difficult cases. Are you going to put a direct report line through to CEOP?

Mr Walker: I would be interested in talking further with CEOP about that. We have in many cases a global platform, so we need to find something that will work for law enforcement around the world. I think we have had a very positive and productive working relationship with them thus far. I think we will be able to do some things to further that collaboration. I am not in a position to commit to specific implementation at this point.

Q264 Helen Southworth: I think we would be very interested to hear something further about that.

Mr Walker: I appreciate the concern.

Q265 Chairman: You stressed the global nature of the Internet and Google is obviously active in a very large number of countries around the world. In some of those countries material may be deemed inappropriate or legal whereas in others it may be seen to be acceptable. How do you deal with that? How do you stop somebody in country A accessing material which is illegal there from going and getting it from country B?

Mr Walker: It is an extraordinarily complicated and difficult question and one that I have spent a large amount of time on, as do my staff and teams of people around the world. Several examples come to mind. One that was made public within the last few months was criticism of the King of Thailand who is a demigod in Thai Buddhism. The notion of criticism was offensive to the government and to many Thai people, but at the same time there were political overtones to much of the criticism. We did not want to be in the business of censoring political comment. We worked very carefully with the Thai government to respect Thai law, to make material that was illegal in Thailand unavailable within Thailand, but without going to the extreme of letting one country, any country, dictate the global content of the Internet. We have done this in a variety of different scenarios. Germany has particular concerns with regard to Nazi paraphernalia, for example, India has concerns about things that are defamatory toward Mahat Magandi or Turkey toward Kamal Ataturk. We work carefully with a variety of different tools to both block and, where appropriate, remove. There is certain material, for example, child pornography, which is generally thought to be abhorrent and unacceptable worldwide and in those cases we would simply remove it from the site. In other cases where there is a narrow or more focused concern we have a variety of tools which we can use to limit access.

Q266 Chairman: Let us look at China where there has been some controversy. You have gone along with the requirements of the Chinese authorities in terms of what they are able to permit their citizens to see.

Mr Walker: China is a very difficult area. I would note that in the course of the last couple of weeks there have been a number of videos uploaded on YouTube with regard to the riots and protests in Tibet. Those remained on YouTube and were not removed, leading to the Chinese government shutting down access to YouTube. I am not sure whether it has been reinstated or not at this point. We take great pride in the fact that as a search engine we are a provider of information, we facilitate access to information, which is ultimately one of the best ways of encouraging democratic change. That said, to operate in China there is a certain level of interaction with the government that is required. We take great pride in the fact that we filter less than any other search engine operating in China. On your earlier point, different countries around the world have different concerns. We respect the German government's concern over Nazi paraphernalia and we have worked with the Chinese government and the others.

Q267 Chairman: Do you detect that the concerns being expressed here are greater than elsewhere or are there specific concerns which are now being discussed in the UK which demand specific solutions here?

Mr Walker: In the UK there is a particular focus at the moment on child protection, which we recognise. We do not think of it as a unique or special problem but rather the first government to have focused on this. We are very interested in working with the British Government on solutions that will be scaleable because it is ultimately a worldwide problem. It is not that the Germans do not care about their children online. The question for us is how to implement this in a way that is scaleable. There may be aspects that are nationally focused. We continue to look at ways of implementing our guidelines in a way that would have either a legal or cultural sensitivity built into them so that we could have different standards for different countries, but that is a work in progress.

Q268 Chairman: Given that it is a worldwide problem, do you think there is a case for a worldwide solution?

Mr Walker: Many of our solutions are in fact worldwide. We have a variety of different tools, from the global flagging to the global review teams that we have in place, to an increasing emphasis on filtering tools which are designed to block and remove inappropriate videos.

Q269 Chairman: Obviously you are a global company so you do have global policies. I am thinking more of whether or not there is a case for some kind of international body to agree what are unacceptable standards and have some ability to enforce those internationally.

Mr Walker: The history of international efforts to try and control information has been a somewhat chequered one. I think our initial response is that a self-regulatory model, informed by the concerns of governments around the world, has actually been relatively successful. That is not to say that mistakes have not been made or things have not slipped through. If you look at the growth of the platform, again just looking at Google and the billions of searches that are done or YouTube where hundreds of thousands of videos are uploaded every day and hundreds of millions of videos are viewed every day, it has been a phenomenal growth with some problems. The risk of governmental or prescriptive rules being imposed on an industry which is effectively less than three years old runs a significant risk of unintended consequences. Given the relative success of the model thus far and our desire to continue to evolve our approach and to work carefully with governments and reports like Dr Byron's, I think it has been fairly successful.

Q270 Mr Sanders: Would the risk of unintended consequences not simply impact on business models and profits whereas actually it is about protecting children and human beings?

Mr Walker: I would go back to the power of the Internet. Many of these platforms do not make money per se. They exist as platforms that are designed to facilitate communication back and forth. The concern is that if you were to pre-filter or prescreen all the content that is on the Net --- Remember the wide variety of things we are talking about here. It is really not a broadcast medium; it is what is called in Silicon Valley the "long tail" phenomena where the majority of material is very small, for example, parents uploading a video of their child's first steps so that they can send it to the grandparents, a blogger that is blogging for a small community of people or a shared document which, rather than being edited in a traditional fashion by one person, is shared among 20 or 50 people for input and comment. The notion of setting up a regime that would edit all of those comments or prevents you from posting a comment on a blog before it had been screened by a government agency I think would fundamentally change the way people work with the Internet. Our response is to say we need clear standards, we need to enforce those standards and respond very promptly where a concern has been identified and thus far I think we have done a pretty good job of that.

Q271 Adam Price: You have set out for us the action that you have taken and the tools that you have developed to minimize access to harmful content on the Internet. It would be fair to say your view is that the system largely of self-regulation that we have is working well. Others have told us they believe that the current legal and regulatory framework is inadequate. Would you accept that there are any areas where there is a need for formal regulation or legislative changes?

Mr Walker: I do not want to speak beyond my area of expertise. I would say that certainly we welcome additional government efforts on media literacy; that is certainly all to the good. We have been an active participant in a number of discussions of codes of conduct currently being developed by the DCSF with regard to cyber bullying and that are being developed by the Home Office with regard to social networking and are due to be released later this week. Ourselves and other Internet companies welcome that and think that that is an appropriate model. I note that the Byron Review suggested additional work along those lines. We would be very interested in participating in that. With regard to more prescriptive approaches as to what people can and cannot post offline, to some degree the existing set of laws that are out there with regard to legal and illegal conduct and whether or not someone is aiding and abetting a crime have proved sufficient in most of these cases.

Q272 Adam Price: As a company you did broadly welcome the Byron Review. The Government has said it is fully committed to implementing its recommendations. Do you have any reservations about any of those recommendations?

Mr Walker: It is a long report and something of a work in progress. We were delighted to have the opportunity to work with Dr Byron and her staff and participate in some of those conversations. We think it is an excellent step in the right direction. Many of the recommendations are for further study and analysis, so it is hard to know exactly what will come out of all that, but directionally we are very supportive.

Q273 Adam Price: Are there any particular reservations that you want to place on the record?

Mr Walker: There is nothing I could pull out at this moment and say that we absolutely could not live with.

Q274 Adam Price: There is this issue raised in the report about a greater regulatory role for Ofcom in this area of harmful content on the Internet. Would you be comfortable with Ofcom developing a greater role, as posited in the report?

Mr Walker: It is probably not for me to suggest how the British Government should structure its regulatory review. In general I think we have had a positive working relationship with Ofcom and the other agencies that have been involved, the DCMS and the Home Office. There is an advantage to having a coordinating group, the Internet Council, for example, bringing to bear a lot of the different concerns, which are similar but not necessarily completely overlapping, to work through a more structured framework for future work.

Q275 Adam Price: If Ofcom was to have a policing role behind that Council would that not create any particular worries for you?

Mr Walker: I think there is a line between a self-regulatory code, informed by concerns that have been brought before the Committee and a rule that would be effectively trying to pre-clear or sensor content before anyone could post something to the Internet. I think we would have concerns about the latter. The former, in terms of enforcement and continuing to work with industry, I think we would welcome.

Q276 Rosemary McKenna: The YouTube Terms of Use and Community Guidelines set out what you will not allow as content on YouTube. Is it not the case that users have no obligation to read them - or even to link to the relevant pages or tick a box - before uploading a video?

Mr Walker: They are presented with the terms before they go through. They say in the States that you can lead a horse to water but you can't make it drink!

Q277 Rosemary McKenna: You can make them tick a box to say that they have read them.

Mr Walker: We can present it. I think we do a pretty good job of distilling it, of making it not legalistic and making it accessible so that people of all ages can review it. We were actually recognised earlier on in the history of YouTube as a leader in terms of providing detailed information about all the potential risks and issues, ranging from the risk of infringing someone's copyright to the dangers of posting offensive or inappropriate material. We communicate fairly clearly to people that, not only will their uploads be removed, but if they violate those terms on a repeated basis we will terminate their accounts on YouTube, and we follow that quite strictly.

Q278 Rosemary McKenna: If they are uploading a video which could be considered harmful they do not have to read the Terms of Use.

Mr Walker: That is true. We have no way of compelling one to read material we put before them.

Q279 Rosemary McKenna: Is there not some way that you could make them go to that page?

Mr Walker: As part of the upload process they are prompted to review various materials. There is a lot there because there are a lot of potential concerns. The response has been that in many cases when inappropriate content is uploaded it is flagged very quickly. It is a three-legged stool of ways of responding to violations. One is that we have a community of hundreds of millions, billions of users around the world who review and very quickly flag material that they find offensive and that has been very effective in triaging and identifying problematic material. We review that quickly again with a global team. The majority of material is reviewed within half an hour. The large majority is reviewed within the hour and removed. Users can flag different kinds of flags, for example, for child pornography or copyright infringement or otherwise offensive material. We are increasingly developing technology tools - Google is if nothing else a technology company - to try and facilitate that, certainly to block the reposting of videos that have already been determined to violate our Terms of Use and, increasingly, to look at similar sorts of videos. It is in a sense easier to detect a copyright infringing video when the copyright owner has posted the original and cooperated with us. We can use other search tools to see if a new upload is a match to that content that would infringe their copyright. It is actually more complicated in some of this material to determine exactly what is appropriate or not, to distinguish, for example, a video of somebody smashing windows from a documentary of the protests in Tibet or at the World Trade Organisation in Seattle a few years ago. There are some nuance questions that are involved there.

Q280 Rosemary McKenna: Would you support the Byron Review's recommendation for an independently monitored code of practice on the moderation of user-generated content, and will you modify your terms and guidelines accordingly?

Mr Walker: We would support the development of a voluntary code of practice. We would support the notion of moderation in the sense of working to make sure that inappropriate content, once identified, is promptly removed. We lead the industry in how quickly we remove that content. We would certainly be open to a discussion on our terms of service and content which have continued to evolve. I would go back to the point that the service has only been in existence for two or three years and those terms have evolved over the course of its history. As I mentioned before, we are looking at the possibility of different approaches in different circumstances around the world.

Q281 Rosemary McKenna: Do you tell people that if they upload videos of others committing crimes then they too risk criminal conviction?

Mr Walker: We prohibit anything that is deemed to be a hate speech, that is deemed to be graphic and gratuitous violence and anything that is humiliating or bullying or otherwise negative. I would have to go back to see if we specifically call out the risk of criminal prosecution in an aiding and abetting context.

Q282 Mr Sanders: When somebody signs up to YouTube they are confronted with the terms and conditions and you have to scroll through thousands and thousands of legalese words. I suspect most people scroll to the bottom, tick the box and off they go because they want to get going. Surely you could do more. You could have a series of questions that confront people and say, "Are you aware that if you upload X you will be liable to prosecution, yes or no? If you do this ...," and so you make the process of signing up almost like one of those learning programmes where you go through a series of questions. You could do that, could you not?

Mr Walker: It is a classic question of online consumer services. There is a desire on the one hand to cover a wide variety of potential risks and concerns and problems and a desire on the other to make something short enough that people will actually read it. We have this issue with our privacy policy, for example, to describe accurately all the things that are done with users' information, to benefit them and to customize services. It is a very long document. The concern has been that the longer you make it the less people will read it. We try and balance those two considerations by making it not as legalistic as some and I am sure we could always do better, but we do go back regularly and try to shorten that document to make it more user-friendly and more likely that people will read it. There would be an infinite number of boxes that could be ticked around a wide variety of different things. Our ultimate question is to what degree does the user understand their role and responsibility online and to encourage them to do the right thing and not violate the terms. Where they do violate the terms we respond very quickly and remove the offending content and let them know in no certain terms that their accounts will be violated if they do it again.

Q283 Mr Sanders: Everybody has ticked the box to say they have read and understood the terms and conditions, but I doubt anybody has done a study to test whether they really have read and understood the terms and conditions. It would not be that difficult to put in, on a rota basis, some extra questions. They do not need to be the same questions to everybody, but just some extra questions in there that really make it clear that there could be a penalty for misusing the system. That would offer that little bit of extra protection at relatively no loss to yourselves. Why do you not do it?

Mr Walker: It is certainly a fair concern. I do not want to get into the micromanagement of product design because the people back in Namchew (?) would not be happy if I did. The tension would be between a form that had lots of boxes that you needed to check all the way down --- There is a temptation, if you give them 20 different boxes to check, to say, "Yes, yes, yes, yes, yes," all the way down.

Q284 Mr Sanders: That is not what I am saying.

Mr Walker: I understand. That is on one side. There may be some enhanced ways of making our terms of service more visible or highlighting particular concerns within them. Each different constituency has a different concern. When I speak with representatives from the privacy agencies and data protection agencies they tell me they would like us to put more emphasis on the privacy protection sites of our materials. The copyright industry would like us to put more emphasis on the risks of infringing copyright. Certainly protecting children is a paramount concern and we need to make sure that that is highlighted appropriately. There are a number of others. We try and calibrate and harmonize all these different concerns. We think of our terms and our approach to our terms as a work in progress. I take the point. It is something that I will study when I get back to the United States to see if there are additional things we can do. At the end of the day, the question is, how do we really reach the user in a tangible, visible way? We have a lot of material on-site, tutorials and other kinds of educational material that we have developed in some of these areas and we may be able to do more in this area as well.

Q285 Mr Evans: You have talked about this struggle between the freedom of speech on the one hand and protecting youngsters on the other. Let us go back to China for a second. You have got the battle between the freedom of speech and the protection of a government which regularly violates human rights. Should that not be an easy one for Google to sort out? You are on the side of freedom of speech, surely.

Mr Walker: We are on the side of freedom of speech and the goal is to maximize freedom of speech and access to information around the world. It is a harder question to say how best one does that. In China and in a number of other governments around the world, which are not as democratic as we would like, you have to consider whether you serve that goal better by absenting yourself completely from the country and not making that service available to the citizens of the country or by trying to engage in a constructive way and push the boundaries of what is acceptable so that again more information is available on your service than would be on contending services. We balance the desire to be a socially responsible company in many areas with the desire for free speech. We are working constantly on the copyright side. It is a very hard challenge because you have 200 plus countries around the world each with their own sets of laws and their unique concerns. I alluded to some of these earlier on. It is in some measure not for us to pick and choose which laws we like and which we do not. On the other hand, at the margins there are certainly situations where the human rights or other concerns are such that we have a difficult time. Another example would be a recent situation involving a video on YouTube which involves someone being beaten unmercifully, shocking, horrible stuff. We took it down immediately only to discover it was actually posted by a well-known Egyptian human rights blogger who had raised concern in his blog about Egyptian police brutality. Not being able to find a place on his blog to host a video, he had cross-linked to do it on YouTube. I suspect it was evidence in support of his concern about police brutality and abuses in violation of the laws of Egypt. We have situations in which one country is concerned about the citizens of another country viewing material in a way that might create a danger to the citizens of the first country, so we have extraterritorial or extra jurisdictional issues that are raised. It is a phenomenally complicated set of issues and we do our best to work both in a public and on a private basis with governments around the world to try and get to a position of maximum free speech everywhere we can.

Q286 Mr Evans: I am just wondering how best you serve the people of China when you put in Tiananmen Square or Falun Gong into your search engine and it throws up nothing because the Chinese authorities do not want that to happen.

Mr Walker: We do disclose to our users that the government has edited the search results effectively and there is available on our side a link to our dot-com results. A user in China, if they can get through the Chinese government's own "Great Firewall", will have access to our unfiltered comprehensive results. I recognise the issue.

Q287 Mr Evans: How do you make these decisions? Do you have a group of people who sit down and say, "China is worth $100 million to us so let's buckle a little bit"?

Mr Walker: I can candidly say that I have been in on a number of conversations and never have I heard a discussion about how much money we stand to make from a given country. It is always, and I think legitimately, a balance in terms of trying to promote free expression or the free exchange of ideas while maintaining our ability to operate in-country and keep our people and country safe from a rampaging mob, which we have seen, or situations where our country managers have been hauled in to police departments or otherwise threatened.

Q288 Mr Evans: Do you think the fact that you have buckled to the Chinese authorities has done any damage to the reputation of Google?

Mr Walker: I would not describe it as having buckled. I think it is a complicated area. There are those who have a view of the world that would have us essentially keep our hands clean of any involvement in any government that they did not like around the world. It has not been our approach. We have tried to continue to maintain our message of free speech. We have tried to minimise the possibility that user information would be disclosed to governments that would use it for ends that we would not approve of. At the same time, we have a policy of cooperating with law enforcement around the world and many government requests, even from governments that may have human rights issues, are in fact legitimate. There are situations involving child pornography or bank robbery or murder in every country around the world. Balancing those two things is another set of issues we wrestle with.

Q289 Mr Evans: Have the Chinese authorities ever asked Google to do certain things and you have turned round and said, "No, we're not going to do that"?

Mr Walker: The YouTube example would be one most recently.

Q290 Mr Evans: That one is not so clear because I think the Chinese authorities wanted footage of the rioters shown on television to show the Chinese people that these are hideous people who are damaging property. The Chinese people were quite happy to see video footage of that.

Mr Walker: I am not clear on that. They did block, for example, Reuters' and the BBC's accounts of the rioting that was going on. We have a team of people in China who work on exactly these issues and push back against requests from the Chinese government to try and minimise the scope and duration and in some cases, as with the defence situation, they simply do not filter it.

Q291 Mr Evans: You negotiate with them, but if the Chinese turn round and say, "No, we want this off," then it has to come off otherwise they will just block you completely.

Mr Walker: Within the dot-cn site, the site that is limited now to China now on the search, that is correct. It is a condition of doing business in the country that we comply with the laws of the country. That is the difficult issue that we wrestle with. At the same time, we try and minimise it. Compared to any other search engine out there we provide more information. We filter less at the request of the Chinese government than anyone else out there. We think that brings a benefit to the people of China. If they want to, if they are interested in doing so, they have access, to the maximum extent we can provide it, to the global site.

Q292 Mr Evans: If it was Zimbabwe and Mugabwe's hideous regime asked you to remove certain things surely you would tell them to get lost, would you not?

Mr Walker: It would depend at some level on what the request was. If it was for child pornography, I think we would probably honour that globally. If it was politically sensitive material, that is a much harder question. We do not have offices in Zimbabwe. We do not have, to the best of my knowledge, a site targeted at Zimbabwe. In fact, as you will see if you go to YouTube or some of the search sites, a lot of information coming out of Zimbabwe about the progress of the election, concerns about potential election rigging and the like. It has been a wonderful channel for that sort of information. In Venezuela within the last year, when the Chavez administration shut down various television stations, several of those stations turned around and started broadcasting on YouTube as an alternative way of providing information to the citizens of Venezuela. It is an important back and forth that we look at continually.

Q293 Mr Evans: I suspect that the difference is China is huge, it is very powerful and there is a lot of money there. Maybe Zimbabwe is not so clever and clearly very poor. Google, like a lot of countries, probably takes a different attitude to China than they would to other countries that violate human rights.

Mr Walker: It is a challenging game to sit down and assess the various criteria one would apply to a different country with regard to its overall level of democracy, level of transparency, independence of its judiciary, tolerance for civil expression and then, taking all that into account and the question of whether or not we have people in harm's way in-country, what is the particular nature of the request that is being made? Is it for child pornography? Is it for something that we genuinely recognise as criminal? Is it for defamation, which might well be recognised as criminal or at least a civil wrong in the United Kingdom or in Europe? What if it is defamation of a political figure? What if it is defamation of a dead person, Kamal Ataturk or Mahat Magandi, which in the United States would not be defamation at all but certainly is defamation in those countries? We try and bring all of those factors to bear in the analysis of these issues.

Q294 Mr Evans: At what level in the company are these decisions made? Is it one person or is it a group of people that make these decisions? Are they different people for different countries?

Mr Walker: Right now our Executive Management Group, which is the very senior people in the company, has reviewed several of these issues. There is a woman on my staff who we have christened "The decider" who is involved in reviewing a variety of these things around the world. We have policy teams, corporate communication teams and people on the ground because these are close issues. There are different points of view even within the company as to the right approach. In most cases we are able to reach consensus internally. In some cases we have escalated them to the senior management of the company for decisions as to what to do.

Q295 Philip Davies: Could I give you a real life example of some of the things that people have been saying earlier on today about some of the content? It is a paper you might not have come across, but as it is my local paper it is the most important paper in the country and that is the Bradford Telegraph & Argus. On the front cover you will see video nasties with pictures of Buttershaw School fights and Rhodesway Royal Rumble bitch fight 1 and bitch fight 2 all appearing on YouTube. Do you accept that this kind of gratuitous violence happens partly because people want to put it up on YouTube and that without YouTube some of these things would not happen?

Mr Walker: I think you are right that there is a risk that any new form of communication often times gets adopted first by the youngest people in that community and can be used for many good things and for some bad things. Gratuitous violence is against our policy so we try and remove it very quickly. It is typically taken down in a number of minutes where it is flagged. It is a concern. It is something we work very hard on.

Q296 Philip Davies: You say you take it down very quickly. How long does it take for you to remove an item that is inappropriate like that?

Mr Walker: Once flagged, more than 50 per cent of that material is removed within half an hour. A large majority of it is removed within an hour. In the longest cases it is one to two days for materials that are harder to identify or figure out whether or not it is a documentary or it is promoting or glorifying violence.

Q297 Philip Davies: Do you not think that is slightly lax, that once flagged you say you will take things down as quickly as possible? Do you not think that you have a duty, given that it is your site, to have people monitoring what is going on there and proactively taking things down like this, which are completely inappropriate and I think everybody would agree is totally unacceptable, rather than waiting for somebody somewhere to flag it up when they are thinking, "I don't need to flag it up because surely somebody at YouTube is going to be looking out for this and taking it down"?

Mr Walker: The community does very actively flag. We get hundreds of thousands of flags every day. The larger question you raise is what is the model of the Internet? Is it more like broadcast television or a newspaper? Is it more like the telephone system, a communication platform that is used for one-to-one or a few-to-few kinds of communications? It clearly has aspects of both. It would be hard to have the rules of television or the requirement of impartiality applied to YouTube. We have this "long tail" problem where there is an awful lot of content. Whether it is on YouTube, Blogger or MySpace, the notion of having somebody pre-clearing your content before you posted it to a MySpace account or to a Beebo account or pre-clearing a blog which you were going to write before you are allowed to publish it on the Internet because it might have offensive content in it has not been the way the Internet has worked. Looking at the volume of material we have uploaded just on YouTube, it is hundreds of thousands of video clips in a day. When you add in all these other services it is in the millions. The effective way of doing it has been to be responsive to problems and responsive very quickly to problems and remove them. Going back to your earlier notion of media literacy, it is to make sure people understand that there will be stupid things posted; there will be misuses and abuses of the platform in the same way that there is graffiti scrawled on school walls. You learn to disregard it, to recognise it is inappropriate. Parents work with children to make sure that they are viewing things that are age appropriate for them and they know how to deal with the challenges and the risks of the Internet as they would with the risks of a school or a park.

Q298 Philip Davies: It is one thing for stupid things to be put on, but it is another thing if you are a young kid at school and you have been beaten and punched and kicked just so that somebody can get some pleasure by putting it up on your site. I was not quite sure from your answer whether or not you employ teams of people or use some kind of system to flag up to yourselves anything that is inappropriate so that you can proactively take it out?

Mr Walker: It is a mix. Primarily it is the three-legged stool I referred to earlier. First of all, it is community flagging. Those flags are quickly reviewed by a human team of reviewers and there is some automation at the back end to make sure that once posted something does not get reposted. That automation then comes back to the front end again. If you are posting something that has previously been pulled down, that would be blocked initially and will never go up. We are working on additional software tools to identify material such as pornography and prohibit it on the site. If we can recognise that and hold it until it can be reviewed, that is something that we continue to look for.

Q299 Philip Davies: How many people do you employ to monitor what is going on? You have said that you get hundreds of thousands of things a day being put on there. How many people do you employ to monitor what gets put up there?

Mr Walker: It is a variety of different teams that are working on it. Our primary focus is on the tools and the development of something that facilitates what I think is actually the industry leading responsiveness on the speed of review. If we are reviewing more than 50 per cent of the things within half an hour, the large majority within an hour, the queues are short. The challenge is not so much the review teams but in making sure that we have identified the things quickly. It comes back to the earlier point that was made about how to enhance community flagging. How do we enhance that process and potentially empower the community itself to be able to identify and remove material and suspend the appearance of material pending review?

Q300 Philip Davies: There is one argument you made that I cannot quite follow. You said that you do not really want to go down the line of somebody like YouTube looking at things before they go up there to decide whether or not they are worthy of going on and that that would be a restriction on free expression. Once something is flagged up to you you then review it to see whether it is acceptable or not. What is the difference between reviewing it before it goes on to see if it is acceptable and reviewing it after it has gone on to see if it is acceptable? Surely you are going through exactly the same process.

Mr Walker: In the United States I am not sure if the language is the same but I suspect the concept is the same in Britain. We would have concerns about what started as prior restraint. We have hundreds of thousands of pieces of material being uploaded on YouTube or millions of pieces being uploaded if you start to include Blogger, MySpace, Beebo, shared documents, posts going up onto videos, a huge amount of user-generated content. This is what is called "Web 290" and it is where you have empowering of the individual consumers to create their own content. If you try to take that vast amount of content and pre-screen all of it, it is neither efficient nor effective and would burden the process of creation. Think of the delays that would be occasioned between the time you tried to edit a document or post a comment and some days later when it would appear online. It is a very different model to anything the Internet has ever had.

Q301 Philip Davies: What you are saying is it would be a pain in the backside to do that. I do not dispute for one minute that it would be hopelessly inconvenient and a complete pain for you, but if it would stop things like this from going on in some of my local schools would you not agree that it is a price worth paying?

Mr Walker: It is not the price for us that I would be concerned about but the price for the user. It is a somewhat different model to what we have had before. A page like this gets worked on by 20 different people. It is on the Internet; it is user-generated content. If I change a comma, should that need to be reviewed before it goes online? I recognise there is a spectrum, from something that gets viewed millions of times to the millions of things that get viewed only a small number of times, but all of that is user-generated content. If you have a correction to a Wikipedia article, should it need to be pre-reviewed before you can move the comma? This is the challenge that we face in trying to come up with a scaleable model that works for all forms of user-generated content. The effective and generally fairly efficient approach and the approach that has safeguarded the user experience and the value of the Net have been responding very quickly to problems. It is as though it is in the offline world. We do not have policemen on every street corner stopping things from happening. We have policemen who very quickly respond when there are problems that occur and consequences for violation of those rules. In this case you would remove a video, you would lose your account and you would be kicked off the Google services if you were guilty of violating the rules.

Q302 Paul Farrelly: Mr Walker, it was a very simple question that my colleague Mr Davies asked you which you did not answer. How many people does YouTube employ to proactively review and take down inappropriate content irrespective of whether it has been flagged or not?

Mr Walker: The answer is we do not proactively review in a human way ---

Q303 Paul Farrelly: Not a single person?

Mr Walker: We have automated tools which review material to see if it has been previously flagged and will stop its reoccurrence. We are developing tools, again to the point that we are a technology company ---

Q304 Paul Farrelly: Not a single person is the answer?

Mr Walker: Devoted to doing prior restraints of user communication, that is correct.

Q305 Paul Farrelly: Let us take a clear-cut example. Somebody perverse uploads a piece of child pornography, how does that get removed from YouTube before anyone flags it?

Mr Walker: There are two potential ways. One, we are working very hard on various forms of automated filters that will detect ---

Q306 Paul Farrelly: How does it now get removed?

Mr Walker: In some cases I believe some of these filters are already being used to identify pornography content. It is difficult to distinguish child pornography from general pornography, but the advantage is that both are illegal or unacceptable on YouTube and so we do not allow it.

Q307 Paul Farrelly: How does it work? Until someone flags it ---

Mr Walker: If an individual piece of video has been uploaded before and we have ruled it a violation of our terms of service or illegal ---

Q308 Paul Farrelly: I am talking about a new piece. You are saying somebody has got to flag it to you before it is taken down.

Mr Walker: I am saying that that is an important part of the way the material comes down. I think you will find that there is very little pornography let alone child pornography on the site because of that. As users know that if they upload something they should not it is going to come down very quickly it is almost not worth the bother of putting it up in the first place. Either it is blocked because it has already been taken down before, it is potentially blocked because it is something that our system has identified as being pornographic or it is blocked because the first person to look at it says it is clearly child pornography and would flag it and it comes down within a matter of minutes. The game is not worth the candle.

Q309 Paul Farrelly: Clearly it would affect YouTube's business model if it had to employ banks of people to do this and therefore your profitability. Does your business model not abdicate any responsibility that you have?

Mr Walker: I would say the reverse is true. The unacceptable content, child pornography --- We make no money from pornography and have no desire to or from the offensive content. There is a much larger risk to our brand reputation and to the vast amounts of money and time and effort we put in to try to detect and remove this material from the service. We would be, from a business perspective, much better off if none of this had ever hit the site.

Q310 Paul Farrelly: Last year there was a teenager murdered in random violence in Liverpool and the press posted pictures of screamed rants from YouTube on videos glorifying gang violence. I am sure this is not just a UK issue, it is an issue in America as well. When I made some comments about that YouTube's response was to say, "We're not in the process of editorializing. It's nothing to do with us, Guv," and subsequently I have been pestered by teams of YouTube PR people seeking to meet me and educate me. If those people were employed doing something about that sort of content or if the law as it stands gave you more of an incentive to do that I would feel much happier, particularly as the father of young children.

Mr Walker: Let me apologise for any pestering. I am sure it was not intended. The general notion is that it is right that we feel a sense of responsibility, as I am sure you do and parents everywhere, to try and make sure that this sort of stuff does not get uploaded. Material that promotes violence on our site violates our rules and it should not be there.

Q311 Paul Farrelly: There was another case recently where a young girl of 15 was prosecuted and sentenced to detention for being an accessory to manslaughter and possibly murder because of the filming of gratuitous violence that was uploaded to YouTube. If the law were to be strengthened to make it carry the risk that by virtue of owning YouTube and broadcasting that you might also, unless you took the issue more seriously, be prosecuted as an accessory, that would give you more incentive to sharpen your act up, would it not?

Mr Walker: I do not think of ourselves as a broadcaster, I think of ourselves as a communications platform. I assure you, we take the issue extremely seriously right now. The question is not a lack of will.

Q312 Paul Farrelly: But you do not employ a single person.

Mr Walker: To do something that we ultimately think would be the wrong approach. We employ many people to get involved in this very complicated balancing of the chilling of free speech on the one hand and the elimination of harmful or offensive content on the other and that is ultimately the right path. There are problematic phone conversations that go on every day with people planning criminal offences. No one would think to impose a requirement on the phone company to monitor phone calls, which would probably be effective in reducing the use of telephones to commit criminal acts. The balance on the other side, of the invasion of privacy, would be thought to be undue. Here again I think trying to create a model which turns the Internet into a monitored broadcast medium where everything you want to post to YouTube or MySpace, whether it a comment to a blog or a blog itself or even your email which goes out to 100 people, should have to run through a filter before it is made public ---

Q313 Chairman: I think our approach would be to suggest to you that your corporate slogan might not just be "Don't be evil" but perhaps "Take an active role to prevent others doing evil". I understand about the amount of material that is posted. Could you confirm whether or not ten hours every minute of video content is correct?

Mr Walker: That is currently correct. It goes up every day.

Q314 Chairman: That video content is tagged. You do not need to look at every single minute of video content. Surely you could have people who would look at the video content which is tagged with labels which suggest it could be inappropriate. If something is tagged "rape" then you might decide that would be worth looking at rather than waiting to see if somebody reports it.

Mr Walker: We look at a variety of different signals. Key words might be something to take into account. The challenge is when you go down that path and someone is posting a comment on "Sex in the City", you might well get an awful lot of material that is not problematic.

Q315 Chairman: If you were to narrow your search by looking at the material which is tagged with labels which suggest it could well be inappropriate then you would not have to be looking at the ten hours going up every minute, you could actually employ some people specifically for this purpose.

Mr Walker: At the end of the day, I think we all agree that the goal is to minimise the amount of controversial material that is on the site. What is the most effective way to do that, not the least expensive, but the way that is best for the user experience, to block it? It may be that some combination of an analysis of the material that is being uploaded through technological tools, an analysis of the labels that are going on, an analysis of the history of the user if they have previously posted problematic material but not so much that their account has been suspended and an analysis of how many people were viewing an item or have viewed other items in the past. We take a lot of different signals data into account. Certainly it is a fair suggestion and it is one we will continue to look at.

Q316 Chairman: I only raise it because there was a case in the UK very recently of a woman who was gang raped and a video was then uploaded to YouTube. It was viewed by 600 people before it was taken down.

Mr Walker: There were 600 views. We believe it was a much smaller number of individuals, but I am familiar with it. Clearly it was a mistake on our part as a result of human review. Our reviewers review a lot of material and in some cases just simply make a mistake.

Q317 Chairman: You say it was a mistake on your part, but it would have been possible to take it down earlier, would it not?

Mr Walker: The initial flag was reviewed and the individual reviewer, who had reviewed a huge number of materials, did not take it down promptly upon reviewing it. I do not know exactly what happened, but it was a mistake. It was a tiny, tiny, infinitesimal percentage of the material that we are viewing.

Paul Farrelly: That is incredible.

Q318 Adam Price: How could you make a mistake like that? How can you agree with gang rape and not see it for what it is?

Mr Walker: The challenge points out the difficulty with human review and that the answer is not always putting more people on this.

Adam Price: Come on!

Paul Farrelly: Do you know how absurd you are sounding?

Q319 Adam Price: You are defending the indefensible now. People will find this deeply objectionable. You cannot defend that. No reviewer could view that kind of content and not understand it for what it is. Surely that single case is enough for you to realise that your approach is completely inadequate. How can you defend that?

Mr Walker: I do not mean to defend it. Certainly the rape itself and the underlying content are abominable and no one would defend it.

Q320 Adam Price: That is not the point. It is a mistake you made as a company and the system is inadequate, surely.

Mr Walker: No system is perfect. The argument is that in the vast majority of situations we do get to the right answer and we get to the right answer very quickly. The challenge is how to make a better system that continues to honour this area for free speech and does not interpose a company or a business between individuals who are putting up perfectly legitimate, positive, pro-social messages and the small number of people who are abusing the rules and that is the difficulty. There is no question that everybody regrets the fact that this video was on the site for a minute.

Q321 Mr Evans: Can I ask what changes you have introduced in the company since making that error to ensure it does not happen again?

Mr Walker: A number of different things. The mistake that was made had to do with the way the individual reviewer coded the video so that additional flags that came in were not immediately escalated to the beginning to the queue. We have made that much harder to do. It is now a double trigger, it needs to be reviewed twice. I do not want to go into the details because it would allow people who are trying to game the system to avoid the technologies. Where there are other sign of content being inappropriate that allows for a secondary review. There are a number of other things I would be happy to talk about in a private session.

Q322 Rosemary McKenna: When you come across these things do you pass them to the local police immediately?

Mr Walker: Absolutely. In that particular case we worked very closely with local police because the Internet, while it raises these challenges, is also a very transparent medium. The perpetrators of this crime are now captured on video in a way that is very powerful evidence for law enforcement to go after them and the same is true for some of these other instances of cyber bulling or other places where people appear to have been aiding and abetting the underlying crime.

Q323 Paul Farrelly: Very quickly, Mr Walker, you have admitted to us that there is not a single person within YouTube (owned by Google) who proactively monitors offensive material. You have just told us that because of the amount of material that your reviewers of flagged stuff have to look at there are human failings and a gang rape got through. The next question is: how many people do you employ at YouTube to look at stuff that is flagged as offensive? How many people?

Mr Walker: Again it is a combination of ---

Q324 Paul Farrelly: How many people? It is a very simple question.

Mr Walker: I think it is impossible for me to sort out the people who are doing physical review from the people who are engineers working ---

Paul Farrelly: Have a guess.

Q325 Chairman: Can we leave it that we would ask if you could supply the Committee with further information?

Mr Walker: We will provide everything within our power and that which will be useful to you.

Chairman: We are going to need to move on to our next session but Alan Keen has one or two questions.

Q326 Alan Keen: Before we come on to the IT questions, you very rightly tried to avoid areas which are not your expertise but we do not often get an expert with your knowledge here. I have got four grandchildren under five and the world moves so quickly: what have we got to fear in the next ten years? Give us some idea. This must be talked about all the time. I know you concentrate on trying to stop the problems now but what do we need to be looking out for?

Mr Walker: It is an interesting question. I would say that the Internet generally as a daily communication platform is a different way of working and communicating than anything we are used to. When we talk about user-generated content or social networking, we need to think about a world that opens up new vistas for kids and ways to communicate across the country, make friends, all of this, but also which creates this Second Life virtual world phenomenon which has a whole new set of challenges. I would not say dangers because I think, appropriately educated, kids can work in that environment, but it is a different way of presenting yourself to the world. I talk to my children about how an email is a different way of working in the world than talking to people face-to-face and that you have to be very careful about how you present yourself, what information you provide, and how people will see you. I think the Internet as a tool generally will only become more commonplace and worldwide as a communications platform, so you have to think a little bit about how that facilitates collaboration in the way you work with people in your company, community and your school group. If there is zero cost to communication, collaboration becomes much easier and much more powerful. That is generally a very good thing but it also creates these risks of anti-social content and conduct in creating new things. I think that is something to be watched but not pre-judged until the problem emerges.

Q327 Alan Keen: Will technology enable us to control the bad things that can happen on the Internet? Is that developing? We were pleasantly surprised when we were looking at counterfeit reproduction of films that the signals go right through the film and it is not just catching a bit at the beginning. Should we be confident that things will get better and not worse?

Mr Walker: In general that is a useful approach to new technologies and it has been the case for the last couple of decades, and I think in specific areas, yes. We have talked a little bit about watermarking to avoid copyright infringement. Watermarking is challenging because once it is broken, it is broken for ever, whereas fingerprinting or video identification is actually more powerful, because if you try and circumvent it by tilting your video camera a little bit or tinting something orange, we can adjust on the server side to catch it against this Digital Library of Alexandria that we are accumulating of various kinds of video content, and technology turns out to be a very good and effective tool when used in a collaborative way. The next challenge for us is to focus on a lot of these offensive materials, which is a harder technology challenge. Technology will never completely substitute itself for human judgment but the beauty of the Internet and of technology generally has been the ability to programme in intelligent rules, so essentially to discern from your query for "flowers" you are looking for a picture of a flower, you would like to purchase flowers for your spouse, you are looking to do research on flowers, or something else, have all of those different threads and give you the information that you are looking for. That is not because we have a lot of people working on it; it is because we have the ability to develop technology that incorporates some aspects of human intelligence and discerns your intent.

Q328 Alan Keen: Internationally in the negotiations you have - and people have mentioned China and Zimbabwe - do you see in your meetings and discussions that gradually the world will be educated and these regimes will change? It is Foreign Office policy in a way that we are talking about, but from the IT point of view you have got this knowledge, so what is happening; is it getting better?

Mr Walker: I think it is getting better. It is a big challenge because you have one Internet and you have a global platform and 200 different countries and they are not used to a world in which everybody has their own digital printing press. That is very challenging particularly to regimes that have limited the distribution of information in the past. That said, you see international standards changing. There was an incident in Pakistan not long ago where the Pakistani Government was concerned about a video that was critical of the Government and they blocked YouTube from being accessible in Pakistan, but inadvertently blocked it from being available anywhere in the world. Any request that was going to YouTube was directed to Pakistan instead. Within a couple of hours, as a result of government-to-government communications and our intervention, they reversed that and made it universally available. We see a growing sense that undue interference with the Internet and undue blocking of information in inappropriate ways is becoming internationally less acceptable.

Q329 Alan Keen: Coming on to boring issues but vital to this inquiry, can you tell us a bit about SafeSearch and the Byron Report recommendations on that? Can you enlighten us a little bit on that?

Mr Walker: Sure. We have, as you know, SafeSearch as a default setting on the Google search generally which avoids the display of offensive or pornographic imagery that might otherwise be out there on the net. There is also the possibility for an enhanced level of safe search that edits text as well. That is somewhat more complicated because you do not want to filter out somebody who is searching for information about breast cancer or sex education or other appropriate materials, but it is available as requested. We are also looking at a variety of other tools that we can provide to parents to further facilitate that. Again, as in many of these questions, it involves a question of balance, and of privacy concerns on that side because to block a web application essentially requires you to know in a verifiable way whose computer it is and who is behind that computer, and that is something we continue to work on.

Q330 Chairman: I think we are going to have to draw a line here since Dr Byron has been waiting very patiently. Mr Walker, can I thank you very much.

Mr Walker: Thank you, Sir.


Witness: Dr Tanya Byron, gave evidence.

 

Chairman: We now move on the second part of this session and I should like to welcome Dr Tanya Byron. Thank you for your patience for the previous session. Adrian Sanders is going to start.

Q331 Mr Sanders: How would you say your professional background has helped shape the review?

Dr Byron: That is an interesting question. The review was an extremely challenging six months. If I knew then what I know now I wonder whether I would have said yes, but I think I would have done, I have really enjoyed it; it has been a fascinating experience. I can answer the question by talking about the skills I think it has really drawn on. I suppose primarily I am an academic with a background in child development, so my academic background is in the understanding of models of child development and how children develop, how children think and how children behave. In addition to that, I have a clinical career of almost 20 years working mostly with children, young people and families, and specifically as a consultant I have worked with vulnerable children and young people in secure units, medium secure units or in open units with children who had to be detained under the Mental Health Act. I am a writer and broadcaster, which I think was useful in terms of the writing of the report, and also thinking about ways in which quite complex arguments could be understandable to many people and useable so that there was take-out information from the report particularly for parents and teachers. I am a mother, I have two children, my son is ten, my daughter is almost 13, and they use these technologies. When I was asked to do the review, another reason I said yes is because I realised how little I knew and felt it would be useful in my own home to know what I know now. That is probably the combination of skills that I brought to this.

Q332 Mr Sanders: Is there anything specifically that you have learned through conducting this review that has changed how you communicate with your children in the home and what boundaries you set for them?

Dr Byron: I promised my son particularly that I would not talk about him directly in this Committee, but yes it has.

Q333 Mr Sanders: He will never forgive you!

Dr Byron: No, he will never forgive me.

Q334 Chairman: My son is not speaking to me.

Dr Byron: And I remember, John, because I was there and I think I had a word with you afterwards actually but there we go; we all live and learn and we all move on! I have always had an understanding around parental filters and they have been an important part of the way I have parented my children with technology. My children do not have computers in their bedrooms and never have done and I feel that is very important. As a broadcaster and a writer on child development generally, I have a real issue about young children's bedrooms particularly becoming multimedia stations in homes, and certainly when I write about things like literacy and so on I do have concerns around that and family time. The dialogues with both my children are very different. My daughter is older and we have a different relationship. She is female and, not wishing to be sexist, in terms of the way that she thinks about the world at her age, in the way that I do not read her diary, there are times when she is on-line and I respect that but I also respect that we have a relationship where we can discuss things. With my son it is much more monitored and supervised. When he is gaming, we know where he is, there are times around when he plays, there are rules in the house around when he plays such as "do your homework first" and no-one plays games when they are eating meals, things like that. I am not saying I am a model parent, I make mistakes like everybody else does, but this is a big challenge for parents and my review was about challenging government and challenging industry, and I feel I have done that robustly as much as I could, but it was also about empowering parents, I suppose, to span this generational technological divide which is so prevalent at the moment.

Q335 Mr Sanders: Is it just a generational divide? Is there not actually a real divide between somebody who is informed and technologically aware above average and a lot of parents who are not, and how will this review reach those parents who at the moment are not adopting the sensible attitude towards the technology that you are in your own home?

Dr Byron: That is a good point that you make and in the report I set out a grid where it was really interesting for us, particularly when we were doing the focus group research, where we found that there was a variety of ways that parents chose to parent their children around their Internet use or their gaming, and some parents are very controlling, some parents are very liberal, some parents use lots of filters, and some parents monitor less, and so on. What was interesting was that parents who felt they could parent their children in whatever way they chose were those who had an understanding of the technology so at least they were 'web 1.0 parents' even if their children were 'web 2.0'. I considered myself a web 1.0 parent before I did this review. I did not understand the notion of user-generated content and so on, as I do now. I am excited by it and it has not frightened me for my children but it has made more aware. As you know, I have recommended quite substantial things to government, quite big challenges, and I have recommended investment in a huge marketing campaign; I have called it a 'social marketing' campaign. As a child, I remember the 'clunk-click' campaigns and I am thinking we need 'think-click' campaigns. We need campaigns to reach people in their homes through public broadcasting, through newspapers, through magazines, in schools, through HR departments and so on. That is what I am recommending; I hope it is robustly taken up.

Q336 Mr Sanders: What part may you play in developing and putting into practice the action plan that you have recommended for Government?

Dr Byron: Can I say I am going on holiday first! I have no plans to be actively involved in the implementation of this review. I have to say I feel very passionately about the review now and I am really pleased at how vigorously it was taken up on the day that I published. The Prime Minister asked me if I would come back and re-review, as I have recommended, in 2011, to make sure that what I have recommended has been implemented and within the time lines that I have recommended, because I do not want things to be set up in a nice thought about way that takes a bit of time; I want things to get going now. I am considering at the moment whether I will do that.

Q337 Rosemary McKenna: Can I just say how welcome the report was. I think it is a fantastic report and it is timeous and it has to be implemented within the time lines because we have been aware of the issue for some time. I am also talking as a user and a grandparent so I want to get these things in place as quickly as possible. We have just heard how inadequately some of the social networking sites are monitored. Just how dangerous is the Internet for children?

Dr Byron: I think the dangers of the Internet correlate strongly with the benefits of the Internet and I think that is something that we need to think about really carefully. The Internet can help you make new friends but it can also mean people who you do not want to know who you are or where you are or what you look like can get information about you. For me, if we start at the very basic parenting level, we feel empowered to say to our children "don't talk to strangers," or "don't give your details out to people who you do not know" and so on, but what really struck me is how people are not engaging in that conversation at all with their children because they do not even know that is what they are doing. A lot of parents think when their kids are going on-line that they are watching television and so the Internet is used as an electronic nanny. It is not; it is actually like opening your front door and saying, "Go on then, go and play," and we do not want our children to go and play unsupervised until we are clear that they are independent. There are very clear dangers which are defined by law. I commend CEOP and the Internet Watch Foundation, I think you have heard from both of them, and the work that they do there is extraordinary and important and must be well-resourced to continue. I have made recommendations about how the law needs to further clarify illegal content, contact and conduct on-line and I think it is important that we think across those three different domains. The big debate we are having at the moment is about suicide websites and I think dangers when it comes to an individual child are very much located within the child, to do with risk and protective factors, so protective factors exist within the home, within family relationships and within how much their parents understand technology. Risk factors make vulnerable children more vulnerable but that is both off-line and on-line, and I think there is a blurring of the off-line and the virtual worlds and, definitely, children do not see these worlds as different. When they are being bullied at school and it then happens on-line it is a blur across the two worlds.

Q338 Rosemary McKenna: I think that is true and what I am concerned about is that those children who are already vulnerable are likely to be the ones who are most vulnerable when they go on-line because they will not have the relationship with their parents nor will their parents have the understanding of the dangers. Is it possible to run a campaign which would alert those parents and would really let them understand exactly without scaring them off the Internet?

Dr Byron: I think the campaign has to be thought about really carefully and I think that targeted marketing of different kinds of ideas to different families and so on has to be thought about carefully, as you say. With my child protection hat on, with the background that I have in child protection, I also know that there are some extremely vulnerable children and young people who have parents who will probably not respond to any kind of messaging which is distressing and disturbing and something which we have to think about often as a society. Those children sometimes are in schools and so I have also made a number of recommendations about how we can support schools in terms of thinking about e-safety. We know that boys are particularly vulnerable if they have no permanent and strong male role model in their lives. Often these boys do make attachments to male role models in schools so it is also about looking at how schools and staff within schools can support these children. What is really interesting is on-line there are also ways in which vulnerable children can be supported and detected and, for me, it is also thinking about that and resourcing that as much as we can.

Q339 Rosemary McKenna: I also think it is quite important that we do not over-protect children and there is an issue about that anyway in the non-virtual world, if you like, with children being escorted to and from school, et cetera, all the kind of things that I think tend to not prepare children to face risk or to understand risk. Can we do that, can we manage that balance?

Dr Byron: I am really glad you have made that point because I have made the point several times in the report and I make it often when I write in my column for The Times. I have real concerns generally that we are living in a risk-averse culture and we have a zero-risk approach to parenting. It is very difficult to say that without people thinking that what I am implying is that children should just be left to roam free and face all the dangers of the world. That is not what I am saying but, ironically, what we did find from the focus groups and from talking to children - and you know that most of the respondents to my report have been children, more than all industry, all adults and everybody it was kids who were writing to me, blogging with me and so on - is that as we become more anxious about the off-line world, the real world, so we are restricting our children access to play outside and all the things that we all did as kids, children who, because of the developmental imperative to take risks and to socialise, and this is how you develop, you take risks and you socialise, it is what development is about, and because they are inside the home they are going on-line and they are doing exactly the same things, and I would argue they are almost more at risk because they are doing exactly what they want to do, socialising and taking risks, but they are doing it in a space where the grown-ups under whose care and control they are, have no real understanding or idea. For me the irony is that as we try and protect our children in the off-line world in an overprotective way we might be pushing them into risk in the on-line world without them really knowing that.

Q340 Adam Price: Looking at risk in the on-line world, I think your categorisation of content, contact and conduct is a very useful way of being a bit more specific because I think that is part of how we avoid a risk-averse culture by being clear about the nature and scale of the risk. Could you just say a little bit about the principal risks that you see to child development? We understand those associated with inappropriate contact but could you say a little bit about what the risks are related to exposure to inappropriate content to children on-line?

Dr Byron: I suppose a useful way of thinking about this is to think about the development of the frontal cortex, if I could have that conversation with you. This is quite a useful way of thinking in how we can help children manage risk anyway, whether it is on-line or off-line. We need to think about the ability of the individual child at a neural level. Neural networks in the brain are still developing throughout childhood into early adulthood. In the frontal cortex of the brain the neural networks are very underdeveloped at birth and they take time to develop. It is not really until adolescence that the neural networks at the front of the brain are linking back to other bits of the brain. This part of the brain is very involved in the kinds of skills that children, young people and adults need in order to evaluate and manage risk. Those would be things like critically evaluating what it is I am seeing, having an ability to differentiate between fantasy and reality, and being able to manage my own impulsive urge to behave in a particular way, being able to regulate my emotion in terms of how it is going to impact on my behaviour. Clearly with very young children we see that we have to have a very high level of supervision and monitoring in the off-line world, and if you do not mind me saying, we have locks on doors so kids cannot just run out of houses, when they get older they play in gardens but there are gates, and so on and so on. For me it is about thinking about the age of the child in terms of developmental profiles and thinking about what is that child competent to be able to manage. For particular kinds of content I am stating very clearly not just in terms of the Internet but in terms of video games that adult content is not appropriate for young children, and we need to be very clear about that and we need to have the courage to say no to our children, to set up filters, to have a clear video game classification system where we are clearly saying: you do not buy that for your child because what they are going to be experiencing, they may not be able to think about in a way that is helpful for them."

Q341 Adam Price: I think in the report you went into some detail looking at the research to date in this area and some people might be surprised that the research at this stage is mixed and possibly even equivocal that there does not seem to be substantial evidence in relation to the different categories of harmful content. Were you surprised by that? Is that a fair assessment of where we are at in terms of our understanding based on the research?

Dr Byron: To be frank, initially I was quite disappointed because when I first started this review, what was really clear from the outset is the polarisation of debate and the emotion and the arguments: "Yes it does,", "No it does not," freedom of speech/censorship and the way that I suppose new media has always been met in society and there is moral panic. There are real things to be afraid of, absolutely, and I have written about them, but also the kind of fear that comes beyond that which is the fear of the unknown. For me as a social scientist I thought let us cut through that and go to the research and then I found that the polarisation of debate exists in almost the same way in the research. It depends who you are talking to. There are some researchers who will say, "I can show you unequivocal evidence of harm," but then there are others who can say, "Yes, but you know what, look at your methods, they are wrong," and when you look at the video games research for example, where you see a huge polarisation of debate, those researchers who claim violent video games make you violent, that is lab-based research that shows short-term effects which many people would say you cannot then generalise into the long term. So for me it was about stepping away from that and rather than just arguing and table-banging and continuing this polarisation of debate, while children are still merrily running along there and doing what they are doing, let us not get too hooked into looking for a simple causal model and let us look at one that talks about probability of risk and in locating probability of risk then you locate it within the child and you say let us think less about what the technology does to the child but what the child brings to the technology and how that is going to mediate benefit or risk for that child.

Q342 Paul Farrelly: We are all parents or grandparents and if we waited for the child psychologists and scientists to prove causality as a human race we never would have adopted a moral code in the first place, would we, so there are boundaries that we as parents set without waiting for causality because we have taken a judgment, right or wrong, that something is inappropriate or wrong. Should we not have the same approach to the Internet?

Dr Byron: I think we should and that is why I think most of my recommendations are very much about empowering the end consumer. For me it is a question of who do we want to make that decision. Do we want government to make that decision at a network level of blocking? Can I just be clear here, I am not talking about illegal material, I hope you are clear that I have very strong views about illegal material and I am very clear that that needs to be thought about very carefully and we should not be unequivocal about that at all, but harmful and inappropriate; what does that mean? One person's harm is another person's offence is another person's distress for their child is another person's, "I want my child to see this while they are with me because it is empowering for them." For example in the pro-ana debate there are parents who have said, "I find it useful to go to these websites to sit down with my daughter and say, 'Look at this, this is where it could go.'" For me it is about having the ability to think about all of those views and embrace them in a way that empowers each individual to make the right decision for their child with all the caveats, as we have said earlier, for vulnerable children. I agree with what you are saying, that you do not want people like me saying, "Here is the evidence; now do it." It is about commonsense, it is about judgment, it is about good parenting. Given this technology digital divide, it is also about helping people feel competent and confident to make decisions about their children. As a parent you would know and as I a parent I would know that when your kids know more than you about something it makes for an uncomfortable parenting dynamic, so for me it is also about actually saying to parents have the courage to say, "I do not really understand this but I just do not like it and I think we need to talk about it." Also if I could say one further thing because your question was very useful, in terms of the research I think we always need to have a research base and an evidence base to really look back to, particularly if we are thinking about policy, and with the UK Council for Child Internet Safety I have made a very clear recommendation that research needs to be tied into the thinking of that Council, but we also need to accept that ethically from a methodological point of view to actually do the kind of experiment we would need to do to truly prove what harms would happen to what children through these new digital technologies it would be completely unethical and we could not do it. So I think we need to also, as you say, bite down a bit and take a commonsense approach and empower the end user to really use all the tools available to challenge industry to step up to sign up to codes, to self-certify against safety principles, and to be independently evaluated and monitored, which then gives parents and consumers the opportunity to make an informed decision on behalf of their children.

Q343 Mr Evans: You sat patiently through the last evidence session. It would be remiss of us not to ask you: do you think that YouTube and Google should be doing more to protect youngsters?

Dr Byron: It is remiss of you to ask me this. I sat patiently through that and I got quite terrified towards the end, I have to say, waiting for my go! I know Kent Walker and I know Patricia Mole and I worked with them and their teams very much during this review, and I have found them, as much as all the stakeholders who have engaged very positively with me during this review, to be people who have genuine and real concern and care around these issues. I agree there are commercial imperatives and so on, but I tend to take a less sceptical view because I also think otherwise we do go down this road of polarisation of debate where it becomes an argument that does not actually get us anywhere. In terms of me making evaluations of different companies or different sites, forgive me, I am not going to give you that opinion because that is outside the remit of my review and I actually think that it would muddy the recommendations I made if I start now giving you a breakdown of who I think does what better. I think a lot of them do some things well. There is a lot more that can be done and I have set a real challenge and I am really pleased to see how much this has been welcomed by industry. For me it is about now everybody doing what they are saying they are going to do.

Q344 Mr Evans: And if they do not, there should be a time - because we have done this before - for whistle-blowing and exposing the companies?

Dr Byron: But also the Government, if you would forgive me for saying that since you are they. There has been a great fanfare and a big jump up and down, and on Thursday the Prime Minister and I were on GMTV - that was an interesting moment in my life - but that was great and he was very supportive, and the two Secretaries of State were really supportive, and that is good and this is a positive thing for Government, it is a positive thing for industry, it makes everybody look good because everybody is saying, "Yes, we are going to fly the flag for child safety"; so do it. I have put time lines in and I will decide whether to come back and be the person to say yes you have or no you have not.

Q345 Mr Evans: Even at the risk of straining relationships between parents and youngsters, you would say to parents that that computer should come out of the bedroom and it should be in a communal place?

Dr Byron: I think that is a very good first start, particularly with younger children, in the same way as you would not let your younger children have any other experience without you being around, to 'shoulder surf' is the new term, to have a look and just check. With older kids it gets more complicated, and I have said from my own experience with my daughter there are times when she is on-line and she is doing what she is doing and that is because I know her and I trust her. That is how we parent our children in the off-line world. We increase their independence. We have to accept that for children to be able to manage the world they need to understand risk. If we take an over-protectionist view we have very vulnerable children who are vulnerable to exploitation in both the on-line and the off-line worlds, so for me it is a balance in terms of how we actually parent children and we have a combination of supervision, monitoring, filtering, discussion, boundaries, timers, and all the other things that I have talked about in my report,

Q346 Mr Evans: It is really complicated when you are talking about the causation of people playing violent video games and then going out and being violent. Should parents really be taking more proactive decisions and indeed learning a little bit more about the Internet and video gaming as well in order to properly supervise or indeed censor what their youngsters are playing?

Dr Byron: We do it for films, we do it for pornography that is on the top shelf. We do not allow or encourage our children to read it or buy the magazine for them. Interestingly, I was buying a video game with my children on the weekend and there were three little boys trying to buy a very well-known and popular video game in the shop and the retailer said no, and then they brought granddad along and he said, "Oh, it's a game, it's fine," and I sort of twitched and my daughter said to me, "Mum, you really aren't supposed to be doing it in shops. It is okay through your report," but I think that is the case. The word 'game' in itself often gives people a false sense. Do we want to get hung up on the 'video games turn people into psychopaths' argument again when we get into the polarisation of debate? My view is that vulnerable people who do dangerous things, it is usually a combination of factors that push them from that underlying predisposition towards violence into violence. It may be what they experience in terms of content in a variety of ways as well as what they experience in the home, how violence is validated and normalised in their early childhoods, and so on. Having said all that though, children should not be playing games that are designed for adults. The video games industry are not designing these games for children. I have recommended a much more robust classification system at the consumer-facing element side and I hope that will help people really understand about games. There are some brilliant games for kids which comprise 50 per cent of the market; I play them with my kids, and that is what parents should be spending their money on.

Q347 Chairman: Can we come on to your main recommendation for a UK Council for Child Internet Safety. It seems to be a development of the Home Secretary's Task Force. Can you tell us what the difference will be between the Council and the Task Force?

Dr Byron: In my overall evaluation of the Home Secretary's Task Force I felt very positive about it. I think it is a model of good practice and I think that is where I feel that the UK does take a responsible lead in thinking about these issues and can continue to do so. The problem with the Home Secretary's Task Force was to do with how it was set up, how it was resourced, who was part of it and also the length of time it took to produce really lengthy documents, which were guidance documents for good practice which in and of themselves are really good documents, but I could not see what they actually meant in terms of the end user, the consumer, whether that is a parent making a decision on behalf of their child or for us to effectively monitor and evaluate that certain companies are doing what they say they want to do in terms of good practice. For me it was building on the relationships, building on the goodwill, building on the fact that we had big multi-national companies based outside the UK who were sitting and thinking together about issues that were UK-centric at the time, and also bearing in mind it was set up in the Home Office because it was set up originally to look at illegal content, particularly child abuse images, and there has been some very successful work that has come out of that. I felt it needed to move on, I felt it needed to be properly resourced, I felt that there needed to be much more of a cross-governmental feel to it because what I also found was that industry were very fatigued at the many different government departments all of whom have a different role to play in this arena and who were having several sometimes contradictory conversations with them, so for me it was about having one government departmental voice if you like with all departments working together. I identified the Home Office as one chair because I believe there is lot of work still to be done on illegal content, and DCFS as the second because of the issues particularly around education and children and families. I have recommended a parent and child advisory panel. Children have really helped shape this review, their voice is very important, so we need parents and children telling us what they think. We need child development experts but we also need technology experts that can challenge companies to think outside the box sometimes. I do think, as you were discussing previously, that technology is moving very much towards better ways of dealing with these issues. We need to have two very clear arms, we need one around thinking about policy and thinking about systems, but we also need one around delivery, and I recommend that we need an independent person to drive the delivery arm so that we do not spend a lot of time talking about it but not actually delivering.

Q348 Chairman: As you know, the Committee recently visited CEOP and we met one of the children who sit on the advisory panel. CEOP is doing a lot of the things which you have called for, not just to seek out and prosecute potential offenders but also in the field of education. Is there a danger that by creating this body you are going to undermine or diminish the work that CEOP is doing?

Dr Byron: It is not my intention to undermine or diminish the work that anybody is doing. There are a lot of organisations that are doing a lot of good things and for me it was about a co-ordinated and strategic approach that was driven by a national strategy. I absolutely echo and agree with your comments about CEOP, and CEOP would have a very key role to play in the Council definitely around illegal content and definitely in the conversation around enforcement. When it comes to education of children, CEOP do a great job, they are going into schools a lot and I know they have their child panel. In fact, we borrowed their child panel for our purposes as well in the review, but there are many others as well and for me it is about pulling together all the information out there in a way that is accessible. I found that about 57-60% of parents said to me, "The problem is we know there is a lot of information out there but we do not know where to go to find it," so people are almost overwhelmed by it. Also what one person finds useful somebody else might not, so it is also about writing and producing and supplying information in a way that meets the needs of everybody who is looking to have their questions answered.

Q349 Rosemary McKenna: Is it right to interpret the review as proposing a move away from self-regulation according to good practice guidelines and towards a more co-regulatory approach, with more accountability to bodies such as the Council or Ofcom?

Dr Byron: I spent an awful lot of time asking members of my team and members in various departments to help me understand the distinction between these models, and I have to say I think it becomes rather semantic after a while, if you will forgive me. For me it is about self-regulation. It is about industry setting out very clear safety codes and general good practice principles that can then be independently monitored and evaluated. Some people might interpret that as co-regulation; some people might interpret that as self-regulation. I have talked about it from a self-regulatory standpoint in this report.

Q350 Rosemary McKenna: And therefore you would not be in favour of a more naming and shaming culture for those businesses and firms who are not complying or do not match the same standards as others?

Dr Byron: I think to some degree this is sort of implicit in what I am recommending in the fact that here we have a UK Council that reports yearly to the Prime Minister. Transparency for me is the key, that is the issue, that consumers really understand what these codes are and that these are monitored and that they understand who is adhering to those codes and who is not, and for me implicit in that then would be a public acknowledgement of good practice. We have to balance this now and move away from a blame, name and shame; there is a lot of good practice out there that needs to be acknowledged and industry needs to be supported as well as challenged, and I think for me it is about finding a balanced way of doing that that is proportionate to the risk that we are looking at.

Q351 Chairman: A lot of the evidence that we have received from the industry has been firmly in support of the self-regulatory structure, and you have praised that, but you are also suggesting that the Council is going to be established to set industry standards, to monitor compliance and to publicly censure breaches. That is not self-regulation; that is co- regulation and I think it is not just semantics, there is a difference between the two, and I think some might feel that a move towards a co-regulatory system is not necessarily going to help. How would you respond?

Dr Byron: Having spent many years working in many different areas of child protection, just to talk about my off-line world experience, I have worked with many dedicated staff teams who have worked extremely hard to protect the most vulnerable children, but a good staff team is as good as how they are evaluated, monitored and audited. My background in health has been around audit. It is about showing good practice and thinking about one's practice in order to push on practice and push the standards on, and so for me the self-regulatory bit of what I have recommended is around the codes and about the good practice guidelines and the safety principles so that different companies can also show how they measure up to these codes in different ways. We talk about user-generated content. That is a massive area so how do different companies measure up to codes in different ways that can be evaluated so that the end consumer can then make an informed choice about what they wish to engage with or their children to engage with.

Q352 Chairman: But under your structure the Council stands behind the industry bodies and will express views as to the adequacy of what they are doing?

Dr Byron: The industry bodies are also part of the Council, so the industry bodies are there. For me this is about collaborative, joined-up working, this is about thinking together, this is about sitting and thinking through some of these very difficult conversations in a way that moves away from the polarisation of debate and the entrenched positioning that often seems to occur when these conversations are had. To me it seems almost a nonsense to have codes if those codes are not independently monitored. That needs to be thought about alongside the development of codes.

Q353 Paul Farrelly: Unless you give bodies some bite, the danger in setting up a quango is that it succumbs to what you might call 'industry capture'. Let me give you an example. The City regulators pleaded with government that naming and shaming on pensions mis-selling would do nothing, it would not incentivise and that the threat had more leverage over the industry. That was absolute nonsense. It was only after 1997 that the new minister, Helen Liddell, who is now in Australia, said, "I am going to name and shame companies that mis-sold," that industry did take it seriously and got its act together, so I am not getting any sense of what bite your Council might have?

Dr Byron: I do not think any company would want to tarnish its brand by not signing up to good practice codes or to ensure that when they have an independent evaluation of their practice that they can show that they are measuring up to those codes. I think the brand damage, apart from anything else, would be huge, and for me that is where I think the bite is. I think also the fact it is set up by and reports to the Prime Minister and it has a very public face to it and the fact that companies work together with a number of government departments so there is a sense of joined-upness in the conversations, because what I have found - and this is what has taken me the most time to get through - is the competing needs of different departments that are putting pressure on industry in different ways and there is no strategy or clear strategic way of moving forward in terms of thinking about child safety. I do believe that brand damage would be huge and brand reputation is very much for these companies what they want and not to be seen to be actually part of this, if one was to take the more sceptical view, I think would be very damaging for these brands.

Q354 Paul Farrelly: As you have heard, we visited CEOP, and they submitted evidence to your inquiry, and at CEOP they feel very strongly that the balance is far too skewed towards self-regulation. We have just heard evidence from a major multi-national that they do not employ a single person to proactively filter out child pornography, and an as yet unspecified number of people checking flagged content were so overloaded they missed a gang rape, so you can understand CEOP's view in terms of the balance between who monitors and the industry co-operating on takedowns for inappropriate content, and you cannot just say, "They have got that view because they're coppers, aren't they?" In terms of their evidence to you, are there elements of their call for the balance to be shifted that you have ignored or have you accepted CEOP's evidence in its entirety in your findings?

Dr Byron: As far as the illegal content goes, I have absolutely no argument with anything that CEOP says. I think that what they do, they do extraordinarily well, and it is not just in terms of the cases but it is in terms of the behavioural monitoring - as you know, you have been there - and the way that they actually do profiling around predators on-line. CEOP in terms of the bigger picture are part of a number of different organisations, including industry themselves, who have a role to play. I believe that is a role that needs to be collaborative. As I made the point earlier, when it comes to content that is not illegal, that is harmful or inappropriate, which is very much a subjective rating around content, for me it is not about a single organisation or a single government making the decision on behalf of society which is accessing that content as to what they can and cannot see or should or should not see. For me the subjective value of making a decision should be supported. CEOP have an incredibly important role in the Council. They are respected by many people and their position, I think, is a very useful challenge to the Council, to the Government and to industry and I think for them to sit amongst others and to work together in the way that I propose is the most useful way forward.

Q355 Paul Farrelly: Just finally, Chairman, we do not want to over-egg the risks and we do not want to mollycoddle children, but we heard at CEOP about the difference between what publicity they are able to put out on that still-watched medium television and how Australia treats it rather differently, and that they do not have a budget really within the departmental priorities to put out factual stuff such as we see every day on keeping people to driving at 20 miles an hour rather than 30 miles per hour and the risks to children. How would your Council change that? Do you think it would be a force for shifting that sort of publicity up the departmental priorities. Again what clout is your Council going to have in that respect?

Dr Byron: As you know, the major area of recommendation for me was what is called media literacy (although I think that is a really confusing term for many people) and an education campaign around e-safety, and all the issues to do with risk and risk management, and all the things that you have highlighted that is targeted across society, but also in specific places targeted to certain groups or individuals based on vulnerability as is assessed and understood and also very much through schools. For me that has been something that I have say very clearly needs to be resourced and I believe that commitment was made last week by government, so in terms of empowering the end consumer, there are tools, there are filters, there are products, and we need to push the standards in the development and we need to put them in front of people and we need to make them easy to understand. Fundamentally, it does get back to this issue that one of your colleagues raised earlier about confidence and competence and self-efficacy around using the technology. For me CEOP definitely has a role in that in terms of the education that they do, but there is a bigger education question which goes beyond risk and it is about what is this technology, what does it do and how can I understand it in order to help my child understand it.

Q356 Chairman: Can I turn to the question of the criminal law and legislation. You have recommended that the Council should look to whether there should be some clarification of the law in particular areas. Can you just tell us where you think there is a case perhaps for introducing further legislation?

Dr Byron: The tragic cases that were going on in Bridgend during the course of this review obviously brought the whole issue around suicide and suicide websites very much to the forefront of everyone's minds, and so alongside my call for evidence there was a huge amount of evidence and suggestions and concern coming into me for that. Obviously as you are aware as a Committee, having sat through so many hours of discussion, there are so many issues to think about. For me it was about sticking to my remit but also when there were particular areas of real concern listening to those and thinking about those. I did decide to write specifically about that in the report. I understand that the Law Commission review in 2006 talks about assisting someone to commit suicide as being a crime both on-line and off-line. I think we need to be really clear about that and we need to identify where that might be happening and then we need to take steps to enforce the law around those areas on-line as we would also off-line.

Q357 Chairman: Can I just test you on two other areas which have been suggested to us. Jim Gamble specifically highlighted Second Life and the use of avatars to simulate child sex and he suggested to us that that should be covered by the law. Would you agree that is another area we need to look at?

Dr Byron: I think it is an area that needs to be looked at but it is an area that needs to be researched and evaluated. I think there is less clarity around that in terms of the Law Commission giving quite a clear steer on the issue around assisted suicide both on and off-line. I think that needs to be thought about and that needs to be evaluated and that might be a priority for the Council in terms of the national strategy that they set out by early next spring.

Q358 Paul Farrelly: That is sitting on the fence really, is it not?

Dr Byron: I think it is sitting on the fence because I do not think we know enough about that yet. I come from a background of social science which is looking at risk, looking at concern but taking a balanced and proportionate view. What that means is understanding that there are many, many things that make people feel very anxious but having a very clear understanding and a basis for action before one acts.

Q359 Paul Farrelly: Surely it is illegal to show this in real life? I think the Government was reviewing whether it might be illegal to show it in manga comics and there is a similar issue with respect to the Internet. It should be illegal, should it not?

Dr Byron: If there is simulation around child sex and that simulation is deemed to be illegal, then absolutely it should be dealt with appropriately and expediently.

Q360 Chairman: Also the existing system regarding video game classification, would you like statutorily enforced classifications below 18 so it would be illegal to sell a 15-rated game to a 12-year-old?

Dr Byron: And down to 12. My recommendation is to take the statutory legislation down to the age of 12. Do you want to know why?

Q361 Chairman: Yes.

Dr Byron: The reason why is that in looking at the content I looked very carefully at how game content changes over the different ages and at 12 you begin to get more realistic violence and some sexual innuendo. Listening to the voice of the parents, which was very important in shaping the review, and also looking at the research done by the other organisations around this, and also looking at the Ofcom research which looked at the voice of parents generally in terms of the Internet and the on-line space, for me it was very clear that when it gets to content at that level, people are more concerned to really understand not just content but context. How is this violence being played out here? Is it part of a news-type game, is it something that has a historical basis to it, or is it just violence? For me it felt very important that in line with DVDs and films, the statutory classification should be from 12 and then 15 and 18.

Q362 Alan Keen: From the law enforcement point of view, Jim Gamble, when he came he was absolutely firm that if it is illegal in real life it should be illegal on-line in every sense of that. Would you like to expand on that and say whether you agree with that?

Dr Byron: I think that is something that needs to be discussed. I think there are very clear examples already for example child abuse images. I think that the on-line space can change things and can make things more difficult, more risky. I think again for me you may say it is sitting on the fence; I would say it is being a social scientist, but you need to take a proportionate view of things, and I think to say that everything that is illegal off line is therefore illegal on-line may be the case but we need to set a very clear strategy and we need to do it now. This needs to be one of the first priorities of the Council to go through that point-by-point and to be very clear so that we then take appropriate enforcement because in terms of resourcing as well, we want to make sure that we resource enforcement to act in the appropriate way, and so therefore we need to be very clear about what our priorities are.

Q363 Alan Keen: Where does your opinion fall, taking a very simple thing, you said there is not enough research on this really to make proper judgments, but a child of 12 who continually plays a game where people are shot and blood runs out and their arms and legs are chopped off; do you think that is harmful or would it have no effect on the vast majority of kids and it would just be the odd few who would be affected by other things in real life? Is that how you see it?

Dr Byron: The game that you have described would not be rated for a 12-year-old, it would be rated for possibly a 15-year-old and probably an 18-year-old.

Q364 Alan Keen: A 15-year-old then?

Dr Byron: The reason I have made very clear recommendations about the statutory end of the classification system at 12, 15 and 18 is because I believe that that kind of content would be inappropriate for people under the age that it is rated for.

Q365 Alan Keen: Is there not a danger of a 15-year-old playing games which are extremely violent, and the press in their headline way of reporting stuff say that is dreadful, that child is going to become so used to it that he will do it in real life because he thinks shooting people or using violence is quite acceptable. What are your views on that? I know it is over-simplistic.

Dr Byron: In terms of the risk, I spent a long time in my report going through the research evidence for video games and there are two camps. There are those who look at games in terms of laboratory-based experiments which show short-term effects of playing violent games on the people playing them, although there are questions around playing games in a laboratory as part of an experiment and whether that is equivalent to playing in real life. There are questions around the measures of aggression after the experiment as to whether they are actually adequate in terms of hypothesising that that is aggression. Certainly there are huge concerns that one cannot extrapolate from a laboratory to real life and generalise from short-term effects and make statements about long-term effects. For me it is more an active participant approach in terms of the research, which would include research that was more ethnographic rather than laboratory-based, so therefore we are looking at qualitative and quantitative research that really evaluates the child, the individual who is accessing the technology or playing the game, rather than just looking at the game itself. I knew this before I came to the review because I have worked with vulnerable children for most of my career, but there are some children who have an underlying predisposition towards violent behaviour, either directed towards themselves or others. Those children may be influenced by a variety of factors in their lives, factors that existed before the advent of video games, but video games themselves may be part of a constellation of factors that would then impact on that child's behaviour to the point that they may do something tragic either to themselves or others. If we continue down the road of trying to isolate single causal factors when we try and understand very complex questions such as child brutality, I think we really lose the perspective and we lose the ability to make holistic and important interventions in terms of managing risk for vulnerable children.

Q366 Alan Keen: What would you like us to put in this report about further research that the Government should make sure is carried out? What would you like us to say? Are you satisfied that research is going to be on-going?

Dr Byron: I hope it is going to be on-going. As you know, I made the recommendation as part of my UK Council. When I had two Secretaries of State and the Prime Minister saying yes, I kind of presumed they were saying yes to everything, which included the research part of it. Maybe I will come back and see! I have very clearly stated that currently, as the research stands, the research in itself is not adequate to base policy around, but it is very important for us to think about it in terms of making proportionate decisions and responses to the probability of risk. Based on your previous question, I think definitely research around vulnerable children and young people, identifying who those children and young people are and thinking about appropriate ways of managing their vulnerabilities and supporting them is very important, research around the areas of illegality, John, what you were talking about earlier in terms of the avatars and so on is very important. Research helps calm the anxieties that polarise the debate and so it needs to be prioritised around the objectives that are set out by the UK Council for Child Internet Safety at their first summit in a year and a half's time.

Q367 Adam Price: You set out in the report the different approaches being taken by the companies that host content to minimise access to harmful content, everything from software that scans content for key words, through to YouTube's approach which is to allow users to flag potential ---

Dr Byron: Notice and takedown?

Q368 Adam Price: Yes indeed, and then finally to companies like AOL who proactively employ their own moderators to identify harmful content which they then take down themselves. Do you think that all these approaches are equally valid in different contexts or is there a gold standard and should more of the major companies be adopting that kind of proactive approach to identifying harmful material that is out there on their sites?

Dr Byron: That is a good question. In an ideal world, a combination of all of those things would be a perfect solution. There are some really great social networking sites for younger children, Club Penguin is an example of that, and there moderation is really, really important. When you have very young children it is important that there are human moderators who are actually present in that on-line space with the child, for obvious reasons that I do not need to spell out. Also talking to the BBC and looking at how they moderate their CBeebies website, I think it is a brilliant standard of moderation. Certainly when you get with older kids and there is chat there need to be very clear buttons where abuse can be reported or concerns can be reported and very clear strategies around how that is managed. Certainly parental filter will filter content wherever your child goes on-line, so if your child is going on to YouTube but they are eight and you have very clear parental filters in place, then that will filter the content that your child experiences there. For me it is about setting it out in the way that you have just set it out - and thank you for reading my report in such detail. I do not know if you know but I wrote a report for children as well so children got the more sane version of it, to be honest, we cut out all the stuff we had to wade through before we got to what we had to say. For me it is about being really clear about the options, making sure, for example, with parental filters and also with filters on gaming consoles that they are easy to set up, they are clearly understandable and so on, and putting them in front of parents. I have recommended that when people take up a broadband connection with a new ISP they are offered bundles of filters as part of the package and that for all new computers it is there on the bundle with all the other bits of bundled stuff that you get in terms of virus protector and so on when you buy your PC. It is a combination of all those things that I think will help children be safe and take risks appropriate for their age and stage of development.

Q369 Adam Price: On the argument against companies which host Internet content acting themselves as proactive gatekeepers, we have heard again this morning from Google that it is like asking a telephone company to screen the content of calls. Do you think that is a fair analogy against the idea of them having a team of people themselves actively reviewing the content to see if there is anything there which could potentially be harmful?

Dr Byron: I have written a bit about the e-Commerce Directive in this report. I do not know if it would be helpful to talk about it here. It took me an awful long time to understand it. I hope you understand it when you read what I had written. For me the e-Commerce Directive is a very interesting point here because I think it is something that companies may use in these discussions. In terms of understanding how it works, there are two ways of understanding it. In terms of the ISP as far as the e-Commerce Directive goes, they are literally the pipes that the content gets piped down, so the e-Commerce Directive would talk about their role as having a mere conduit nature, so they are merely the conduit for content, and as a conduit there is no expectation of them to be opening the packages of content as it goes down the pipe and making a decision about whether that is appropriate or not for delivery. I suppose in the same way as we do not expect the Royal Mail to open every letter and decide, that is the 'mere conduit' argument. As far as the content hosts go, the issues around liability, which is once they step into an area where they start to make decisions about what is appropriate or inappropriate (and again I am talking about the subjective bit, I am not talking about the illegal bit where it is pretty clear, I am talking about we make a decision that we do not like this so it is going) they then set themselves up in terms of liability that if you did this bit, why did you not do this bit and why did you not do this bit? So the notice and takedown is a way that that can be managed as well. I think that businesses take risks also as part of being businesses and sometimes businesses might want to stick their head above the parapet and say, "Yes, there is the e-Commerce Directive but, do you know what, we do not like this, so we are proactively going to take that down." That is me speaking entirely personally. I think businesses take risks as part of business. As far as the e-Commerce Directive stands at the moment that is how these things have come about. Notice and takedown in itself can be an effective system and certainly when it is built into a reputational management system so your community police force actually has people within it who the community understands as being reliable informants of where content breaks policies of acceptable use, that is very useful because their flags when they say they do not like something will be escalated up the system and will be looked at first, and a number of companies do that in a very reliable way. These are issues the Council really needs to think about, so if we do have a system which is notice and takedown then all the questions that you asked earlier need to be answered in terms of time, there needs to be transparency, there needs to be accountability, and it needs to be evaluated.

Q370 Paul Farrelly: In terms of the e-Commerce Directive in the area of libel there is bite because companies can be sued irrespective of whether they are a pipeline or not if they do not go through certain steps of giving notice and then take down. The question for this Committee is whether the criminal law or an almost mandatory approach adopted through your Council should give bite in terms of liability for stuff that is hosted or posted on the Internet, so what would your recommendation be? You have said the issues need to be looked at but what would your recommendation be?

Dr Byron: You are going to tell me that I am sitting on the fence again here, Mr Farrelly ---

Q371 Paul Farrelly: No I am not.

Dr Byron: Oh yes you are! My recommendation would be this: the e-Commerce Directive as it stands works but it works in a way that I think industry should be encouraged not to hide behind it. They should be able to think proactively about their own liability in relation to content. I think these are complex and difficult conversations. I am not recommending that the e-Commerce Directive be changed in itself but it might be as these conversations continue that that needs to be thought about. I will be honest with you in terms of the breadth of the remit of review, this is not something that I have thought about in any depth and I could not give you a specific recommendation at this stage.

Q372 Paul Farrelly: I would not want to discourage witnesses coming to committees like this, whether by accusing them of fence-sitting or more serious things.

Dr Byron: You are terrifying, you are!

Adam Price: Speak to his wife!

Q373 Paul Farrelly: Speak to my children! Now I have lost my whole train of thought. We have just heard a pretty hands-off liberal model from Google. I can Google and get almost any fact I want by using Google but were you surprised that the previous witness could not actually answer the simple question of how many people might be involved in Google and YouTube in taking down flagged materials? Was that a surprise to you?

Dr Byron: I am not going to make specific comments about other witnesses or what they have said to you because that is outside my remit and I am here to answer questions about the review, so I say that to you with respect, Mr Farrelly. I think that we need to be careful that we do not become so polarised in the debate again that we lose the focus. I think there are difficulties. These are very new technologies. Who knew two years ago that social networking was even going to be a phenomenon? I am not defending mistakes that are made. I am not condemning them either. I am saying I think we need to move our thinking on in a way that is collaborate and proactive and fundamentally focuses on the needs of children and young people.

Q374 Paul Farrelly: Generalising from that specific instance, so we are not naming and shaming any particular organisation, is that sort of model a bit too hands-off for you?

Dr Byron: The notice and takedown model?

Q375 Paul Farrelly: Not having anything proactive and having staff so overloaded that they miss gang rape videos, in the general case?

Dr Byron: There will be ways of looking at things proactively in terms of scans, in terms of flesh tones, as was being said earlier by the last witness, in terms of things like pornographic content. The difficulty with that and the way that as I understand the technology is advancing is that when you are looking at flesh tones, you might be looking for flesh tones in terms of seeing sexual content but you may get 15 million pictures of people in their bikinis on the beach, so the technology is advancing in terms of being very strategic in terms of pictures but also in terms of text. You made the point - and can I call you John because I sort of know you - you were talking about tagging and you were talking about sophisticated research around names. I think these are all things that need to be thought about, but it needs to be pushed up the agenda for companies so these technologies are technologies that they are really investing in because child safety is a priority because we say it should be.

Q376 Rosemary McKenna: Just very quickly on a couple of specific issues, you reject an expansion of network level blocking such as that used by major ISPs to block access to sites identified by the Internet Watch Foundation, but the Internet Watch Foundation say that it is working effectively. Why do you not want it to be developed?

Dr Byron: Maybe I need to be a bit clearer. I do not reject it in terms of illegal content, as we have been talking about earlier. If content is identified as being illegal and it is found to be illegal, then we need to think about how we tell the ISPs about that so they can take it down, but I reject the notion that ISPs should be looking for that content. We need to define as we have with child abuse images what is illegal and then notify them. Generally I am not recommending in terms of harmful and inappropriate material, which again is a subjective decision, that the networks make that decision and then block it at that level.

Q377 Rosemary McKenna: You also recommend that parental control software on new computers should not be automatically on there but should be an option for the parents?

Dr Byron: I think it should be there and I think it should be in front of people and I think it should be there when you switch on. What I am saying is it should not be set at the highest level by default. The reason for that is I thought about this really, really carefully because obviously if you are very concerned about child safety, it would be an obvious recommendation to say put the filtering software on and have it set at the highest level, but in the focus groups and very much in the call for evidence, what parents were saying is that they were finding filtering software so restrictive that if it was set at the highest level they just switched it off, partly because families have lots of children of different ages, and once it is switched off they have not actually engaged with anything other than the thought of switching it off. For me it is there, it is in front of you as you switch on or your ISP provides it to you when you change to a new connection, and you are talked through setting it up. I talked about this notion of the 'tipping point' which is what the social psychologist Malcolm Gladwell talked about, which is if you put it in front of people and you enable them to engage with it enough because they have to go through the process to set it up, you have moved them from a stage of precontemplation, which is not even thinking about the issue, to a state of contemplation and to a state of action so you have changed people's behaviour while also providing the tools to enable them to keep their children safe.

Rosemary McKenna: Thank you very much for a very, very commonsense approach to a very complex issue and it is very helpful to the Committee to have heard you this morning as well as having read the report.

Q378 Chairman: I am afraid Mr Farrelly wishes to test your patience one last time!

Dr Byron: Make it a nice one.

Q379 Paul Farrelly: This is a very easy, nice question. I want to congratulate you on the report. I do not think I have ever seen a report that has been so broadly welcomed, apart from one rogue editorial in the Guardian that I read, and I have never seen a Government immediately come out and say, "We are going to implement all the recommendations." If they treated political manifestoes like that, we would be in a different world. Flicking through this report, I have not seen a figure in here and good intentions wither on the vine unless they are matched by money, so what figure do you have in mind that your Council would cost to be effective?

Dr Byron: Can you help me answer this question really? I absolutely would not know. I have not thought about money. I am not a person who prepares budgets. I am a psychologist; I am not an accountant. What do you think, Mr Farrelly, should be the figure that I should be suggesting here?

Q380 Paul Farrelly: I have no idea. I will ask all the people who have welcomed the report but I thought you might have a figure in mind.

Dr Byron: Lots of money. Enough money to target effectively and enough money to be creative around how that targeting is done. This should be a financial priority in terms of resourcing a campaign to enable parents, teachers, adults grandparents who do a lot of childcare as well, and schools, to feel empowered to enable children to take responsibility for their own behaviour, and that will require proper resourcing.

Q381 Paul Farrelly: So warm welcomes are fine but the Government has got to put its money where its mouth is?

Dr Byron: Absolutely right.

Q382 Chairman: Can I thank you very much and can I on behalf of the Committee wish you a very enjoyable holiday, which is thoroughly deserved!

Dr Byron: Thank you very much.