Select Committee on Culture, Media and Sport Minutes of Evidence


Examination of Witnesses (Questions 360-379)

DR TANYA BYRON

1 APRIL 2008

  Q360  Chairman: Also the existing system regarding video game classification, would you like statutorily enforced classifications below 18 so it would be illegal to sell a 15-rated game to a 12-year-old?

  Dr Byron: And down to 12. My recommendation is to take the statutory legislation down to the age of 12. Do you want to know why?

  Q361  Chairman: Yes.

  Dr Byron: The reason why is that in looking at the content I looked very carefully at how game content changes over the different ages and at 12 you begin to get more realistic violence and some sexual innuendo. Listening to the voice of the parents, which was very important in shaping the review, and also looking at the research done by the other organisations around this, and also looking at the Ofcom research which looked at the voice of parents generally in terms of the Internet and the on-line space, for me it was very clear that when it gets to content at that level, people are more concerned to really understand not just content but context. How is this violence being played out here? Is it part of a news-type game, is it something that has a historical basis to it, or is it just violence? For me it felt very important that in line with DVDs and films, the statutory classification should be from 12 and then 15 and 18.

  Q362  Alan Keen: From the law enforcement point of view, Jim Gamble, when he came he was absolutely firm that if it is illegal in real life it should be illegal on-line in every sense of that. Would you like to expand on that and say whether you agree with that?

  Dr Byron: I think that is something that needs to be discussed. I think there are very clear examples already for example child abuse images. I think that the on-line space can change things and can make things more difficult, more risky. I think again for me you may say it is sitting on the fence; I would say it is being a social scientist, but you need to take a proportionate view of things, and I think to say that everything that is illegal off line is therefore illegal on-line may be the case but we need to set a very clear strategy and we need to do it now. This needs to be one of the first priorities of the Council to go through that point-by-point and to be very clear so that we then take appropriate enforcement because in terms of resourcing as well, we want to make sure that we resource enforcement to act in the appropriate way, and so therefore we need to be very clear about what our priorities are.

  Q363  Alan Keen: Where does your opinion fall, taking a very simple thing, you said there is not enough research on this really to make proper judgments, but a child of 12 who continually plays a game where people are shot and blood runs out and their arms and legs are chopped off; do you think that is harmful or would it have no effect on the vast majority of kids and it would just be the odd few who would be affected by other things in real life? Is that how you see it?

  Dr Byron: The game that you have described would not be rated for a 12-year-old, it would be rated for possibly a 15-year-old and probably an 18-year-old.

  Q364  Alan Keen: A 15-year-old then?

  Dr Byron: The reason I have made very clear recommendations about the statutory end of the classification system at 12, 15 and 18 is because I believe that that kind of content would be inappropriate for people under the age that it is rated for.

  Q365  Alan Keen: Is there not a danger of a 15-year-old playing games which are extremely violent, and the press in their headline way of reporting stuff say that is dreadful, that child is going to become so used to it that he will do it in real life because he thinks shooting people or using violence is quite acceptable. What are your views on that? I know it is over-simplistic.

  Dr Byron: In terms of the risk, I spent a long time in my report going through the research evidence for video games and there are two camps. There are those who look at games in terms of laboratory-based experiments which show short-term effects of playing violent games on the people playing them, although there are questions around playing games in a laboratory as part of an experiment and whether that is equivalent to playing in real life. There are questions around the measures of aggression after the experiment as to whether they are actually adequate in terms of hypothesising that that is aggression. Certainly there are huge concerns that one cannot extrapolate from a laboratory to real life and generalise from short-term effects and make statements about long-term effects. For me it is more an active participant approach in terms of the research, which would include research that was more ethnographic rather than laboratory-based, so therefore we are looking at qualitative and quantitative research that really evaluates the child, the individual who is accessing the technology or playing the game, rather than just looking at the game itself. I knew this before I came to the review because I have worked with vulnerable children for most of my career, but there are some children who have an underlying predisposition towards violent behaviour, either directed towards themselves or others. Those children may be influenced by a variety of factors in their lives, factors that existed before the advent of video games, but video games themselves may be part of a constellation of factors that would then impact on that child's behaviour to the point that they may do something tragic either to themselves or others. If we continue down the road of trying to isolate single causal factors when we try and understand very complex questions such as child brutality, I think we really lose the perspective and we lose the ability to make holistic and important interventions in terms of managing risk for vulnerable children.

  Q366  Alan Keen: What would you like us to put in this report about further research that the Government should make sure is carried out? What would you like us to say? Are you satisfied that research is going to be on-going?

  Dr Byron: I hope it is going to be on-going. As you know, I made the recommendation as part of my UK Council. When I had two Secretaries of State and the Prime Minister saying yes, I kind of presumed they were saying yes to everything, which included the research part of it. Maybe I will come back and see! I have very clearly stated that currently, as the research stands, the research in itself is not adequate to base policy around, but it is very important for us to think about it in terms of making proportionate decisions and responses to the probability of risk. Based on your previous question, I think definitely research around vulnerable children and young people, identifying who those children and young people are and thinking about appropriate ways of managing their vulnerabilities and supporting them is very important, research around the areas of illegality, John, what you were talking about earlier in terms of the avatars and so on is very important. Research helps calm the anxieties that polarise the debate and so it needs to be prioritised around the objectives that are set out by the UK Council for Child Internet Safety at their first summit in a year and a half's time.

  Q367  Adam Price: You set out in the report the different approaches being taken by the companies that host content to minimise access to harmful content, everything from software that scans content for key words, through to YouTube's approach which is to allow users to flag potential—

  Dr Byron: Notice and takedown?

  Q368  Adam Price: Yes indeed, and then finally to companies like AOL who proactively employ their own moderators to identify harmful content which they then take down themselves. Do you think that all these approaches are equally valid in different contexts or is there a gold standard and should more of the major companies be adopting that kind of proactive approach to identifying harmful material that is out there on their sites?

  Dr Byron: That is a good question. In an ideal world, a combination of all of those things would be a perfect solution. There are some really great social networking sites for younger children, Club Penguin is an example of that, and there moderation is really, really important. When you have very young children it is important that there are human moderators who are actually present in that on-line space with the child, for obvious reasons that I do not need to spell out. Also talking to the BBC and looking at how they moderate their CBeebies website, I think it is a brilliant standard of moderation. Certainly when you get with older kids and there is chat there need to be very clear buttons where abuse can be reported or concerns can be reported and very clear strategies around how that is managed. Certainly parental filter will filter content wherever your child goes on-line, so if your child is going on to YouTube but they are eight and you have very clear parental filters in place, then that will filter the content that your child experiences there. For me it is about setting it out in the way that you have just set it out—and thank you for reading my report in such detail. I do not know if you know but I wrote a report for children as well so children got the more sane version of it, to be honest, we cut out all the stuff we had to wade through before we got to what we had to say. For me it is about being really clear about the options, making sure, for example, with parental filters and also with filters on gaming consoles that they are easy to set up, they are clearly understandable and so on, and putting them in front of parents. I have recommended that when people take up a broadband connection with a new ISP they are offered bundles of filters as part of the package and that for all new computers it is there on the bundle with all the other bits of bundled stuff that you get in terms of virus protector and so on when you buy your PC. It is a combination of all those things that I think will help children be safe and take risks appropriate for their age and stage of development.

  Q369  Adam Price: On the argument against companies which host internet content acting themselves as proactive gatekeepers, we have heard again this morning from Google that it is like asking a telephone company to screen the content of calls. Do you think that is a fair analogy against the idea of them having a team of people themselves actively reviewing the content to see if there is anything there which could potentially be harmful?

  Dr Byron: I have written a bit about the e-Commerce Directive in this report. I do not know if it would be helpful to talk about it here. It took me an awful long time to understand it. I hope you understand it when you read what I had written. For me the e-Commerce Directive is a very interesting point here because I think it is something that companies may use in these discussions. In terms of understanding how it works, there are two ways of understanding it. In terms of the ISP as far as the e-Commerce Directive goes, they are literally the pipes that the content gets piped down, so the e-Commerce Directive would talk about their role as having a mere conduit nature, so they are merely the conduit for content, and as a conduit there is no expectation of them to be opening the packages of content as it goes down the pipe and making a decision about whether that is appropriate or not for delivery. I suppose in the same way as we do not expect the Royal Mail to open every letter and decide, that is the "mere conduit" argument. As far as the content hosts go, the issues around liability, which is once they step into an area where they start to make decisions about what is appropriate or inappropriate (and again I am talking about the subjective bit, I am not talking about the illegal bit where it is pretty clear, I am talking about we make a decision that we do not like this so it is going) they then set themselves up in terms of liability that if you did this bit, why did you not do this bit and why did you not do this bit? So the notice and takedown is a way that that can be managed as well. I think that businesses take risks also as part of being businesses and sometimes businesses might want to stick their head above the parapet and say, "Yes, there is the e-Commerce Directive but, do you know what, we do not like this, so we are proactively going to take that down." That is me speaking entirely personally. I think businesses take risks as part of business. As far as the e-Commerce Directive stands at the moment that is how these things have come about. Notice and takedown in itself can be an effective system and certainly when it is built into a reputational management system so your community police force actually has people within it who the community understands as being reliable informants of where content breaks policies of acceptable use, that is very useful because their flags when they say they do not like something will be escalated up the system and will be looked at first, and a number of companies do that in a very reliable way. These are issues the Council really needs to think about, so if we do have a system which is notice and takedown then all the questions that you asked earlier need to be answered in terms of time, there needs to be transparency, there needs to be accountability, and it needs to be evaluated.

  Q370  Paul Farrelly: In terms of the e-Commerce Directive in the area of libel there is bite because companies can be sued irrespective of whether they are a pipeline or not if they do not go through certain steps of giving notice and then take down. The question for this Committee is whether the criminal law or an almost mandatory approach adopted through your Council should give bite in terms of liability for stuff that is hosted or posted on the Internet, so what would your recommendation be? You have said the issues need to be looked at but what would your recommendation be?

  Dr Byron: You are going to tell me that I am sitting on the fence again here, Mr Farrelly—

  Q371  Paul Farrelly: No I am not.

  Dr Byron: Oh yes you are! My recommendation would be this: the e-Commerce Directive as it stands works but it works in a way that I think industry should be encouraged not to hide behind it. They should be able to think proactively about their own liability in relation to content. I think these are complex and difficult conversations. I am not recommending that the e-Commerce Directive be changed in itself but it might be as these conversations continue that that needs to be thought about. I will be honest with you in terms of the breadth of the remit of review, this is not something that I have thought about in any depth and I could not give you a specific recommendation at this stage.

  Q372  Paul Farrelly: I would not want to discourage witnesses coming to committees like this, whether by accusing them of fence-sitting or more serious things.

  Dr Byron: You are terrifying, you are!

  Adam Price: Speak to his wife!

  Q373  Paul Farrelly: Speak to my children! Now I have lost my whole train of thought. We have just heard a pretty hands-off liberal model from Google. I can Google and get almost any fact I want by using Google but were you surprised that the previous witness could not actually answer the simple question of how many people might be involved in Google and YouTube in taking down flagged materials? Was that a surprise to you?

  Dr Byron: I am not going to make specific comments about other witnesses or what they have said to you because that is outside my remit and I am here to answer questions about the review, so I say that to you with respect, Mr Farrelly. I think that we need to be careful that we do not become so polarised in the debate again that we lose the focus. I think there are difficulties. These are very new technologies. Who knew two years ago that social networking was even going to be a phenomenon? I am not defending mistakes that are made. I am not condemning them either. I am saying I think we need to move our thinking on in a way that is collaborate and proactive and fundamentally focuses on the needs of children and young people.

  Q374  Paul Farrelly: Generalising from that specific instance, so we are not naming and shaming any particular organisation, is that sort of model a bit too hands-off for you?

  Dr Byron: The notice and takedown model?

  Q375  Paul Farrelly: Not having anything proactive and having staff so overloaded that they miss gang rape videos, in the general case?

  Dr Byron: There will be ways of looking at things proactively in terms of scans, in terms of flesh tones, as was being said earlier by the last witness, in terms of things like pornographic content. The difficulty with that and the way that as I understand the technology is advancing is that when you are looking at flesh tones, you might be looking for flesh tones in terms of seeing sexual content but you may get 15 million pictures of people in their bikinis on the beach, so the technology is advancing in terms of being very strategic in terms of pictures but also in terms of text. You made the point—and can I call you John because I sort of know you—you were talking about tagging and you were talking about sophisticated research around names. I think these are all things that need to be thought about, but it needs to be pushed up the agenda for companies so these technologies are technologies that they are really investing in because child safety is a priority because we say it should be.

  Q376  Rosemary McKenna: Just very quickly on a couple of specific issues, you reject an expansion of network level blocking such as that used by major ISPs to block access to sites identified by the Internet Watch Foundation, but the Internet Watch Foundation say that it is working effectively. Why do you not want it to be developed?

  Dr Byron: Maybe I need to be a bit clearer. I do not reject it in terms of illegal content, as we have been talking about earlier. If content is identified as being illegal and it is found to be illegal, then we need to think about how we tell the ISPs about that so they can take it down, but I reject the notion that ISPs should be looking for that content. We need to define as we have with child abuse images what is illegal and then notify them. Generally I am not recommending in terms of harmful and inappropriate material, which again is a subjective decision, that the networks make that decision and then block it at that level.

  Q377  Rosemary McKenna: You also recommend that parental control software on new computers should not be automatically on there but should be an option for the parents?

  Dr Byron: I think it should be there and I think it should be in front of people and I think it should be there when you switch on. What I am saying is it should not be set at the highest level by default. The reason for that is I thought about this really, really carefully because obviously if you are very concerned about child safety, it would be an obvious recommendation to say put the filtering software on and have it set at the highest level, but in the focus groups and very much in the call for evidence, what parents were saying is that they were finding filtering software so restrictive that if it was set at the highest level they just switched it off, partly because families have lots of children of different ages, and once it is switched off they have not actually engaged with anything other than the thought of switching it off. For me it is there, it is in front of you as you switch on or your ISP provides it to you when you change to a new connection, and you are talked through setting it up. I talked about this notion of the "tipping point" which is what the social psychologist Malcolm Gladwell talked about, which is if you put it in front of people and you enable them to engage with it enough because they have to go through the process to set it up, you have moved them from a stage of precontemplation, which is not even thinking about the issue, to a state of contemplation and to a state of action so you have changed people's behaviour while also providing the tools to enable them to keep their children safe.

  Rosemary McKenna: Thank you very much for a very, very commonsense approach to a very complex issue and it is very helpful to the Committee to have heard you this morning as well as having read the report.

  Q378  Chairman: I am afraid Mr Farrelly wishes to test your patience one last time!

  Dr Byron: Make it a nice one.

  Q379  Paul Farrelly: This is a very easy, nice question. I want to congratulate you on the report. I do not think I have ever seen a report that has been so broadly welcomed, apart from one rogue editorial in the Guardian that I read, and I have never seen a Government immediately come out and say, "We are going to implement all the recommendations." If they treated political manifestos like that, we would be in a different world. Flicking through this report, I have not seen a figure in here and good intentions wither on the vine unless they are matched by money, so what figure do you have in mind that your Council would cost to be effective?

  Dr Byron: Can you help me answer this question really? I absolutely would not know. I have not thought about money. I am not a person who prepares budgets. I am a psychologist; I am not an accountant. What do you think, Mr Farrelly, should be the figure that I should be suggesting here?



 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 July 2008