UNCORRECTED TRANSCRIPT OF ORAL EVIDENCE To be published as HC 353-i House of COMMONS MINUTES OF EVIDENCE TAKEN BEFORE Culture, Media and Sport COMMITTEE
Harmful content on the Internet and in video games
Tuesday 26 February 2008 MR JOHN CARR, MR STEPHEN CARRICK-DAVIES and PROFESSOR SONIA LIVINGSTONE
MR MATT LAMBERT, MR PETER ROBBINS OBE QPM and MS HEATHER RABBATTS CBE Evidence heard in Public Questions 1 - 89
USE OF THE TRANSCRIPT
Oral Evidence Taken before the Culture, Media and Sport Committee on Tuesday 26 February 2008 Members present Mr John Whittingdale, in the Chair Janet Anderson Philip Davies Paul Farrelly Rosemary McKenna Adam Price Mr Adrian Sanders ________________ Memoranda submitted by Children's Charities Coalition on Internet Safety, Childnet International and Professor Sonia Livingstone
Examination of Witnesses
Witnesses: Mr John Carr, Executive Secretary, Children's Charities Coalition for Internet Safety, Mr Stephen Carrick-Davies, Chief Executive, Childnet International and Professor Sonia Livingstone, gave evidence. Q1 Chairman: Good morning. This is the first session of the Committee's inquiry into harmful content on the Internet and in video games. Our inquiry is running roughly in parallel, perhaps a little behind that of Dr Tanya Byron; although we will be looking at similar ground, we will in due course be hearing from her as well. I would like to welcome to this, our first session, John Carr of the Children's Charities Coalition for Internet Safety; Professor Sonia Livingstone of the LSE; and Stephen Carrick-Davies of Childnet International. Perhaps as an opening question, I could ask you whether or not you think that the considerable media coverage of the dangers and negative aspects of online content and in video games is perhaps being exaggerated to the extent that we are overlooking the positive aspects to young people from potentially games and the Internet. Professor Livingstone: I do not think we are overlooking the benefits, I think there are many initiatives going on, many ways in which people are gaining access to the Internet, both at schools and at home, precisely because they are very confident of the benefits of the Internet, but at the same time, there is a context of anxiety, partly led by media panics, partly led by some of the experiences that people are having with the Internet, which is leading perhaps to an overemphasis on the risks online by comparison with some of the other possible causes of dangers or ill-effects in everyday life. So sometimes the media get blamed for all kinds of ills which they are not solely responsible for. But I think people are very clear that this is primarily a fantastically beneficial medium that they want to give their children as free access to as they possibly can. Q2 Chairman: When you say media panics, do you think that the media is exaggerating, is there a certain degree of hysteria creeping in? Professor Livingstone: I think there is some hysteria, and sometimes that makes us overfocus on the risks that the media pose, rather than some of the other problems in children's lives, let us say, which are also part of the reasons for the troubles they face. Mr Carr: We need to be clear, the media do not make it up; the cases that we hear about and read about are things that have happened in our criminal courts, that have been reported on accurately, but the headlines do not always follow the line of the story. I would hate us to think that, in a sense, we have no need to worry about anything. The fact that the media occasionally overdo it with the odd headline should not blind us to the fact that there are genuine risks and perils out there. Mr Carrick-Davies: Two comments on benefits. First, I think I would recognise that the vast majority of users still embrace the technology, we see that just from the point of view of take-up. The overwhelming experience of children and young people is broadly positive, creative, engaging, they can do so many things with this new technology. The educational argument is there; we have seen where technology has been properly embedded within learning environments, and that is an important caveat, that we are seeing great improvements in attainment, to say nothing of the tremendous motivator that technology brings, and the opportunity for children to create their own learning environments. But clearly, when there are areas of risk, and there are very real areas of risk, some of that is communicated because of the case study, because of the individual child that has been hurt, and that stirs up enormous emotions. Parents can very easily be given the message that all is not well, and I think the challenge is that we have to contextualise the risks in the wider experience of children using the technology. We recognise that the vast majority of abuse takes place with children in contact with people that they know, not online. But the media, understandably, will draw out issues of risk, and as John said, they are not making it up. We know from our experience at Childnet that 22% of young people report to have been cyber-bullied. There is no way you can argue about that, one in five children will say through research that their experience of peer abuse is very real, but I conclude by saying that most people accept that there is an element of risk, and it is probably impossible to fully prevent children from being exposed to potential risk, as it is in real life. So there needs to be a mature debate about how these risks are portrayed, and what is the real baseline and experience that children have online. Q3 Chairman: Can I ask you specifically also about video games? We have focused and will continue to focus on the online threat, particularly from individuals who may use the ability to contact people through the Internet for harmful purposes, but games as a separate category: first of all, do you think there is a problem with games; and secondly, do you accept the argument from the industry that games too have benefits, educationally and developmentally? Professor Livingstone: I think there is a question about the evidence for both of those claims. There is not terrifically good evidence that playing video games benefits children, in terms of educational or social benefits, and there is perhaps equally contested evidence that video games can harm children if they play them. So there is quite a body of evidence, particularly in the States, of the ways in which particularly playing violent video games repeatedly may have harmful effects, particularly for some children, particularly for some boys, but there will continue to be debates, both about the possible benefits and about whether playing those games has harmful effects outside the situation of playing the game, as it were. Mr Carr: This is a problem that we come up against repeatedly when dealing with these types of questions. The fact that there is not necessarily any concrete evidence of harm does not mean that there is not any harm. You would not get through any decent ethics committee of any academic institution a proposition to allow you, for example, to subject a group of children to a constant barrage of violent images or violent games or whatever, as against a control group of other children who were not subjected to those games, simply to see what the outcome was in relation to both of those groups. Ethically, you simply could not do that. So you have to, to some degree or another, use some common sense when you think about these things, because you will never get absolutely cast-iron, empirically solid evidence, I do not think, to support the proposition either way. One of the aspects in relation to games, however, that is of concern, irrespective of the level of violence or pornography or whatever it might be that the game itself might contain, is the overuse. Certainly in some countries, we have had cases reported of children dying at the console of exhaustion; not a very common phenomenon, thankfully, but nonetheless, it is there. This has led some governments, for example, to consider requiring or asking the games industry to try to make sure that all of the high-scoring bits of a game occur early on in the game. So there is a disincentive, the law of diminishing returns sets in, so the longer the child stays on the game, stays on the console or whatever, there are fewer scoring opportunities or fewer opportunities to get a new weapon, or whatever it might be that the game is about. So there are other concerns around games than simply the exposure to violent images or pornography; this issue of overuse and extended use, replacing a social life, whatever you want to call it, getting out and playing football or whatever, is also then part of the equation too. Mr Carrick-Davies: I would just add to that, and first of all say I think it is excellent that the Committee is looking at this issue of gaming, because what we are seeing is this multi-user aspect, children are not just playing solitarily on their own, or with their brother or sister in a closed environment, they are actually communicating to often strangers, and of course, there is a very obvious interest that adults and children may have in one game. So I think that is an important point to note. I echo Sonia's point that we need to have greater evidence of harm, and also the educational benefits; the question was, what are the benefits, and we know that children derive enormous pleasure from playing games. It improves their confidence, their sense of social standing, their ability to multi-task, their ability to receive conflicting bits of information and education. It does strengthen the informal learning that takes place in this environment. But I think it is really important that the Committee reviews this, that you talk to the industry about how they are standardising the advice for parents. Our experience at Childnet is that parents are very easily confused. It is not helpful to see two different types of ratings system in this country at the moment, both have their strengths and weaknesses. There needs to be greater standardisation, there needs to be greater educational awareness about what games are actually about, so it is an important area. Professor Livingstone: There does seem to be an issue particularly within homes about children playing games which, when they have been rated, are rated as much higher than their age. There seems to be a kind of casualness within the home or a difficulty for parents to regulate this which means that many children are playing games with a much higher age rating. Mr Carr: Some parents seem to think that if a game is 18, and they think Little Johnny is Einstein, that he must be able to handle an 18, it is proof that Little Johnny is Einstein. In fact, the 18 rating is about the content and nature of the game, it is not a skill level, and that could be made clearer very often. Chairman: Perhaps on this question of harmful content, I will move to Rosemary McKenna. Q4 Rosemary McKenna: Thank you very much. Just to comment, though, I think some parents have exactly the same attitude to film and video, DVD, so it does not only apply to the Internet. Can we move on to what could be potentially harmful content? In some cases, it is pretty straightforward to decide which content is harmful and what action should be taken, for example, child abuse, but for other content, it is much more difficult to decide what is appropriate. How and where should the boundaries be drawn for content, where the level of harm is a subjective issue? Mr Carrick-Davies: Could I kick off on this, colleagues? I think it is a very difficult question, is it not, where do you draw the line? The industry can very clearly say that if it is illegal, they will not host it, or they will not access it, but whether it is harmful, as you say, is a very subjective issue. Three points of challenges here: first of all, the challenge is that children may be harmed by things on the Internet that are not necessarily perceived as traditionally harmful or illegal. Secondly, with this Web 2.0 world, which is the phrase that illustrates the user-generated content, there is a real blur between the consumer and the creator, and therefore, it is very, very easy for children themselves to upload content or images themselves which can be harmful. This is such a challenging issue. We often see children as passive victims, but actually, the work we have been doing is that this medium, although it is neutral, gives children an enormous ability to amplify inappropriate, harmful, spiteful, devastating comments. The third thing, this is why it is so challenging about where do you draw the line, is that it is a global medium, you will appreciate that it is continually evolving, that it is subjective of parents, even for parents with children of a certain age group, they may be affected by some things which other children in that same age group will not be. I think we need to recognise that there is this grey; there is black and white, but there is also this grey, and how do we respond? I would argue that there are going to be three huge players who need to be involved in this. Clearly industry need to have responsibility, and I am sure in this debate we will talk about the opportunity of media literacy and greater better by design opportunities that industry have, but there is obviously an issue of empowering, supporting and educating children, such an important part of the work of schools, and clearly the role and responsibility and the care for parents to empower them, to give them confidence to talk to children, so that children are equipped, just in the same way that they are equipped when they walk down the High Street, to recognise harm, to respond to harm, to overcome harm. Mr Carr: Can I just say, very briefly, on this: if it is illegal, it is clear-cut, it should not be there. Certainly within the UK, in relation to child abuse images, for example, the industry has done an absolutely fantastic job in screening that out. If it is not illegal, then there is no basis on which Government or police or anybody can ask the industry to intervene, because the industry are not moral arbiters, they are not priests, they have no locus, any more than you or I have, to say to a particular family, or a particular child, "This is good for you or bad for you". What happens in reality is within a school, for example, the headteacher sets the policy within the school, or the individual subject teachers, so they collectively decide what is acceptable in terms of Internet content that children can access or use within that school. I am afraid it is and ought to be the same within the home. What is acceptable for liberal lefties living in Hampstead will be very different from -- I am going to be very careful what I say next: for other types of people, shall we say. There is no way that you can legislate from the centre, or an Internet service provider can legislate from the centre and get that right. There are software packages around, there are programs around, filtering packages and so on, that each family can use to match their own particular social, moral, religious, cultural values, and I think sadly it is as complicated as that. Professor Livingstone: I would start from a different starting point: I do not know what it is possible for you to say to the industry, or what possible regulations might be put in place, but if I think of research on parents and on children, their starting point is often the environment of television, where there has been a very carefully regulated environment. So with television, parents and children have a very clear understanding of what the norms are, what the expectations are, what can and cannot be said, at what point they might complain, and to whom, how they manage what children of different ages would have access to, and it is an environment in which we can say that there is a good relationship between regulation, self-regulation and media literacy or parental management. When it comes to the Internet, the content that I think really worries people, and that children say can upset them, and we have figures of 10-15% of children and young people say they have been upset or distressed by something they have seen online, sometimes that is about access to extreme content of various kinds, but very often the point is that that was access which was unexpected, accidental, without any framework of mediation, without any warning, and content that for them is not, as it were, justified within the frame of the material itself, or for which there was any warning. While I take John's point that in the home, there are many tools available for filtering and monitoring and so forth that parents can implement, the struggles that parents have, and the research shows parents have to implement this, and we have to say the inequalities, it is very often more middle class parents who are more equipped and able and motivated to implement some of those facilities, the result is that many children are continuing to come up against accidentally upsetting, inappropriate and on occasion harmful content. Q5 Rosemary McKenna: To what extent is user-generated content that encourages violence harmful? Professor Livingstone: We do not have good evidence on that specific question, but what we can say is that there is evidence that material which, let us say, is outside one's expectations, outside one's norms of what you expect to see, is often shocking. It can be part of setting norms about what is acceptable behaviour, so if you see certain kinds of, let us say, cyber-bullying on YouTube, if you see that kind of content routinely, it might set a different norm of what is acceptable in a different situation in the playground. There is something, I think, to be said, and a lesson from television about the impact of repeated images, of seeing over and over certain -- on television, we did not only worry and try to keep off certain kinds of extreme violence, but we have also worried about the repetitive effect of violent images over and over, and I think on the Internet we have not begun to think about that at all. We have only really thought about the extreme content. Mr Carrick-Davies: If I may just add another comment on that, it is very important to reflect on the differences between harm in the offline world that children could exert against each other to that which is online, because of the 24/7 nature of this, that it is hard for a child to actually escape from the bully, they cannot have the sanctuary of the home. The very fact that there is a lack of closure, that if something has been put up on a social networking site, or has been circulated to vast numbers, it is very challenging for the child to ever think that has been removed, it will always be up there in space. The other classic point, we do need to watch very, very carefully, and schools tell us this, the bystander effect, that children will actually see, as Sonia alluded to a little bit, that this becomes the norm, they then get involved in it, and see it as a prank, or as a joke, or as just a bit of fun, rather than the devastating pain that it causes. The Government, the DCSF has done some very good work on giving advice for schools on cyber-bullying, one of the messages is that those bystanders, if they laugh at it, they are very much a part of it. So I think we cannot see the issue of harmful content without the context of the peer group. Q6 Rosemary McKenna: On these social networking sites, to what extent does content that displays personal information lead to harm on social networking sites? For example, I have heard of young families who refuse to put their children's photographs on social networking sites. Is that a reasonable thing to do? Mr Carrick-Davies: That would be dependent on, as John said earlier, the morals and norms and values of the family. I think there is a real danger that children do not appreciate that a photograph that they do put up on there can be online forever. One of the messages that we say is: would you want your future employer to see that photograph that was taken at a party in a compromising position? One of the challenges -- Q7 Rosemary McKenna: I do not mean that, I mean just normal families, exchanging photographs across the social networking sites, I have heard of people who are concerned about doing that, because of what could happen, the abuse of that photograph. Professor Livingstone: I think the issue there is about the kind of privacy controls that are provided for those sites. So if we imagine your family wanting to put up family photographs, they can set it to private and control who gets access to those photographs. Q8 Rosemary McKenna: Do you think that is robust enough? Professor Livingstone: I think very often people are confused about how those controls work, they are not very subtle, so pretty much you either give people access or you do not. There is some evidence that people do not understand or do not take very seriously what those decisions are, partly because they are not aware of possible abuses of that information. So I think there can be more done there, both in making people aware of the possible risks but also in making those controls more tailored to the particular needs of those who use them. Mr Carr: The point you raise goes a lot wider than, if you like, the traditional -- traditional is perhaps not the right word, but the worries around paedophiles getting access to information and tracking down a child. I was horrified a few weeks ago when I read, for example, the admissions tutor of one of our older universities acknowledging that he had been trawling some of these social networking sites, looking for information about people who had applied to his college. By the way, he was subsequently rapped over the knuckles, very publicly, I am happy to say. But there is an anxiety around it, certainly in the Children's Charities field, about the fact that colleges, or more pointedly, perhaps, employers, might be looking at these sorts of sites, and seeing, you know, some 17-year old kid doing something that 17-year old boys or girls do, and that being then seen by a potential employer or a potential university admissions tutor, and weighing in the balance when they are deciding whether or not to offer the person a job or an interview or a place at university. I think that is shocking, I think it is outrageous, it is not illegal, but I hope it soon will be, and certainly we intend to take this up as a campaigning point. We do tell children and advise their parents and so on not to put things of that kind on the Internet, but they do, that is what kids do. They should not, however, suffer potentially a lifelong penalty from not getting into their preferred university to do their preferred course, not getting the job or the job interview, because people use that information in a different way from that for which it was intended. It is certainly something that you may be hearing from us on in the near future. Q9 Chairman: How do you make it illegal? Mr Carr: In the same way that we have anti-discrimination laws about what you may or may not take into account when you are offering somebody a job. The point about these pictures and these sorts of things that you sometimes see on the websites is you can never possibly have the full context in which the particular photograph -- Q10 Mr Sanders: How many cases are you aware of, of people who have not got university places or have not got jobs as a result of this? Mr Carr: There was the case in the newspapers that was reported, I think it was three or four weeks ago, and anecdotally we have had parents and others -- there is no hard evidence. Q11 Mr Sanders: So there is one case in the paper. Mr Carr: Yes, a few weeks ago. Q12 Mr Sanders: Was it proven beyond all reasonable doubt, that they -- Mr Carr: He had acknowledged that he had made a mistake, and he promised not to do it again. Q13 Mr Sanders: The employer? Mr Carr: The admissions tutor. I will send you the press cutting. Q14 Mr Sanders: So we have one case. What we are discussing here is real harm to children, not one case of somebody who might not have got into a university because of something -- Mr Carr: That would be a real harm to that child. Q15 Mr Sanders: Maybe it would have been real harm, but what we are concerned about is the much more general issue, we seem to be going down what I think are side roads of things of predicting what might happen when there is no real hard evidence for it at all. Mr Carr: Well, there was in that one case, and there are certainly anxieties and fears that it is happening on a wider scale, and it should not, that is all. I do not want to labour the point. Q16 Mr Sanders: But you could not enforce the rule anyway, could you? Mr Carr: No, but what you could do, and I would expect responsible employers to say to their personnel management, or their personnel staff, their recruiters, and responsible universities to say to their admissions tutors, "Do not do it, it is not allowed". Q17 Mr Sanders: But you cannot enforce it. Mr Carr: You can say that about almost any law really. Q18 Mr Sanders: I am sorry, but -- Mr Carr: It is all about getting the evidence, is it not? I agree, it is not a major -- we ought not to be detained by it. I apologise if I have introduced a red herring. Q19 Adam Price: Can I take us down a different side route or path? Professor Livingstone, you made the analogy between the kind of regulation and self-regulation that we have, for instance, in the broadcasting world, or even in the press, and indeed, Hugo Swire has recently made a proposal for creating some kind of, I suppose, Internet standards body. Do you see any merit in creating some kind of, I do not know, umbrella organisation which would regulate or self-regulate content on the Internet? Professor Livingstone: Well, this is a huge issue. Perhaps I can start somewhere slightly different, in that it is my understanding that all kinds of self-regulation of online content goes on in various ways already. Internet service providers, content providers respond to complaints from individual consumers: they take down upsetting user-generated content from YouTube, if somebody makes a complaint to that end; they are aware of content that might damage brands or contravene copyright or whatever. So I would not make the starting point that online content is not being regulated, should we have a body to do so? I would rather say there is all kinds of activity going on that does already do some kind of regulation; is the public aware of it, does it know that they could complain, do they know that there are other -- they do not read their terms and conditions, they do not read their privacy policies, we know that. Is there some way in which that process could be more co-ordinated, more accountable to the public, made more transparent? Would people have a right of appeal if they disagreed with a decision made in terms of content? I would think about those kinds of principles and say that there is more that could be done than is currently being done. Whether that is best wrapped up as one new body, or whether this is a way of extending Ofcom's remit, that is not my expertise. What I can see is that the public understands very well how all kinds of traditional content are being regulated, they understand how television is differently regulated from the press, they understand that even the advertising that appears on their child's bus stop is regulated. They are simply bemused and upset that they cannot see and they do not understand that anything like this happens online, and at the same time, to go back to the benefits, they have very clearly grasped the message that they must and should get their child online and give them broadband access, to help with their education and so forth. Mr Carrick-Davies: Could I echo just one of those points? I think it is right that the Internet is not some moral vacuum, that there are very good existing laws for some of the providers who work in the online and offline space, from the point of view of advertising, for example, that they need to adhere to. At Childnet, we would really be very cautious about imposing new stringent regulation or the creation of any new regulations in this space because of the fact that there are unintended consequences of that. The very fact that you can restrict freedom of expression, which is such a valuable plus of the Internet, giving children an enormous voice, an enormous opportunity to communicate, to say nothing of the increasing costs of paying for this monitoring, the fact that less scrupulous providers may come into this space, it is a global medium, but that does not mean that we just say, okay, we just have to cope with this laissez faire. There are very important things we can be doing in terms of identifying what is working, drawing out best practice, monitoring the way that that best practice is actually being embedded and being rolled out. There is a real duty on governments to keep up to date with the statute book, and make sure that there are not gaps, and we need to often play catch-up. There is such a need to develop a kind of international co-ordination of this. This is a real leadership issue, ultimately, for the G8 or other international bodies, including the Internet Governance Forum, which we should be recognising is now really taking mainstream and looking at issues of access, security, openness and diversity. So I think my wish for this Committee would be to identify what is working well, and to make sure that we really prioritise the strengthening of existing multi-stakeholder initiatives, so we can actually ensure that there is a good safety net in this environment. Q20 Mr Sanders: There is an awful lot of talk and speculation about potential risks of online content, but the actual risks do not seem to be very well researched or proven. Are people getting overly concerned about risks that are only anecdotal at this stage? Mr Carrick-Davies: I know Sonia will have lots to say about this, because there are real challenges about getting evidence that we just need to rehearse and remind ourselves. First of all, this is such a new medium, so it is very hard to say, has it grown? If you ask me, has cyber-bullying grown; ten years ago, children were not actually texting, but there is no evidence of inappropriate texting at that stage. So fundamentally, it is a new environment, so we are in new territory and it is difficult. Secondly, there are real problems about efficacy in terms of: you are working with children and young people, do you expose children to harm and then compare that with another group of children? Very difficult to look at how does harm manifest itself in a child's life, to say nothing of the fact that every child is different. The third point that is really important to recognise is children have real difficulties in disclosing harm, for very, very good reasons. Let us just think about the fact that we are parents here in this room. If my child says they have come across inappropriate content, are they going to be scared that Dad is going to go (indicates) round the back of the neck, not that I would ever do that, but the fear of being told off, the fear of getting somebody else into trouble, the fear that this wonderful, exciting, brilliant technology that is part of their life is going to be taken away from them, these are very real challenges. So I am not wanting a cop-out and saying just because evidence is hard, we do not go down that route; we have to apply, we have to base our rationale, our decisions on sound evidence. It is coming together slowly, but John's point is the absence of evidence of harm does not necessarily prove that children are not being harmed, so with that caveat, that there are real challenges and difficulties in looking at the evidence, we need to say: what is the evidence? Sonia's work with the UK Children Go Online report, that carried out studies with children and parents, comparing their experiences back in 2004 and 2005, which John and Childnet were supporting, showed very real evidence that parents underestimate children's negative experiences online, for some of the reasons I have just mentioned there about disclosure. Q21 Mr Sanders: Is not one of the answers to this actually quite simple, and you guys actually have got a major role to play in it, which is actually to just keep banging home the message, time and time again, not to allow children to have their access away from other people in the household; in other words, not to have a monitor and a computer in a bedroom, but to have it somewhere accessible, where other members of the family are passing by. Would that not actually do more to keep children out of harm than any other measure you could take? Yet neither you nor the industry seem to be willing to go down that route of simple advice to parents. Professor Livingstone: I think as a simple message to parents, it is worth reiterating clearly, but one of the things that research also shows is many parents do put the computer in a living room or in a public place, and children still find a way of using it when no one is looking over their shoulder. I think we have to be realistic about the situation, particularly for young teenagers -- I think if we are talking about 9-year olds, there is less of an issue in terms of parents knowing what their children are doing, but if we are talking about young teenagers, what parents are trying to do then is precisely to give children the kind of independence to explore and to make some of their own decisions, and learn to cope with some of the risks that they encounter, that means that they cannot as parents be always looking over their shoulder, looking at their MySpace, checking their emails and so on, and much of what goes on on that screen, which is very fast and furious from a parent's point of view, is quite hard to understand, as you look over their shoulder. So I think children are motivated necessarily to evade some of that kind of scrutiny, and parents are not particularly well equipped in many homes to understand what is going on on the screen, even if they do casually see it. I would rather say that a crucial message or a crucial thing for us to think about is the way in which parents need to encourage their children to cope with some of the material they encounter, but what we do not want is an environment in which those risks are radically extreme and completely outside anyone's ability to cope. So of course children, let us say, a 13-year old, one might expect to see certain kinds of images of nudity or pornography, but perhaps not some of the violent pornography that would be more than one could expect a 13-year old to cope with. The question I would say is rather how do we empower parents to encourage their children to explore so that they become resilient and able to cope with certain kinds of risks, many kinds of risks, without coming across the kinds of shocking content that some children are seeing and a minority are saying is upsetting. Q22 Mr Sanders: But would it not be a start to actually focus a major campaign saying that, because it seems to me that is not a message that is coming across from anybody, and yet it could do so much good in terms of reducing the risk. Mr Carr: You would have to take it at least one step further and remind the parents to keep the screen facing outwards, because if the screen is facing in towards the wall, as soon as the parent walks into the kitchen or the living room or whatever, one click of the mouse and the whole screen changes, whereas if the screen is facing out into the room, the parent will see what is on the screen the minute they walk in the room. Q23 Mr Sanders: Come on, most people would have it against the wall because that is where the electrical socket is. Mr Carr: No, just the screen. It is just the very practical point about the way the screen is facing, because if it is facing towards the wall -- the screen, nothing to do with where the socket is -- then it is pretty much useless, because the child will be able to minimise the screen and their homework will be on it by the time you get round. Q24 Mr Sanders: But you could minimise the screen anyway. Mr Carr: I am just saying, it is slightly easier -- Q25 Mr Sanders: Are you saying it would not help? Mr Carr: No, I agree. Q26 Mr Sanders: So can we stop the sophistry and -- Professor Livingstone: It has been said. Q27 Mr Sanders: Let us actually have a campaign, let us see you guys leading it. Mr Carr: With respect, it has been said from the very beginning. First of all, over 20% of children now have Internet access on computers in their bedroom anyway, and with the growth in handheld devices, mobile phones which have Internet access, the salience of that point is less than it used to be. There is a second point -- I mean, nobody disagrees with it, but it is just increasingly less relevant, because there are so many other ways in which youngsters can get access other than through a standard PC in their home, eg a handheld device, eg a mobile, eg a laptop. Schools are encouraging children more and more to have these handheld devices and laptops, so the physical location is not so easy to solve as it used to be. In addition, just bear in mind, if you have three or four children, and there are families with even more, and they all have to do their homework, every evening of the week, we hope, all good children are sitting down to do their homework between, say, six and eight o'clock or whatever, and you have three or four children, each wanting to get on the PC at the same time, what do you do then? Lots of families now have two and three computers for their children to use to do their homework. Do you have two or three computers in your living room? It means you do not have a living room. Q28 Mr Sanders: I do accept all these points, every one of them stands up to scrutiny, but all we are hearing is reasons why things cannot be done. What we need to hear is what can be done. One of the things that could be done is that advice, there are other things that we do not know that need to be done, that is what we want to hear, so that we can recommend, in order that we can protect people from harm. Mr Carr: I think it is a lot to do with the age of the child; they have a right to privacy, and they have a right, as they get a bit older -- certainly for very young children, I agree with you 100%, I think it is a matter of judgment for each family to decide when the child has some right to start to explore on their own. Mr Carrick-Davies: Just one final quick comment on this, because I know we are going to come on to a session on media literacy, and what is needed as a whole range of educational awareness, not merely to parents, but to children and young people, but I do appreciate your question, and I do think that sometimes it is the obvious question that you can ask a parent to start that dialogue. I do not think children just want a set of rules, because the whole point about children growing up is often breaking those rules and pushing the boundaries. The point John and Sonia made very eloquently is that this is all moving to a mobile platform, and I know that you will be talking to the mobile operators soon. We have a narrow window of opportunity, if you like, where children are going to be using these big grey boxes sitting on desks. In another five years' time, we will not even see that sort of stuff, we will not even see laptops that big, it will all be personal, portable and private, three Ps to always remember about this space. But my point is that if parents can start thinking about the location, and John and myself and Sonia, if you have heard the news reports for the last 10 years, I have been in this environment for 10 years, we have been banging on about location, especially for webcam based computers in children's bedrooms: absolutely not appropriate, get it out of the bedroom, you have heard us say that, but it is counter-intuitive, because the very nature of the technology is a private medium. You know how hard it is to even get your children to sit down and watch television together. Let us have a reality check: this is a private medium, and therefore, children will want to do it on their own. So that question that you ask is absolutely brilliant, but it needs to be the start of a process, not just simply the rule or the advice. Q29 Rosemary McKenna: Surely children must be exposed though to some risk. I mean, I actually think there is a concern about children not being exposed to risk by being allowed to be out and play the way children did in the past, so what is an appropriate level of risk for children on the Internet? Professor Livingstone: Well, the psychologist's answer is anything that they are not yet mature enough to cope with, and not able to learn to develop, so that they can encompass. That means that we are dealing with a content environment to which people respond in very, very variable ways, partly depending on age, but also depending on vulnerability. One of the difficult issues is that even if you look at children of the same age, some will laugh off something they have seen, and others will be upset by it, and the discriminating factor probably lies in the rest of their lives, in their relationships with their family and the other difficulties that they face in their daily lives. So response to the same content is very variable, and worryingly, it is the children who are more vulnerable to online content who probably have the fewer social supports in the rest of their lives, and precisely do not have the parents or other adults that they can discuss these things with. So there is a kind of compounding of vulnerabilities if you like, for a minority. Q30 Rosemary McKenna: That illustrates the difficulty with this whole inquiry, as to how do you manage that, how do we come up with the report which says yes, there are ways of dealing with this whole issue. Mr Carr: I think you often hear, or I have often heard parents say things like, "Oh, everything, Little Billy is not out playing on the street with the bad lads any more, it is wonderful now we have got the computer with the Internet, he is up in his bedroom doing his homework all the time". Part of this is about getting across to parents that when you bring the Internet into your home, you are bringing the rest of the world into your home, and getting these messages across to parents and helping parents understand how they can support their children and guide them. Rosemary McKenna: We also want to be able to protect children from harmful content, that is the bottom line. Q31 Philip Davies: The bit I do not understand about this is that some people will watch certain things and it will make no difference to them whatsoever, whereas some people will watch some violent thing and they might replicate it somewhere. The bit I do not understand is therefore, it is not the content that is the problem, it is the individual, or it is something within that individual that is the issue, not the content in itself. So how do you get to the stage where you regulate content when, for the vast majority of people, perhaps even 99% of people, that content is not doing them any harm whatsoever? Mr Carr: Well, this is at the heart of all of these things, everything hinges on the individual, which is why I was saying in response to the last question that a lot of what I think is the major public policy challenge is helping parents to understand what the nature of these issues are, so that they in turn can help their child deal with it, because every child is different. You cannot just say, it will be like this, and it is the same for everybody, because it plainly is not, and parents are going to be best placed to make judgments about these things for their own individual children. You are absolutely right, there is not a simple line that you can draw and say, "Yes, no, good, bad", it is complicated. Professor Livingstone: Between the content and the individual, there is, you know, something we might call cultural or community norms or expectations of how to respond to certain kinds of content, so if one has a society in which certain kinds of, let us say, violent representations are routinely laughed off and understood to be in the world of fantasy rather than reality, for most people, that is not going to be harmful. If one has a set of content to which we have, as yet, no articulated response, people do not know how to respond, it is often unexpected or without a framework for interpretation, then there are different kinds of risks, and one of the difficult things about the Internet is that all kinds of diverse content -- you know, we could not just worry about violence, we might worry about websites that encourage anorexia, for example, we might worry about race hate sites, there are all kinds of content there that children might and on occasion do encounter before they gain the kind of cultural interpretation that says, "This is funny, this is fantasy, this is problematic but here are your resources for thinking about it, this is upsetting because it is unexpected, we try to avoid you finding that". So partly it is the very diversity of what is available on the Internet that is the challenge, by comparison with, let us say, what children might find in some other media. Q32 Philip Davies: It is all subjective, is it not? What you might consider to be harmful, I might consider to be funny. If you were to ask 11 people on the Committee what they thought was appropriate and inappropriate, we would probably have 11 different answers. So the point is, which comes partly back to Adrian's point, whatever you do, you cannot substitute parental -- it is up to each parent to decide surely what they think is right for their child, what their values are that they want to bring their children up with; you cannot get away from that base, can you? Mr Carr: That is absolutely true, but what we know from Sonia's research and other research is that parents feel very much at sea when trying to understand how to do the right thing for their child, and how to set the right framework of Internet use or whatever in their home for their children, and it is one of the reasons why we, for example, within the Children's Charities have advocated that safety software should be preinstalled on every new computer that is sold into the domestic market, and set to a high level of security, so that there is at least something there when the parent turns on the computer for the first time that will screen out certain types of image. For many families, those basic settings that would be in place on day one will perhaps not be appropriate, and they would have the opportunity to change them. What I think is wrong about the situation we have at the moment is nothing is there, and so in a sense, when the manufacturers sell the computers into the domestic market, they are leaving it to chance, in a way, that the parent will find out that there are issues, act on it, act wisely, and be able to transmit to their children those sorts of values and issues. So I think something being there at the beginning would help a lot of parents, and actually, apart from anything else, it would be a major learning opportunity for them if there was a piece of software or some safety stuff there when they turned on the computer for the first time. Professor Livingstone: Can I just follow that up? I am trying to imagine the parent willing to implement exactly the kind of domestic regulation that you suggest, but I ask myself, how does the parent know that, for example, cyber-bullying images or beheadings or pornographic images are there on YouTube? Does the parent feel empowered to say to the child, "You must not use YouTube", which is in a sense to kind of ostracise him from many jokes that go around in the playground, so how do they even know it is there? What mechanisms are then available to them to prevent their child getting access to that kind of content rather than all the other benefits that are there? This is not a medium in which parents have subtle skills, or indeed in which there are subtle resources provided to say, "Yes, you can have this content; no, you can have that content". Then, if I put that into the context of a young adolescent who is already not telling their parents that they are trying smoking and not coming home when they are meant to at night, fearful that if they do tell their parents, the commuter will be taken away, I struggle to see that in many households there will be a good basis of both knowledge and trust that will allow that kind of reliance on parents to work as we would wish. Mr Carrick-Davies: One final reflection on this, if I may. You have to ask the question, where is the voice of the parent heard? We started the discussion about the media and often the media will be very good at illustrating parents' concern at point of tragedy but, just picking up on what Sonia said, I do not think that parents do understand the services that their children are using because I do not think the companies that are providing those services, which are broadly attractive for children and young people, are very good at actually explaining or illustrating or making their terms of services accessible, child-friendly, simple, understandable at the point of start-up and induction. A parent would have a choice if they knew that most of the content on You Tube would be for a certain audience or that you have to be a certain age to actually set up a channel or that for social networking services it would usually be targeted for a child of a certain age. So I do think it is incumbent on the industry to be more proactive in making good quality terms of reference for their services more accessible, more easily available. We have done this with the mobile industry. The mobile industry have done a great deal of work in this country in identifying a code of practice which gives parents greater choices. The problem is parents do not know that there is internet filtering at the point of purchasing a phone or that a phone could be set as a child phone at the start of the contract. I think that is part of the education case. I am sorry if I am adding another layer of complexity but I do hope you appreciate that some of us have been grappling with this issue. Oh, that it would be so simple to have a simple silver bullet, a rule, but what we have to find out and the task for this Committee is looking at the granularity of these issues and thinking what we can do quickly that will make maximum impact. Chairman: We are getting a little short of time so can I ask that not everybody answer every single question? Q33 Paul Farrelly: I just wanted to come back to regulation, which Adam first touched on. There appears to be a consensus, particularly with the internet with its worldwide scope, that self-regulation is the best possible approach. I just want to explore the extent to which you think it is working and whether you agree with that. Clearly, in areas of what you can say, there is a growing body of libel law which is developing with respect to the internet that would not allow Google and You Tube to say, for example, "Nothing to do with us, guv," which was their reaction to the gangs videos that were on when the young lad was killed in Liverpool. The Internet Watch Foundation lists three areas where people have to watch out because the legal sanctions are already there with the criminal law: child abuse; criminally obscene content, however you define that; and incitement to make racial hatred. Do you think that list should be made more explicit and clearer through legislation or regulation, for instance, with respect of the whole issue of incitement to violence, be it cyber-bullying, the misnamed "happy slapping" and some things more extreme? I would just like to gain your views on that area. Mr Carr: Add to the list? I think the police have said in relation to the specific point you have made that they feel that there are limitations on their powers to act, for example, in relation to incitement to violence or incitement to suicide and things of that kind, so to that extent - and I do not know if you are going to be calling anybody from the police - we would be broadly sympathetic to that point of view. As for extending the categories, journalists ring you up every other week saying "Should we ban suicide websites? Should we ban anorexia websites?" It is a matter for Parliament in the end to decide what the laws of our country are. My point at the beginning was that, in the absence of a particular topic being illegal, it is very difficult to see how you could ask the industry to make a decision off its own bat. If Parliament thinks something should be illegal and makes a new law, the industry will respond. In the absence of a clear law, however, I think it is always going to be a bit messy. Professor Livingstone: Can I disagree with that? Just to go back to the point I made earlier, my understanding is that the industry is already making all kinds of decisions about when to permit a certain kind of website to continue or to change the content. All kinds of content regulation is going on under the banner of self-regulation. My preference would not be to, as it were, restrict freedom of speech and say more things become illegal but to have a more clear and transparent and coherent code whereby the kind of speech is managed, not necessarily to ban it entirely or to make it illegal but to make it, say, so difficult that the casual surfer does not come upon it. That is my understanding of precisely what we do in the broadcasting code on television. We do not say things cannot be said; we say they should not be something that you come upon without warning, accidentally, without realising what you are getting into. We could go a lot further down that route, I think, before deciding certain content should be illegal. Q34 Paul Farrelly: If I can pursue that, if you then make complaints, a member of the public or a member of your organisations, under self-regulation, unless the police intervene, it is up to the company to decide whether to remove that content. You mentioned the avenue or potential avenue of appeal, which there is not now. How would you grapple with that issue in practical terms? Professor Livingstone: Could we have an independent body that oversees and reports on the effectiveness and the accountability of self-regulatory codes? Q35 Paul Farrelly: What is your answer? Professor Livingstone: I would say I think it is an idea worth pursuing. I would like to know if this has been attempted in other countries or in other areas of regulation. I am not a regulation expert. I know primarily about what children do in their house with the internet but it seems to me that there are principles of regulation which perhaps should very clearly be applied, especially when something as important as freedom of speech is at stake. Q36 Paul Farrelly: You mentioned that there are decisions being taken all the time by the companies as to what to put on and what not to put on, so it is not a case of "Not us, guv" but there are two ways to ratchet this up: they can employ banks of people themselves, which might affect their business models and their profitability. Professor Livingstone: Banks of people are already employed. That is my understanding. Q37 Paul Farrelly: If you want to increase the banks, or you can have banks of people like yourselves being Mary Whitehouses of the Viewers and Listeners Foundation, making complaints regularly on a coordinated basis but then how do you measure together the response of the industry to see actually whether they are taking much note of what you are saying and what the complaints are? Mr Carrick-Davies: I really want to look at this issue in some detail because I think it is very good that the industry have said that they have banks people looking at moderated content, that they will take things down but anecdotally we hear, especially from schools - and schools are places which actually provide often a filtered and protected environment. Here is a statistic from the Southwest Group for Learning, which is the schools in the south-west region, about 2,500 schools across the whole of the south-west region, which did some research and they asked of those schools that had complained to social networking providers about inappropriate, harmful, and offensive content, how many of them had taken it down. 41% of schools were unsatisfied with the response that they had had from social networking providers about their complaints, complaints that were well done, put together properly through the mechanisms that were there for them. So I do believe that these companies need to be seen to be doing more to respond promptly. I am delighted to say that the social networking providers are working under the auspices of the Home Office task force on guidance looking at best practice which will cover the issues about better reporting, better education, better design and better removal of content. Let us see what happens in that space but it is a very important point that we need to monitor and see how effective that self-regulation is. Q38 Paul Farrelly: If I can ask you what more things do you think ISPs and content providers and also mobile phone companies can do, what one, two or three things can they do better, in the opinion of each of you? Professor Livingstone: Provision of clear information to parents and children that people can understand. That is number one, the positioning of that kind of information at the point where it matters. I take John's point about what happens when you first buy your computer and you get the information but you also need it on the sites, on the social networking sites, a reminder when you are using Instant Messaging: "Are you sure you want to send that photograph?" I think there is a lot of just-in-time, at-the-right-moment information that could be and should be provided. Mr Carrick-Davies: And much greater choice. Why do we not see the market providing a phone with limited functionality for a parent who says "I want to see the benefits of a child using a phone for safety reasons but I don't want the camera phone or the internet enabled at this stage for a nine- or a ten-year-old child"? Why do we not have greater granularity of choice? You can say that the industry is finding it very difficult to advertise or market programmes for children because of the Stewart Report but we need to get beyond that. We need to say that parents will actually prefer choice and if they can go upstream in terms of the service that they provide, with companies going upstream and providing better filtered products, I believe parents would actually migrate to those services. At the moment there is not the choice, I would argue. Mr Carr: I think the pre-installation is a key thing. I would also say - this is very much more prosaic - a lot more stuff on bits of paper telling a parent "If you want to find out how to keep your child safe, go to the following website, click through and do all of that" is not always the best way, whereas if you have simple messages, well presented on bits of paper, that is an environment that they are much more familiar with and much more comfortable with. Q39 Paul Farrelly: I would like to ask one final question on a recent case. Recently a young girl was convicted of aiding and abetting by taking pictures on a mobile phone of an assault that led to a death. This whole area is a very tricky area because it covers what you can show as a television company and how far you are involved. It is a very tricky area but I am sure many of the content providers will now be looking at the risks that they run and just trying to check in law where they stand if in these instances custom and practice is that these people do this to be able to upload it to sites so that everybody can see it. In this sort of instance, that practical instance, would you prefer a change in regulation or the criminal law to make the risks clearer as to whether in those sort of instances, if they show those scenes if they are uploaded, they could also potentially be charged with aiding and abetting an offence? Mr Carr: I think that the argument is running now. I am very sympathetic to it. I take what Sonia was saying earlier about the way in which informal mechanisms of content management and editorial things are happening all the time within the industry, but when you get to categories, when you are talking about people with categories of incidents of the type you have just mentioned, I think there clarity on the part of the law is essential. Professor Livingstone: I am unclear whether your purpose is to try to find a mechanism of ensuring that crimes are resolved better or whether there is an additional risk posed by the presentation of certain kinds of material. Q40 Paul Farrelly: You are evading the question. The question is quite clear. Professor Livingstone: Then I may need to ask you to repeat it. I did not grasp it. Mr Carrick-Davies: I would like to hear the question again, please. I do not think it is unreasonable, just to be clear. Q41 Paul Farrelly: If the purpose, by custom and practice, what is going on on the ground, of taking mobile phone pictures of an offence such as what has been called "happy slapping" is actually to the upload them on to places like You Tube, where people can watch them, so they can watch them, in that instance, if content providers like that continue to show that sort of content, like the person taking the photograph, because of the purpose of the whole activity as demonstrated by what happens, should the law be clarified that actually, if they continue to show that, they should potentially run the very clear risk of being convicted of aiding and abetting an offence? Mr Carrick-Davies: Mr Farrelly, that is a straight question. My straight answer is "yes" but it is a challenge because we need to rehearse the arguments carefully. With respect, none of us are legal professionals. The issue of media law is incredibly complex. You have to think about the rights of the vendor as somebody who provides this content. Do they have content status or vendor liability? These are difficult things to work out but that is a very good question for this Committee to look at. We have precedent now where a court case has been held and aiding and abetting has been proven. That could be one of the very simple actions that will actually stop a lot of this content being put together. Q42 Paul Farrelly: Was that clear enough for you, Professor Livingstone? Professor Livingstone: Yes, it was. I think, with the caveat that I am not an expert in media law, I would probably say yes, except that I am just worried about the balance between, as it were, criminalising the user who is uploading rather than the person who is providing the site. If I think about the different area of music copyright, we have effectively criminalised half our young people in this country because they are using software and facilities which have been made available by the market. I do not know that that is a constructive way to go. I think we are teaching a lot of children to ignore the law. Q43 Adam Price: I think one of you said there is no silver bullet but the Australians think they may have found it in their so-called clean feed system, which will block all access to certain categories of extremely harmful material. The Howard government promised to do it seven years ago; they did not succeed. It became a big issue in the Australian election campaign and the Labour government are going to introduce the system. Should we be doing that here as well? Mr Carr: I am very happy to tell you that the Australians copied it from us. BT - bless them; they did not have to do it - developed this thing called clean feed. I think in your next session you have Peter Robbins, the Chief Executive of the Internet Watch Foundation, appearing before you. I do not want to steal his thunder but it was the IWF who pioneered this policy. They maintain a list of the illegal websites, they hand it to internet service providers. BT was the first one to do it and that is why 95% of domestic internet users in the UK can no longer get access to that kind of material. Many other countries are beginning to copy it, including the Australians. It certainly has wider applicability but it only deals with illegal content. It is absolutely limited to illegal content and, in this particular case, it is only dealing with child sex abuse images. Q44 Adam Price: In Australia it has given rise to a wider discussion, including legal challenges. I understand that the Australians - and I am familiar with your work - are looking at maybe widening it to other categories of material which is seen as harmful and unnecessary and illegal. That is the question I am asking. Mr Carr: I think you are legally obliged to provide a filtering package. Every Australian internet service provider is by law required to offer free to every Australian family a filtering package. I am not sure what the rate of take-up is but it is an interesting experiment. Q45 Janet Anderson: Could I ask you, Professor Livingstone, about media literacy? I think you say in your evidence that you are not sure about how effective this is. What is the importance of the role of media literacy? Microsoft have called for a government-funded public information campaign. Would you support that? Professor Livingstone: Yes, I would support all information provided to parents and children - indeed, to everybody - about the possibilities and the risks posed by all media, particularly the new media that people are unfamiliar with. I think media literacy is a crucial part of the broader picture. I think people are motivated to try to understand the media that they are using and that their children are using and they would like to understand it better and be more sophisticated users, though they have constraints in how far they can do that. What I put in the evidence was a note of caution that - ironically, in a way - just as there are questions about the evidence for harm, so too are there questions about the evidence that media literacy works. In other words, I do not think we yet have a body of evidence that shows that if you provide or increase the level of media literacy among some compared to others, that those people then encounter fewer risks or are better able to address the risks that they do encounter. That is the follow-up point I really wanted to make. So people learn more about the media but very often, when people know more about something, they use it in an even more complicated way. So media literacy might be the springboard to taking yet further sophisticated risks and encountering other kinds of harm. What I saw in the UK Children Go Online project was an association between those children who did go online and gained more benefits, they thereby learned more about the internet, they were more media-literate but they were the ones who were getting into more of the risks, and it is often the naïve, cautious one who knows less that is also safer. This is a paradox that I think we will really struggle to address. One thing I would urge is that, if we do have a big public-funded campaign, we do not evaluate it in terms of asking whether people got the message of the campaign but we evaluate it more ambitiously in terms of whether people then encountered fewer risks or whether they were better prepared to deal with the risks that they did meet. That is the question that does not get asked. Mr Carr: There was some work done in the University of Central Lancashire in Preston a few years ago. They interviewed a whole range of different children and found some of the children who were doing very risky things on the internet were absolutely 100% aware of all the dangers and all of the risk and it did not have any impact at all on their behaviour. There are obviously some children who are disposed to risky behaviour, perhaps those who come from more vulnerable backgrounds or whatever. They are the ones we really need to find a better way of targeting. Mr Carrick-Davies: Could I make a real plea to the Committee that, as you look at the whole issue of media literacy, you understand what is happening within the curriculum taught in schools? Universal access to the internet takes place in schools; most schools have 100% connectivity. It is now finally embedded on the QCA, the Qualifications Curriculum Authority key stage three. That is 12, 13, 14-year-olds and upwards. Why is not the content of e-literacy with a reference to safe and responsible use embedded at the age of seven, eight or nine, when children are just beginning to use this technology and we have greater opportunities to influence behaviour? We would argue that we need to do that in tandem with three other things: first of all, the empowerment, training and support for teachers coming into the profession. They do not understand how the swimming pool works; they have never dived into it. This needs to be hands-on experience and extensive work with the TDA, Becta, QCA and Microsoft in doing a resource called Know It All for Teachers, where we have actually helped them to understand the online safety issues. That is to be welcomed but more needs to be done. Secondly, if you think about where teachers derive their continuing professional development, we need to find ways to actually empower some of the teachers who have never used these social networking services, do not know a thing about the technology, because it is so much a part of children's lives. The whole point of schools is to prepare children for life outside the school gates, so we need to ensure that children look at this. If I argue that the Government believes healthy eating and obesity and school meals should be on the agenda, what are we saying about the reality of children's lives online, and we do not touch it? To quote somebody from Becta who said it a few years ago, it is almost as if we said, as we did in the Sixties, "We have a no-smoking policy therefore we don't talk about smoking." You may not be able to access that very rich, interactive, potentially harmful and dangerous content in schools but we need to prepare life for it. I would argue that the biggest opportunity you have is to strengthen the curriculum on media literacy and ensure that schools are playing their part. This is my last point and then I will shut up. Parents do trust schools. They do not trust, with respect, large companies that look like they are trying to shift product. They also, with respect, often do not trust big government. What they will trust is the relationship they already have with their child's school, with their social worker, with a health worker. This is the challenge, if you like. We need to combine strategic initiatives with random acts of education, where children and young people are talking to their parents and understanding that we have a nurture of care and if we can engender that culture to get children and parents to talk together, get schools to address this issue because it is so relevant to children's lives, then I believe this Committee will make a profound impact in this area. Janet Anderson: Thank you very much. Chairman: Can I thank the three of you very much. Memoranda submitted by Microsoft, Internet Watch Foundation and Media Literacy Taskforce
Examination of Witnesses Witnesses: Mr Matt Lambert, Head of Corporate Affairs, Microsoft; Mr Peter Robbins, OBE, QPM, Chief Executive, Internet Watch Foundation; and Ms Heather Rabbatts, CBE, Chair, Media Literacy Taskforce, gave evidence.
Chairman: Can I welcome our next set of witnesses, Matt Lambert from Microsoft, Peter Robbins, the Chief Executive of the Internet Watch Foundation, and Heather Rabbatts, the Chair of the Media Literacy Taskforce. Q46 Mr Sanders: This is to Matt Lambert. Your submission outlines the range of Microsoft tools that parents can use to filter out harmful content. Other evidence says that these tools are not widely used. Is this the case? Mr Lambert: Yes, it is the case that they are not very widely used. It is disappointing, and that is one of the reasons why we have said in our submission to you and also to Byron that we very much feel that public education, driven by government, with the support of companies, appropriate NGOs and other experts, including perhaps to some extent the police and law enforcement, should actually get out to schools, to teachers and to parents and drive awareness, either with public information campaigning or, I think more appropriately, through schools and other places where you can talk to young people, to parents, to responsible adults who have responsibility for children and tell them that these tools are available - not just on Microsoft, of course; there are generic filtering tools available in other operating systems and of course by many other competitors to Microsoft in the internet security safety area offering tools which work very well with Microsoft Windows and other operating systems. I think there is a low take-up but there is not, interestingly, low awareness that tools are available. Parents are not acting on that. Q47 Mr Sanders: What is Microsoft doing to try and ensure that there is a common and consistent system of parental controls among all software and games providers for parents? Mr Lambert: We work with industry associations, including, particularly in the gaming area, which I think is perhaps where you are thinking there, to encourage consistent advice about games rating with the PEGI system, the Pan-European Games Information system. We work with other associations on a cross-industry basis. For ourselves, what we try to do, I think, the onus on us is to recognise that parents are disadvantaged. Some survey work we did showed that 90% of children reckoned they knew much more about this than their parents and that is the case with most people I know. Certainly it is my situation, even though I have worked in the industry for 20 years. My kids out-manoeuvre me on Xbox quite frequently. It is difficult but we have to make these systems as simple as possible so that when they open up the Xbox for the first time or when there is Vista you will find one of the things you can get here is parental guidance about the family safety settings. We have a unique responsibility that we recognise because typically Internet Explorer is a gateway for many people to the internet in the first place through Windows Vista, a lot of people using our online services with Windows Live MSN and, of course, Xbox, which is increasingly a live experience online for many people as well, including young people. We try to have a commonality of simple, straightforward controls which parents and responsible adults can actually understand and set, and they are password-protected so that they can set a rating level available for children or they can say "This is the type of content I want my children to see. Here are some specific websites I want to block. I only want them to play online for a certain amount of time and I do not want them to use Instant Messaging at all," if you want to take a very restrictive approach. All of that is available. It is actually quite simple and straightforward. Q48 Mr Sanders: We heard earlier that some parents confuse a classification with a skill rating. Mr Lambert: Yes. Q49 Mr Sanders: You may laugh but actually it is quite important. Mr Lambert: I am not laughing. It is not amusing. Q50 Mr Sanders: It is not you. I was criticising my Chairman, which is probably not wise of me! Would it not make sense to actually have a skill rating level alongside a classification so that you could remove that ambiguity? Mr Lambert: You are aware, I am sure, that there are two types of rating. Q51 Mr Sanders: No, I am not. Mr Lambert: Let me explain then. In the United Kingdom we basically operate two systems in parallel. One is the PEGI rating, which I have already referred to, which has rating from three-plus upwards. We also have to be BBFC, British Board of Film Classification, which takes the same approach that it takes to films, which I think most people are aware of, X-rated films, PG, and so forth. It takes the same approach to games. Very typically, that rating comes in at 15, and primarily what they are doing is saying a game is actually inappropriate under British law, harmful content and so forth, is unacceptable, therefore this game is not acceptable and it is not released in the United Kingdom, so they have that role but also the ratings often apply so you can see if it is an 18-plus, 15-plus and so forth. So the two ratings run parallel. To come back to your original question, the PEGI rating is also age-appropriate. There is a game I was playing the other day with my eight-year-old, appropriate for three-plus. It does not help that I am 44 and I had not a clue what I was doing, but that is another story. It is telling you broadly that this is a game that is appropriate for this age but it will also tell you clearly that only 15-plus is advisable. I think the problem here, and I think it was referred to in the last session, is that you have a situation where the games ratings are not taken as seriously, typically, as the cinema film ratings, and I cannot explain that phenomenon. It is not the case the parents are not aware of these ratings. We have done a bit of research. We surveyed 4,000 people across Europe, including a thousand people, broadly speaking, in the United Kingdom, a statistically relevant survey, and we found that 96% of parents were aware of an age-rated video rating system. They understand that it is there that they do not always operate it. I think possibly that is because, typically, parents did not play games like this when they were children; a lot of older parents certainly did not have them when they were children. This is a world that they do not fully understand. Microsoft does not wish to preach to parents; it basically wants to make the tools available and publicise them. Parents must decide, I think, the appropriate rating for content and games and so forth for their children, and each case can be different up to a point. The Government has to say at a point certain types of material are unacceptable or unacceptable for certain age groups, so there is a role for government but I do think we need to educate parents and say that there is stuff which is inappropriate, there is a good reason why it is rated and they need to take a bit more interest in this. That, I think, is actually for governments. That is why I say Microsoft does not preach to parents. That is for politicians and governments to decide but we stand willing and ready. I think you have heard mention a couple of times in the previous session that we are very active in terms of training programmes, getting out to schools and so forth, talking to them about these issues. A couple of years ago we went out to 100 schools in this country. I think we saw 30,000 children aged 14 and above. I myself trained about 900 kids in terms of giving them advice. We need to do this as a rolling programme. That is the advice we have given to Byron. It applies to games rating and we stand ready and willing to work with the rest of the industry, government and anyone else who wants to get involved to get out to schools, to talk to parents, to talk to teachers, to talk to young people, tell them the dangers. We must put the thing in context though. The internet is a good thing. We think gaming is a good thing. It should be fun and it should be part of pleasurable activity but there have to be limits and boundaries, and we think it is up to parents, teachers and responsible adults to decide where they lie. Q52 Chairman: One quick last question to Microsoft. You heard in the previous session the suggestion that the default setting for a new computer should be set at the maximum level of security. Is that something you would be willing to consider? Mr Lambert: No, it is not. I know John Carr very well. We have a huge amount of respect for him and we have worked with him on many things, including at the moment looking at age verification and how we could do more around that area, but I do not agree with him about this issue. It is simply this. We set our ratings for search, for example, at the moderate rating, so that blocks on Windows Live search most sexual content. You put in some obvious word, looking for sexual content, and you will get a warning that this content is inappropriate. As I have already said, we also have the tools which allow parents to take a more interventionary approach and block entirely, and clearly, you can just shut off the internet. Where does the default lie? This is the question I asked John Carr and the charities, many of whom we work with as partners. Are you saying we stop the internet and when you buy a computer you have to switch it on, or are we saying that we set it at a very high level which blocks most content, which is pretty much close to blocking the internet altogether, in my view? How long would people accept that? First of all, they would blame us. We would have to explain to our customers "You have to do this to turn it back on." John would say, "I know, but that is an acceptable level of pain," but the reality is that consumers will complain, they will go elsewhere, they will simply use other technology which neither you nor I can control and, in my view, the net result of this is you will do more damage and there will be more open access to inappropriate content. The right way, I believe, is to work in a self-regulatory regime but in a co-operative system, such as I have proposed to Byron and I have mentioned here, that we work, not just Microsoft about as a cross-industry thing, which I know the industry, ISPA and others are very willing to take part in, to raise awareness and make these tools as simple as possible. We will take any amount of advice from MPs here and any customers. If you do not think the thing is simple and easy to understand, and you do not find it readily available, and clearly available, we are listening, and we will change it in the next system and we will backdate it where we can. You can download OneCare and backdate it on previous systems if you do not have Windows Vista. We try to make it simple and straightforward but blocking wholesale and setting at a high level is equivalent to blocking and Britain would be looked at as being in the Dark Ages if you set that kind of legal restriction, in my view. Q53 Chairman: Can I turn to Peter Robbins. The IWF produces a list of banned websites and ISPs block access to them. It was suggested to us that actually it was comparatively simple to get around these blocks. Would you accept that? Mr Robbins: The decision that the internet service providers, mobile operators, search providers took to voluntarily take our list was based on the fact that they were trying to protect consumers from stumbling across these types of websites and these types of images. It is true to say though that there are ways of getting round it and therefore criminals who want to download indecent images of children will indeed find their way round these types of systems, but the thrust of the initiative was about protecting consumers as a whole from stumbling across these types of websites. Q54 Chairman: So you would have to be pretty sophisticated to be able to overcome the block? Mr Robbins: You do have to have some knowledge about other means of getting round it, yes. Q55 Chairman: The IWF is an industry-led body, self-regulatory. Do all the ISPs accept your banned list? Mr Robbins: As John said earlier, 95% of broadband consumers currently access the internet through six, seven or eight companies, all of whom take our list. There is a tail, equivalent to roughly 100 companies, that are the smaller ISPs that do not currently take the list but, to be fair, there is some confusion about how the smaller companies may be protected by virtue of activities upstream by some of the bigger providers. It is not clear to the downstream providers whether or not they are covered from the upstream providers. The way in which the industry has agreed to tackle this is to invoke a self-regulatory verification system, which we are developing with the industry, whereby they will be given data by us to enable them to check their services to see whether or not you can break through the service and, if you can, then of course there is a break in the system. So a small ISP will be able to take some data from us, check it to see whether not it breaks through the system and, if it did, they would know they are not covered upstream and therefore they have some responsibility to try and deal with a blocking mechanism for that. Paul Farrelly: I am sorry, Chairman. I did not understand a word of that. Q56 Chairman: Perhaps you could clarify. At the same time, are there any small ISPs that actually take a view that because - I do not know - they believe in freedom of expression on the internet, they deliberately do not wish to block access to sites? Mr Robbins: I do not think that would be the case that any company would take in relation to child sexual abuse websites but I think there are issues for them in relation to cost and effectiveness, but cost I think is an issue that they have with government over this, yes. Chairman: Perhaps you could just clarify the upstream and downstream, for me as well actually. Q57 Paul Farrelly: There are 100 people who do not abide by your list and seven or eight that do. To what extent do the "wicked" people go to the 100 and be able to access the sites? Mr Robbins: Yes, that is a good question. We do not know, of course, how many individuals access the internet through those types of ISPs that may not be taking our list or blocking. Let us just give you an example. If BT is blocking, which it is, and you buy your internet service connectivity as a reseller or virtual ISP who resells on to connectivity within a village, within an area, if you know that you are on the BT network, you know that you are covered by virtue of their decision upstream. However, the way in which the internet gets sold and resold, it is not always clear to two or three virtual ISPs downstream of BT that they actually are on the BT network. That is not easy to determine at the moment and that is something that we are working through now with those small ISPs to try and establish where they may be covered upstream. Q58 Paul Farrelly: Are the police, for instance, quite relaxed that these smaller ISPs exist so that all the rotten apples end up in the smaller barrels and they can watch it more closely? Mr Robbins: No, I do not think so. Q59 Janet Anderson: If we could just stay on the subject of self-regulation and potentially harmful content that is not illegal, do you think self-regulation works or do you think there should be more formal regulation or legislation? Mr Robbins: The situation with the three areas of remit that we deal with, if we just dealt with criminally obscene adult content, which is within our remit, it is very rare for us to find any of that type of content hosted in the UK. Recently there was an interpretation of the Obscene Publications Act which meant that, if there was adult pornography of an explicit nature on a landing page, which is the front page of a website, which was hosted in the UK, that a child under the age of 18 could see, therefore it may fail the Obscene Publications Act definition of "depraved and corrupt", we could issue a notice to that service provider to take that website down. That was an interpretation of the law which has only been made in the last couple of years and, indeed, since we started issuing notices in regard to that, the numbers of adult pornography websites with explicit content have now reduced significantly and we have not had to issue any notices in the last six or seven months. So as the industry takes its notices from us, it takes it down, the word gets around and the volume goes down but, of course, what has happened is that the majority of porn sites are actually now hosted outside of the UK and, in terms of pornography, we do not have any relationships with any other government or any other hotline in the world to put them on notice about these types of websites with explicit content for them to do something about it or to take them down, because they would be legal. If you went to Holland, there are plenty of porn sites in Holland which are legal which in the UK would not be allowed. So there are issues in relation to where the content moves to, where it is hosted, and then obviously what the UK can do about that. Incitement to racial hatred is also within our remit but, again, it is very rare for us to find any of that type of content in the UK, and if there was, we would issue a notice and it would be taken down immediately, and the police would be allowed, obviously, to get on with their investigation with regard to that. Of course, the complaints that we get, we trace them to other parts of the world where they have different laws and different views about what incitement to racial hatred might be, therefore they will not take any action to take them down. The question then is, how do you regulate that in the UK if people can obviously access the internet around the world? I would say that the majority of those websites are relatively static. The porn sites and the race sites, or the interpretation of what are race sites, are static. They generally are within filter systems that you can buy as commercial products, so providing you tick the box on your computer and make sure that you do not want to see porn or race or whatever, they are usually quite effective. The reason why we provide a list to ISPs of child sexual abuse websites, and the difference between that and other types of content, is that these sites move all the time. They are taken down by authorities around the world, there is co-operation around the world and, of course, we have networks and hotlines around the world to tackle that but, because of the pressure they get, they keep moving all the time. Consequently, filtering systems cannot keep up with the pace at which they move as much as they can when they are general porn sites and race sites. Consequently, ISPs take the list from us, they block it at a network level, and they block it because it does not give anybody a choice about what they can and cannot see. That is not the same with pornography, because you and I can see it; it is not an offence for us to view it; it is not an offence for you and I to look at race sites, wherever they are hosted in the world. The Government are currently proposing some legislation in relation to extreme pornography where they are minded to make it an offence to possess certain types of extreme pornography. The next step in regulating, in criminalising access to this, is that you have to stop people from either seeing it and therefore making it an offence to possess. The Obscene Publications Act does not make it an offence to possess it unless you are going to distribute it; the same with race, unless you are going to go on and incite people to do things. So government are having to think about how you can control the content and then make it an offence to see it. Q60 Janet Anderson: So you think there is a need for some legislation? Mr Robbins: If you could decide what was harmful, if anyone can decide what is harmful, then of course it would be helpful to people to know where the framework is and where the parameters are. From everybody who has spoken in front of me today, none of us are sure where you would actually put the threshold levels for those sites and who would make those decisions. That is the challenge we all see. Q61 Janet Anderson: You mentioned countries around the world and the way these websites move around. Are there particular countries where this is a problem? Mr Robbins: Yes. Q62 Janet Anderson: Would you like to name them? Mr Robbins: There have been problems in Russia. There were no reciprocal arrangements with the Russians in relation to taking down or having those types of websites investigated. There has been more co-operation recently, and indeed, one structure in Russia, which was commonly known to be posting this type of content, has actually disappeared for the time being. That has meant that many of those websites have now moved. They appear now to be hosted in the US, or many of them have moved to the US. In defence of the US, or if I can just clarify, actually, it is a massive country with millions of servers and, consequently, given the space that they have and the amount of internet space that they have, of course people go. It is cheap to host over there and put your content over there. They have a serious volume problem. We understand why these people move their content temporarily on to a server there before they think the police are going to catch up with them, and then they will move it on to here. That is the challenge and that is why there is an issue about the US. Q63 Chairman: You talked about the areas you cover, which in a sense are the ones which will command general agreement that this is unacceptable and everybody can work together. Mr Robbins: In the UK, yes. Q64 Chairman: But the debate is now moving into other areas. I will just give you two examples - there might have been many - of calls for action against websites which tell people how to commit suicide, particularly young people, and secondly, the Prime Minister has flagged up concerns about sites which are acting as recruiters for extremist, potentially terrorist organisations. Are those areas where you think potentially you might be able to operate the same system, to have a list of websites which the ISPs would then block access to? Mr Robbins: I think the model that we have can be applied to many other areas of content, provided, of course, that the thinking in relation to it is around not necessarily the criminal law. Let us just take suicide websites, for example. I am not clear that there are any suicide websites in the UK advocating suicide which would bring them into conflict with the legislation as we currently have it. They are hosted outside the UK. There is no reason why either our organisation or another organisation could not set up a hotline for the public to report these types of websites to, for them to be assessed, and then for somebody, an organisation, a caring organisation, for example, to make representations to those websites, to the government of the countries where they are, that actually these are affecting children, et cetera, and get them taken down in that way. There is no reason why that cannot happen. Without laws you can still do these things. I still think the model can be replicated in other ways. It is easier, of course, if there is criminal law involved because people take down and remove and all that sort of thing. In terms of blocking, those websites will be static so they will be in filters; they will be in the commercial types of filters, I think, because they will fit into one of the categories, so therefore they are known. If you are perhaps expecting a list to be provided to service providers, mobile operators, search providers at a default level, if you like, at the top end so that they are the ones that are blocking it and nobody has a choice to switch them on or off, then there are problems about whether or not you and I should be prevented from looking at something that we can legally look at. We may not want our children to look at them and it may entice children to go and do things but, nevertheless, where do you draw the line there? With child sexual abuse content, of course, it is an offence for any of us to look at it. It is not the same with these other types of content. I think that is an area where there is difficulty once you want to put it on a network where there is automatically a default on to stop you and I from looking at it. Q65 Chairman: Do you see the right place to tackle this problem at the ISP level or perhaps at the individual PC level? We had a demonstration from Microsoft about how you can actually tick boxes for categories of different types of content to say whether or not you wish to filter all of those out at the PC level. Mr Robbins: I cannot think of any other sensible suggestion other than it does seem to me to be in the hands of parents and carers and the family as such to set the parameters under which their children should be operating. I cannot think of an alternative solution to that. Mr Lambert: Can I just add one other comment, Mr Chairman? I think there is an element here of sites where there is a clear intent to advocate, say, terrorism or suicide for others. That would almost certainly be in breach of the standard terms and conditions for most of these hosting services. Certainly if it were something we were hosting and it was clearly advocating terrorism, that would be in breach of our terms and conditions and if we were aware of it, we would take it down. Where there is a lack of grey area, there is an opportunity for hosting organisations to intervene. Q66 Paul Farrelly: Just on that point, Mr Lambert, I hear your answer on that, yet, with respect to your content-hosting operation You Tube, after the death of the teenager--- Mr Lambert: We do not own You Tube. Q67 Paul Farrelly: I am sorry. It is Google. When Google were challenged when these videos glorifying gang violence, acting as adverts to "join our gang", were shown, their response was to shrug their shoulders and say "It is not our responsibility. We are not a censor, neither before nor afterwards." Clearly, that is a judgement for them to take. What is your view on that sort of attitude that says "We should not be asked to censor this sort of stuff" when in fact censorship goes on all along with respect to some of the other things that we were talking about? Mr Lambert: With respect to Google, you are offering me an open goal! I am not going to kick the ball straight into the middle of the net. I think that is a matter for each individual company. That is my answer to that. They each have to look at what is appropriate, what the laws are and what else they need to do. All I can say is that Microsoft takes an approach where we look at the law as it stands and we also look at what we think is a responsible duty of care, particularly where young people are concerned, which is, I think, the prime focus for us. Q68 Paul Farrelly: We heard in Seattle with respect to your Xbox that there were certain video games involving gang violence that you would not encourage people to develop on that platform or develop yourself because it was not good for your brand. Mr Lambert: We take specific choices not to do certain types of content. You are probably aware from your visit that we also give each of our developers and anyone who is involved in the marketing or any part of it, if they do not like a game, if they find it unacceptable, they do not have to work on it. There is no effect on their career; they are just moved to something more palatable to them personally. Incidentally, we take a very specific attitude on this, and I think this is common across the games industry, if I may say so, that they do not market games inappropriately. I am actually not just speaking there for Microsoft; I think I can speak from a broader industry point of view. They are not targeting inappropriate games at younger children. I really do not think that is the case. There might be a case for Byron or yourselves to look at the way that retailers operate. By and large they are responsible about this but you might want to think about whether there is an aspect there. That is clearly an area where the law could intervene, if you wanted to raise the penalties, for example, for ignoring the age ratings and selling to minors. Q69 Paul Farrelly: Given your responsible attitude towards games development and Google's on-the-record attitude with respect to You Tube, they are not really coming up to your standards, are they? Mr Lambert: You might say that. We have our own standards. We think they are quite high. Others can choose whether they try to meet them or not. Q70 Rosemary McKenna: Can we move on to media literacy, Heather? There seems to be a consensus developing that media literacy and education are a large part of the strategy for dealing with harmful content but some say that the effectiveness of this approach has yet to be proven. Are people jumping on the bandwagon because it is an easy answer? Ms Rabbatts: No, I do not believe that is the case. Just making reference to my colleagues' comments, I think what they amply demonstrate to this Committee, and indeed the experts giving evidence earlier today, is the sheer pace of change that we are witnessing in the world of the internet and that the biggest safeguard is to enable people to understand and be able to be responsible, as adults in terms of their responsibilities, as parents to children, and indeed, as Mr Lambert was saying, as children constantly outwit their parents at the latest sets of technical devices. I absolutely believe, and it is the view of the Taskforce, that linked with self-regulation and intervention where appropriate, our biggest defence to enable people to participate in what, let us be clear, are very creative opportunities - millions of people are taking advantage of their access to the world in a very different way, most of them positively - what we need to ensure is that they understand how they can protect themselves. I think we are in the early days of ensuring that people have appropriate skills. As Professor Sonia Livingstone mentioned earlier, we are getting some of those messages through. It is about being able to act responsibly in terms of understanding those messages. There is work under way both with the regulator, Ofcom, and ourselves as we start to chart over the forthcoming years whether we as citizens are becoming media-literate. I do not think it is the easy answer. I think it is complex and you have to look at how we educate children, and indeed adults, to navigate themselves through what is a very complex world. Q71 Rosemary McKenna: Would a media campaign be effective? Ms Rabbatts: I certainly think that finding ways to inform the public on different levels, be they campaigning, be they the responsibilities of my colleagues, such as Microsoft, they are all important ways in which we convey those messages. I also believe, and certainly some of the work that the Taskforce has been involved in, by working with the BBC or Channel 4 or the Film Council, those bodies that are responsible in the public space for moving media content, that actually they are providing practical tools, examples of how young people can participate in this world in a safe environment. So I think it is about a multi-layered approach in terms of education, from formal in terms of what we are now discussing in terms of the curriculum, in terms of adult education and in terms of broadcasters playing their role in terms of ensuring that citizens can both participate but also be critical and protect themselves from harm. Q72 Rosemary McKenna: Would you like to comment, gentlemen? Mr Robbins: I do not have any easy answer to this either. I think it is an issue of literacy. The speed at which these devices and the speed at which technology is changing is facing us all. My staff and I are challenged every single day with the types of content, where it is, how we trace it, how people get access to it, which device they have. It is complex and that is the world that we are in now. I agree with all the previous speakers that it needs to be tackled at different levels. How you get one message out given the various areas of understanding is the challenge that we all face. Q73 Rosemary McKenna: Of course, the question is then how would you measure the success of the campaign, or is it just continually working with people as they buy new equipment? Would putting a requirement on the providers of the equipment help? Mr Robbins: I think any information that is provided at the point of sale is important. We all know when you go and buy a device, when you unpack it, there is quite a lot of information in there. There are disks of all sorts, bits of paper of all sorts, and of course, you do get lots of messages as you fire up the machine and start to work your way through it. I have to say that, whatever product you buy these days, there is so much information around it and it is about reading, understanding and applying that, and I confess I find it difficult with all sorts of appliances that I buy, because they are so sophisticated. It takes weeks and months sometimes work out how they all work. Q74 Rosemary McKenna: In the mean time, the kids have got it all up and running! Mr Robbins: Yes. Ms Rabbatts: They do it without even reading the manuals. Going back to the point, we are at early days. Ofcom are obviously responsible for looking at how we are promoting media literacy and trying to get an assessment of how we are changing. There is work being undertaken in terms of evaluation and toolkit to enable all the different initiatives that are currently in play in terms of media literacy being able to be quantified and be subject to more empirical interrogation. I think we will continue to learn in this area and what is important is that there are ways in which we all come to understand that body of evidence that is available to us so that we can hone those messages more effectively in the future. Mr Lambert: Can I follow very briefly on the question? There are two angles to your question. I have commented on the onus on the companies to try and make the technology simple and straightforward, and customers always have a view on that and they should give that view very firmly to all the companies, and they do to us readily, as you know, but I do think in terms of public information, one of the problems we have here - and I say this as a parent as well as representing my company - it is extremely confusing. There is so much information coming out. One of the things I have said to the Byron review and I think also to you in the past is that the onus is on government to try to funnel all of that into a simple set of campaigns. There is actually something, Get Safe Online, and it has subsidiaries which apply to child safety, which most of the industry gets behind and the Government gets behind but not even all of government knows that it actually sponsors this Get Safe Online campaign. Somebody in government has to take responsibility and say "It is up to the Government to call the industry together, figure out for each age group, and for parents and teachers the right messages for each of those groups", because they are all subtly different, and then get them out. We all know from our own experience of public education - "Clunk click every trip", the Green Cross Code--- Q75 Rosemary McKenna: Is that not an Ofcom responsibility rather than a government responsibility? Mr Lambert: It could be, yes. Ms Rabbatts: I think Ofcom as a regulator does have a duty to promote and it may well be that in the next Communications Act you may wish to consider broadening some of those duties. I think the need for clarity of simple messages is overwhelming. What we have to understand is that the message today will be pretty redundant by tomorrow. So whatever we decide to put in play, I think we all have to understand we have to work at much greater speed and flexibility than previous regimes of self-regulation, regulation and education have encompassed because of the nature of the devices we are now talking about. Q76 Rosemary McKenna: Can we move on to risk for children? Is it healthy for children not to face any risk at all? Ms Rabbatts: I think for children - and I am sure you will be talking to experts in this field - what we have to understand and what they have to begin to understand, and parents should be guiding them, is how they keep themselves safe. That is the first issue. We do not let our children just walk into the roads without stopping, and pelican crossings. The first thing to think is: "I am entering into this world; what is the safety that I, as a parent or responsible adult, owe to my three, four, five-year old? It is then for children themselves to begin to understand how they manage risk, and that is a very challenging domain. I think it has been mentioned previously that children can be aware that they are entering into risky areas but they make the decision to continue to do so. That is the changing nature of boundaries that are happening in young people's lives. I think that is a very big and complex question. I absolutely believe that children need to be protected and the fundamental way is by their parents acting responsibly and not leaving them alone in bedrooms for hours on end, whatever the filtering systems they may have. We cannot abdicate parental responsibility to some other force that is out there. I then think it is necessary, whether it be schools, whether it is about media literacy, for it to become more enshrined in our curriculum going forward about how you navigate your way into this world and understand risk. I think that is a very large and complex subject. Q77 Rosemary McKenna: How do you get that message across to parents? There are still adults with very little understanding of the internet. Ms Rabbatts: Yes, and we know that, with young people who are vulnerable, often one of the common traits that they are sharing is that they lack responsible adults in their lives. Young people who become at risk of exclusion at school or offending, one of the criteria that I know from my work in local government in the past is that we see an absence of responsible adults. In some ways this is no exception, this world that they are now living in. So it is about adults understanding how they need to protect their children. That is why I think media literacy in terms of adult media literacy is an important component in terms of how we ensure adults, as they open those boxes, have very simple messages around how they might help to protect their children going into some of these worlds. The other important dimension that we should not lose sight of is that this is also a very creative world. There are many advantages to participating in this world. So you have to have a balance of a message between protection but also enjoyment, fulfilment and learning about how I can be part and parcel of the communities and the democratic society in which I am participating. Q78 Chairman: Would you agree with the recommendation we heard from Stephen Carrick-Davies in the last session that the key thing which could be done is to address this in the curriculum at quite an early stage? Ms Rabbatts: We have worked at the 14-plus level in terms of some of the new diplomas being introduced in education. I think in terms of young children, becoming media-literate, be that creative, be that being able to participate but also having that critical understanding is an important part of the curriculum going forward. Clearly, there is work under way and I think this is an area of opportunity for there to be collaboration both with the Department for Children, Schools and Families and also the Department for Culture, Media and Sport to collaborate together in terms of understanding how media is operating in the world and linking that to a curriculum offer that can be effective. I do think that there is an opportunity for us to do much more work in that space going forward. Q79 Chairman: I have two rather separate, quick questions. First of all, Peter Robbins, you described how your organisation tackles child pornography on an international scale with hotlines between countries and co-operation. Is it correct therefore to say that essentially, although these things do move around and pop up, we are essentially on top of the problem in that particular area of child pornography? Mr Robbins: I would say that we do have confidence now that we have identified a core of 2,500 websites which constantly move, have organised crime below them, that now need to be tackled by joined-up law enforcement work across the world. Yes - you never solve these things but nevertheless, the collaboration between governments and NGOs and the private and public sector has brought about a position, I think, where that is not under control but is being tackled. Q80 Chairman: It would be fair to say then that those people who, when we raise concerns about these other areas, say "The global internet is impossible to regulate, it is all over the place, you cannot possibly stop a server in some obscure Pacific island" or whatever, you have demonstrated that if the will is there, it can be done. Mr Robbins: If there is a will, it can be done. That is true. Q81 Chairman: So really the problem is persuading people to address some of these other areas with the same degree of concern that everybody agrees needs to be shown to child pornography. Mr Robbins: Yes. I think that is fair to say but, at the same time, we know how long it has taken us to get from 1996 to 2007, where we are aware there is some common agreement. There are still countries where it is not an offence to possess child sexual abuse content and therefore there is still quite a long way to go in some countries for them to catch up as well. That is with an area of content which not many people would want to argue the case against. As soon as you move into all these other areas that we are faced with, we do not see any common agreements anywhere. There are disputes and debates about this everywhere we go. As soon as this default system of blocking was introduced by ISPs, of course, their worry was about what more we would add to the list. That becomes a technical challenge to companies. One of the problems Australia is going to have is that if you keep adding hundreds of numbers of websites, the way in which the technology currently works means it does start to slow traffic down, which is a challenge for everyone, as consumers, who are then not getting the speeds which they want as well. It is not as simple as default at the network level. Do not forget, our list is only somewhere in the region of 1,200 or 1,500 to 2,000 maximum on a daily basis. It keeps moving up and down, fluctuating, but there are tens of thousands, if not millions of porn sites. If you wanted to block those out, you cannot do it at that layer which I am talking about. There is no consensus or agreement around those either. Q82 Paul Farrelly: We were going to get back to what is harmful or not in terms of content. I think we have covered that. I am interested, as a parent of young children, in boundaries and laws. With young children, if you are a good parent, you set boundaries. You do not take the risk that a certain type of behaviour leads to harm in the future as that individual develops. You just say "That is wrong." Should we not just set clearer boundaries without getting into the debates? For instance, I have mentioned adverts on places like You Tube that serve to say gang culture is a really good thing. Should we not just lay down the law and say this is unacceptable, it may be potentially harmful, we do not want to go there and this sort of content should not be hosted? Mr Robbins: You are looking at me as if I know the answer to that one as well. My organisation in the independent role that we have does not have an opinion about what you have just said. I understand, as an adult and a member of society, what you are trying to convey there, and I think there have been some examples recently where some companies who have had their adverts that in some way or another appear on certain types of web pages that do not want to be associated with are taking a view about that, because of the way advertisements are served up and how they are delivered across different platforms, but my organisation does not have a position on that, I am sorry to say. Q83 Paul Farrelly: Can I just pursue this? I do not know whether it is illegal. I used to be a newspaper journalist. I do not know whether it would be illegal for my old paper, the Observer, to have an advert in there from Mr Gangster saying "Come and join my motley crew because we are far better than the lot down the road" but I am sure the newspaper would not take it and the newspaper organisation would make sure that if it did, it was pretty much condemned. Is there not an inconsistency there in self-regulation? Advertising is self-regulatory. Mr Robbins: I think you are going into areas which I do not think I am qualified to comment about or to form an opinion on. I understand the point that you are making in terms of whether a brand wants to be associated with some other type of activity but those are decisions the people who purchase the space and sell the space need to come together on and if there is a way that that can be understood, that the Observer does not want to be associated with - I do not know - asocial networking sites, for example--- Q84 Paul Farrelly: My question is about consistency in self-regulation across different spheres of the media. Heather, you wanted to come in. Ms Rabbatts: I was just reflecting on the introduction to your question, which was a contextual point. Outside of the boundaries of criminal law, I think what you are beginning to put forward is really a sense of an individual's moral compass, that as a parent, as parents, hopefully we have a view, which is that you do not leave a three-year-old unattended in front of a computer for hours on end or 20 minutes without knowing they are watching something or doing an educational game and you know exactly what they are doing. I think there is absolutely a responsibility on individuals and citizens around their moral compass and how they are parenting and protecting young children. The other point that you make is a difference in terms of how self-regulation acts in the print world as opposed to an online and media world, and I think we have to appreciate that we are talking now about two very different spheres of influence. We have traditions that have grown up in print, we have evolving traditions in the online world, and part of the role of this Select Committee and indeed its investigation as it listens to evidence is understanding how we are beginning to try and understand intervening in a global community, where boundaries of regulation that used to be applicable in a practical sense, in the real world, are just not in the world that we are now living in. Mr Lambert: It is certainly possible to make those choices. For example, I cannot remember if it is a legal requirement or not but the Committee of Advertising Practice gives guidance on age-appropriate advertising, including certain types of advertising about types of food and so forth, and we respect that on our Instant Messaging, on our live services generally. I do think companies can make a choice and they can recognise local guidance or indeed, obviously, law, as I have already said. You have to recognise it is obviously an international, global thing, the internet, and what is acceptable for children here in Europe, or in the United Kingdom specifically, is completely different in many other parts of the world. It is tough to do it globally. Where a company is local or has local facilities, such as we do, you can take that choice. Q85 Chairman: There is one final area I just wanted to come back to, which I think is probably more for Matt and Heather: video games. I think we have heard there has already been quite a lot of agreement amongst witnesses that consumers are confused by the fact that they have two different rating systems. Would you share that view and, if you do share that view, do you have a view as to which of the two classification systems is preferable as a universal one to recommend? Mr Lambert: I have not seen evidence that shows that they are specifically confused about the two systems. It may be true but I have not seen the evidence for it. What I have seen evidence of is that parents are aware that there is a rating system and, anecdotally, I understand that parents do not necessarily enforce that or they do not take notice of it. Coming back to your question, should there be one, yes, we think that if there is going to be one, if Byron or you are going to recommend that there should be one, actually, counter-intuitively, it should be PEGI. The reason why I say that is because it would be nice and easy for a British institution to say it is the British Board of Film Classification and it is ours, which is certainly true, but the beauty of PEGI is that it thinks very carefully about game-appropriate, age-appropriate rating, legal level of acceptability. It takes note of what the BBFC is saying legally and as guidance in terms of films but the BBFC is set up for films and when it rates games, it takes the film approach. Games are recognisably different. There are different types of content and a different approach, and PEGI thinks very carefully; it breaks it down to a different level. So the PEGI rating actually will tell you more about whether there is bad language in this, it will give you a symbol for that as well, whether there is gambling, whether there is sexual content. It will give you a rating but it will also give you a series of symbols potentially which will tell you that the game has a whole suite of different things. You might think "This game is appropriate but it also has bad language" and then you might think again. It is a different depth and it is more applicable and more sensible. It also has a pan-European feature to it, and you have to accept that Britain is a major centre of the gaming development industry, which is why Microsoft has a major centre of development here in the United Kingdom. It employs 6,000 people, and we have several of the best development companies, a couple of them in the Microsoft house but many, many others. Britain is known for being that, and I think part of that is producing product which is sold responsibly right the way across Europe and hopefully internationally. It is a great thing for Britain. Q86 Chairman: The BBFC's evidence does directly contradict your view, in that they say that PEGI classifications are less reliable because they derive from inferior methodology based on self-assessment whereas every BBFC decision is based on extensive game play by independent BBFC examiners. Mr Lambert: I am not saying that that is wrong. I apologise if I have given the impression that they do not do it at all but, on the other hand, they would say that, would they not, in terms of the "We are the best" concept? I am not criticising what they say in terms of the factual but I do think PEGI is thought about in terms of the games specifically and I do think myself, notwithstanding what you have just told me, that the BBFC primarily comes from the world of film and that I think is certainly true. Q87 Chairman: Heather, do you have a view on this? Ms Rabbatts: Just to say that clearly many games are inspired by films, and indeed, I think having clarity of a system is hugely important. I also think we need to keep it simple, whether it be the BBFC or PEGI, and what we do not want is too much complexity which means that people are not exercising the judgements about what is appropriate for their children to watch. I would urge both bodies to come together so that we can actually arrive at something which we think is workable. Q88 Adam Price: Just very briefly, we have recently had the idea floated of banning internet users who illegally file-share through the internet. Do you think there is any value in extending that to people who, for instance, upload graphically violent material, et cetera, the idea that rights and responsibilities could also be extended to how people use the internet if they abuse that? If it is technically possible, they revoke their right to use it. People who spam, for instance, have their right to use an ISP removed. Mr Robbins: Actually, I do not know how you ban. That is the problem I see. I do not know how you police a banning order, if there were such a thing, and who would decide what type of content. It is back to this decision about what they uploaded and then who would take a view as to whether they should be banned. If it were associated with a court case where part of the conviction system was that they could be given a banning order or an anti-social behaviour order which denied him access to the internet--- Q89 Adam Price: An ASBO on the net? Mr Robbins: Yes. I think people convicted of paedophilia using the internet have supervisory conditions applied now to their probation about access to internet supervision and things like that. These things are doable but you come back to the practicalities, I think, of relating the content do a decision and then an authority to ban it. Ms Rabbatts: I would echo that point. I come back to the importance of education in the broadest sense of that word, that, as citizens, we have both rights and responsibilities and we need to exercise them in our day to day lives, be that when we are online or in work, et cetera. I think that is the biggest safeguard in terms of going forward. I think every time one thinks about a way of trying to filter or ban, as quickly as one has done that, there will be an invention that gets around it, and ultimately it has become back to "I am not going to upload that bullying incident that took place in the playground because a very effective anti-bullying campaign runs in the schools between teachers, parents and children". That is what you need to do to tackle that. Stopping them uploading will not help you. It is about stopping it at source. Similarly, issues around anorexia are particularly around looking at why young women have such poor self-images and poor self-esteem and are self-harming. Those are the problems, not the fact that people are going to those sites. It is about how you tackle it. Adam Price: Thank you. Chairman: We must stop there. Thank you very much. |