Select Committee on Culture, Media and Sport Minutes of Evidence


Examination of Witnesses (Questions 1-19)

MR JOHN CARR, MR STEPHEN CARRICK-DAVIES AND PROFESSOR SONIA LIVINGSTONE

26 FEBRUARY 2008

  Q1 Chairman: Good morning. This is the first session of the Committee's inquiry into harmful content on the Internet and in video games. Our inquiry is running roughly in parallel, perhaps a little behind that of Dr Tanya Byron; although we will be looking at similar ground, we will in due course be hearing from her as well. I would like to welcome to this, our first session, John Carr of the Children's Charities Coalition for Internet Safety; Professor Sonia Livingstone of the LSE; and Stephen Carrick-Davies of Childnet International. Perhaps as an opening question, I could ask you whether or not you think that the considerable media coverage of the dangers and negative aspects of online content and in video games is perhaps being exaggerated to the extent that we are overlooking the positive aspects to young people from potentially games and the Internet.

  Professor Livingstone: I do not think we are overlooking the benefits, I think there are many initiatives going on, many ways in which people are gaining access to the Internet, both at schools and at home, precisely because they are very confident of the benefits of the Internet, but at the same time, there is a context of anxiety, partly led by media panics, partly led by some of the experiences that people are having with the Internet, which is leading perhaps to an overemphasis on the risks online by comparison with some of the other possible causes of dangers or ill-effects in everyday life. So sometimes the media get blamed for all kinds of ills which they are not solely responsible for. But I think people are very clear that this is primarily a fantastically beneficial medium that they want to give their children as free access to as they possibly can.

  Q2  Chairman: When you say media panics, do you think that the media is exaggerating, is there a certain degree of hysteria creeping in?

  Professor Livingstone: I think there is some hysteria, and sometimes that makes us overfocus on the risks that the media pose, rather than some of the other problems in children's lives, let us say, which are also part of the reasons for the troubles they face.

  Mr Carr: We need to be clear, the media do not make it up; the cases that we hear about and read about are things that have happened in our criminal courts, that have been reported on accurately, but the headlines do not always follow the line of the story. I would hate us to think that, in a sense, we have no need to worry about anything. The fact that the media occasionally overdo it with the odd headline should not blind us to the fact that there are genuine risks and perils out there.

  Mr Carrick-Davies: Two comments on benefits. First, I think I would recognise that the vast majority of users still embrace the technology, we see that just from the point of view of take-up. The overwhelming experience of children and young people is broadly positive, creative, engaging, they can do so many things with this new technology. The educational argument is there; we have seen where technology has been properly embedded within learning environments, and that is an important caveat, that we are seeing great improvements in attainment, to say nothing of the tremendous motivation that technology brings, and the opportunity for children to create their own learning environments. But clearly, when there are areas of risk, and there are very real areas of risk, some of that is communicated because of the case study, because of the individual child that has been hurt, and that stirs up enormous emotions. Parents can very easily be given the message that all is not well, and I think the challenge is that we have to contextualise the risks in the wider experience of children using the technology. We recognise that the vast majority of abuse takes place with children in contact with people that they know, not online. But the media, understandably, will draw out issues of risk, and as John said, they are not making it up. We know from our experience at Childnet that 22% of young people report to have been cyber-bullied. There is no way you can argue about that, one in five children will say through research that their experience of peer abuse is very real, but I conclude by saying that most people accept that there is an element of risk, and it is probably impossible to fully prevent children from being exposed to potential risk, as it is in real life. So there needs to be a mature debate about how these risks are portrayed, and what is the real baseline and experience that children have online.

  Q3  Chairman: Can I ask you specifically also about video games? We have focused and will continue to focus on the online threat, particularly from individuals who may use the ability to contact people through the Internet for harmful purposes, but games as a separate category: first of all, do you think there is a problem with games; and secondly, do you accept the argument from the industry that games too have benefits, educationally and developmentally?

  Professor Livingstone: I think there is a question about the evidence for both of those claims. There is not terrifically good evidence that playing video games benefits children, in terms of educational or social benefits, and there is perhaps equally contested evidence that video games can harm children if they play them. So there is quite a body of evidence, particularly in the States, of the ways in which particularly playing violent video games repeatedly may have harmful effects, particularly for some children, particularly for some boys, but there will continue to be debates, both about the possible benefits and about whether playing those games has harmful effects outside the situation of playing the game, as it were.

  Mr Carr: This is a problem that we come up against repeatedly when dealing with these types of questions. The fact that there is not necessarily any concrete evidence of harm does not mean that there is not any harm. You would not get through any decent ethics committee of any academic institution a proposition to allow you, for example, to subject a group of children to a constant barrage of violent images or violent games or whatever, as against a control group of other children who were not subjected to those games, simply to see what the outcome was in relation to both of those groups. Ethically, you simply could not do that. So you have to, to some degree or another, use some common sense when you think about these things, because you will never get absolutely cast-iron, empirically solid evidence, I do not think, to support the proposition either way. One of the aspects in relation to games, however, that is of concern, irrespective of the level of violence or pornography or whatever it might be that the game itself might contain, is the overuse. Certainly in some countries, we have had cases reported of children dying at the console of exhaustion; not a very common phenomenon, thankfully, but nonetheless, it is there. This has led some governments, for example, to consider requiring or asking the games industry to try to make sure that all of the high-scoring bits of a game occur early on in the game. So there is a disincentive, the law of diminishing returns sets in, so the longer the child stays on the game, stays on the console or whatever, there are fewer scoring opportunities or fewer opportunities to get a new weapon, or whatever it might be that the game is about. So there are other concerns around games than simply the exposure to violent images or pornography; this issue of overuse and extended use, replacing a social life, whatever you want to call it, getting out and playing football or whatever, is also then part of the equation too.

  Mr Carrick-Davies: I would just add to that, and first of all say I think it is excellent that the Committee is looking at this issue of gaming, because what we are seeing is this multi-user aspect, children are not just playing solitarily on their own, or with their brother or sister in a closed environment, they are actually communicating to often strangers, and of course, there is a very obvious interest that adults and children may have in one game. So I think that is an important point to note. I echo Sonia's point that we need to have greater evidence of harm, and also the educational benefits; the question was, what are the benefits, and we know that children derive enormous pleasure from playing games. It improves their confidence, their sense of social standing, their ability to multi-task, their ability to receive conflicting bits of information and education. It does strengthen the informal learning that takes place in this environment. But I think it is really important that the Committee reviews this, that you talk to the industry about how they are standardising the advice for parents. Our experience at Childnet is that parents are very easily confused. It is not helpful to see two different types of ratings system in this country at the moment, both have their strengths and weaknesses. There needs to be greater standardisation, there needs to be greater educational awareness about what games are actually about, so it is an important area.

  Professor Livingstone: There does seem to be an issue particularly within homes about children playing games which, when they have been rated, are rated as much higher than their age. There seems to be a kind of casualness within the home or a difficulty for parents to regulate this which means that many children are playing games with a much higher age rating.

  Mr Carr: Some parents seem to think that if a game is 18, and they think Little Johnny is Einstein, that he must be able to handle an 18, it is proof that Little Johnny is Einstein. In fact, the 18 rating is about the content and nature of the game, it is not a skill level, and that could be made clearer very often.

  Chairman: Perhaps on this question of harmful content, I will move to Rosemary McKenna.

  Q4  Rosemary McKenna: Thank you very much. Just to comment, though, I think some parents have exactly the same attitude to film and video, DVD, so it does not only apply to the Internet. Can we move on to what could be potentially harmful content? In some cases, it is pretty straightforward to decide which content is harmful and what action should be taken, for example, child abuse, but for other content, it is much more difficult to decide what is appropriate. How and where should the boundaries be drawn for content, where the level of harm is a subjective issue?

  Mr Carrick-Davies: Could I kick off on this, colleagues? I think it is a very difficult question, is it not, where do you draw the line? The industry can very clearly say that if it is illegal, they will not host it, or they will not access it, but whether it is harmful, as you say, is a very subjective issue. Three points of challenges here: first of all, the challenge is that children may be harmed by things on the Internet that are not necessarily perceived as traditionally harmful or illegal. Secondly, with this Web 2.0 world, which is the phrase that illustrates the user-generated content, there is a real blur between the consumer and the creator, and therefore, it is very, very easy for children themselves to upload content or images themselves which can be harmful. This is such a challenging issue. We often see children as passive victims, but actually, the work we have been doing is that this medium, although it is neutral, gives children an enormous ability to amplify inappropriate, harmful, spiteful, devastating comments. The third thing, this is why it is so challenging about where do you draw the line, is that it is a global medium, you will appreciate that it is continually evolving, that it is subjective for parents, even for parents with children of a certain age group, they may be affected by some things which other children in that same age group will not be. I think we need to recognise that there is this grey; there is black and white, but there is also this grey, and how do we respond? I would argue that there are going to be three huge players who need to be involved in this. Clearly industry need to have responsibility, and I am sure in this debate we will talk about the opportunity of media literacy and greater better by design opportunities that industry have, but there is obviously an issue of empowering, supporting and educating children, such an important part of the work of schools, and clearly the role and responsibility and the care for parents to empower them, to give them confidence to talk to children, so that children are equipped, just in the same way that they are equipped when they walk down the High Street, to recognise harm, to respond to harm, to overcome harm.

  Mr Carr: Can I just say, very briefly, on this: if it is illegal, it is clear-cut, it should not be there. Certainly within the UK, in relation to child abuse images, for example, the industry has done an absolutely fantastic job in screening that out. If it is not illegal, then there is no basis on which Government or police or anybody can ask the industry to intervene, because the industry are not moral arbiters, they are not priests, they have no locus, any more than you or I have, to say to a particular family, or a particular child, "This is good for you or bad for you". What happens in reality is within a school, for example, the headteacher sets the policy within the school, or the individual subject teachers, so they collectively decide what is acceptable in terms of internet content that children can access or use within that school. I am afraid it is and ought to be the same within the home. What is acceptable for liberal lefties living in Hampstead will be very different from—I am going to be very careful what I say next: for other types of people, shall we say. There is no way that you can legislate from the centre, or an Internet service provider can legislate from the centre and get that right. There are software packages around, there are programs around, filtering packages and so on, that each family can use to match their own particular social, moral, religious, cultural values, and I think sadly it is as complicated as that.

  Professor Livingstone: I would start from a different starting point: I do not know what it is possible for you to say to the industry, or what possible regulations might be put in place, but if I think of research on parents and on children, their starting point is often the environment of television, where there has been a very carefully regulated environment. So with television, parents and children have a very clear understanding of what the norms are, what the expectations are, what can and cannot be said, at what point they might complain, and to whom, how they manage what children of different ages would have access to, and it is an environment in which we can say that there is a good relationship between regulation, self-regulation and media literacy or parental management. When it comes to the Internet, the content that I think really worries people, and that children say can upset them, and we have figures of 10-15% of children and young people say they have been upset or distressed by something they have seen online, sometimes that is about access to extreme content of various kinds, but very often the point is that that was access which was unexpected, accidental, without any framework of mediation, without any warning, and content that for them is not, as it were, justified within the frame of the material itself, or for which there was any warning. While I take John's point that in the home, there are many tools available for filtering and monitoring and so forth that parents can implement, the struggles that parents have, and the research shows parents have to implement this, and we have to say the inequalities, it is very often more middle class parents who are more equipped and able and motivated to implement some of those facilities, the result is that many children are continuing to come up against accidentally upsetting, inappropriate and on occasion harmful content.

  Q5  Rosemary McKenna: To what extent is user-generated content that encourages violence harmful?

  Professor Livingstone: We do not have good evidence on that specific question, but what we can say is that there is evidence that material which, let us say, is outside one's expectations, outside one's norms of what you expect to see, is often shocking. It can be part of setting norms about what is acceptable behaviour, so if you see certain kinds of, let us say, cyber-bullying on YouTube, if you see that kind of content routinely, it might set a different norm of what is acceptable in a different situation in the playground. There is something, I think, to be said, and a lesson from television about the impact of repeated images, of seeing over and over certain content on television, we did not only worry and try to keep off certain kinds of extreme violence, but we have also worried about the repetitive effect of violent images over and over, and I think on the Internet we have not begun to think about that at all. We have only really thought about the extreme content.

  Mr Carrick-Davies: If I may just add another comment on that, it is very important to reflect on the differences between harm in the offline world that children could exert against each other to that which is online, because of the 24/7 nature of this, that it is hard for a child to actually escape from the bully, they cannot have the sanctuary of the home. The very fact that there is a lack of closure, that if something has been put up on a social networking site, or has been circulated to vast numbers, is very challenging for the child to ever think that has been removed, it will always be up there in space. The other classic point, we do need to watch very, very carefully, and schools tell us this, is the bystander effect, that children will actually see, as Sonia alluded to a little bit, that this becomes the norm, they then get involved in it, and see it as a prank, or as a joke, or as just a bit of fun, rather than the devastating pain that it causes. The Government, the DCSF has done some very good work on giving advice for schools on cyber-bullying, one of the messages is that those bystanders, if they laugh at it, they are very much a part of it. So I think we cannot see the issue of harmful content without the context of the peer group.

  Q6  Rosemary McKenna: On these social networking sites, to what extent does content that displays personal information lead to harm on social networking sites? For example, I have heard of young families who refuse to put their children's photographs on social networking sites. Is that a reasonable thing to do?

  Mr Carrick-Davies: That would be dependent on, as John said earlier, the morals and norms and values of the family. I think there is a real danger that children do not appreciate that a photograph that they do put up on there can be online forever. One of the messages that we say is: would you want your future employer to see that photograph that was taken at a party in a compromising position? One of the challenges—

  Q7  Rosemary McKenna: I do not mean that, I mean just normal families, exchanging photographs across the social networking sites, I have heard of people who are concerned about doing that, because of what could happen, the abuse of that photograph.

  Professor Livingstone: I think the issue there is about the kind of privacy controls that are provided for those sites. So if we imagine your family wanting to put up family photographs, they can set it to private and control who gets access to those photographs.

  Q8  Rosemary McKenna: Do you think that is robust enough?

  Professor Livingstone: I think very often people are confused about how those controls work, they are not very subtle, so pretty much you either give people access or you do not. There is some evidence that people do not understand or do not take very seriously what those decisions are, partly because they are not aware of possible abuses of that information. So I think there can be more done there, both in making people aware of the possible risks but also in making those controls more tailored to the particular needs of those who use them.

  Mr Carr: The point you raise goes a lot wider than, if you like, the traditional—traditional is perhaps not the right word, but the worries around paedophiles getting access to information and tracking down a child. I was horrified a few weeks ago when I read, for example, the admissions tutor of one of our older universities acknowledging that he had been trawling some of these social networking sites, looking for information about people who had applied to his college. By the way, he was subsequently rapped over the knuckles, very publicly, I am happy to say. But there is an anxiety around it, certainly in the Children's Charities field, about the fact that colleges, or more pointedly, perhaps, employers, might be looking at these sorts of sites, and seeing, you know, some 17-year-old kid doing something that 17-year-old boys or girls do, and that being then seen by a potential employer or a potential university admissions tutor, and weighing in the balance when they are deciding whether or not to offer the person a job or an interview or a place at university. I think that is shocking, I think it is outrageous, it is not illegal, but I hope it soon will be, and certainly we intend to take this up as a campaigning point. We do tell children and advise their parents and so on not to put things of that kind on the Internet, but they do, that is what kids do. They should not, however, suffer potentially a lifelong penalty from not getting into their preferred university to do their preferred course, not getting the job or the job interview, because people use that information in a different way from that for which it was intended. It is certainly something that you may be hearing from us on in the near future.

  Q9  Chairman: How do you make it illegal?

  Mr Carr: In the same way that we have anti-discrimination laws about what you may or may not take into account when you are offering somebody a job. The point about these pictures and these sorts of things that you sometimes see on the websites is you can never possibly have the full context in which the particular photograph—

  Q10  Mr Sanders: How many cases are you aware of, of people who have not got university places or have not got jobs as a result of this?

  Mr Carr: There was the case in the newspapers that was reported, I think it was three or four weeks ago, and anecdotally we have had parents and others—there is no hard evidence.

  Q11  Mr Sanders: So there is one case in the paper.

  Mr Carr: Yes, a few weeks ago.

  Q12  Mr Sanders: Was it proven beyond all reasonable doubt, that they—

  Mr Carr: He had acknowledged that he had made a mistake, and he promised not to do it again.

  Q13  Mr Sanders: The employer?

  Mr Carr: The admissions tutor. I will send you the press cutting.

  Q14  Mr Sanders: So we have one case. What we are discussing here is real harm to children, not one case of somebody who might not have got into a university because of something—

  Mr Carr: That would be a real harm to that child.

  Q15  Mr Sanders: Maybe it would have been real harm, but what we are concerned about is the much more general issue, we seem to be going down what I think are side roads of things of predicting what might happen when there is no real hard evidence for it at all.

  Mr Carr: Well, there was in that one case, and there are certainly anxieties and fears that it is happening on a wider scale, and it should not, that is all. I do not want to labour the point.

  Q16  Mr Sanders: But you could not enforce the rule anyway, could you?

  Mr Carr: No, but what you could do, and I would expect responsible employers to say to their personnel management, or their personnel staff, their recruiters, and responsible universities to say to their admissions tutors, "Do not do it, it is not allowed".

  Q17  Mr Sanders: But you cannot enforce it.

  Mr Carr: You can say that about almost any law really.

  Q18  Mr Sanders: I am sorry, but—

  Mr Carr: It is all about getting the evidence, is it not? I agree, it is not a major—we ought not to be detained by it. I apologise if I have introduced a red herring.

  Q19  Adam Price: Can I take us down a different side route or path? Professor Livingstone, you made the analogy between the kind of regulation and self-regulation that we have, for instance, in the broadcasting world, or even in the press, and indeed, Hugo Swire has recently made a proposal for creating some kind of, I suppose, Internet standards body. Do you see any merit in creating some kind of, I do not know, umbrella organisation which would regulate or self-regulate content on the Internet?

  Professor Livingstone: Well, this is a huge issue. Perhaps I can start somewhere slightly different, in that it is my understanding that all kinds of self-regulation of online content goes on in various ways already. Internet service providers, content providers respond to complaints from individual consumers: they take down upsetting user-generated content from YouTube, if somebody makes a complaint to that end; they are aware of content that might damage brands or contravene copyright or whatever. So I would not make the starting point that online content is not being regulated, should we have a body to do so? I would rather say there is all kinds of activity going on that does already do some kind of regulation; is the public aware of it, does it know that they could complain, do they know that there are other—they do not read their terms and conditions, they do not read their privacy policies, we know that. Is there some way in which that process could be more co-ordinated, more accountable to the public, made more transparent? Would people have a right of appeal if they disagreed with a decision made in terms of content? I would think about those kinds of principles and say that there is more that could be done than is currently being done. Whether that is best wrapped up as one new body, or whether this is a way of extending Ofcom's remit, that is not my expertise. What I can see is that the public understands very well how all kinds of traditional content are being regulated, they understand how television is differently regulated from the press, they understand that even the advertising that appears on their child's bus stop is regulated. They are simply bemused and upset that they cannot see and they do not understand that anything like this happens online, and at the same time, to go back to the benefits, they have very clearly grasped the message that they must and should get their child online and give them broadband access, to help with their education and so forth.

  Mr Carrick-Davies: Could I echo just one of those points? I think it is right that the Internet is not some moral vacuum, that there are very good existing laws for some of the providers who work in the online and offline space, from the point of view of advertising, for example, that they need to adhere to. At Childnet, we would really be very cautious about imposing new stringent regulation or the creation of any new regulations in this space because of the fact that there are unintended consequences of that. The very fact that you can restrict freedom of expression, which is such a valuable plus of the Internet, giving children an enormous voice, an enormous opportunity to communicate, to say nothing of the increasing costs of paying for this monitoring, the fact that less scrupulous providers may come into this space, it is a global medium, but that does not mean that we just say, okay, we just have to cope with this laissez faire. There are very important things we can be doing in terms of identifying what is working, drawing out best practice, monitoring the way that that best practice is actually being embedded and being rolled out. There is a real duty on governments to keep up to date with the statute book, and make sure that there are not gaps, and we need to often play catch-up. There is such a need to develop a kind of international co-ordination of this. This is a real leadership issue, ultimately, for the G8 or other international bodies, including the Internet Governance Forum, which we should be recognising is now really taking mainstream and looking at issues of access, security, openness and diversity. So I think my wish for this Committee would be to identify what is working well, and to make sure that we really prioritise the strengthening of existing multi-stakeholder initiatives, so we can actually ensure that there is a good safety net in this environment.



 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 July 2008