Select Committee on Culture, Media and Sport Minutes of Evidence


Examination of Witnesses (Questions 90-99)

MS JULIET KRAMER, MR STEVEN BARTHOLOMEW, MS TRISH CHURCH AND MR HAMISH MACLEOD

4 MARCH 2008

  Q90 Chairman: Good morning. This is the second session of the Committee's inquiry into harmful content and we are focusing particularly this morning on mobile platforms. I would like to welcome Juliet Kramer, the Head of Content Regulation at T-Mobile, Steven Bartholomew from 02, Trish Church from Orange UK and Hamish MacLeod, the Chairman of the Mobile Broadband Group. Hamish, can I start with you? You suggested in your submission to us that you felt that, actually, you already had in place most of the necessary building blocks for self-regulation. Can you tell us whether you think that there are any gaps that need to be addressed or is it just a question of utilising what is already in place?

  Mr MacLeod: It is mostly about utilising what we already have in place. When I talked about the building blocks, I was not talking just about mobile, I was talking about the wider UK set-up altogether. There is perhaps a lack of understanding as to what all the building blocks are. There is probably a lack of coordination between the building blocks and that could be improved a lot, but there is a lack of understanding and a lack of agreement amongst all the parties about how we actually go about addressing problems that pop up pretty quickly and pretty suddenly in an efficient and well-understood process.

  Q91  Chairman: Is the view shared across the operators that actually we already have in place a lot of what is required?

  Mr Bartholomew: I think it is. From O2's perspective, without wishing to sound complacent, we are very proud of the code of practice we put in place four years ago. We have always tried to predict and pre-empt problems that come along with new media technology. It is not always possible, but our approach has always been to try to predict and pre-empt and the code of practice is a very good example of that. What we are especially proud of is the fact that it was a world-first here in the UK, so the UK is leading the way. It has been transplanted from the UK into other European Union Member States and now many aspects of it have been adopted by the European Commission to form the basis of their framework on safer mobiles in the European Union. We are very pleased with the way in which things are working at the moment. There is more that can be done and we are happy to continue to work with all the interested parties.

  Q92  Chairman: It seems to me that there are two distinct areas of concern. There is access to particular websites which contain content which is potentially harmful. At the extreme end you have that which is covered by the Internet Watch Foundation where there is general agreement that these sites should be blocked, but a lot of public concern is growing about different types of websites, some helping people by advising them how to commit suicide, others encouraging them to stop eating, others making available extreme pornography or violence. Are you content with the ability to block those? At the moment it appears that it is only the very extreme end of it that is blocked, whereas the public concern relates to a much wider area?

  Mr MacLeod: May I just step back a little bit? When the European Commission, two or three years ago, suggested that we should set up regulatory frameworks to regulate pretty much anything that goes on the Internet, as part of their review of the Television Without Frontiers Directive, we in the UK were extremely vocal in rejecting the notion that we should have one regulate-everything-type body. What we have created in the UK is a series of taskforces that address very specific topics: within the Home Office we have a number of policy groups, one looking at child sexual abuse, another looking at violent crime reduction; in the Department of Health, we have got them looking at anorexia and those sorts of things. I think that is absolutely the right approach that the relevant experts are brought under one roof to discuss the problems that come up which are very, very tricky. You are absolutely right that there are lots and lots of grey areas and what should come out of those discussions and policy development is some really quite firm guidance, even to the point where there is clear legislation as to what is deemed to be genuinely harmful and should not be made available and that which falls the other side of that bracket, so that commercial companies are not put in the position that they are having to make these editorial decisions. There needs to be wider debate in Parliament and elsewhere about these very specific problems.

  Q93  Chairman: Just looking at the other area, we are going to come on to specific examples in due course, but the other area where there is a lot of concern is basically user-generated. It is the sites which provide a platform and then the users can use those sites for what are potentially harmful purposes: Second Life has been cited in the news very recently; there is YouTube; and obviously the social networking sites. As operators do you feel any responsibility to try to control how those are used?

  Mr MacLeod: We are a voice. We are not in a position to control or to mandate what they do, but, like everybody in this discussion, basically the UK has taken a collaborative self-regulatory-partnership approach to regulating things on the Internet and we are a voice and we can influence that. We would like to see a little bit more transparency around the editorial policies that they use to make decisions about whether to remove content or not, and we would like to see a little bit more transparency around the timescales in which they undertake to remove content.

  Ms Kramer: We do offer some interactive services within our own portal so the uploading and downloading is all done by mobile customers. In those situations we do fully moderate both comments and pictures, which we are able to do within our portal.

  Q94  Chairman: Does "fully moderate" mean a person is actually looking at every piece of content that is uploaded?

  Ms Kramer: Yes.

  Mr Bartholomew: Our approach on content regulation obviously begins with illegal content and you are familiar with the work that is done around clean feed. We then have the harmful categories of content and we are aware of the public debate there. O2 and some of the other companies represented here today operate their own chat rooms or user-generated services. Our acceptable use policy states that we will not tolerate the kind of harmful content that people are concerned about in those spaces. If we see it, we will take it down or we will not allow it to be posted in the first place.

  Q95  Chairman: Are you monitoring the chat rooms?

  Mr Bartholomew: Yes.

  Q96  Chairman: Somebody is watching everything.

  Mr Bartholomew: Yes, our chat rooms fall into two categories. They are either put behind age-verification controls, so you have to be over the age of 18 to get access to the chat rooms, or the bulk of them, which are not behind age-verification controls, are moderated. The comments are posted and then, after the event, somebody goes through and reads those comments. If we see things we are unhappy with, we take them down or, in incidences where we believe there has been an attempt perhaps to groom an under-18-year-old by a paedophile, we will take that evidence and give it to CEOP, effectively the police. We are both a pipe and a publisher. In our own services, we will take action. Of course we just form a very small part of the Internet. We give our customers the opportunity to put filters in place to control the rest of the Internet, if they wish to do that. That gives them a real degree of control and reassurance. At the same time though, we do recognise that we are not an arbiter as to what our customers can actually see. Provided the content is legal, we accept that they have the right to go where they like on the Internet to access that content. If people believe content is causing harm to the public, ultimately we believe it is a role for Parliament to define that content as illegal and then we can work with the appropriate agencies to take action against that content.

  Q97  Mr Evans: Specifically on the suicide sites, do people contact you and say their youngsters were looking at this site and they are aware that those sites are there? Do you get these contacts?

  Ms Kramer: Within our moderated chat rooms or blogs, if the moderator sees that someone might be considering some self-harm or suicide, it would flag up as a warning and, as a response, they would actually send them information, help information, saying contact the Samaritans or whatever the appropriate link would be and the text would not be posted up; that would all happen between the moderator and the customer.

  Q98  Mr Evans: I think I am right in saying that somebody actually did commit suicide on the web did they not, with a webcam?

  Mr Bartholomew: Yes, that is correct.

  Q99  Mr Evans: And that was following a chat room site where basically people were egging him on to do it?

  Mr Bartholomew: Yes, that is correct.

  Mr MacLeod: May I just be absolutely clear? With the mobile we do have this clear distinction between publisher and pipe and the publisher is where we have our own portal services which are in a mobile-only environment. There are a few third-party partners that are content providers and within that sphere of influence we do have a reasonable amount of control. Outside that, there is the wider Internet where we really are just an access pipe. When we talk about moderation and controlling and all that sort of thing, that is happening within our own portal.



 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 July 2008