Select Committee on Culture, Media and Sport Tenth Report


Conclusions and recommendations


1.  We agree that any approach to the protection of children from online dangers should be based on the probability of risk. We believe that incontrovertible evidence of harm is not necessarily required in order to justify a restriction of access to certain types of content in any medium. (Paragraph 53)

2.   It is sensible that parents set boundaries for their children's online activities, but a totally risk-averse culture in parenting will not equip children to face dangers which they will inevitably encounter as they grow older. (Paragraph 54)

3.  The Home Office Task Force on Child Internet Safety has, by common consent, done good work and has served its purpose well; but its loose funding and support structures have given the impression that its work is of a comparatively low priority. We agree with Dr Byron that the structure and funding of the Task Force should be formalised. We also welcome the announcement by the Government that the date for establishment of the Council is to be brought forward from April 2009 to September 2008. However, we are concerned at reports from some key players that there has been no contact with Government to take this forward and from others that there has been little opportunity to influence decisions as to how the Council will operate in practice. We expect the Government to address these issues urgently. (Paragraph 62)

4.  We agree that the Council, at least in its early years, should be chaired by a Minister, to ensure that Council members have direct access to public policy-making. However, we question the proposed joint chairing arrangement, which excludes DCMS Ministers. We believe that it would be unfortunate if DCMS were to appear subsidiary in Council governance, given its role in media regulation, although we recognise the practical difficulties in sharing the chairing role between many Departments: indeed, we question whether co-chairing is desirable in principle. We invite the Government to consider carefully whether to appoint a single lead minister, either from one of the Departments represented or perhaps from the Cabinet Office. There may be a case in future for the Council to be chaired by someone who sits outside Government, particularly if the role of the Council is to expand. Given that the Government has accepted Dr Byron's recommendations in full, we believe it should now move quickly to provide a budget. (Paragraph 66)

5.  While there might be an expectation that most of the Council's effort would be directed towards child protection, we believe that there is a danger of overlooking possible harm to vulnerable adults, and we recommend that the Government should give this proper consideration when deciding the Council's terms of reference. (Paragraph 67)

6.  We are much impressed by the work of the Child Exploitation and Online Protection Centre and its close co-operation with charities such as the National Society for the Prevention of Cruelty to Children. However, we are concerned that levels of funding are not keeping pace with the increasing volume of work which is referred to the Centre, and we therefore encourage the Government to look favourably on any request by CEOP for increased resources. We also welcome the financial contribution made by charities and industry, and we believe that the latter should be increased: business models for Internet-based services rely upon public confidence that networking sites are safe to use, and CEOP plays a large part in delivering that safety. (Paragraph 70)

7.  We strongly recommend that terms and conditions which guide consumers on the types of content which are acceptable on a site should be prominent. It should be made more difficult for users to avoid seeing and reading the conditions of use: as a consequence, it would become more difficult for users to claim ignorance of terms and conditions if they upload inappropriate content. The UK Council for Child Internet Safety should examine this at an early stage and produce recommendations as to how it is best achieved. (Paragraph 85)

8.  We are also concerned that user-generated video content on sites such as YouTube does not carry any age classification, nor is there a watershed before which it cannot be viewed. We welcome efforts by YouTube to identify material only suitable for adults, such as that containing foul language, and to develop potential controls to prevent children from accessing it. (Paragraph 86)

9.  We do not believe that it is in the public interest for Internet service providers or networking sites to neglect screening content because of a fear that they will become liable under the terms of the EC E-Commerce Directive for material which is illegal but which is not identified. It would be perverse if the law were to make such sites more vulnerable for trying to offer protection to consumers. We recommend that Ofcom or the Government should set out their interpretation of when the E-Commerce Directive will place upon Internet service providers liability for content which they host or to which they enable access. Ultimately, the Government should be prepared to seek amendment to the Directive if it is preventing ISPs and websites from exercising more rigorous controls over content. (Paragraph 95)

10.  We found the arguments put forward by Google/You Tube against their staff undertaking any kind of proactive screening to be unconvincing. To plead that the volume of traffic prevents screening of content is clearly not correct: indeed, major providers such as MySpace have not been deterred from reviewing material posted on their sites. Even if review of every bit of content is not practical, that is not an argument to undertake none at all. We recommend that proactive review of content should be standard practice for sites hosting user-generated content, and we look to the UK Council proposed by Dr Byron to give a high priority to reconciling the conflicting claims about the practicality and effectiveness of using staff and technological tools to screen and take down material. (Paragraph 96)

11.   File titles and screening tools can help to identify files which appear to present a particular risk of exposure to inappropriate material. We encourage sites which handle user-generated content to develop as a priority technological tools to screen file titles and prevent the upload of—or quarantine—material which potentially violates terms and conditions of use until it has been reviewed by staff. We also encourage sites to share their knowledge and expertise at the UK Council on Child Internet Safety, with a view to developing codes of practice for prior screening of material. (Paragraph 97)

12.  We find it shocking that a take-down time of 24 hours for removal of child abuse content should be an industry standard. (Paragraph 98)

13.  We believe that there is a need for agreed minimum standards across industry on take-down times in order to increase consumer confidence. We recommend that the UK Council on Child Internet Safety should work with Internet-based industries to develop a consistent and transparent policy on take-down procedures with clear maximum times within which inappropriate material will be removed. This should be subject to independent verification and publication. (Paragraph 99)

14.  We await the announcement by the Ministry of Justice on whether the law might be strengthened to help prevent the use of the Internet to encourage suicide. Even if it concludes that the offence of assisting suicide is clear enough in law to enable successful prosecutions of those responsible for websites which assist or encourage suicide, we believe that the law should not be the only means of controlling access. The characteristics of the offence should be clear enough in law to enable access to such sites to be blocked on a voluntary basis, possibly through the procedures established by the Internet Watch Foundation. The UK Council for Child Internet Safety should accord a high priority in its work programme to discussions with the Ministry of Justice on whether the law on assisted suicide is worded clearly enough to include websites which encourage suicide and to enable action to be taken to block access to websites which assist or encourage suicide. (Paragraph 117)

15.  At a time of rapid technological change, it is difficult to judge whether blocking access to Internet content at network level by Internet service providers is likely to become ineffective in the near future. However, this is not a reason for not doing so while it is still effective for the overwhelming majority of users. (Paragraph 119)

16.  We believe that there would be advantage in establishing a forum at which governments or regulators from across the world could try to find common ground on how access to content on the Internet should be treated. This may, in time, lead to a more co-ordinated effort in frustrating access to material which is widely perceived as harmful. We recommend that the Government should take a lead in establishing such a forum. (Paragraph 124)

17.  It is clear that many users of social networking sites, particularly children, do not realise that by posting information about themselves, they may be making it publicly available for all to see. We recommend that social networking sites should have a default setting restricting access and that users should be required to take a deliberate decision to make their personal information more widely available. We also recommend that consideration be given to alerting users through pop-up displays about the risks involved in submitting personal details without restricting access. (Paragraph 128)

18.  We commend Microsoft for providing a facility for direct reporting to the Child Exploitation and Online Protection Centre within Windows Live Messenger. We believe that high profile one-click facilities for reporting directly to law enforcement and support organisations are an essential feature of a safe networking site. We recommend that the UK Council for Child Internet Safety should impress upon providers of networking services the value of direct one-click reporting from their websites to law enforcement agencies and voluntary sector organisations with expertise in offering support to vulnerable people. We also believe that facilities for reporting abuse should be obvious to users and should be directly accessible from all relevant pages of a website, close to the entry point. We would expect providers of all Internet services based upon user participation to move towards these standards without delay. (Paragraph 132)

19.  We recommend that network operators and manufacturers of mobile devices should assess whether it is technically possible to enable images sent from mobile devices to be traced and viewed by law enforcement officers with the appropriate authority. (Paragraph 138)

20.  We commend Microsoft for their efforts to ensure that there are varied and effective parental controls built in to their hardware. We believe that other console manufacturers should be encouraged at least to match these. We hope that this matter will also be considered at an early date by the UK Council on Child Internet Safety. (Paragraph 140)

21.   We expect the Government to apply continuing, and if necessary, escalating pressure on Internet service providers who are showing reluctance to block access to illegal content hosted abroad. In a lucrative market, the cost to Internet service providers of installing software to block access to child pornography sites should not come second to child safety. (Paragraph 146)

22.  We believe that leaving individual companies in the Internet services sector to regulate themselves in the protection of users from potential harm has resulted in a piecemeal approach which we find unsatisfactory. Different practices are being followed and there is a lack of consistency and transparency, leading to confusion among users. Nor is there any external mechanism for complaints about services provided by Internet-based industries to be considered by an independent body. However, we do not believe that statutory regulation should be the first resort. Instead, we propose a tighter form of self-regulation, applied across the industry and led by the industry. We therefore call on the industry to establish a self-regulatory body which would agree minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics and adjudicate on complaints. (Paragraph 153)

23.  We recognise that a number of companies may choose to set higher standards for their own commercial reasons, but the public need the assurance that certain basic standards will be met. This is particularly important in the area of child protection and Internet safety. However, the new body might also take on the task of setting rules governing practice in other areas such as on-line piracy and peer to peer file-sharing, and behavioural advertising, which although outside the scope of this inquiry are also of public concern. Given the global nature of the industry, it is impossible to make membership compulsory for all service providers, but a widespread publicity campaign should ensure that consumers are aware that they can have confidence in the standards of protection and reputable practice which membership of the body carries with it and that this cannot be guaranteed by those companies that choose not to join. (Paragraph 154)

24.  Our preferred model for any new body to maintain standards among providers of Internet-based services is that of the Advertising Standards Authority, which is generally successful at securing compliance with codes for advertising standards but which, if necessary, may refer companies which persistently breach those standards to statutory regulators that can apply penalties. (Paragraph 155)

25.  We commend the Government for the action it has taken to motivate the Internet industry, the voluntary sector and others to work together to improve the level of protection from risks from the Internet, particularly for children. However, we regret that much of this work remains unknown and has therefore done little to increase public confidence. We look to the UK Council to build on the existing agreements and to ensure a much greater public awareness of what has already been achieved. (Paragraph 157)

26.  We also note that the Government originally suggested that four different Ministers should give evidence to our inquiry and it does seem that there is scope for improved co-ordination of activity between different Government departments. We recommend that a single Minister should have responsibility for co-ordinating the Government's effort in improving levels of protection from harm from the Internet, overseeing complementary initiatives led by different Government departments, and monitoring the resourcing of relevant Government-funded bodies. (Paragraph 159)

27.  We endorse the thrust of Dr Byron's recommendations on improving media literacy, and we commend her for her approach. However, we believe that the one-stop shop will only be worth locating on the DirectGov website if search tools, social networking sites, video-sharing sites and Internet service providers offer a direct link: otherwise the one-stop shop will languish in obscurity. We also recommend that all new computer equipment sold for home use should be supplied with a standard information leaflet, to be agreed with the IT hardware and software industries through the UK Council on Child Internet Safety, containing advice for parents on Internet safety tools and practices. (Paragraph 185)

28.  We agree with Ofcom that parents will need to take on greater responsibility for protecting children from harm from the Internet and from video games. In particular, they should be aware of the consequences of buying devices which allow unsupervised access to the Internet; they should have more knowledge of young children's social networking activities and be more familiar with video game content, thereby gaining a better understanding of the risks; and they should, wherever possible, discuss those risks openly with their children. We recommend that the UK Council for Child Internet Safety should investigate ways of communicating these messages to parents. (Paragraph 186)

29.  We recognise the concerns that the hybrid system for games classification proposed by Dr Byron may not command confidence in the games industry and would not provide significantly greater clarity for consumers. We believe that, ideally, a single classification system should be adopted. While either of the systems operated by the BBFC and by PEGI would be workable in principle, we believe that the widespread recognition of the BBFC's classification categories in the UK and their statutory backing offer significant advantages which the PEGI system lacks. We therefore agree that the BBFC should continue to rate games with adult content and should have responsibility for rating games with content appropriate only for players aged 12 or above, and that these ratings should appear prominently. Online distributors should be encouraged to take advantage of the BBFC.online scheme which should be promoted as offering greater confidence to parents about the nature of the game. While we hope that PEGI will work with the BBFC to develop a single system, distributors are of course free to continue to use PEGI ratings in addition, as they do at present. (Paragraph 203)


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 31 July 2008