Culture, Media and Sport CommitteeWritten evidence submitted by the Information Commissioner’s Office

Thank you for giving the Information Commissioner’s Office (ICO) the opportunity to submit evidence to your inquiry. Your inquiry is timely, given that the abuse of children via the online services they use is becoming a matter of increasing concern.

We should explain that as the regulator for the Data Protection Act 1998 (the DPA), our statutory interest in this matter lies in the processing of personal data. This takes place where, for example, a company hosts information that identifies someone on its website or where one individual posts information about another on a social networking site. However, as we explain later, our ability to police postings done by private individuals is really very limited.

In our submission we have decided to focus on a limited number of key areas, rather than to explore all the data protection issues that can arise in the context of children’s online safety. These are:

1. The DPA’s Domestic Purposes Exemption

Section 36 of the DPA says that personal data processed by an individual only for the purposes of that individual’s personal, family or household affairs (including recreational purposes) are exempt from the data protection principles and other parts of the Act. This means, for example, where one child posts information that is inaccurate (and quite possibly hurtful and malicious) about another as part of a personal online spat, there is nothing action the ICO can take against the child that posted the information—ie we cannot use our normal powers to order the correction of the inaccurate data. However, the organisation that hosts the information cannot itself benefit from the DPA’s domestic purposes exemption—we discuss the implications of this later.

The ICO is not calling for a change to the law here. We recognise the importance of the current exemption in terms of individuals’ freedom of expression and their need to process personal data for their own reasons without the threat of regulatory intervention. We also recognise the—perhaps insurmountable—logistical challenge of the ICO being expected to police the enormous amount of personal data that private citizens post about each other through social-networking and other means. We doubt if any organisation could take on this task effectively. We also recognise that organisations running services such as social networking sites cannot be expected to check all the information that individuals post about themselves or each other.

It seems to the ICO that the best way to protect individuals who are the victims of abusive postings is for the online service industries—particularly social networking sites and search engines—to be encouraged to develop more effective ways of allowing individuals to have content about them “taken down” or rendered unsearchable. We recognise the practical problems here in that we do not think it is realistic to expect service providers to vet all the content they host, and recent relevant case-law would generally seem to support this approach. (See for example the recent Court of Justice of the European Union: Advocate General’s Opinion in Case C-131/12, Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González. Note the final judgement here has not yet been handed down.) However, we do think “the industry” should do more to be responsive to the wishes of private individuals to have content about them moderated where it is clearly causing them distress. (This is quite different to public figures being able to hide negative stories about themselves.) However, we are aware of the legal uncertainty and technological difficulty here. We know from our own experience of dealing with determined, well-resourced individuals with access to a great deal of legal and technological resource that, once posted, it may be impossible to remove information from the internet effectively. Content taken down from one site often ends up being posted on another. However, we believe that it is often possible to limit the damage caused by information posted online, if not to eliminate it completely. We urge the industry to do more to develop more effective means of facilitating redress for aggrieved individuals.

We hope that as the information age develops, the legal landscape will become much clearer in terms of the responsibility of online service providers for information that they host, but which private individuals post. We also hope that the technologies that are so impressive in terms of making content searchable and accessible can be used to protect individuals in respect of content that should never have appeared online in the first place. ICO will continue to engage with “the industry” to further these objectives.

ICO has recently published guidance on social networking and online forums—it is available here:

2. Privacy Notices and Explaining Personal Data Issues to Children and others

“Privacy notice” has become the prevalent term in the UK and elsewhere for the information that organisations are required by law to make available to people to tell them, in short, what will happen to any information they provide to an organisation. This is clearly an important component of online safety for children and others because a properly drafted, accessible privacy notice should allow individuals to make properly-informed information-choices. A privacy notice might allow a child to decide not to provide his or her personal information to a particular website because of the way it intends to use the information.

Privacy notices can only take us so far though, and we suspect that privacy notices largely go unread by adults, let alone children. However, for older, more mentally competent children they do have a part to play—provided they are truthful, accessible, explain the implications of using a service in a particular way and are written in language that the service’s intended users—or their parents or guardians—can understand.

Privacy notices can be particularly useful when combined with relevant privacy-choices. This might be the case where a user cannot navigate through a website unless he or she actively agrees—or declines to agree—to a particular use or disclosure of his or her personal data. However, as we have said, this only takes us so far and it is unrealistic to expect younger children to make meaningful choices here—although obviously parents or guardians could do this for their charges. This is also why it is extremely important that services aimed at—or available to—children are offered with privacy-friendly default settings. This would mean, for example, that service users have to make an active decision to share information about themselves, rather than the sharing taking place unless it is “turned off”. The ICO very much favours a “privacy by default” approach, particularly where children’s data is concerned.

Where services are aimed at children, perhaps more important than the privacy notice is the underlying fairness, safety and transparency of the service being offered—regarding matters such as the disclosure of personal data to third parties, or its use for marketing purposes. We certainly encourage the development of industry best-practice in the area of online services aimed at children, so that there is no confusion—or any unpleasant surprises—when a child uses one of these services. Clearly, marketing that takes place on sites aimed at children should be age-appropriate.

3. Age Verification

One approach to the protection of children online is to prohibit certain organisations from offering certain types of service to children below a certain age. (This is the approach proposed in the European Commission’s data protection reform package.) We can see the appeal of the approach in terms of its simplicity—social networking sites should not be allowed to offer their services to children under 13, for example. However, this approach is fraught with practical difficulty and does not tally well with the UK’s legal system which generally relies on a child’s competence, rather than just his or her age, in order to assess whether he or she can make a decision about something. (That said, age-restricted goods are denied to individuals purely on the basis of their age, although we know from “bricks and mortar” contexts how relatively easy it is for individuals under 18 to obtain alcohol, for example.)

However, it is clear that children of the same age have different levels of understanding, and that even quite young children might be able to make informed choices over relatively straightforward propositions. We do think that a too rigid, age-based approach could unfairly restrict children’s take up of on-line services and could deny them the informational learning experiences that will stand them in good stead in later life.

We also recognise the perhaps insurmountable difficulty of introducing a reliable online age verification system—or a robust system for obtaining authentic parental consent. We are concerned that the introduction of such systems could result in service providers collecting large amounts of “hard” personal information about children and their parents—information they would not ordinarily collect.

In terms of age verification, we know that it is usually very easy to register as a user on a site based on a fake identity. A 12 year old can pretend to be 21 and set up a profile on most social networking sites. As a result, children have been able to find their way onto sites that were intended for adults and vice versa.

At first glance, governments, child protection agencies and parents may be pacified by the promise that a single technology will dramatically reduce their child’s risk online and protect them from inappropriate sites and unsavoury characters on the Internet. However, the reality is that age verification is very difficult because a nationally available infrastructure is not available in the UK. If parents and policy architects focus solely on social networking sites, or any single technology solution, then the mark for internet safety will be missed.

There is no quick fix or silver bullet solution to child safety concerns around age verification, especially with the ever changing digital landscape. Tools are available for stakeholders to promote and encourage better, safer online collaboration and communication for children. However, the reliance on a single technology or approach has limitations and possibly is unworkable. We suggest an alternative strategy of a layered approach deployed alongside educational advice, appropriate parental responsibility and supervision, mentoring children in using the Internet safely, ISP involvement, industry protocols, website developers embedding age verification and most importantly using the full range of existing legislation—including data protection law—to protect children effectively. (We have not majored here on the undoubted importance of parental responsibility and supervision. However, we recognise that this is becoming more difficult to provide as we move away rapidly from the shared family PC to the personal portable device.)

The Internet offers countless benefits and learning opportunities for children. It lays the foundation for communication opportunities that have never been experienced by mankind in our history. Because of the endless benefits it is in society’s best interest to help as many children get online. However, we must not focus too much on age verification as a potential silver bullet, but rather focus on a layered approach, allowing children to become ethical, resilient citizens and contributing towards society in a positive manner.

Chapter 2 of the ICO’s “Personal Information On-line” code of practice contains more information about the ICO’s approach to children’s personal data. It is available here:

It is worth noting that the European Commission’s proposal for a new data protection regulation has several provisions intended to protect children’s personal data. In the original text a child was defined as any person below the age of 18 years. However this definition has been removed from the latest draft.

4. What can the ICO do to protect children?

ICO’s activities fall into to two basic types—enforcement and education.

We will not go into all the aspects of our enforcement activity here. However, we do have powers that can require organisations processing personal data about children to change their practices in order to comply with the requirements of data protection law. We can also serve a civil monetary penalty of up to £500,000 for a breach. We deal with relatively few enforcement cases relating to children’s data—although a growing problem seems to be “scam” text messages being sent to individuals—including children—where the recipient is invited to call a particular premium-rate number in response. Dealing with this type of problem is a major element of our enforcement activity. ICO has been very active here—we submitted evidence and appeared recently before your Committee to discuss the problem of spam messages. Please follow this link for more information about the ICO’s work in this area:

We would urge parents, teachers—and of course children themselves—to bring it to our attention when they believe that information about children is being collected, used or shared inappropriately, or where there is evidence that it is not being kept sufficiently secure, for example. We are confident that well-targeted enforcement activity could help to encourage good practice across the board and to marginalise companies that do not handle children’s data with sufficient respect.

In terms of the education of children in respect of their information rights, the ICO has been extremely active over the years and plans further activity in the future.

In January 2013 the ICO commissioned a set of education materials covering information rights that were intended to give teachers a creative and engaging set of lesson plans for use in the classroom. The objective has been to raise some key issues with children and young people, so that from a young age they can become aware of their information rights, understand the potential threats to their privacy, and know how to protect themselves on the internet and elsewhere. This is still at an early stage but we are pleased with the initial results.

After pilot lessons at several schools throughout the UK in March and May 2013, nine lesson plan outlines—five for primary schools, four for secondary—have been fully developed and are made available from the ICO’s website Since 18 August 2013, over 3,800 of these education packs have been downloaded. Pre and post lesson evaluation among Key Stage 4 pupils attending High Storrs School in Sheffield showed that awareness of how to protect personal information increased from 59% to 75%, following the lessons.

When appointing the supplier to produce the materials, the ICO insisted that they were designed and developed by teachers still working in schools today and, crucially, were tailored to specific areas of the national curriculum.

The lesson plans explore what is meant by personal data, what information can be shared and which information children should keep safe. The activities help teachers introduce children to their rights and responsibilities relating to data protection and freedom of information, and explain where to go for help if they have a concern. The materials allow teachers flexibility in their approach and encourage active learning, with many opportunities for students to discuss and question.

There are five lesson plans for use in Primary Schools. They are:

Lesson Plan 1—What is personal information?

Learning objective: Children will understand what is meant by “personal information”.

LP2—Who needs to know and why?

Learning objective: Children will consider the personal information that it might be appropriate to provide to a given person.

LP3—Opting out

Learning objective: Children will become aware of how to “opt out” from particular uses, sharing etc. of their information. They will understand that information they provide could be used for a variety of purposes.

LP4—How to react to unsolicited messages

Learning objective: Children will consider how they should react to unsolicited emails and text messages that request personal information.

LP5—Rights and Regulations

Learning objective: Children will understand that laws and regulations exist to protect us and to give us the right to access appropriate information.

There are four lesson plans for use in Secondary Schools. They are:

LP1—Strictly Private—what is personal data

Learning objective: By the end of the session, students will be able to:

explain what they mean by the term “personal data”;

discuss levels of privacy appropriate to a range of personal data; and

explain the role of the ICO.

LP2—Private versus public

Learning objective: By the end of the session students will:

understand how they may unintentionally disclose personal data;

define the kinds of personal data that should be kept secure;

list some of the personal data likely to be held about them and the organisations likely to hold it; and

describe their rights in relation to how organisations should store and use their data.

LP3—Is there something I should know? Exercising our rights

Learning objective: By the end of the session students will:

understand the main principles applying to how organisations should store and make use of personal data;

understand that they have choices over how they control their personal information;

understand how to check that data held is accurate; and

know how to have inaccurate data corrected.

LP4—No secrets—Freedom of Information

Learning objective: By the end of the session students will:

understand their right to know under the Freedom of Information Act;

understand how to make a freedom of information (FOI) request; and

appreciate the impact some past FOI requests have had on individuals and society.

We are very proud of our achievements thus far in terms of educating children about their information rights and informational safety. We are convinced that the education of children about these issues is crucial in terms of giving them the informational self-defence they will need as children and of course as adults. It is all too easy for children just to provide their details when asked to do so online, or to subscribe to a service because they want the “goodies” without thinking through the privacy implications of their actions. However, we are confident that our educational outreach work will help to foster the critical, questioning attitudes that children’s online safety depends on to such a great extent.

We thank the Committee for the opportunity to provide our views on this extremely important topic. We will of course be happy to provide any further information it requires from us. The point we would like to reiterate is that the issue of children’s safety online is a complex and multi-faceted one. There is no single solution and the best we can do is to make sure that the various organisation—including the ICO—that have a responsibility in this area prioritise the protection of children and co-ordinate their efforts to reduce the risks and enable the benefits of the online world for young people.

September 2013

Prepared 18th March 2014