Digital Economy Bill

Written evidence submitted by the Children’s Media Foundation
(DEB 42)

Digital Economy Bill 2016

Summary

1. The Children’s Media Foundation recognises the primary objectives of the Digital Economy Bill to protect children against pornography and inappropriate data protection. However, we feel there are many other concerns around children’s easy access to the internet that also need to be considered and that additional interventions are necessary in certain scenarios. In short, we feel that the Bill does not go far enough to create real-world safeguards to protect children online.

2. We understand that it is also important to ensure that future innovation is not inadvertently stifled. However, platforms, distributors and content makers must no longer be able to hide behind a soft ‘age–limit’ or parental responsibility or by stating they are merely ‘the pipes’, to absolve themselves from specific legal responsibilities. We believe those age-limits for example are often just a legal backstop and that most platforms knowingly allow children to continue to use their ‘not-for-children’ services unchecked and many operate double standards by actively courting children’s brands or celebrating under-age users despite their stated Terms of Service.

3. Furthermore, while improved digital literacy amongst parents is certainly part of the solution, the ubiquitous nature of today’s internet services means that it is now impossible for them to oversee every digital action taken by their children. So as a society, we need to take extra measures.

4. As a general principle the Children’s Media Foundation believes that platform owners should have an obligation to make their services child-safe by design and then offer adults an option to opt out, once they prove their age, rather than the other way round. While many platform owners will argue this is not possible, we believe this is only because it has never been a priority in this fast growing industry, where the objectives have been to ‘on-board’ new customers in the most efficient, friction-free ways possible, to maximise revenues as fast as possible. If just a fraction of the internal Research and Development resources spent optimising features such as ‘ad-targeting’, were used to protect minors, we would already be in a much stronger place.

5. The CMF therefore believes that the Digital Economy Bill must include:

I. Clear rules about what age verification is required for non-children’s content with an emphasis on the platforms to demonstrate that users are the age they claim to be.

II. The insistence that companies not mine children’s data or target or manipulate children based on their online activity, particularly regarding marketing and advertising.

CONTEXT

6. The Children’s Media Foundation (CMF) is a not-for-profit organisation dedicated to ensuring UK children have the best possible media choices, on all platforms and at all ages. We bring together academic research institutions, the children’s media industries, regulators, politicians and concerned individuals who recognise that media is not only a powerful force in children’s lives, but a valuable one.

7. This submission has been drafted by our non-executive advisory team who comprise industry leaders from the children’s digital sector, ex-BBC executives and representatives from the tech-start up community.

8. At the outset, we think it is important to emphasise that ‘children’ are not a single group and that their age and stage of development affects their relationship with digital media. In broad terms:

9. Access by very young children is MEDIATED. Essentially, parents choose a channel and the child watches. This is the main model for pre-schoolers. The development stage of this age group tends to mean that their use of digital technology is focused on specific characters and brands through apps and games. Social media is not widely used although young children often use You Tube to watch videos. VOD platforms such as Amazon and Netflix are important too.

10. As a child gets older and moves towards school age (6-12), traditionally their access is MANAGED: a social contract exists between children, parents and wider society to try and protect them from unsuitable content. The television 9pm watershed is an example. Digital is changing this: the prevalence of mobile devices means children become increasingly autonomous in their media consumption. While most 6 year olds will focus on apps, games and You Tube, as they get older they use social platforms such as WhatsApp and Instagram and routinely share media and content.

11. By the time a child reaches their teenage years, it is natural for them to start experimenting and exploring. Most parents will try and MONITOR their child’s use but this is increasingly difficult as platforms evolve and children constantly search for the next thing to play with and use.

12. The CMF therefore believes that the UK needs:

I. Clear rules about what age verification is required for non-children’s content with an emphasis on the platforms to demonstrate that users are the age they claim to be.

II. A commitment not to mine children’s data or target or manipulate children based on their online activity, particularly regarding marketing and advertising.

AGE VERIFICATION

13. The CMF welcomes the proposals in the Bill that will make it harder for children to access pornography. However, we do not consider that the Bill goes far enough and would argue that the requirements for verification in the digital world are more closely aligned to the traditional media and the real world.

14. Traditionally there has been an implicit social contract between children, parents and content providers (publishers, television broadcasters and cinema distributors etc) that recognises three categories of material: content and services that you can safely leave your children to consume alone (children’s channels etc), content and services made for the general public that children might enjoy but not specifically tailored for them and content and services that are specifically for adults (gambling, pornography etc) usually recognised under law.

15. The CMF believes these categories are just as relevant in the digital world and supports provisions within the Bill to restrict children’s unauthorised access to this last category.

16. However, we feel the Bill does not go far enough to recreate the real-world safeguards around this second category of content made not specifically for children but that they may still enjoy. We have to acknowledge that the digital world is much more porous than old media (which had some built-in restrictions such as spectrum). The rapid speed of recent digital developments has meant that many scenarios we could never have previously imagined are now possible and we need to adapt our cultural expectations and regulatory interventions to suit. This means we have to consider taste, decency and appropriateness in a much broader context than just pornography.

17. The market dominance of a few digital platforms is shaping society’s perspective and unfortunately masks whether enough is being done to protect children.

18. All of the main social media platforms require users to be over 13 but very few actively police it. Many digital platforms have an ambivalent attitude as to whether or not they support children’s access. Spotify, for instance, promotes a ‘Family Subscription’ implicitly inviting parents to add their children and therefore with parental consent inherently built in. However, after payment is taken, the only way to register a child is to ensure Spotify ‘thinks’ the child is over 13 – potentially encouraging an adult to become complicit in lying about a child’s age.

19. You Tube is now the ubiquitous video distribution platform, especially for children. While under-13s cannot create an account, the platform works with broadcasters to carry and promote huge volumes of content for children. However, search You Tube for "Lindsey Russell", a Blue Peter presenter and a great female role model, and the second clip is tagged ‘Leather Mini Skirt and Black Tights’ with denigrating comments and expletives about Lindsey and her appearance. Search for the cartoon character Shrek and not far away are a series of animated videos featuring this children’s character simulating various sex acts.

20. This content is clearly unsuitable for children and yet it is easily available and seemingly outside the scope of the proposed Bill. True there are filters on You Tube and many inappropriate videos are not available in ‘Restricted’ mode. But by default this is switched off. Furthermore, filters do not apply if a user is not signed in. Children rarely sign in!

21. To date, the main focus of industry efforts on safeguarding children has been levelled at better parental information. This is partly because of a lack of consensus about how to address the issues but also because of lobbying from the main industry players that they are merely providing the ‘pipes’ for content providers and therefore not responsible for any transgressions.

22. In our opinion, this approach is insufficient. The CMF actively supports and collaborates in research into children’s media consumption and media literacy. We know that the platforms used by children vary according to their age and developmental stage. Factors affecting the popularity of services can include immediacy of content, social engagement, cost and novelty.

23. The CMF maintains that there needs to be a standard that ensures consistent best practice and expectations across the industry. However, there is no real motivation from the industry to tackle this problem. Therefore the standard needs to be backed by a regulatory framework. We acknowledge that there are technical challenges to more widespread age verification. However just because something is hard, does not mean that it’s not right.

24. We would also argue that any online platform that benefits from a sizeable children’s audience (a potential threshold could be 1% of the under 13s audience: approximately 91,000 junior users per annum) or their own user base comprises over 5% of under 13s, then that platform would be legally obliged to have a clearly published children’s policy stating the safety provision they have in place.


DATA

25. We note that the Bill makes reference to data in the context of research. We also recognise that detailed legislation concerning data is contained in the Data Protection Act 1998.

26. While we welcome the intention of the new legislation in respect of adults, within the framework of the regulation there is too much uncertainty about when children are responsible for their own data, nor does it lay down clear guidance on when and how children’s data can be collected.

27. Currently a parent has the responsibility for protecting the data of their children and can share their children’s data as they wish. In an environment where few adults understand the implications of sharing their own data, let alone their children’s, this poses a number of issues.

28. Parents may be sharing data that their children do not want to be shared. As a child gets older, this could have implications for their personal and professional lives.

29. As Ofcom’s research frequently illustrates, most children are more technically aware than their parents. This means that parents are often delegating the responsibility of safeguarding data back to their children- who in turn will not understand the implications.

30. Parents and carers clearly have a role to play in protecting children online. However, the CMF maintains that parents cannot be expected to do this alone. Digital platforms and content providers must assume some responsibility too.

31. The CMF would like to see a number of measures incorporated into the Bill in order to safeguard children:

I. Any site or service that has the potential to inadvertently collect children’s data should offer the Right To Be Forgotten, with an expectation on platform owners that it needs to be easy and quick to enact. Furthermore its implementation should be clearly evidenced within a prescribed timeframe.

II. In addition, we would argue that a Children’s Policy should also address data and behavioural engineering. There needs to a commitment not to mine children’s data nor target nor manipulate children based on their online activity, particularly regarding marketing and advertising. This could include clearer demarcation of adverts in search results, tighter regulation on automated links that lead children out of their safe havens and rules against behavioural mechanics that try to draw children into addictive behaviours or use exhortation.

GOVERNANCE

32. The CMF welcomes the proposed role for Ofcom in policing many aspects of the Digital Economy Bill. However, we do have concerns about the resources and capability of Ofcom in some areas.

33. The most common regulatory framework in digital space is the United States’ Child Online Protection and Privacy Act (COPPA). In the CMF’s opinion, while this is the best regulatory framework available, it has been designed for American rather than British children and is not flexible enough to keep up with the changing landscape. For instance, COPPA guidance suggests that the favoured route to obtain parental consent is by fax! It is simply no longer fit for purpose.

34. Furthermore, COPPA allows the predominantly US platforms to sidestep any societal responsibility for protecting children. The platforms claim they are merely the pipes for delivering content, with no responsibility for the content itself. They can therefore do the minimum to stay within national rules.

35. While there is a European Union directive in process, it is primarily designed to address content plurality and reflect indigenous culture.

36. In the United Kingdom, the Information Commissioner’s Office is responsible for policing best practice about data protection and children. However, the ICO is really a passive organisation: potentially unsafe practices are unlikely to be addressed unless there is a problem.

37. The CMF considers that there are currently three issues around regulation:

I. Many major digital businesses popular with children fall outside of UK jurisdiction.

II. The regulations that we are forced to use to safeguard British children have not been designed with the needs of British children in mind. While we would expect some European countries such as France to strictly legislate, the UK’s approach is to let the market self-regulate. So far this has not been successful and we have no reason to consider that the situation will improve in future.

III. The wheels of technology move at a much faster rate than the cogs of the legal system. Legislation needs to be flexible to accommodate new challenges and the Industry needs to interpret and honour the intention of guidance as well as the specifics.

38. When services are developed or launched, we would like children to be considered by default. It is much easier to create a safe environment for kids and then unshackle it for adults, than to try and retrospectively react to make something child friendly.

39. These principles should be applied to any service where children can:

i. Access inappropriate content intentionally or inadvertently

ii. View adverts that have not been pre-vetted as suitable for their age group.

iii. Be encouraged directly to purchase additional products, services or merchandise.

iv. Access unvetted comments and feedback from other users (that often carry derogatory or offensive remarks)

v. Make direct contact with other users not known to them personally.

vi. Be approached by other users without parental approval or supervision.

vii. Share personal information that can reveal their real identities

viii. Surrender behavioural data that can be used for tracking, marketing or retargeting – without explicit informed consent of their parent or guardian.

40. UK regulators need to have ‘teeth’ to ensure that regulation can be enforced.

41. However it is also important to ensure that future innovation is not inadvertently stifled. Platforms, distributors and content makers need to take a clear and accessible position regarding the provision of services for children, including explicit information about how data is collected and used, and how targeted advertising is applied. This could mean three levels:

I. Content is appropriate for children (the default)

II. Content is not appropriate for children

III. Content is appropriate for children with parental consent.

Conclusion

42. In conclusion, the CMF draws an analogy of a corner shop selling alcohol or sexually explicit magazines to minors. In such cases the onus is still on the shopkeeper to be able to prove that they took all adequate measures to prevent children’s access to the content. Similarly, as a society, we put an obligation on storeowners not to sell cooking knives to underage shoppers. The same could be true of the internet. It is not good enough to just have a sign on the door saying you must be of legal age or have parental consent to enter and then apply no further checks. Similarly we would not accept a sign with terms and conditions in the window that says you are self-declaring that you are an adult by virtue of entering the store.

43. The CMF therefore calls on the Digital Economy Bill to ensure that all platform owners that regularly attract a significant proportion of minors should have a legal obligation to put in place measures to protect those minors AND demonstrate that these are effective, understood and adhered to by that younger audience.

October 2016

 

Prepared 18th October 2016