Regulating in a digital world Contents

Chapter 2: Principles for regulation

A principles-based approach

27.The rapid pace of technological development requires a principles-based approach that sets out standards and expectations of service providers. Many witnesses advocated legislation that is ‘technology neutral’—that is, legislation which targets specific types of behaviour regardless of the medium.32 The Children’s Media Foundation thought that “Legislation needs to be flexible to accommodate new challenges”33 and that “the industry needs to interpret the intention of guidance as well as the specifics”. To this end, a principles-based approach to regulation could help to improve the effectiveness of self- and co-regulation and inform and shape specific rules.

28.Principles can be applied to regulation in two ways. First, legislation can require that principles, expressed in a relatively general way, must be complied with. This form of principles-based regulation is often contrasted with rules-based regulation: principles-based regulation focuses on outcomes, whereas rules-based regulation prescribes the format compliance must take. The data protection principles set out in the GDPR are an example of this form of principles-based regulation. Elizabeth Denham, the Information Commissioner, explained:

“Principles-based regulation works for an area of law that is fast changing and fast moving. [It] allows for more detail to be developed through guidelines, codes of practice and certification that flow from the principles.”34

29.Ms Denham acknowledged that there were drawbacks of this approach: many commercial entities prefer the legal certainty of a rules-based system; however, she found such an approach to be “rigid” and “not future-focused”.

30.Secondly, principles can be used to inform the development of regulation. Witnesses stressed the importance of legislation being aimed at specific ‘sectors’ of the internet35 and enforced by the different regulators with expertise in their own area.36 A principles-based approach can help to establish a common understanting for addressing issues which cut across sectors and can provide a common framework for regulators, executive bodies, policy-makers and lawmakers to work within to develop effective regulation.

31.The Government has used principles to inform work on its Digital Charter.37 It argues that its principles are “mutually supportive”, allowing for “a free and open internet while keeping people safe online”. While we support these so far as they go, we believe that they are insufficient. In this chapter we identify 10 principles which have emerged from our evidence and which should underpin regulation of the digital world:

32.No form of regulation will be effective unless it is enforced. Enforcement mechanisms must have sufficient resources and be rigorously applied.

Parity

33.We define the ‘principle of parity’ to mean that regulation should seek to achieve equivalent outcomes online and offline.

34.McEvedy’s Solicitors and Attorneys wrote: “Good laws are technology and actor neutral and focus on behaviours and not actors, so the first question should remain what happens offline?”38 None of our witnesses disputed the principle that what is illegal offline should also be illegal online. Though some felt that it had not always proved helpful in addressing policy issues.39

35.The London Internet Exchange (LINX), a membership association for network operators, warned that too often those who promote the principle exclusively want “restrictions and prohibitions” to be enforced online by private companies with no corresponding eagerness to ensure the administration of justice which balances competing interests in the independent court system offline.40

36.Myles Jackman, Legal Director of the Open Rights Group, told us that the underlying principles of regulation should apply both online and offline, but cautioned that care was needed to understand how technology will shape their implementation: “It is equally wrong to demand that something that works offline works exactly the same online—because it will not—as it is to say that the online world should create completely new rules.”41

37.Recent developments on age verification provide an example of an attempt to transpose child protection rules into the digital environment. In the offline environment it would be illegal for a shopkeeper to supply a pornographic film to a child; this is regulated both by the classification framework operated by the British Board of Film Classification (BBFC) and the Video Recordings Act 1984. In the online environment, where the supplier of adult film content does not have face-to-face contact with the consumer and may not be directly subject to the UK regulatory framework, children are able to access material they would not normally be able to access offline. The Digital Economy Act 2017 requires commercial online pornography providers to check the age of users. These provisions will not be implemented until spring 2019 and gaps will persist. For example, social media companies will not immediately be in the scope of the most robust age verification standards.42 The parity principle would bring them into scope.

Accountability

38.Accountability means that there are processes in place to ensure that individuals and organisations are held to account for their actions and policies. Such processes should involve establishing clear expectations and compliance with rules. If individuals or organisations are found to have not complied, they should be subject to sanctions or required to make amends. This principle applies to all organisations including third sector, businesses, public and regulatory bodies, and users.

39.There was widespread concern among our witnesses about the lack of accountability in the online environment. Many called for an ‘enforcement approach’, pointing out that often online the problem is not a lack of law or regulation but rather under-enforcement. Microsoft for example argued that “the challenges posed by the internet typically require enforcement of existing laws and regulations” rather than new legislation.43

40.Too often internet companies have been allowed “to mark their own homework” and can fail to uphold even the standards they themselves set in codes of practice.44 Doteveryone told us that their research of public attitudes had found that people “feel disempowered in the face of technologies and have a strong appetite for greater accountability from technology companies and government”.45 This inequality suggests that independent oversight is required.

41.The Northumbria Internet & Society Research Interest Group suggested that users should also be made responsible for following rules, but added: “Long, unfair, and opaque privacy policies and usage guidelines are not a good way to achieve this.”46

42.Given the power imbalances between users and tech companies, accountability mechanisms need to be quick, accessible and easy to use. Professor Lilian Edwards noted the value of “low cost or free [alternative dispute resolution] system for users, of the sort companies like eBay have provided in the past” though she remarked also on the need for public oversight or audit.47 The evidence suggests that all parties, including internet platforms, regulators and governments, are failing to ensure access to redress.

Transparency

43.Transparency is key to ensuring accountability. It also has a role in enabling policy-makers to see how the online environment is functioning to identify problems, in promoting a common understanding of rules, and in enabling users to understand how their rights are affected. Transparency is particularly important online because of the balance of power between platforms and their users and because of the significant role platforms play in managing communications between individuals.

44.The issue of transparency grown in significance because of the adoption of automated decision-making systems in both the online and offline environment. For example, with a large volume of decisions surrounding content moderation now being fully or partly automated there is a risk that decision-making takes place within what Professor Frank Pasquale calls ‘the black box’, a system whose workings are mysterious; only inputs and outputs can be observed, but not the process in between.48 Clare Sumner of the BBC said: “Everything around algorithms needs to be more transparent and people need to be more honest about whether they are using algorithms and what they are doing.”49

45.This issue was raised in evidence on a number of occasions. Professor Lilian Edwards noted:

“More transparency, as recently seen in the form of the publication of [Facebook’s] content moderation rules and YouTube’s take down “flags” is helpful and emerging driven by recent [public relations] scandals … But it is still unclear what action could be taken if the processes revealed seemed socially unacceptable either by governments or users, bar long and precarious challenges on human rights grounds.”50

46.Very often it is not helpful to disclose a large volume of technical information, which can in fact lead to a lack of transparency as pertinent information is obscured. In such cases what is really needed is a clear explanation. Absolute transparency may also impinge on legitimate business interests. Subforum, a tech developer, noted that platforms were opaque because “transparent systems are easier to manipulate”.51 Recent scandals on data misuse, and concerns reported surrounding the policies applied by social media and other content moderation platforms, extending even to concerns raised in evidence by McEvedys around the highly respected system for regulation of child exploitation content, point to a “transparency gap”.52 It may be necessary to have different levels of transparency for different purposes. For example, the Information Commissioner’s Office suggested that “Informing the users at a non-technical level must be paired with a deeper requirement to explain and account to the regulator.”53

Openness

47.Openness has been a fundamental attribute of the internet since its inception. Professor John Naughton, Senior Research Fellow at the University of Cambridge, explained that the internet was designed with two fundamental axioms: “One was that there should be no central ownership or control of what they designed; the second was that they should design a network that was not optimised for anything they knew about at the time”.54 This has enabled creativity and “permissionless innovation”.

48.Openness could be interpreted as a “carte-blanche for ‘anything goes’”.55 Some innovation has been harmful. Jenny Afia, a partner at Schillings, told us that her biggest concern was that “children’s best interests have been ignored probably because of the utopian vision that all internet users would be treated equally”.56 It therefore needs to be balanced against other principles, particularly ethical design and recognition of childhood, which are discussed below.

49.Others, such as Google, argue that the internet has enabled “the free flow of information online and given consumers, citizens, institutions and businesses more choice, power and opportunity”.57 As the internet plays a greater role in private and public life, human rights, including the rights of freedom of expression and freedom of information, need to be protected online.58 One aspect of this is net neutrality: “the principle that internet service providers should enable access to all content and applications regardless of the source, and without favouring or blocking particular products or websites”.59 While net neutrality is traditionally associated with the infrastructure of the internet, analogous principles apply to certain internet services that run on top of the infrastructure level. Some witnesses expressed concern that the significant power of a small number of global companies is limiting choice and innovation: confining users within “walled gardens” and in so doing threatening the openness of the internet.60 We consider this further in chapter 4.

Privacy

50.Privacy and data protection are already the subject of a significant body of law regulated by the Information Commissioner’s Office. However, there is still much to be achieved in bringing about meaningful control of data privacy and data protection. The Northumbria Internet & Society Research Interest Group argued that “the recent issues with Facebook and Cambridge Analytica suggest there is scope for greater regulation of the use of individuals’ personal data”.61

51.Our evidence showed that there is a gap between what the data protection framework provides and what users expect. The Information Commissioner’s Office noted that despite the strength of the GDPR and related domestic legislation, “There is growing consumer unease about how online platforms are using personal data and potentially limiting consumer choice”. It concluded: “it is fair to say that some aspects of the law have not kept pace with the rapid development of the internet”.62 As technological development increasingly results in connected homes, cars and cities, the balance between convenience and privacy will require debate and must be reflected in clear standards.

Ethical design

52.Many problems associated with the digital world originate in the way in which services are designed. Some internet technology is deliberately designed to take advantage of psychological insights to manipulate user behaviour. Laurie Laybourn-Langton, Senior Research Fellow, Institute for Public Policy Research, told us about how technology had used to learn more about user behaviour with a view to manipulating it.63 He argued that there would have been a public backlash if the Government had undertaken similar research. This demonstrated a divergence between “the norms we have established in certain areas of society and those in this sector”.

53.Ethical standards, such as safety and privacy, should be incorporated into the design of technology and delivered by default. Such standards should also ensure that individuals should not be manipulated but free to use the internet purposefully. Users should be treated on the basis of fair, transparent and consistent rules. Technology should act in the interests of users and the public. In particular, personal data should be used fairly. We consider this principle further in the next chapter.

Recognition of childhood

54.One third of internet users are under 18. In our report Growing up with the internet, we found that children are particularly vulnerable to online harms and that, although they are often early adopters of new technology, their welfare is very little considered by tech entrepreneurs.64 We argued that this should change to make the internet work better for children.

55.Consideration of children should not just focus on protection. It is also necessary to consider how the internet can meet their needs and be accessible to them. Any principle-based approach to regulation must recognise children’s rights, their legal status and the concept of childhood.

Respect for human rights and equality

56.The internet has become so ingrained in how individuals live that restricting internet access or usage threatens their ability to participate in essential personal, social, business and political activities. In particular, some witnesses stressed that the internet has become integral to participating in democratic life. It is therefore essential that regulation in the digital world respects human rights and equality rights. The Government told us that it was “firmly committed” to protecting these rights online: “These are essential qualities of any functioning democracy and promoting these values is a key UK priority both at home and overseas. Any interference with these rights must be consistent with the principles of legality, necessity and proportionality.”65

57.Dr Emily Laidlaw argued that the potential of the internet to promote and facilitate democratic activities was dependent on privately-owned companies which she called ‘Internet Information Gatekeepers’. She explained that this referred to: “a gatekeeper which facilitates or hinders deliberation and participation in the forms of meaning making in democratic culture. Every time we use the internet we engage with IIGs. In order to find information, we use search engines. In order to sort through the clutter on the internet, we use portals. In order just to access the Internet, we need to use Internet service providers (ISP).”66 The regulation and self-regulation of these gatekeepers must therefore take into account relevant human rights and equality legislation in the interests of users.

58.The Information Law and Policy Centre, Institute for Advanced Legal Studies suggested that the application of European Convention on Human Rights case law would help to avoid disproportionate censorship online.67 Mark Stephens, a partner at Howard Kennedy, drew the committee’s attention to the UN Guiding Principles on Business and Human Rights68 (‘Ruggie Principles’), which were designed to be used for businesses carrying out activities which affect human rights and could inform further internet regulation.69 Any such regulation must observe due process, as outlined in Article 6 of the ECHR, both for gatekeepers being regulated and users seeking redress.

59.Consideration should also be given to protected characteristics, as set out in the Equality Act 2010. The internet can empower people from all backgrounds, providing a platform for those not heard elsewhere and a means of connecting with others. However, with these benefits come risks. Several witnesses discussed online abuse and harassment directed against specific groups according to gender, sexuality, race or religion. Addressing this can be challenging. The British Computer Society noted that removing racist content can take longer than content such as nudity which is easier to categorise70. Michael Veale, a researcher at University College London, described how automated content moderation systems can discriminate against ethnic minorities through a failure to understand non-mainstream uses of language.71

60.Margot James MP, Minister for Digital and the Creative Industries, was concerned that 20% of people with a registered disability have never been online. We share the Government’s desire that the benefits of technology should “be shared across society, not for certain groups to benefit while other groups fall behind.”72 This includes the need to address the inequality of experience among those who do use the internet. The UK Safer Internet Centre and Global Partners Digital both raised the difficulties people with low digital literacy or disabilities can have, such as in availing grievance redress mechanisms and understanding terms and conditions.73 Which?, a consumer group, also reported that vulnerable adults can feel anxious about being ‘micro-targeted’ and possible harms resulting from the use of sensitive data.74

Education and awareness-raising

61.In our report Growing up with the internet we recommended that “digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics”. Digital literacy refers to “the skills to use, create and critique digital technologies” and the knowledge “to critically understand the structures and syntax of the digital world, and to be confident in managing new social norms”.75 The Children’s Media Foundation found that digital literacy remains “poor in many audience groups—including children and parents.”76 Dr Paul Bernal of the University of East Anglia agreed that levels of understanding were low but noted that the internet would probably always be “a messy and sometimes confusing place”.77 He advocated that children should “become ‘savvy’ and encouraged to be sensible, rather than our suggesting that we can make the environment fundamentally safe”. However, 5Rights Foundation argue that it is wrong to ask children to “be resilient to a system that does not respect or protect their safety and security”.78

62.Parents play an important role in mediating children’s use of the internet. However, many parents lack the knowledge or confidence to do so effectively. The government could do more to rationalise guidance to make it clearer and more easily accessible. Some of the largest companies support Internet Matters, a website of resources to help keep children safe online.79 Tech companies which provide online services should take responsibility for providing educational tools and raising awareness, including raising awareness of how their services work and potential harms of using them. However, advice should not be limited to parents and children. Users of all ages can benefit from being better informed. The Northumbria Internet & Society Research Interest Group argued that: “Education and advice should become integrated as part of the online user experience.”80

63.Many tech companies argued that the response to online harms should focus on improving digital literacy. But digital literacy cannot be the only solution to problems associated with the internet.81 The most vulnerable people in society are particularly susceptible to online harms, but they are less likely to develop digital literacy.

Democratic accountability, proportionality and evidenced-based approach

64.A report from Communications Chambers identified the risk of ‘regulation by outrage’ where in the absence of an effective regulatory framework “outrage, campaigning and lobbying” intensified by media coverage have stimulated ad hoc responses to online harms.82 It is unclear how effective these responses are and they leave “consumers none the wiser about the true risks of online content nor what they have a right to expect from intermediaries”. A more strategic approach is therefore necessary.

65.Many witnesses warned about the risks of unintended consequences when introducing regulation which might stifle competition, freedom of expression and information. Dr Paul Bernal advised that regulation needed to be “monitored very closely if a decision is made to regulate. Where regulation is not working or being counterproductive, it needs to be reversed.”83 Regulatory action should therefore be based on evidence. However, in some cases it can take a long time for harm to become apparent by which stage it is too late to react. In cases of high risk it may be appropriate to act to prevent harm before the evidence is conclusive.

66.On the other hand, witnesses criticised the current model self-regulation which encourages platforms to police online harms. Doteveryone said that this lacks “democratic legitimacy as there is little opportunity for the public, civil society and government to have their say on what constitutes a “harm”, and where the damage caused by it outweighs the right to freedom of expression.” In the final chapter of this report we consider how future regulatory responses should be developed.

Conclusion

67.The 10 principles set out in this report should guide the development and implementation of regulation online and be used to set expectations of digital services. These principles will help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. They will help ensure that rights are protected online just as they are offline. If rights are infringed, those responsible should be held accountable in a fair and transparent way. With these principles the internet would remain open to innovation and creativity while a new culture of ethical behaviour would be embedded into the design of services.


32 Written evidence from McEvedys Solicitors & Attorneys (IRN0065)

33 Written evidence from CMF (IRN0033)

35 Written evidence from Airbnb (IRN0091). Airbnb lists e-commerce, media, search engines, communications, payment systems, labour provision, operating systems, transport, advertising, distribution of cultural content and social networks.

36 See appendix 4.

37 These are: the internet should be free, open and accessible; people should understand the rules that apply to them when they are online; personal data should be respected and used appropriately; protections should be in place to help keep people safe online, especially children; the same rights that people have offline must be protected online, and the social and economic benefits brought by new technologies should be fairly shared.

38 Written evidence from McEvedys Solicitors and Attorneys (IRN0065)

39 Written evidence from Microsoft UK (IRN0085)

40 Written evidence from LINX (IRN0055)

41 Q 21. See also written evidence from British and Irish Legal Education Technology Association (BILETA) (IRN0029).

42 They may be classed as ‘ancillary service providers’, which would allow the BBFC to publicise their failure to comply with regulations but not to impose financial penalties.

43 Written evidence from Microsoft UK (IRN0085)

44 Written evidence from Sky (IRN0060)

45 Written evidence from Doteveryone (IRN0028)

46 Written evidence from NINSO (IRN0035)

47 Written evidence from Lilian Edwards, Professor of eGovernance (IRN0069)

48 Professor Frank Pasquale, The Black Box Society (Harvard University Press 2015), p 3

50 Written evidence from Lilian Edwards, Professor of eGovernance (IRN0069)

51 Written evidence from Subforum (IRN0013)

52 Written evidence from McEvedys Solicitors & Attorneys Ltd (IRN0065)

53 Written evidence from the Information Commissioner’s Office (ICO) (IRN0087)

55 Written evidence from CARE (IRN0024)

57 Written evidence from Google (IRN0088). See also written evidence from the Royal Academy of Engineering (IRN0078).

58 Written evidence from BILETA (IRN0029). These two rights are enshrined in Article 10 of the European Convention on Human Rights.

59 Written evidence from the Advertising Association (IRN0039). In the US the Federal Communications Commission is seeking to repeal net neutrality rules in respect of Internet Service Providers.

60 Written evidence from Horizon Digital Economy Research Institute (IRN0038)

61 Written evidence from NINSO (IRN0035)

62 Written evidence from the ICO (IRN0087)

64 Communications Committee, Growing up with the internet (2nd Report, Session 2016–17, HL Paper 130)

65 Written evidence from Her Majesty’s Government (IRN0109)

66 Emily Laidlaw, Internet Gatekeepers, Human Rights and Corporate Social Responsibilities, PhD thesis (London School of Economics, 2012) p 3

67 Written evidence from the Information Law and Policy Centre, Institute for Advanced Legal Studies (IRN0063)

68 United Nations, Guiding Principles on Business and Human Rights (16 June 2011): https://www.ohchr.org/documents/publications/GuidingprinciplesBusinesshr_eN.pdf [accessed 26 February 2019]

70 Written evidence from BCS, The Chartered Institute for IT (IRN0092)

71 Written evidence from Michael Veale, University College London (IRN0077)

73 Written evidence from Global Partners Digital (IRN0099) and the UK Safer Internet Centre (IRN0061)

74 Written evidence from Which? (IRN0116)

75 5Rights, ‘The right to digital literacy’: https://5rightsfoundation.com/the-5-rights/the-right-to-digital-literacy.html [accessed 15 February 2019]

76 Written evidence from CMF (IRN0033)

77 Written evidence from Dr Paul Bernal (IRN0019)

78 5Rights, ‘The right to digital literacy’

79 Q 104 (Daniel Butler)

80 Written evidence from NINSO (IRN0035)

81 Written evidence from CMF (IRN0033)

82 Mark Bunting ‘Keeping consumers safe online: Legislating for platform accountability for online content’ Communications Chambers (July 2018): http://static1.1.sqspcdn.com/static/f/1321365/27941308/1530714958163/Sky+Platform+Accountability+FINAL+020718+2200.pdf [accessed 16 January 2019]

83 Written evidence from Dr Paul Bernal (IRN0019)




© Parliamentary copyright 2019