Regulating in a digital world Contents

Regulating in a digital world

Chapter 1: Introduction

“The changes we’ve managed to bring have created a better and more connected world. But for all the good we’ve achieved, the web has evolved into an engine of inequity and division; swayed by powerful forces who use it for their own agendas.” Sir Tim Berners-Lee, Creator of the World Wide Web1

“My position is not that there should be no regulation. I think the real question as the internet becomes more important in people’s lives is ‘What is the right regulation?’” Mark Zuckerberg, Chief Executive Officer and founder of Facebook2


1.We began our inquiry by asking whether further internet regulation was possible or desirable.3 However, the focus of this report might be better described as the ‘digital world’: an environment composed of digital services—facilitated by the internet—which plays an ever-increasing role in all aspects of life. The digital world enables people to communicate and transact business with one another on a scale previously unimaginable.

2.The internet has transformed and disrupted economies thanks to rapid innovation enabled by light-touch regulation and a corporate culture which espoused the mantra “move fast and break things”. The speed of technological change and its transnational character make the digital world hard to regulate.4 There has been a widespread perception therefore that comprehensive internet regulation was not possible or that, if it were possible, it would not be advisable.

3.More recently, however, there has been a backlash against this attitude. A series of events have highlighted a litany of concerns, such as harmful online content, abusive and threatening behaviour, cybercrime, misuse of data, and political misinformation and polarisation. According to a survey for Ofcom and the Information Commissioner’s Office (ICO), 45% of adult internet users in the UK have experienced some form of online harm.5 However, individuals are unaware of rights they have or what they should expect from online service providers.6 There is an emerging consensus that action is needed to address these concerns.

4.The internet started more than 40 years ago as a decentralised communications network which was open to be used by anyone, although it was largely used by the military and academics who had the necessary equipment and technical ability.7 Since then a small number of companies have come to dominate the digital world. In the quotation above, Sir Tim Berners-Lee, the creator of the World Wide Web, expressed concern that this has led to a power imbalance, allowing these large companies to treat users unfairly and with little regard to negative consequences for society as a whole. Without intervention the largest tech companies are likely to gain ever more control of technologies which disseminate media content, extract data from the home and individuals or make decisions affecting people’s lives. If governments fail to regulate the internet adequately, it will evolve in ways determined by, and in the interests of, these companies. Professor Christopher Marsden of the University of Sussex explained: “Our relationship with the internet, as society and as individuals, continues to develop, so the do-nothing option is not one in which nothing happens. A great deal happens, but without legislative impulse.”8

5.Although the internet is subject to a variety of laws and regulation including copyright law, defamation law, the data protection framework, and the criminal law, a large volume of activity occurs online which would not normally be tolerated offline.

6.One example is the combined effect of personal data profiling and targeted political and commercial messaging including so-called ‘fake news’. While some activities surrounding the Cambridge Analytica scandal have been found to be criminal, with the ICO stating its intention to fine Facebook the maximum £500,000 for two breaches of the Data Protection Act 1998, other forms of targeted messaging exist in a grey area. The Digital, Culture, Media and Sport Committee found that “Electoral law in this country is not fit for purpose for the digital age, and needs to be amended to reflect new technologies.”9

7.This is but one recent area of concern. Jamie Bartlett, Director of the Centre for the Analysis of Social Media at Demos, told us that the digital world encourages poor behaviour at the personal level:

“Simply the way we communicate with each other online is very sharp, quick, and dramatic. We tend to overstate our enemies’ or opponents’ importance and significance, and we attribute to them all sorts of terrible motives that they probably do not have, and they do likewise to us.”10

8.Considerable media focus has been brought to bear upon political discourse in social media involving hateful forms of speech directed at female MPs. Amnesty International found that Diane Abbott MP received 8,121 abusive tweets in 150 days—an average of 54 per day.11 There are widespread concerns about the role of social media in spreading hate and societal dissonance in spite of services’ community standards forbidding hate speech.12

9.Although much of the discussion about internet regulation has focused on social media, Rachel Coldicutt, Chief Executive Officer of Doteveryone, cautioned that this is just “the tip of the iceberg. There are an enormous number of other potential harms.”13

10.Action is needed to address these harms and to make the digital world work better for individuals and society.

The law on the internet

11.The internet is not an unregulated ‘Wild West’, as it has sometimes been characterised.14 Criminal and civil law generally applies to activity on the internet in the same way as elsewhere. For example, section 1 of the Malicious Communications Act 1988 prohibits the sending of messages which are threatening or grossly offensive; it applies whether the message is through the post or through any form of electronic communication. There is also legislation which specifically targets online behaviour, such as the Computer Misuse Act 1990.

12.There are three models to enhance and enforce rules of law and other norms and standards online: regulation, co-regulation and self-regulation

13.Regulation is carried out by independent bodies with powers to monitor and enforce rules for conducting specified types of activity. Several regulators have responsibilities for activities which are particularly relevant to the online environment. Notably, Ofcom has responsibility for ‘TV-like’ content and telecommunications companies, which provide material access to the internet, and the Information Commissioner’s Office regulates the use of data, which is essential to the digital economy.15 But no regulator has a remit for the internet in general and there are aspects of the digital environment, such as user-generated content, for which no specific regulator is responsible.

14.Self-regulation is where internet businesses set rules themselves on a voluntary basis. These may include best practice and corporate social responsibility. In our report Growing up with the internet,16 we found a strong preference among internet policy-makers for self-regulation online as it allowed businesses to apply rules in accordance with their own business interests.

15.Co-regulation is where a regulatory body delegates responsibility to enforce rules to an industry body. For example, the Communications Act 2003 gave Ofcom the duty to regulate broadcast advertising, but Ofcom delegated the day-to-day responsibility for this to the Advertising Standards Authority, an industry body which regulates advertising content.17 In practice, there is a sliding scale of self-regulation and co-regulation depending on the degree to which rules are formalised and the Government, or other public bodies, put pressure on industry to regulate itself.18

16.The transnational nature of the internet poses problems in enforcing regulation, including conflicts of law, confusion about which jurisdiction applies and in seeking redress against foreign actors. But individual countries are not powerless in enforcing their own laws. Professor Derek McAuley and his colleagues at the Horizon Digital Economy Research Institute, University of Nottingham, explained how the General Data Protection Regulation (GDPR) identifies jurisdiction by focusing on where the impact of processing occurs, namely the location of the data subject: “So generally, it is the case that services targeted at specific jurisdictions through localisation, whether through language or tailored local content, and generating revenue from such localisation should be required to obey the regulation within that jurisdiction.”19

17.Similarly, although it may be difficult to prevent online harms which originate outside the United Kingdom, the law can still be effective in protecting victims within this jurisdiction. For example, although salacious reports were published around the world about the private life of an anonymous celebrity, the Supreme Court granted an injunction against such reports being circulated in England and Wales where the celebrity’s child might see them in future on social media.20

18.In the long-term regulatory fragmentation threatens the cohesiveness and interoperability of the internet, which has developed as a global and borderless medium. The Internet Society has called on national policy-makers to weigh the risks and benefits of any regulatory action, to collaborate with stakeholders, and to be mindful of the unique properties of the internet including interoperability and accessibility.21 Global action also makes domestic measures more effective. The Government told us that the UK has played a leading role in addressing problems raised by the internet and notes that: “As the UK leaves the EU, international collaboration will be more important than ever.”22 The UN is currently undertaking a high-level inquiry on digital cooperation.23

Our inquiry

19.Building on our previous inquiries on children’s use of the internet and the digital advertising market,24 we set out to explore how regulation of the digital world could be improved. In doing so, we sought to inform the Government’s ‘Digital Charter’, an ongoing programme of work aiming to make the UK “the safest place in the world to be online and the best place to start and grow a digital business”.25 We support these objectives. In our view, good regulation is not only about restricting certain types of conduct; rather, it makes the digital world work better for everyone and engenders a more respectful and trustworthy culture.

20.Several witnesses highlighted that the internet is too broad a concept to speak meaningfully of regulating it—comprising different layers such as network infrastructure, protocols and standards, and user services built on top of these.26 This report focuses on issues which are particularly relevant to the upper “user services” layer of the internet, in particular online platforms (see Box 1), but we believe that many of our key recommendations apply more broadly. Many witnesses argued that regulatory action should focus on the function of specific regulation (for example, data protection) rather than the technology being used,27 and that “one-size-fits-all” regulation would not work. However, we believe that regulation can be guided by common principles even where implementation differs.

21.We were concerned that there are gaps in regulation and that it appears to be fragmented and poorly enforced online. Policy discussion in this area seems to be driven by public perceptions of specific harms. The Royal Academy of Engineering called for:

“A strategic approach … alongside a more direct response to the current challenges. There is a risk that any response is tactical and piecemeal, responding to received wisdoms. Instead, a more fundamental rethink is required.”28

We sought to understand the question of internet regulation holistically to see what general approach was required for the future.

Box 1: Online platforms

The European Commission defines an online platform as “an undertaking operating in two (or multi)-sided markets, which uses the internet to enable interactions between two or more distinct but interdependent groups of users so as to generate value for at least one of the groups”. There is some uncertainty about the scope of this definition as the uses of online platforms are extremely diverse and still evolving. Examples include search engines, marketplaces, social media platforms, gaming platforms and content-sharing platforms.

Online platforms share the following features: they use communication and information technologies to facilitate interactions between users, they collect and use data about these interactions; and they tend to benefit from network effects.

Source: European Commission (2015), ‘Consultation on Regulatory environment for platforms, online intermediaries, data and cloud computing and the collaborative economy’, 24 September, p 5

22.In the next chapter we consider a principles-based approach to regulation. Then we examine two overarching issues: the concentration of internet services into the hands of a small number of companies and the ethical principles of designing internet technology. Then we consider the role of online platforms in dealing with online harms; this is an area of focus as the Government develops its Internet Safety Strategy, a major strand of the Digital Charter.29 Finally, we explore how to regulate for the future.

23.We received over 100 pieces of written evidence. Between July 2018 and January 2019 we took oral evidence from many witnesses including legal and social science academics, think tanks, charities, rights groups, broadcasters, journalists, industry bodies, and representatives of some of the world’s largest tech companies, Google, Facebook, Microsoft and Amazon, as well as Twitter and Match Group. We also met representatives of criminal law enforcement, regulators and Margot James MP, Minister for Digital and the Creative Industries.

24.Our inquiry was also informed by several reports which have been published just before or during the inquiry. They include the work of:

There have also been numerous reports by civil society groups and academics, including: Doteveryone, a thinktank; Communications Chambers, a consultancy; Professor Lorna Woods and William Perrin for the Carnegie UK Trust; and the LSE Truth, Trust and Technology Commission. The volume and contents of these reports reinforced our view that action is necessary.

25.The question of internet regulation has taken on a new prominence in the media since we began work. In particular, the death of 14-year-old Molly Russell and her family’s campaigning has given rise to a greater public awareness of the most extreme risks the internet can pose. There has also been a noticeable shift in the rhetoric of major platforms. In February 2019 Twitter’s CEO, Jack Dorsey, admitted that he would grade the company at a ‘C’ for ‘Tech Responsibility’ and reflected that Twitter had “put most of the burden on the victims of abuse (that’s a huge fail)”.31 We hope that our report can play a valuable part in this crucial and fast-moving debate on the future of regulation in a digital world.

26.We are grateful to all those who contributed to our inquiry. We also thank Professor Andrew Murray, Professor of Law at the London School of Economics and Political Science, who provided expert advice throughout our inquiry.

1 Sir Tim Berners-Lee, ‘One Small Step for the Web…’, Medium (29 September 2018): [accessed 29 January 2019]

2 ‘Marks Zuckerberg’s testimony to Congress: Facebook boss admits company working with Mueller’s Russia probe’ The Daily Telegraph (11 April 2018): [accessed 23 November 2018]

3 See appendix 3 for our call for evidence.

4 Written evidence from The Children’s Media Foundation (CMF) (IRN0033)

5 Ofcom and ICO, Internet users’ experience of harm online: summary of survey research (September 2018): [accessed 3 January 2018]

6 Q 161 (Caroline Normand)

7 John Naughton ‘The evolution of the Internet: from military experiment to General Purpose Technology’ Journal of Cyber Policy, vol. 1, (12 February 2016): [accessed 26 February 2019]

9 Digital, Culture, Media and Sport Committee, Disinformation and ‘fake news’: Interim Report (Fifth Report, Session 2017–19, HC 363)

11 Amnesty International, ‘Unsocial Media: Tracking Twitter abuse against women MPs’ Medium (3 September 2017): [accessed 16 January 2019]

12 There are many reports on this such as CNN Business, ‘Big Tech made the social media mess. It has to fix it’ (29 October 2018): [accessed 16 January 2019].

14 Written evidence from Dr Paul Bernal (IRN0019)

15 See appendix 4 for a list of regulatory bodies which have such a remit.

16 Communications Committee, Growing up with the internet (2nd Report, Session 2016–17, HL Paper 130)

17 Advertising Standards Authority, ‘Self-regulation and co-regulation’: [accessed 29 November 2018]

18 Written evidence from Professor Christopher Marsden (IRN0080)

19 Written evidence from Horizon Digital Economy Research Institute, University of Nottingham (IRN0038)

20 PJS v Newsgroup Newspapers [2016] UKSC 26

21 Internet Society ‘The Internet and Extra-Territorial Effects of Laws’ (18 October 2018): [accessed 7 January 2019]

22 Written evidence from Her Majesty’s Government (IRN0109)

23 UN Secretary-General’s High Level Panel on Digital Cooperation, Digital Cooperation Press Release (12 July 2018) [accessed 26 February 2019]

24 Communications Committee, Growing up with the internet (2nd Report, Session 2016–17, HL Paper 130); Communications Committee, UK advertising in a digital age (1st Report, Session 2017–19, HL Paper 116)

25 DCMS, Digital Charter (25 January 2018): [accessed 26 November 2018]

26 Written evidence from Cloudfare (IRN0064) and Internet Society UK Chapter (IRN0076)

27 Written evidence from Horizon Digital Economy Research Institute, University of Nottingham (IRN0038)

28 Written evidence from the Royal Academy of Engineering (IRN0078)

29 DCMS, ‘Internet Safety Strategy green paper (11 October 2017): [accessed 11 December 2018]

30 This was also the subject of the European Union Committee’s report. Select Committee on European Union, Online platforms and the Digital Single Market (10th Report, Session 2015–16, HL Paper 129)

31 Casey Quackenbush, ‘Twitter’s CEO gives the company a “C” for “Tech Responsibility”’ Time (13 February 2019) [accessed 14 February 2019]

© Parliamentary copyright 2019