1.The 10 principles set out in this report should guide the development and implementation of regulation online and be used to set expectations of digital services. These principles will help the industry, regulators, the Government and users work towards a common goal of making the internet a better, more respectful environment which is beneficial to all. They will help ensure that rights are protected online just as they are offline. If rights are infringed, those responsible should be held accountable in a fair and transparent way. With these principles the internet would remain open to innovation and creativity while a new culture of ethical behaviour would be embedded into the design of services. (Paragraph 68)
2.As organisations, including financial and health services providers, increasingly perceive individuals as the aggregation of data gathered about them (sometimes called their ‘data selves’), it is essential that data be accurate, up-to-date and processed fairly and lawfully, especially when processed by algorithm. While the GDPR and the Data Protection Act 2018 provide valuable safeguards, including subject access rights to ensure that data are accurate and up to date and the right to opt out from purely automated processing, there are weaknesses in the regime. For example, a subject access request does not give subjects automatic access to behavioural data generated about them because it is deemed to be the property of the company that acquired it. (Paragraph 80)
3.Users of internet services should have the right to receive a processing transparency report on request. In a model similar to a subject access report under the GDPR users should have the right to request a data transparency report from data controllers showing not only what data they hold on the data subject (which is the currently the case under the GDPR) but also what data they generate on them (behavioural data) and any behavioural data obtained from third parties, including details of when and how they are obtained. (Paragraph 81)
4.Data controllers and data processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties, how they are stored and for how long, and how they are used and transferred. (Paragraph 82)
5.Digital service providers (such as hardware manufacturers, operators of digital platforms, including social media platforms and entertainment platforms, and games developers) should keep a record of time spent using their service which may be easily accessed and reviewed by users, with periodic reminders of prolonged or extended use through pop-up notices or similar. An industry standard on reasonable use should be developed to inform an understanding of what constitutes prolonged use. This standard should guide design so that services mitigate the risk of encouraging compulsive behaviour. (Paragraph 88)
6.The Information Commissioner’s Office should set out rules for the use of algorithms based on the principles set out in chapter 2. The ICO should be empowered to conduct impact-based audits where risks associated with using algorithms are greatest and to require businesses to explain how they use personal data and what their algorithms do. Failure to comply with the rules should result in sanctions. (Paragraph 100)
7.The ICO should also publish a code of best practice informed by the work of the Centre for Data Ethics and Innovation around the use of algorithms. This code could form the basis of a gold-standard industry ‘kitemark’. (Paragraph 101)
8.Data subjects should be given the right to request a statement from a data processor explaining how, if applicable, algorithms are used to profile them, deliver content or drive their behaviour. (Paragraph 102)
9.Terms of service must be written in a form which is clearly accessible and understandable to internet users. Alongside terms of service statements a ‘plain English’ statement should be published which sets out clearly and concisely the most relevant provisions. These may make use of infographics or video statements where appropriate. (Paragraph 109)
10.Where children are permitted to access or use a service age-appropriate terms and conditions must be provided. These should be written in language clearly understandable to children of the minimum age allowed on the platform. (Paragraph 110)
11.Maximum privacy and safety settings should be included in services by default. The Information Commissioner’s Office should provide guidance requiring platforms to provide greater choice to users to control how their data are collected and used. (Paragraph 115)
12.Regulators must ensure that terms of service are fair and must bring enforcement action against organisations which routinely breach their terms of service. (Paragraph 116)
13.Design principles and standards are a normal part of business life across all sectors. Establishing and enforcing standards that would meet the 10 principles would help to reduce harms to users and society. We recommend that regulation should follow the precautionary principle to ensure ethical design while also recognising the importance of innovation and entrepreneurship. (Paragraph 120)
14.We recommend that the ethical approach outlined in our 10 principles should be embedded in the teaching of all levels of computer science. The Government should promote and support this. The Centre for Data Ethics and Innovation will also have a role in providing guidance which can be incorporated into teaching, as well as educating users on the ethics and risks of the internet. (Paragraph 121)
15.Mergers and acquisitions should not allow large companies to become data monopolies. We recommend that in its review of competition law in the context of digital markets the Government should consider implementing a public-interest test for data-driven mergers and acquisitions. The public-interest standard would be the management, in the public interest and through competition law, of the accumulation of data. If necessary, the Competition and Markets Authority (CMA) could therefore intervene as it currently does in cases relevant to media plurality or national security. (Paragraph 150)
16.The modern internet is characterised by the concentration of market power in a small number of companies which operate online platforms. These services have been very popular and networks effects have helped them to become dominant. Yet the nature of digital markets challenges traditional competition law. The meticulous ex post analyses that competition regulators use struggle to keep pace with the digital economy. The ability of platforms to cross-subsidise their products and services across markets to deliver them free or discounted to users challenges traditional understanding of the consumer welfare standard. (Paragraph 161)
17.In reviewing the application of competition law to digital markets, the Government should recognise the inherent power of intermediaries and broaden the consumer welfare standard to ensure that it takes adequate account of long-term innovation. The Government should work with the Competition and Markets Authority (CMA) to make the process for imposing interim measures more effective. (Paragraph 162)
18.We take this opportunity to repeat the recommendation that we made in our report ‘UK advertising in a digital world’ that the CMA should undertake a market study of the digital advertising market. We would be grateful for an update from the Government and the CMA. (Paragraph 163)
19.Online communications platforms act as gatekeepers for the internet, controlling what users can access and how they behave. They can be compared to utilities in the sense that users feel they cannot do without them and so have limited choice but to accept their terms of service. Providers of these services currently have little incentive to address concerns about data misuse or online harms, including harms to society (Paragraph 172)
20.It is appropriate to put special obligations on these companies to ensure that they act fairly to users, other companies and in the interests of society. These obligations should be drawn up in accordance with the 10 principles we have set out earlier in this report and enforced by a regulator. (Paragraph 173)
21.It is too early to say how effective the right to data portability will be. It has the potential to help counteract the switching costs which lock users into services by giving them more autonomy over and control of their data. This will require greater interoperability. Portability would be more effective if the right applied to social graphs and other inferred data. The Centre for Data Ethics and Innovation should play a role developing best practice in this area. The Information Commissioner’s Office should monitor the operation and effectiveness of this right and set out the basis on which it will be enforced. (Paragraph 180)
22.Some have argued that the conditional exemption from liability should be abolished altogether. It has been suggested that using artificial intelligence to identify illegal content could allow companies to comply with strict liability. However, such technology is not capable of identifying illegal content accurately and can have a discriminatory effect. Imposing strict liability would therefore have a chilling effect on freedom of speech. These concerns would need to be addressed before the ‘safe harbour’ provisions of the e-Commerce Directive are repealed. (Paragraph 194)
23.Online platforms have developed new services which were not envisaged when the e-Commerce Directive was introduced. They now play a key role in curating content for users, going beyond the role of a simple hosting platform. As such, they can facilitate the propagation of illegal content online. ‘Notice and takedown’ is not an adequate model for content regulation. Case law has already developed on situations where the conditional exemption from liability under the e-Commerce Directive should not apply. Nevertheless, the directive may need to be revised or replaced to reflect better its original purpose. (Paragraph 195)
24.Technology companies provide venues for illegal content and other forms of online abuse, bullying and fake news. Although they acknowledge some responsibility, their responses are not commensurate with the scale of the problem. We recommend that a duty of care should be imposed on online services which host and curate content which can openly be uploaded and accessed by the public. This would aim to create a culture of risk management at all stages of the design and delivery of services. (Paragraph 207)
25.To be effective, a duty of care would have to be upheld by a regulator with a full set of enforcement powers. Given the urgency of the need to address online harms, we believe that in the first instance the remit of Ofcom should be expanded to include responsibility for enforcing the duty of care. Ofcom has experience of surveying digital literacy and consumption, and experience in assessing inappropriate content and balancing it against other rights, including freedom of expression. It may be that in time a new regulator is required. (Paragraph 208)
26.Content moderation is often ineffective in removing content which is either illegal or breaks community standards. Major platforms have failed to invest in their moderation systems, leaving moderators overstretched and inadequately trained. There is little clarity about the expected standard of behaviour and little recourse for a user to seek to reverse a moderation decision against them. In cases where a user’s content is blocked or removed this can impinge their right to freedom of expression. (Paragraph 224)
27.Community standards should be easily accessible to users and written in plain English. Ofcom should have power to investigate whether the standards are being upheld and to consider appeals against moderation decisions. Ofcom should be empowered to impose fines against a company if it finds that the company persistently breaches its terms of use. (Paragraph 225)
28.The sector should collaborate with Ofcom to devise a labelling scheme for social media websites and apps. A classification framework similar to that of the British Board of Film Classification would help users to identify more quickly the risks of using a platform. This would allow sites which wish to allow unfettered conversation or legal adult material to do so. Users could then more easily choose between platforms with stricter or more relaxed community standards. (Paragraph 226)
29.Community standards and classifications should be consistent with a platform’s age policy. (Paragraph 228)
30.We recommend that a new body, which we call the Digital Authority, should be established to co-ordinate regulators in the digital world. We recommend that the Digital Authority should have the following functions: to continually assess regulation in the digital world and make recommendations on where additional powers are necessary to fill gaps;
31.Policy-makers across different sectors have not responded adequately to changes in the digital world. The Digital Authority should be empowered to instruct regulators to address specific problems or areas. In cases where this is not possible because problems are not within the remit of any regulator, the Digital Authority should advise the Government and Parliament that new or strengthened legal powers are needed. (Paragraph 241)
32.The Digital Authority will co-ordinate regulators across different sectors and multiple Government department. We therefore recommend that it should report to the Cabinet Office and be overseen at the highest level. (Paragraph 245)
33.We recommend that a joint committee of both Houses of Parliament should be established to consider matters related to the digital environment. In addition to advising the Government the Digital Authority should report to Parliament on a quarterly basis and regularly give evidence to the new joint committee to discuss the adequacy of powers and resources in regulating the digital world. The combined force of the Digital Authority and the joint committee will bring a new consistency and urgency to regulation. (Paragraph 246)