The Right to Privacy (Article 8) and the Digital Revolution Contents

Conclusions and recommendations


1.We consider that the vast majority of individuals would find it almost impossible to know what they are consenting to when using social media platforms or other web services. Individuals are highly unlikely to read or fully understand complex and lengthy terms and conditions or privacy notices. Moreover, these notices are non-negotiable and offered in a take it-or-leave-it manner. Facebook, Snapchat, YouTube and many other online services make joining a service conditional on agreeing wholesale to terms and conditions, which includes current privacy notices and future changes to terms. In practice, this means individuals often have no choice but to agree if they want to use a service, which raises questions about whether or not consent has really been given. (Paragraph 25)

2.Our view, based on the evidence we heard, is that the consent model is broken. It puts too much onus on the individual to educate themselves on how the technology companies work rather than setting a high standard of protection by default. (Paragraph 26)

3.It is unreasonable to place the onus for knowing about the risks or harms associated with using web-based services on the consumer. Internet users should be able to trust that the infrastructure is secure and will protect them appropriately. Consent should no longer be used as a blanket basis for processing. (Paragraph 27)

4.Just as they do in the offline world, the Government must ensure robust regulatory standards are in place, and rigorously enforced, so internet users can be confident that any data that companies hold about them is being used in a reasonable manner. (Paragraph 28)

5.Children and vulnerable adults are likely to find it particularly difficult to give meaningful consent, given the complexity of documents they are being asked to read. In addition, peer pressure to join the same social networks as their friends may make the ‘take it or leave it’ approach to consent especially problematic for children. (Paragraph 34)

6.We do not believe that it is reasonable to expect 13 year-olds to give informed consent to their personal data being processed. (Paragraph 35)

7.We also believe there is a very strong likelihood of those under 13 regularly ‘consenting’ to their data being used, given that there is no meaningful way for a company to determine the age of the person consenting. (Paragraph 36)

8.The general rule under Article 8 of the GDPR is an age of digital consent of 16. Protections for children in the UN Convention on the Rights of the Child should apply to all children under the age of 18. While the ‘consent model’ for data processing in the GDPR remains, the Government should urgently act to protect children by raising the age of digital consent to 16, and putting in place adequate protection for all those under 18 who access services online. In any case, consent should not be used as a basis for processing the data of children under the age of 16. (Paragraph 37)

Legitimate interests

9.Article 6 of the GDPR states that there may be legitimate interest for the controller to process the data without consent where there is a relevant and appropriate relationship between the individual and controller. However, there is not sufficient clarity on how an organisation determines what is in its legitimate interest and how it overrides the individual’s rights. (Paragraph 42)

10.Given that there is a lack of understanding among companies around the use and relevance of the legitimate interests basis, we consider that there should be clearer guidance to companies either issued from the ICO or the Government around when and how the legitimate interests basis can be used. We also consider that there should be a rigorous process to test whether companies are using legitimate interests appropriately. (Paragraph 43)

Risk to privacy

11.Using the internet is an essential part of most people’s day-to-day lives. But use of many websites and services is contingent on consenting to personal data being shared. This puts people’s right to privacy at risk. It is likely that many people are unaware that they have agreed for their data to be shared, especially given the complexity of consent agreements. (Paragraph 49)

Sharing data without subject’s knowledge

12.The evidence we heard suggests that people’s data is routinely being shared and used without their consent, which clearly infringes on their right to privacy. (Paragraph 53)

13.It should be made much simpler for individuals to see what data has been shared about them, and with whom, and to prevent some or all of their data being shared. (Paragraph 54)

14.The Government should explore the practicality and usefulness of creating a single online registry that would allow people to see, in real time, all the companies that hold personal data on them, and what data they hold. (Paragraph 55)

Combining data from different sources

15.Even where individuals have knowingly consented to sharing some of their personal data with one company, they may not be content with that data being combined to create a profile of themselves that they have no opportunity to see or edit. (Paragraph 64)

16.It is deeply concerning that ‘data’ about an individual is being used and shared when it is based on inferences that may be untrue, and when the individual has no opportunity to correct any inaccuracies: indeed, there is no way of finding out what inferences may have been made about you. (Paragraph 65)

17.This makes the need for people to be informed about what data is being collected and shared, and with whom, even more pressing. (Paragraph 66)

18.We agree with the recommendation of the House of Lords Communications Committee that, in a model similar to a subject access report under the GDPR, users should have the right to request data that a company has generated about them, so they are aware of any inferences that may have been made. (Paragraph 67)

Risk of data breaches

19.Companies hold significant amounts of our personal data. They must take full responsibility for keeping it safe and secure. (Paragraph 70)

User choice?

20.Companies must respect people’s right to privacy, and make it easier for people to limit or stop how their data is being shared. (Paragraph 74)

Challenging or deleting data

21.Given that many people will have consented to their personal data being shared without being in a position to understand what they were agreeing to, that people’s data is being shared without their consent and that inferences are being drawn from people’s data to create a profile of them that may be entirely incorrect, it is vital that companies make it easy for people to correct or remove data held about them. While the GDPR gives individuals the rights to have their personal data erased and rectified, the evidence we heard suggests that these are not always adequately enforced. Companies must respect people’s right to privacy, and make it easier for people to limit or stop how their data is being shared. We consider that these rights could be more effectively enforced if specific sanctions were associated with non-compliance of these rights by companies, particularly when companies fail to respond promptly or adequately to individual’s requests to rectify or delete their data. (Paragraph 78)

The risk of discrimination

22.We were shocked to hear that major companies have used the ability to target advertising in order to discriminate against certain groups of people. Those social media channels and websites on which the advertisements are being placed must accept responsibility and carry out sufficient checks on adverts to ensure that companies are not inadvertently or deliberately excluding people in a discriminatory way which disadvantages them in their access to opportunities in areas like employment, housing or finance. (Paragraph 86)

23.There are challenging questions to be asked about the balance between providing ‘personalised content’ (i.e. showing someone the advertisements, news stories etc. that they are most likely to be interested in) and discriminating against people by deciding certain material should or should not be shown to them because of their particular demographics. This debate needs to be had, and we urge the Government to bring internet companies, regulators and users together to discuss this. These discussions should also explore how anti-discrimination laws can be better enforced in the online advertising world. (Paragraph 87)

24.Companies need to be aware of how targeting content at people based on certain hobbies, interests etc may indirectly be discriminating against certain groups of people. They should be actively looking for, and screening out, such practices and ensuring they have adequate tests in place to consider whether targeting certain aspects of users’ profiles could be discriminatory. (Paragraph 88)

25.Important decisions–such as whether to refuse someone access to a service–should never be made based on inferences from people’s data, and the Government should review whether the current legal framework is adequately robust in this regard. (Paragraph 89)

26.We consider that more transparency is needed in relation to how advertisements are targeted at individuals online, in order to prevent discrimination from occurring. This could potentially include introducing tools through which individuals can look up how companies are targeting adverts at them, or at others, online and which would enable regulators to effectively audit the criteria used by advertisers. (Paragraph 90)

27.We consider that mechanisms allowing for better collective redress could be particularly useful in relation to targeted advertisements online, given that an individual cannot compare what they see online with what is seen by others and would therefore be unaware that they were being discriminated against. In such situations, unlawful practices are more likely to be revealed by independent investigations, most often carried out by civil society organisations and charities; if these organisations could then pursue cases on behalf of the affected individuals, the companies undertaking these activities could more effectively be held to account. (Paragraph 92)

The UN Guiding Principles

28.The UN Guiding Principles on Business and Human Rights, if fully implemented, would address many of the concerns raised in this report by requiring companies to both make users aware of how their data is used and proactively identify and mitigate any adverse impact their activities may have on people’s human rights. (Paragraph 98)

29.The Government should consider how it could mandate internet companies to adhere to the Guiding Principles, and how it could effectively enforce such a requirement. We restate the recommendation from our 2017 report on business and human rights that reporting on due diligence in human rights should be compulsory for large businesses. (Paragraph 99)

30.The Government should also update its National Action Plan for implementing the Guiding Principles to include specific consideration of the impact of internet and social media companies on human rights. (Paragraph 100)

Stronger enforcement of regulation

31.The GDPR should offer a substantial level of protection for people’s personal data, but this does not seem to have materialised in practice. The Government should review whether there are adequate measures in place to enforce the GDPR and DPA in relation to how internet companies are using personal data, including consideration of whether the ICO has the resources necessary to act as an effective regulator (Paragraph 105)

New regulation

32.While we welcome the publication of the Government’s Online Harms White Paper, it was disappointing that violation of people’s right to privacy and freedom from discrimination were not included in their list of online harmful activity that they consider to be in scope of the White Paper. We do not agree with the Government that the existing legal framework provides adequate protection against the misuse of people’s data by internet companies and would urge them to reconsider the scope of their proposals. (Paragraph 108)

33.The Government’s proposals to create a new statutory duty of care to make companies take more responsibility for the safety of their users, enforced by an independent regulator, could provide a valuable framework for ensuring that internet companies uphold people’s human rights. We urge the Government to include in its proposed “duty of care” a requirement for companies to adhere to robust standards on how people’s data is processed. (Paragraph 109)

34.The Government should also consider how the UN ‘s Guiding Principles on Business and Human Rights could be incorporated into their new regulatory regime. (Paragraph 110)

Published: 3 November 2019