18.As outlined in Chapter one, there are six lawful bases for processing personal data. In the context of the provision of internet-based services, the most common bases are consent and legitimate interests.
19.Several witnesses expressed concern with the way that consent, as a legal basis for data processing, worked. Tamsin Allen, Partner at Bindmans LLP, emphasised what is at stake: “What we are dealing with is agreeing to the use of your data not just by one company; you are agreeing to your data being used, reused, combined, mathematically altered, and kept probably for ever in one form or another [ … ] keeping you as one digital archetype at one moment in time.”
20.A major concern was whether individuals are aware about what they are consenting to when using social media platforms or other web services. According to Liberty:
“[ … ] vast numbers of people are not fully aware of how their data is being used, and do not have a meaningful level of choice to consent to this usage. Many users of social media platforms will feel trapped in the decision to either accept terms and conditions they are not comfortable with or find themselves unable to access the service which may form an integral part of their lives.”
21.Doteveryone, a think tank that champions responsible technology, agreed. In a research project they conducted, 47% said felt they had no choice but to sign up to terms and conditions, even if they have concerns about them.
22.Dr Orla Lynskey, Associate Professor of Law at the London School of Economics, told us:
“To be valid from a legal perspective, consent has to be freely given, specific and informed, so you can already imagine how difficult it is to fulfil those conditions when you think of the way in which you are asked to provide consent in the digital environment. If you are consenting to something on your mobile phone, for instance, that information might be disaggregated across six or seven documents that you have to click through a number of times to get a complete picture of the way in which your personal information is being used. That makes it very difficult to have informed consent.
There is [also] a widespread commercial practice of bundling consent—having very vaguely stated purposes for the use of your personal information, which militates against this idea that you should be consenting to something specific.”
23.Privacy International agreed that having unduly lengthy privacy policies made it difficult for individuals to make informed choices about handing over their personal data:
“The average Internet user would have to spend seventy-six working days each year to simply read the privacy policies they would encounter in a given year. An investigation by the BBC in June 2018 revealed that companies such as Amazon, Apple, Facebook, Google, Instagram, LinkedIn, Snapchat, Spotify, Tinder, Twitter, WhatsApp, and YouTube had privacy policies that were written at a university reading level and would be more complicated to read than Charles Dickens’ A Tale of Two Cities. Reading the privacy policies of the fifteen companies the BBC examined would take an average person almost nine hours to read.”
24.The ‘consent model’ also relies on individuals having the necessary expertise to understand the risks that may be involved in what they are consenting to. Tamsin Allen, a Partner at Bindmans LLP, made the point that we do not expect this from individuals when it comes to assessing risk in the ‘real world’:
“ If you enter a building, you do not sign away your rights to enter it safely. You do not sign a form with 14,000 pages that tells you how the building was built and that says you have to accept the risk. You rely on the fact that the architect, the engineer and the builder will be subject to regulation, and that there will be insurance and public liability requirements on the building because it is open to the public, and you will feel that you can then walk into that building safely.
The companies that build data systems describe themselves as architects and engineers, so it is unfair on an individual to expect them to take responsibility for any risks, and there are serious risks of harm associated with using web-based services.”
25.We consider that the vast majority of individuals would find it almost impossible to know what they are consenting to when using social media platforms or other web services. Individuals are highly unlikely to read or fully understand complex and lengthy terms and conditions or privacy notices. Moreover, these notices are non-negotiable and offered in a take it-or-leave-it manner. Facebook, Snapchat, YouTube and many other online services make joining a service conditional on agreeing wholesale to terms and conditions, which includes current privacy notices and future changes to terms. In practice, this means individuals often have no choice but to agree if they want to use a service, which raises questions about whether or not consent has really been given.
26.Our view, based on the evidence we heard, is that the consent model is broken. It puts too much onus on the individual to educate themselves on how the technology companies work rather than setting a high standard of protection by default.
27.It is unreasonable to place the onus for knowing about the risks or harms associated with using web-based services on the consumer. Internet users should be able to trust that the infrastructure is secure and will protect them appropriately. Consent should no longer be used as a blanket basis for processing.
28.Just as they do in the offline world, the Government must ensure robust regulatory standards are in place, and rigorously enforced, so internet users can be confident that any data that companies hold about them is being used in a reasonable manner.
29.There are also concerns about the way that children’s consent is obtained online. As outlined in Chapter One, in the UK a child aged 13 years or older can consent to their personal data being processed; parental consent is required to collect and process the information of children aged 12 and under.
30.UNICEF UK argued that, in situations where parental consent is required for children to share their data, there is no guarantee that parents are better-positioned to make decisions that protect children’s privacy. UNICEF UK cite research by the LSE from May 2018 which looked at whether parents have the skills to translate concerns about privacy into practical action. They found that 58% of parents said they could change their own privacy settings online while 53% said they could decide which information they want to share, suggesting that nearly half would not know how to stop their children’s data being shared.
31.It is also not clear how websites and social media platforms can determine the age of the person consenting with any accuracy, suggesting the data of many children under the age of 13 may be being collected and processed without parental consent. We note that some respondents to the ICO’s Age Appropriate Design Code Consultation, which closed in May 2019, highlighted obtaining and verifying parental consent for children under 13 was a problem. We also note with considerable disappointment that Government plans to introduce an age verification of 18 for online pornography that were contained in Part 3 of the Digital Economy Act are not being commenced, although they are looking at other mechanisms to achieve similar aims. We call on the Government to expedite finding an effective solution to this problem as part of its wider work on online harms. While not directly related to the focus of this inquiry, the debate on protecting children from adult content highlights the lack of mechanisms in place to determine the age of the user.
32.For those children deemed old enough to give consent (those aged 13 or over), 5Rights Foundation, an organisation which is dedicated to making the digital environment fit for children, stressed that the barriers to giving informed consent are even greater than for adults:
“Technology companies often assert that children understand their privacy and rights online. Yet extensive independent research repeatedly finds not only that children don’t fully understand their privacy or rights online, but also that they are actively discouraged from understanding them by the way the information is presented online.
Children don’t read terms and conditions or privacy notices and are either unable or discouraged to given their length and complexity”
33.The Government has announced its intention to publish draft legislation aimed at tackling online harm, which includes protecting children from harmful content. We believe this could also be a vehicle for protecting children (and adults) in relation to how their data is used, an issue we explore further in Chapter 6. We look forward to scrutinising the draft Bill, including to consider whether there are sufficient protections within it to protect children online, in line with the United Nations Convention on the Rights of the Child.
34.Children and vulnerable adults are likely to find it particularly difficult to give meaningful consent, given the complexity of documents they are being asked to read. In addition, peer pressure to join the same social networks as their friends may make the ‘take it or leave it’ approach to consent especially problematic for children.
36.We also believe there is a very strong likelihood of those under 13 regularly ‘consenting’ to their data being used, given that there is no meaningful way for a company to determine the age of the person consenting.
37.The general rule under Article 8 of the GDPR is an age of digital consent of 16. Protections for children in the UN Convention on the Rights of the Child should apply to all children under the age of 18. While the ‘consent model’ for data processing in the GDPR remains, the Government should urgently act to protect children by raising the age of digital consent to 16, and putting in place adequate protection for all those under 18 who access services online. In any case, consent should not be used as a basis for processing the data of children under the age of 16.
38.Article 6(1)(f) of the GDPR allows processing of data where:
“processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.”
39.Recital 47 of the GDPR broadly describes areas where legitimate interests could be relied upon, such as:
40.The ICO’s guidance on legitimate interests states that a “wide range of interests may be legitimate interests. The GDPR specifically mentions use of client or employee data, marketing, fraud prevention, intra-group transfers, or IT security as potential legitimate interests.”
41.Richard Cumbley, from Linklaters LLP, told us that the use of legitimate interests now requires organisations to go through a legitimate interest assessment, evidencing to the regulator that appropriate steps have been taken to mitigate risks to individuals. However, Ailidh Callander from Privacy International, expressed concern that:
“[ … ] legitimate interest is being used as a way to justify any business interest. There is no demonstrable evidence of how the rights of individuals are being considered.”
The ICO has also found that companies “are unable to demonstrate that they have properly carried out the legitimate interests tests and implemented appropriate safeguards.”
42.Article 6 of the GDPR states that there may be legitimate interest for the controller to process the data without consent where there is a relevant and appropriate relationship between the individual and controller. However, there is not sufficient clarity on how an organisation determines what is in its legitimate interest and how it overrides the individual’s rights.
43.Given that there is a lack of understanding among companies around the use and relevance of the legitimate interests basis, we consider that there should be clearer guidance to companies either issued from the ICO or the Government around when and how the legitimate interests basis can be used. We also consider that there should be a rigorous process to test whether companies are using legitimate interests appropriately.
12 [Tasmin Allen]
13 Liberty ()
14 Doteveryone ()
15 [Dr Orla Lynskey]
16 Privacy International ()
17 [Tamsin Allen]
18 UNICEF UK ()
19 London School of Economics, , May 2018
20 Information Commissioner’s Office , page 2
21 Rt Hon Nicky Morgan MP, Written Ministerial Statement, 16 October 2019, HCWS13
22 5Rights Foundation ()
23 Prime Minister’s Office,
24 See for example the requirements of , which apply to all children under the age of 18.
25 REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC ()
26 REGULATION (EU) 2016/679 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC ()
27 Information Commissioner’s Office, , 22 March 2018
28 [Richard Cumbley]
29 [Ailidh Callander]
30 Information Commissioner’s Office, , June 2019
Published: 3 November 2019