44.The rapid development of technology, as well the growth of business models which allow companies to offer free services online in exchange for personal data, has meant that private companies are able to process personal data on a massive scale. A new industry has emerged, made up of companies known as ‘data brokers’, which collects and sells data on millions of individuals.
45.Most individuals have to use the internet in their everyday lives: according to the Office for National Statistics, 87% of all adults used the internet daily or almost every day in 2019. As a consequence, individuals release significant amounts of data.
46.Private companies assert that they collect data for the purposes of targeted advertising or personalisation of content. Google told us that personalising advertising to users “provide[s] an experience where ads are more relevant, more likely to be useful and, for advertisers, more likely to be effective.” Acxiom used the following examples to illustrate how personalised advertising could be useful for individuals:
If a car company thinks you are in the market to buy a car, it will look to show you, rather than someone who is not, its latest offer for its latest model. If you live in a high-rise building, people who sell garden equipment do not want to show you an advert, because not only is it a waste of your time but a waste of their resources. If they waste money on advertising, their costs go up, as does the price to the consumer. It is an incredibly complex equation.
47.Some of our witnesses, however, argued that the scale and amount of data that is being collected is beyond what is needed to provide a service. Madhumita Murgia from the Financial Times told us that current data practices of private companies were excessive and posed significant risks to individuals’ right to privacy:
“There is a social contract between all of us and our use of the free internet. Yes, we get all these [ … ] services for free and, yes, we do not mind being advertised to, especially if that advertising is relevant and targeted. That is agreed, and people want to use the internet for free, so they are willing to give up some amount of data, but the problem here is that it is completely out of control.”
48.As discussed in detail in Chapter 3, because of the length and complexity of consent agreements, and people’s desire or need to use the services on offer, it is likely that people are agreeing to their data being shared without realising what they have agreed to or feeling like they have a choice. Research by Doteveryone, a think tank that champions responsible technology, found that 62% of the people they spoke to were unaware that social media companies made money by selling data to third parties and 45% were unaware that information they enter on websites and social media can help target advertisements.
49.Using the internet is an essential part of most people’s day-to-day lives. But use of many websites and services is contingent on consenting to personal data being shared. This puts people’s right to privacy at risk. It is likely that many people are unaware that they have agreed for their data to be shared, especially given the complexity of consent agreements.
50.The ICO told us that businesses are increasingly buying and selling data through data brokers without the data subject’s knowledge and despite the fact that the data subject only gave consent to the use of their data in return for a service from one business. In 2018, for example, the ICO fined the owners of parenting advice website Emma’s Diary £140,000 for illegally selling data belonging to more than one million people. The data included people’s names, household addresses, the number of children they had and their childrens’ dates of birth. Emma’s Diary sold the information, obtained through their registration forms, to Experian Marketing Services. Experian, in turn, created a database for the Labour Party in order to profile new mums in the run up to the 2017 General Election.
51.The ICO also raised concerns about information acquired through business acquisitions. They told us that larger technology companies have been known to acquire vast amounts of data after buying smaller technology firms, even though consent for use of that data had been given by individuals at different times and to different entities.
52.Dr Reuben Binns, a data scientist from the University of Oxford, told us about other ways in which people’s data may be being shared without their consent. In his research study, which looked at nearly 1 million Android apps, he found that nine out of ten apps sent data back to Google; four out of ten apps sent data back to Facebook; and, in the case of Facebook, “many of them sent data automatically without the individual having the opportunity to say no to it.” Dr Binns told us that often even the developers of the apps are not necessarily fully aware of all the different parties that might receive that information:
“You may think you have a relationship of trust between a user and an app developer, when in fact the app developer may not be fully in control of everything that is happening within the app they have developed. The same thing applies to many websites. A great deal of third-party code is included in these websites, which facilitates the harvesting of data for advertising technology purposes.”
55.The Government should explore the practicality and usefulness of creating a single online registry that would allow people to see, in real time, all the companies that hold personal data on them, and what data they hold.
56.The ICO were one of a number of witnesses to raise concerns about data aggregation, or the practice of combining different data collected from different websites and online services, which can lead to very detailed profiles of individuals without the data subject’s knowledge. The ICO state that “[a]s we enter the era of “the internet of things”, larger aspects of people’s lives will yield data including, for example, GPS systems in cars and on phones, online search histories, credit/debit card purchases, social media communications, and cookies on websites; combined they paint a sophisticated picture of an individual data subject.”
57.These profiles can be used, for example, to target online advertising, including through the real time bidding (RTB) process (see Box 2) .
Box 2: What is Real Time Bidding?
RTB is a type of online advertising that enables advertisers to compete for digital advertising space, placing billions of online adverts on webpages and apps in the UK every day by automated means. It refers to the buying and selling of advertising inventory in real time through real-time auctions that occur in the time it takes a webpage to load.
Madhumita Murgia from the Financial Times explained how RTB can work in practice:
“There is an auction system in the way online advertising works. [ … ] If I am on one side, as a user of the internet going to visit a website, there are hundreds of companies that might want to advertise their product to a 30-year-old woman who lives in London, works in media and likes to buy clothes, for example [ … ] when you visit this website, a profile about who you are is sent out and people are asked to bid. Companies then decide if they want to place their ad in front of you and put out different bids. There is one winning bid, and that is the advert you see in front of you on your page [ … ]”
58.The ICO’s recent report into RTB explained:
“[it] involves the creation and sharing of user profiles within an ecosystem comprising thousands of organisations. These profiles can also be ‘enriched’ by information gathered by other sources, e.g. concerning individuals’ use of multiple devices and online services, as well as other ‘data matching’ services. The creation of these very detailed profiles, which are repeatedly augmented with information about actions that individuals take on the web, is disproportionate, intrusive and unfair in the context of the processing of personal data for the purposes of delivering targeted advertising.”
They go on to state that “in many cases data subjects are unaware that this processing is taking place.”
59.Financial Times journalist Madhumita Murgia explained the sort of data that could be being combined and shared in the RTB process:
“It can be your IP address, which means your exact location; in some cases, your actual latitude and longitude are broadcast out. It can be where you live, where you work, what you like to buy, what health conditions you are interested in, where you are travelling to, everything you do in your daily life, what you have bought in the real world, political preferences or sexuality. The concern is that special category data that are protected, including race, sexuality and your health status, are being broadcast out to companies. We have no idea whom it is going to. We have no idea what they are doing with it. There is no transparency.”
60.The data broker Acxiom sought to reassure the Committee about the type of data that might be combined. They told us that they do provide a service which involves combining first-party data (i.e. that held by the client), with Acxiom’s own data and/ or third-party data. However, they emphasised that they do not do this when it might result in insights that reveal special category data. Alex Hazell, the Head of Legal at Acxiom, gave an example:
“[ … ] dry eye is a medical condition, so any data processed in relation to that condition would be special category data. [ … ] if we took Acxiom’s dataset, combined it with people with dry eye and tried to spot some trends, in my view that would cross a red line, because the Acxiom data would become what, under data protection rules, is called special category data, which the legislation treats as a particularly sensitive form of data [ … ] That is one line we would never cross.
61.Profiles that companies hold on individuals are likely to be partially based on data that the individual has submitted (either to that company or another company): perhaps as part of the sign-up process to join a social media network, or through a form filled in while buying something online. But profiles may also contain inferences that are made when that data is combined. Natasha Lomas from Tech Crunch explained:
“Inferences can be made from personal data. You give your pieces of data and you think that is all you are giving, but using AI technologies all sorts of inferences can be drawn from this information. New companies might then calculate certain things about you that you do not necessarily know they are doing.”
62.Natasha Lomas cited one example in which eye tracking software was being used to make assumptions about people’s sexual orientation, whether they have mental illness, are drunk or have taken drugs. She said:
“You are just using a piece of technology, and it might be making all these calculations about what it thinks you are, which might be wrong. If you do not even know it is happening, how could you address that inaccuracy? They might be telling someone else and sharing this inference that you are a drug taker, and it is not true.”
She went on to explain how difficult it would be to know if any inferences had been drawn about you:
“[Facebook] has a button where you can download your data, but it will just give you the things you have literally uploaded. It will not give you all the inferences that Facebook has made from your data, everything it has learned by watching you continuously. It does not define all the surveillance and intelligence as your personal data.”
While the GDPR and the DPA give individuals the right to obtain a copy of their personal data (known as a ‘subject access request’), this does not include the inferences that companies have made about a person based on that data as this is deemed to be the property of the company that acquired it. A House of Lords’ Communications Committee report into digital regulation looked at this issue and recommended that users should be able to request, in a manner similar to a subject access request, any data that a company has generated about them.
63.Professor Victoria Nash, from the Oxford Internet Institute at the University of Oxford, explained that these inferences could be used for a variety of purposes:
“If their only effect is on what adverts I am served, I probably will not worry too much, but my understanding is that they may be used in many other areas, for example to affect the price of goods you are offered and the array of products that are served to you. They may be used in terms of transfers to health and insurance companies, and the technologies we use at work. The way in which inferences are drawn from this wide array of data is a trend that worries me.”
64.Even where individuals have knowingly consented to sharing some of their personal data with one company, they may not be content with that data being combined to create a profile of themselves that they have no opportunity to see or edit.
65.It is deeply concerning that ‘data’ about an individual is being used and shared when it is based on inferences that may be untrue, and when the individual has no opportunity to correct any inaccuracies: indeed, there is no way of finding out what inferences may have been made about you.
67.We agree with the recommendation of the House of Lords Communications Committee that, in a model similar to a subject access report under the GDPR, users should have the right to request data that a company has generated about them, so they are aware of any inferences that may have been made.
68.The data storage practices of some businesses are another potential risk to privacy flagged to us by the ICO. They told us: “An obvious example is in the case of a data breach, where unauthorised entities gain access to data held by a data controller. This does not require a physical breach or loss, and with increasing amounts of data held in cloud storage remote hacking is becoming increasingly frequent.”
69.In July 2019, for example, the ICO issued a notice of its intention to fine the hotel chain Marriott International £99,200,396 for infringements to the GPDR. This related to a ‘cyber incident’ which exposed personal data from 339 million guest records.
72.Research by Doteveryone, however, found that only 24% of people thought that digital services made it easy for people to change their privacy settings.
73.Several witnesses also referred to research by the Norwegian Consumer Council which found that Facebook, Google and Windows 10 all make it difficult for people to increase their privacy settings and restrict the amount of their personal data that is shared. They found that, in the case of Facebook and Google, “users who want the privacy friendly option have to go through a significantly longer process”. Their report states:
“The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices. Choices are worded to compel users to make certain choices, while key information is omitted or downplayed. None of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option. The GDPR settings from Facebook, Google and Windows 10 provide users with granular choices regarding the collection and use of personal data. At the same time, we find that the service providers employ numerous tactics in order to nudge or push consumers toward sharing as much data as possible.”
75.Google’s Public Policy Manager, Lanah Kammourieh Donnelly, told us that the company does offer “deletion settings” and that they “recently rolled out a feature called auto-delete that also allows users to set an automatic rolling deletion of data associated with their account.” Evidence from other witnesses, however, suggests that correcting or deleting personal data being held about you can be a near impossible task.
76.Dr Melanie Smallman is a lecturer at the Department of Science and Technology Studies at UCL. Despite her knowledge and expertise, however, she found herself unable to stop her personal data being shared or to have it deleted:
“When I train in the gym and the machines gather any fitness or weight data, that data goes to an app that I never signed up to and have asked repeatedly to be unsigned from. I spent two days trying to get the bottom of where this data goes and who it can be shared with, which is not clear [ … ] What is clear is that there are three or four steps in the chain of where my personal health data, which I have repeatedly asked not to be stored, is going [ … ] I have [now] given up.”
77.Tamsin Allen told us about a client “about whom completely false and private information was published in newspapers.” He successfully won a libel case but the stories still appeared in internet searches and so he wanted Google to remove them. She told us:
“Months and months later, after their refusing to delist the URLs, he had to instruct lawyers. Months and months after we wrote to them, at great expense on his part, they agreed to de-list just some. Finally, we had to issue proceedings here and apply to serve them out of the jurisdiction and serve them on a Google in America—at enormous cost, because they were not playing ball with their own delisting service. They instructed very expensive lawyers in London, who had a row with us. Eventually they agreed: “Right, we’ll delist”. They delisted, but everything that previously did not appear on the Google listing because it was too far down the ranking suddenly popped up. We got to the conclusion of the proceedings, he spent a fortune, and suddenly there is a whole load more of the same pieces of information back on Google. So we had to write back and asked them to go through all this again. They said, “Okay. We’ll get rid of it all again”. The same thing happened [ … ] it took 18 months to get anywhere near an internet clean-up, which is exceptionally difficult and expensive to do.”
78.Given that many people will have consented to their personal data being shared without being in a position to understand what they were agreeing to, that people’s data is being shared without their consent and that inferences are being drawn from people’s data to create a profile of them that may be entirely incorrect, it is vital that companies make it easy for people to correct or remove data held about them. While the GDPR gives individuals the rights to have their personal data erased and rectified, the evidence we heard suggests that these are not always adequately enforced. Companies must respect people’s right to privacy, and make it easier for people to limit or stop how their data is being shared. We consider that these rights could be more effectively enforced if specific sanctions were associated with non-compliance of these rights by companies, particularly when companies fail to respond promptly or adequately to individual’s requests to rectify or delete their data.
31 Office for National Statistics, , 7 August 2020
32 [Lanah Kammourieh Donnelly]
33 [Jed Mole, Acxiom]
34 [Madhumita Murgia]
35 Doteveryone ()
36 Information Commissioner’s Office ()
37 ‘Emma’s Diary fined £140,000 for selling personal information for political campaigning’, Information Commissioner’s Office ()
38 Information Commissioner’s Office ()
39 [Dr Reuben Binns]
40 [Dr Reuben Binns]
41 Information Commissioner’s Office (
42 [Madhumita Murgia[
43 Information Commissioner’s Office, , June 2019,
44 [Madhumita Murgia]
45 [Alex Hazell, Acxiom]
46 [Natasha Lomas]
47 [Natasha Lomas]
48 [Natasha Lomas]
49 House of Lords, Report of the Select Committee on Communications, 2nd Report of Session 2017–19, , HL Paper 299
50 [Professor Victoria Nash]
51 Information Commissioner’s Office ()
52 Information Commissioner’s Office, , 9 July 2019
53 [Lanah Kammourieh Donnelly]
54 [Lanah Kammourieh Donnelly]
55 [Lanah Kammourieh Donnelly]
56 Liberty (); Privacy International ()
57 Forbrukerradet, , 2018
58 Forbrukerradet, , 2018
59 [Lanah Kammourieh Donnelly]
60 [Dr Melanie Smallman]
61 [Tamsin Allen]
62 [Tamsin Allen]
Published: 3 November 2019