79.We also heard how the increasing sharing of personal data online and associated data processing could result in discrimination against certain groups or individuals in the way that their personal data was used. As outlined in chapter one, the Equality Act 2010 prohibits direct and indirect discrimination by private companies in the provision of goods and services. As such, private companies could be liable to breaching the prohibition on direct or indirect discrimination in relation to the way that they use technology - even if discrimination was not intended. The characteristics that are protected by the Equality Act in relation to goods and services are: age (but only if an individual is 18 or over); disability; gender reassignment; pregnancy and maternity; race; religion or belief; sex; and sexual orientation.
80.Several witnesses raised concerns that the way companies are using people’s personal data to target advertisements to them is resulting in discrimination. “Online platforms use algorithms (see Box 3) to present content to users based on (depending on the nature of the platform) what they were searching for, data collected about them (‘personalisation’) and factors such as whether an advertiser has paid for content to be prioritised.”
Box 3: Algorithms
“An algorithm is a set of rules to be used to make the necessary decisions to complete a given task. While algorithms have been used since antiquity, they have been critical to the development of computer science. In recent years, the word ‘algorithm’ is often taken to mean complex decision-making software. Algorithms are used in artificial intelligence. ‘Reinforcement learning’ allows algorithms to improve and rewrite themselves without further human input. Article 22 of the GDPR protects users from being subject to decisions made by algorithms which have “legal or significant effects”, such as when applying for loans online.”
81.Madhumita Murgia from the Financial Times provided several shocking examples of how targeted advertising had resulted in discriminatory outcomes. She told us that an investigation by fellow journalists into job advertisements on Facebook found that “lots of companies, including Amazon, Facebook itself and Goldman Sachs, were gating at what age people should see those ads, essentially discriminating by saying, “We only want young, hip people to work at our company, so only show this advert to people between 20 and 40”. They also found that Facebook were accepting housing advertisements discriminating by race, and advertisements aimed specifically at ‘Jew haters’. We invited Facebook to give evidence to us; they were unable to make anyone available on the dates requested.
82.It was equally concerning to hear that it was possible for companies to discriminate in less overt ways by using personal data to categorise people. In addition to targeting individuals based on their protected characteristics, companies can target preferences which are likely to be held by certain groups (i.e. using indirect discrimination). Dr Melanie Smallman explained that:
“The point is that the algorithms do not act in a discriminatory way by saying, “We’re going to exclude all women”, for example. It is much subtler than that. If you want to identify a young person, you can find somebody who likes a particular band or people who holiday in a particular place. You can advertise jobs to people who like golf. We know what these things mean.”
83.When we asked Google how they were ensuring that their platform was not being used by companies to bypass anti-discrimination laws, their Public Policy Manager, Lanah Kammourieh Donnelly, told us:
“First, we are bound by all the laws in place in this country, including legislation on equality and non-discrimination. That is simply our baseline. In addition to that, we do not allow the targeting of users based on sensitive data categories. Our policies, which we review regularly, make it clear that we do not allow discrimination; when we find a violation, we take action.”
84.Professor Victoria Nash explained to us how difficult it is to determine if discrimination is occurring, because the content that each individual sees online is personalised to them: “Without seeing the adverts that each and every one of us in the UK receives, it is impossible for me to look for trends, such as patterns of discrimination in the adverts that are displayed.”
Dr Melanie Smallman argued that the lack of diversity in the workforce of many internet companies may account for some of the discrimination taking place. She pointed out that it was important to understand that algorithms were not “simply automatic” and that “when adverts are served in a sexist or racist way, somebody is, or has been behind that.” Dr Smallman said:
“That takes us to a much broader question about how technology companies are staffed, what workforces look like and how such decisions are even turned into algorithms in the first place [ … ]
“What are people wanting to advertise asking for? We have all heard stories like, “I’m less likely to be served an advert for a high-paid job than my male partner who has equal qualifications to mine”. What are advertisers asking for? I do not want to defend Google—it is not my job—but some responsibility has to be with those asking for such adverts. If they say, “I want to get my job advert to the right people”, questions need to be asked, such as, “How would you decide who the right people are? Are you going to advertise equally to women and men? Is there a risk of our company looking bad as a result of this?”
85.Our written evidence also highlighted specific concerns about how algorithms could draw inferences from personal data, posing risks in terms of discrimination. Written evidence from Dr Matthew White cites research by Dr Sandra Wachter which looked at how inferences drawn from personal data can create opportunities for “discriminatory, biased and invasive decision-making.” The research suggested that major internet platforms or social media companies like Facebook, are able to infer protected characteristics such as race and sexual orientation, which are then used for targeted advertising, and that third parties have used such data to infer the socioeconomic status of individuals to determine people’s eligibility for loans. In their paper, A right to reasonable inferences: re-thinking data protection law in the age of Big Data and AI, Dr Sandra Wachter and Brent Mittelstadt argue that:
“Big Data analytics and artificial intelligence (AI) draw non-intuitive and unverifiable inferences and predictions about the behaviors, preferences, and private lives of individuals. These inferences draw on highly diverse and feature-rich data of unpredictable value, and create new opportunities for discriminatory [ … ] decision-making … “
“[ … ]a new data protection right, the “right to reasonable inferences,” is needed to help close the accountability gap currently posed by “high risk inferences,” [ … ] that damage privacy or reputation, or have low verifiability in the sense of being predictive or opinion based while being used in important decisions.”
86.We were shocked to hear that major companies have used the ability to target advertising in order to discriminate against certain groups of people. Those social media channels and websites on which the advertisements are being placed must accept responsibility and carry out sufficient checks on adverts to ensure that companies are not inadvertently or deliberately excluding people in a discriminatory way which disadvantages them in their access to opportunities in areas like employment, housing or finance.
87.There are challenging questions to be asked about the balance between providing ‘personalised content’ (i.e. showing someone the advertisements, news stories etc. that they are most likely to be interested in) and discriminating against people by deciding certain material should or should not be shown to them because of their particular demographics. This debate needs to be had, and we urge the Government to bring internet companies, regulators and users together to discuss this. These discussions should also explore how anti-discrimination laws can be better enforced in the online advertising world.
88.Companies need to be aware of how targeting content at people based on certain hobbies, interests etc may indirectly be discriminating against certain groups of people. They should be actively looking for, and screening out, such practices and ensuring they have adequate tests in place to consider whether targeting certain aspects of users’ profiles could be discriminatory.
89.Important decisions–such as whether to refuse someone access to a service–should never be made based on inferences from people’s data, and the Government should review whether the current legal framework is adequately robust in this regard.
90.We consider that more transparency is needed in relation to how advertisements are targeted at individuals online, in order to prevent discrimination from occurring. This could potentially include introducing tools through which individuals can look up how companies are targeting adverts at them, or at others, online and which would enable regulators to effectively audit the criteria used by advertisers.
91.We also note that there are concerns among some organisations working in this field that the DPA did not include a “collective redress” system, which would have allowed for one person or body to represent a group of individuals that have suffered the same harm.
92.We consider that mechanisms allowing for better collective redress could be particularly useful in relation to targeted advertisements online, given that an individual cannot compare what they see online with what is seen by others and would therefore be unaware that they were being discriminated against. In such situations, unlawful practices are more likely to be revealed by independent investigations, most often carried out by civil society organisations and charities; if these organisations could then pursue cases on behalf of the affected individuals, the companies undertaking these activities could more effectively be held to account.
63 House of Lords, Report of the Select Committee on Communications, 2nd Report of Session 2017–19, , HL Paper 299
64 [Madhumita Murgia]
65 [Madhumita Murgia]
66 [Dr Melanie Smallman]
67 [Lanah Kammourieh Donnelly]
68 [Professor Victoria Nash]
69 [Dr Melanie Smallman]
70 [Dr Melanie Smallman]
71 Dr Sandra Wachter and Brent Mittelstadt, , Columbia Business Law Review, Volume 2019, Issue 2 (2018)
72 Privacy International, , 1 July 2019
Published: 3 November 2019