The Right to Privacy (Article 8) and the Digital Revolution Contents

1The UK regulatory landscape

Overview of the regulatory framework for data protection

1.The Data Protection Act 2018 (DPA) and the General Data Protection Regulation (GDPR) provide the data protection framework in the UK. Data protection rules, under the GDPR, apply to companies and organisations who offer goods and services, whether or not they are based in the EU, whenever they process the personal data of individuals in the EU .

2.Data protection laws apply to all types of personal data. It does not matter what format the data takes. Whether it is online on a computer system or on paper in a structured file, whenever information directly or indirectly identifying an individual is processed, data protection rights have to be respected. The data protection regulatory landscape for the UK is governed by:

a)the general data protection regime which applies to most UK and EU businesses (and includes the General Data Protection Regulation (“GDPR”) tailored by the Data Protection Act 2018 (“DPA”));

b)the Privacy and Electronic Communications Regulations (“PECR”) which provide guidance on the use of electronic marketing messages (by phone, fax, email or text), cookies, or electronic communication services to the public; and

c)the electronic identification and trust services (“eIDAS”) which governs the provision of trust services such as electronic signatures, electronic time stamps and website authentication certificates.

Box 1: What do we mean by ‘processing’ data?

Article 4 of the GDPR defines processing as: “any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction.”

Source: General Data Protection Regulation

3.The data protection rules, under the GDPR, describe different situations where a company or an organisation is allowed process personal data. There are six lawful bases:

a)Contract/explicit agreement: where you have expressly agreed to the collection and/or processing of your data in a contract. Examples would include a contract to supply goods or services when you buy something online, or an employee contract.

b)Legal obligation: when processing your data is a legal requirement. This includes, for example, when your employer gives information on your monthly salary to the relevant Government departments in order to determine National Insurance contributions, eligibility for welfare benefits etc.

c)Vital interest: such as when this might protect your life. This could apply in emergency medical care, for example, when it is necessary to process personal data for medical purposes but the individual is incapable of giving consent to the processing.

d)Public interest: when processing is necessary to enable an organisation to carry out its public functions and powers, or specific tasks in the public interest, as enshrined in law. This could include processing necessary for the administration of justice; parliamentary functions; statutory functions, governmental functions or activities that support or promote democratic engagement.3

e)Legitimate interest: this could include your bank using your personal data to check whether you could be eligible for a savings account with a higher interest rate, for example.

f)Consent: consent should be a freely given, specific, informed and unambiguous indication of the individual’s wishes. The company or organisation must keep records, so it can demonstrate that consent has been given by the relevant individual.

4.One of the key aims of the GDPR and the DPA is to empower individuals and give them control over their personal data. It preserves existing rights for individuals and adds additional rights such as the right to data portability and the right to be forgotten.

5.The data protection laws also contain specific protections for children. For the purposes of the GDPR, a child is someone below the age of 16, although Member States can reduce this age to 13, as the UK has done in the DPA. Therefore, consent can only be obtained from a child under 13 in relation to online services if the consent is authorised by a parent. In the UK, children who are 13 or older are expected to give consent in the same way as adults - with all of the associated risks. Other conditions under which the GDPR allows data to be processed can also be applied to children’s data, although organisations may find the criteria for the ‘legitimate interests’ condition, in particular, difficult to meet in relation to children.

6.Specific protections for children in data protection laws include:

a)Simplicity: privacy policies must be very clear and simple if they are aimed at children.

b)Automated decisions: profiling and automated decision-making should not be applied to children.

c)The right to be forgotten: this applies very strongly to children.

7.Lastly, the data protection laws make provisions for special category data. Special category data is considered more sensitive, and so needs more protection. It includes information about an individual’s race, ethnic origin, politics, religion, trade union membership, genetics, health, sex life and sexual orientation. Special category data is broadly similar to the concept of sensitive personal data under the Data Protection Act 1998. Organisations must have a lawful basis for processing special category data in exactly the same way as for any other personal data. The difference is that they will also need to satisfy a specific condition under Article 9 (2) of the GDPR. There are 10 conditions listed in Article 9(2). One of these is that the data subject has given “explicit consent” to the processing.4

The regulatory and governing bodies concerned with data protection

8.The primary enforcer of rules relating to data protection, including the GDPR and DPA, is the Information Commissioner and her office (“ICO”). The ICO is an independent body that provides information and guidance to individuals and businesses, as well as taking enforcement action when organisations fail to meet their legal obligations.

9.Since April 2010, the ICO has had the power to issue monetary penalty notices of up to £500,000 for serious breaches of the Data Protection Act 1998 (and now the DPA 2018), and since May 2011 this power has been extended to serious breaches of the PECR.5 Under the DPA 2018, there is now a higher penalty for severe violations (€20 million or 4% of the total annual worldwide turnover in the preceding financial year, whichever is higher) which is intended to encourage compliance.6

10.The Government has also set up the Centre for Data Ethics and Innovation (“CDEI”) to provide independent, impartial and expert advice on the ethical and innovative deployment of data and Artificial Intelligence (“AI”). They produce guidance, highlight best practice, and publish recommendations for Government (which the Government is bound to consider and respond to publicly). Of particular relevance to this inquiry, the CDEI’s work programme for 2019–20 contains plans for key reviews of both algorithmic bias and online targeting, investigating how data is used to personalise and shape people’s online environments.7

Overview of the regulatory framework for equality and human rights

11.In addition to specific data protection laws, human rights and equality legislation can also offer protection in relation to how people’s data is used.

12.The Human Rights Act 1998 (“HRA”), is based on, and “gives further effect” to, the rights and freedoms contained in the European Convention on Human Rights (“ECHR”).8 Of particular relevance are: (i) the right to respect for private and family life (Article 8 ECHR); and (ii) the prohibition of discrimination in the enjoyment of other ECHR rights (Article 14 ECHR). The right to respect for private and family life includes protections against unnecessary surveillance or intrusion into an individual’s private life or correspondence. The prohibition of discrimination provides that no one should be discriminated against when applying the other rights in the Convention–because, for example, of their sex, race, disability, sexuality, religion or age.

13.In addition, Section 13 of the Equality Act 2010 prohibits direct discrimination, while Section 19 prohibits indirect discrimination (where a provision, criterion or practice puts people sharing a protected characteristic at a particular disadvantage, and this cannot be objectively justified). The characteristics that are protected by the Equality Act in relation to goods and services are: age (but only if an individual is 18 or over); disability; gender reassignment; pregnancy and maternity; race; religion or belief; sex; and sexual orientation.


3 Information Commissioner’s Office, Lawful basis for processing: Public Task

4 See Information Commissioner’s Office, Special category data: At a glance, for all ten conditions.

5 Information Commissioner’s Office, Our history

6 Data Protection Act 2018, Sections 155, 156 and 157

7 See Department for Digital, Culture, Media & Sport, Independent - The Centre for Data Ethics and Innovation (CDEI) 2019/ 20 Work Programme, 20 March 2019

8 Human Rights Act 1998, Preamble




Published: 3 November 2019