Select Committee on Home Affairs Fifth Report



5  Are existing safeguards strong enough?

Regulatory safeguards

127.  During our inquiry we heard evidence on the safeguards in place to protect individuals from unauthorised surveillance and from security breaches which might lead to the disclosure of their personal information. There are several key pieces of legislation which govern data-sharing and the protection of human rights with regard to personal data and privacy in the UK:

the Data Protection Act 1998 governs the collection and exchange of personal data

the Regulation of Investigatory Powers Act (RIPA) 2000 legislates for using methods of surveillance and information gathering to help the prevention of crime, including terrorism[123]

Article 8 of the European Convention on Human Rights sets out the right to privacy.

The Data Protection Act gives effect to the European Data Protection Directive. In April 2008 the Information Commissioner's Office announced that it would invite tenders to carry out a study of the strengths and weaknesses of EU Data Protection law, addressing "a growing feeling that the EU Directive on data protection is becoming increasingly outdated and is more bureaucratic and burdensome than it needs to be".[124]

128.  A number of other EU instruments affect how personal information relating to UK citizens is shared within the EU and with third countries. We considered safeguards for data in this context in connection with our inquiry into Justice and Home Affairs Issues at European Union Level.[125]

129.  The Ministry of Justice is responsible for the Government's domestic, European and international policy on data protection and data sharing.

130.  The Information Commissioner's Office (ICO) administers the Data Protection Act 1998. Under the Act, a data controller ("a person who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed") has a duty to comply with a set of data protection principles in relation to all personal data in respect of which he is the controller.

(n)  Data Protection Principles

   1. Personal data shall be processed fairly and lawfully ...

   2. Personal data shall be obtained only for one or more specified and lawful purposes, and shall not be further processed in any manner incompatible with that purpose or those purposes.

   3. Personal data shall be adequate, relevant and not excessive in relation to the purpose or purposes for which they are processed.

   4. Personal data shall be accurate and, where necessary, kept up to date.

   5. Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes.

   6. Personal data shall be processed in accordance with the rights of data subjects under this Act.

   7. Appropriate technical and organisational measures shall be taken against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to, personal data.

   8. Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.

131.  The Information Commissioner has power to conduct audit and inspections to ensure compliance with the Data Protection Act but this is limited by a requirement to have the consent of the data controller concerned. Where data protection principles are breached the Commissioner has the power to issue enforcement notices which are remedial in effect. The Commissioner told us that these "do not impose any element of punishment for wrong doing".[126] Offences under the Act are punishable by fine.

132.  The ICO also administers the Freedom of Information Act 2000, the Environmental Information Regulations 2004 and the Privacy and Electronic Communications (EC Directive) Regulations 2003.

133.  In conjunction with its two main aims—to ensure that public information is available to all, unless there are good reasons for non-disclosure and to ensure that personal information is protected—the ICO works to encourage organisations to adopt good practice in relation to handling data and information, and to influence thinking on privacy and access issues.

134.  The ICO has drawn up codes of practice for the sharing of personal information and the use of CCTV and the Commissioner provides advice to public and private sector bodies on new practices and projects which involve the collection and use of personal information.[127]

Responsibility for protecting information in the public sector

135.  The Government's Chief Information Officer outlined the following "roles and accountabilities" in relation to safeguards for data held by public bodies:

Accounting officers

136.  The Accounting Officer of a public sector body is accountable for ensuring that:

personal information (or "Citizen data") is used for the purposes for which it was intended in accordance with the relevant legislation

the appropriate policies, procedures, staff and technology are deployed to maintain standards for keeping citizen data accurate and for protecting it from disclosure

defined roles and responsibilities within the organisation relate to the execution of risk identification and mitigation strategies for the use of citizen data

Chief Information Officer (CIO) Council

137.  The CIO Council's remit is to improve public service delivery by ensuring that the "strategic use of technology and computer systems are aligned" with the overall strategy set out in the Transformational Government Strategy (see above at para 26). The CIO Council is charged with creating and delivering a Government-wide CIO agenda to "build capacity and capability in IT-enabled business change".

Central Sponsor for Information Assurance

138.  The Central Sponsor for Information Assurance (CSIA), based in the Cabinet Office, is accountable for the development of strategy, policy and guidance relating to the protection of data. The CSIA is also responsible for the accreditation of departmental computer systems and networks and for ensuring that they conform to agreed minimum standards.[128]

Debate on the limitations of regulatory safeguards

139.  The Information Commissioner told us that whilst regulation formed the basis for protection of personal information and prevention of excessive surveillance, the role of the individual in seeking to check or query the collection, storage and use of information under data protection and other legislation, was crucial:

it is about educating and encouraging people to use their own rights as much as about what we can do as the regulator.[129]

140.  In the Foreword to his Annual Report for 2006-07 the Information Commissioner argued that "self interest" on the part of those organisations which control the use of personal information was also a key safeguard against the risks associated with excessive surveillance:

Although many of the detailed rules are too bureaucratic, the underlying principles of data protection have successfully stood the test of time. They provide a sound framework to minimise the risks and promote acceptable and beneficial handling of personal information. But legal regulation is insufficient by itself. The consequences of getting it wrong can now be seen instantly—domestically and across the globe—causing great short-term damage to political and commercial reputations and long-term damage to society. It is ministers, permanent secretaries, chairs and chief executives who must ensure their organisations guarantee safeguards and exercise the necessary self-restraint. This is simple self-interest which must come from the top.[130]

141.  This "self-interest", however, conspicuously failed to prevent the "great short-term damage" to the reputations of HMRC and the other organisations involved in high-profile data loss incidents in recent months. In December 2007 the Information Commissioner told the House of Commons Justice Committee that after the loss of child benefit data, public and private sector bodies had approached his office almost "on a confessional basis" to bring to the Commissioner's attention problems they had encountered with security.[131]

Technological safeguards

Privacy-enhancing technologies

142.  Technologies themselves can provide powerful controls over potential for surveillance or invasion of privacy, minimising data collection and providing intrinsic safeguards. For example, encryption of personal data as it is stored or flows across domains and other electronic boundaries can provide a degree of security, network design and software code can act to restrict surveillance, and web 'cookies' can be filtered. Such methods of limiting surveillance are known as 'Privacy-enhancing technologies' or PETs.

  Cookies

  A "cookie" is a small piece of information sent by a web server to store on a web browser so it can later be read back from that browser. Cookies can contain a variety of information, including the name of the website that issued them, where on the site the user visited, and user names and passwords that have been supplied via forms.

  Cookies are used for a range of purposes, including: online ordering systems (and services such as Google Checkout, which allows people to buy from stores across the web and track all their orders and delivery information in one place), website personalisation, website tracking (for example, to allow a web designer to see how people navigate a particular site), and targeted marketing.

  Cookies can make using the internet and online services quicker and easier for consumers but raise privacy concerns because—whilst a great many companies have cookie policies and users can choose to stop their browsers from saving cookies—they can be stored in a user's computer without the user's knowledge or consent.[132]

143.  In a Communication to the European Parliament and the Council, the European Commission set out its support for PETs. It considers that PETs should be developed and more widely used, in particular where personal data is processed through ICT networks. Whilst the use of PETs would be complementary to the existing legal framework and enforcement mechanisms, the Commission considers that wider use of PETs would improve the protection of privacy as well as help fulfil data protection rules. The Commission's Communication gives the following as examples of PETs:

Automatic anonymisation of data after a certain lapse of time: this supports the principle that processed data should be kept in a form which permits identification of data subjects for no longer than necessary for the purposes for which the data were originally collected

Encryption tools: prevent hacking when information is transmitted over the Internet or other media, and support the data controller's obligation to take appropriate measures to protect personal data against unlawful processing

'Cookie-cutters': block cookies placed on the user's PC that make it perform certain instructions without the user being aware of them, enhancing compliance with the principle that data must be processed fairly and lawfully, and that the data subject must be informed about the processing going on

The 'Platform for Privacy Preferences' (P3P): allows internet users to analyse the privacy policies of websites and compare them with the users' preferences as to the information they wish to release, helping to ensure that data subjects' consent to processing of their data is an informed one.[133]

144.  Symantec, a software company which specialises in security software and services, told us that the market offered a number of a number of technological tools:

suitable for different environments and different user-sophistication that can afford adequate levels of security and protection for personal sensitive information. The information security industry continues to develop innovative solutions that can ensure the security and privacy of individuals' information in the evolving threat landscape.[134]

145.  A conference supported by the European Commission and held in November 2007—a "series of independent forums for discussing privacy in relation to the development of new technology"—took as its theme "the need for privacy and security to be designed in to products and services at the earliest stage" and concluded that "Governments are critical enablers who will control the demand for and enforcement of PETs."[135]

Digital identities and identity management

146.  According to the Royal Academy of Engineering, technologies such as encryption can be used to separate authentication, a process that results in a person being accepted as authorised to, or having the right to, engage in or perform some activity; and identification, the process that results in a person's identity being revealed. For such technologies to provide an effective safeguard for personal information, however, the Royal Academy of Engineering argued:

privacy has to be engineered into the system at the most fundamental level, allowing anonymity or at least pseudonymity of users (the ability of users to have a different pseudonym for different services) at the level of the infrastructure.[136]

147.  Hewlett-Packard Laboratories, one of the leaders of the PRIME project, a "four-year co-operation between 20 industrial and academic research institutions, that aims to advance the state of the art of privacy-enhancing technologies",[137] stated that:

There is a variety of technical approaches to providing the individual with the means to manage his/her digital identity information to and control its release and subsequent use. These range from approaches in which all communication and interaction between digital service provider and consumer is done on the basis of anonymous credentials (i.e., no identity information is transferred) to those in which the service provider's identity management systems are designed to follow all the consumer's requirements regarding his/her identity information (and thus act as his/her proxy) and are verified as actually doing so.[138]

Whilst it did not advocate "the total replacement of identity-based digital service delivery systems by those in which no identity information at all is required", Hewlett-Packard Laboratories emphasised the "benefits to be gained" if identity information demanded by a service provider "be just that required to deliver the service, and no more".[139]

148.  Caspar Bowden, an expert on privacy-enhancing technologies and privacy regulation, argued in written evidence that:

Sophisticated PETs can provide much more robust bulwarks against function-creep than policy controls alone, but it must be understood that the purpose of these technologies is expressly to minimize the disclosure of personal information to the absolute minimum required.[140]

Mr Bowden said that whilst industry and academia were able and willing to develop effective PETs and privacy systems, "there is a chronic lack of awareness and interest from both data controllers and most regulators".[141]

149.  According to Mr Bowden, advanced PET techniques could be used to create authentication systems which would eliminate the need for a database of all system transactions as the basis of an 'audit trail'. Mr Bowden argued that the use of these techniques was "mandated" by the Human Rights Act and Article 8 of the European Convention on Human Rights and that the legality of the "blanket collection of identifiable transaction data" was questionable.[142]

Debate on the limitations of technological safeguards

The role of the individual

150.  We heard from our witnesses that the effectiveness of privacy-enhancing technologies in protecting individuals from unauthorised monitoring of their online activities and theft of their personal information, depends to a great extent on individuals' awareness of the need to take steps to secure the systems they use and their ability to gain access to the new technologies.

151.  Pete Bramhall, of Hewlett-Packard Laboratories, said that technology such as encryption could provide effective protection but told us that "the question then becomes how do you make that usable and accessible to the ordinary person".[143] Dr Andy Phippen, of the University of Plymouth, agreed that these were key issues, arguing that the prospect of convenient access to services and other benefits was more influential than concern about security in guiding people's attitudes towards providing personal information by electronic means:

The public's view of encryption is whether the little padlock is on the browser and, if the padlock is on the browser, it is safe. I think the usability issues are extremely significant if you are looking at privacy-enhancing technologies at all and, unless your average person on the street is comfortable with them, guarantees of security will be ignored in a lot of the cases ... If you are buying something online and you are saving yourself 50 quid, it is very clear. There are some very successful public sector e-delivery mechanisms, such as the DVLA and tax returns, and school admissions systems for some reason are incredibly popular because they offer a sort of return in terms of convenience to individuals and they are not saying "I'm not using that" because you are not using the most up-to-date encryption mechanisms on it, but they are saying, "I'll use that because it will save me having to fill out the form on paper or it saves me having to phone someone up and do it all on the phone".[144]

152.  Dr Phippen told us that his research into attitudes towards responsibility for online security found that the predominant view amongst individuals was that responsibility for making sure that systems were secure rested elsewhere:

We did an awful lot of work with awareness and education, who is responsible, and it always comes back when you talk to citizens that it is the Government and it is that manufacturers that should be responsible.[145]

Whilst Pete Bramhall argued that garnering a reputation for protecting personal data could become a differentiator for private sector companies "particularly as far as the provision of digital services is concerned", he went on to tell us that the potential of privacy-enhancing technologies had not been realised:

increasingly as technology, particularly privacy-enhancing technology, begins to offer possibilities for system designers to design the systems in a way that actually requires less personal information, then I think the incentive to them to do so is not actually apparent at the moment because they are sort of stuck in this habit of gathering more information because it might come in useful some day.[146]

153.  A regular experiment to test the security of wireless networks carried out by Dr Phippen's students found that the number of unsecured networks had dropped significantly in the past two years. Dr Phippen attributed this change to the fact that vendors of computer equipment "are now providing out of the box some level of security". He said that whilst "manufacturers are trying to do more" in terms of building security measures in to the products they sold, another experiment showed that over 60% of people who were sent an unsolicited message on their unsecured Bluetooth devices received the message and loaded it up. Dr Phippen concluded from this experiment that although manufacturers could "do a lot" and Government, through education, "is not doing enough" to protect privacy:

there has to be personal responsibility because ultimately it is a personal device ... people were very willing to accept that something is in their personal device, they did not know what it was, they just accepted it. Now, how could a manufacturer protect against that?[147]

PETs and a 'privacy divide'

154.  The Surveillance Studies Network insisted in its evidence that PETs could not be regarded as a panacea, asking:

Do PETs represent simply a market response to problems of surveillance and privacy? If so, their spread and relative effectiveness will replicate social and economic divisions, leading to a society of privacy haves and have-nots.[148]

The Network posits the emergence of "Personal Information Economies" where the wealthy are able to enjoy the benefits of surveillance (for example, as consumers or users of public services) and "technologically-enhanced privacy" but the "poor, marginalised and excluded" who are unable to gain access to such technologies, are simply subjects of "mass surveillance, categorisation and control".[149]

155.  We put to our witnesses the concerns of the Surveillance Studies Network that reliance on privacy-enhancing technologies to protect personal information online could create a 'privacy divide' between those who could afford to invest in technologies to protect themselves from excessive or unwarranted surveillance and those who could not. In discussing a "privacy-enhanced approach", Pete Bramhall's view was that in terms of the price of privacy-enhancing technologies:

the issue then becomes whether the providers of digital services would wish to price perhaps discriminatorily such that the privacy-sensitive services are at a higher price than the other ones. I think then perhaps it becomes a question for society as to how much it is willing to countenance the possibility of a privacy divide, as you described it.[150]

Privacy-enhancing processes: the role of the organisation

156.  Professor Anderson told us that having been involved in developing a number of "what would now be called 'privacy-enhancing technologies'", he had become "something of a sceptic". He argued that "they can be dressed up in various fancy ways, but at heart they are pseudonyms" and that their value was limited because organisations could profit from collecting personal information:

Companies do not want to deal with pseudonymous individuals, by and large, unless there is some premium in it for them. You can get prepaid credit cards, but they are significantly more expensive and the reason for this is that the information that is collected about you is valuable and it is used for price discrimination. So there are some market niches for privacy-enhancing technologies, but they are by no means the general solution to surveillance problems.[151]

157.  Professor Anderson also argued against the use of digital identities as a way of protecting privacy on the grounds that they allowed organisations which hold personal information to shift responsibility for protecting that information on to the individual, as guardian of his or her identity:

the rhetoric of identity becomes a means of passing the buck. In the old days, if someone went to the Midland Bank, pretended to be me and borrowed 10,000, that was impersonation and it was the bank's fault. Now, it is my identity that has been stolen, so it is supposedly my fault, and I end up having a furious row with the credit reference agencies. So the construction of the concept of 'identity' as something that belongs to me, that I have to protect with the help of government is not particularly helpful in this debate.[152]

We discuss identity management in the context of the Government's plans for identity cards below at paragraph 239.

158.  Professor Anderson told us that a better approach to protecting personal information would be to begin by thinking about:

the underlying business process of people, when they go to a government office, being dealt with in a fair and reasonable way; whether banks' transactions with their customers are regulated reasonably.[153]

Like Professor Anderson, Mr Bramhall emphasised the importance of processes in protecting privacy. These processes combined the technical aspects of designing systems for managing information, and the procedures used by organisations themselves:

Those processes are as much to do with management practice as they are to do with technology and, by themselves, those processes require some technology to help them as well.[154]

I do not think the issue is fundamentally one of the technology and its capability of addressing that issue; I think it is much more about education and awareness and people following good practice and, by that, I do not just mean the individual, but system designers following good practice.[155]

This approach would involve establishing clear routes to guidance for individual users and system designers, and means of redress for individuals whose personal information has been compromised.

159.  We welcome efforts to develop technological means by which organisations and individuals can protect personal information and prevent unwarranted monitoring of individuals' online activities. We recommend that the Government track and make full use of new developments in encryption and other privacy-enhancing technologies and in particular those which limit the disclosure and of collection of information which could identify individuals. We further recommend that the resources of the Information Commissioner's Office be expanded to accommodate sufficient technical expertise to be able to work with the Chief Information Officer to provide advice on the deployment of privacy-enhancing technologies in Government.

160.  We recognise, however, that awareness of and access to privacy-enhancing technologies is not universal amongst the public. Over-reliance on the capacity of technology to secure data systems leads to neglect of the need to ensure that processes for the management of information by organisations are robust. It also raises unrealistic and potentially discriminatory expectations of individuals who are not in a position to take steps to prevent the theft of their personal information.

161.  Where individuals have little or no choice about providing personal information, such as in their interactions with Government, it is especially important that the organisation which collects and holds the information takes responsibility for safeguarding it, rather than attempting to pass on the responsibility to the individual. The organisation's responsibility should begin before collection takes place: by obtaining consent for collecting and processing data where possible and by providing an explanation where this is not possible.

162.  The Home Office should work with the Information Commissioner to raise public awareness of how the Home Office collects, stores, shares and uses personal information. The Home Office should highlight the distinction between those areas in which individuals can exercise choice by giving or withholding their consent, and those areas in which seeking informed consent is not feasible and transparency is particularly important.

163.  The principle of restricting the amount of information collected to that which is needed to provide a service should guide the design of any system which involves the collection and storage of personal information. We recommend that the Government adopt a principle of data minimisation in its policy and in the design of its systems. We further recommend that the Government acknowledge the distinction between identification and authentication as one which is valuable in its efforts to adhere to this principle.

164.  It is not just the volume of data collected that creates a problem: the longer information is retained, the more likely it is that the information will be out of date and inaccurate. Information should be held only as long as is necessary to fulfil the purpose for which it was collected. If information is to be retained for secondary purposes rather than service delivery it should normally be anonymised and retained only for a previously specified period.

The case for new safeguards

165.  Much of the evidence we received in our inquiry argued for new or strengthened safeguards for personal information and against unwarranted surveillance, on the grounds that:

existing safeguards were not sufficient to meet the demands of new circumstances created by the growth in capacity and sophistication of private and public sector systems for collecting and storing information

trends in terms of the public's willingness to give up information and the Government's enthusiasm for sharing it across departments gave rise to serious concerns

criminal activity involving the abuse of databases and personal information stored in other ways had intensified.

Tackling abuse of databases through criminal activity or negligence

Criminal activity

166.  A great many of those who submitted evidence to our inquiry raised concerns about the abuse of databases by criminals. Symantec told us that:

Data is one of the most important assets of any organisation and a valuable target for attackers. Identity-related information is becoming a valuable asset to criminals, resulting in both public and private sector databases containing sensitive information [becoming] increasingly vulnerable to attack.[156]

Hewlett-Packard Laboratories referred to "the present level of cybercrime and likely continuation or steepening of its rate of increase".[157] The assessment of the Government's Chief Information Officer's was that:

The more and more that the technology becomes sophisticated, we absolutely will be able to find people who are getting access to systems and using information illegally.[158]

167.  Professor Ross Anderson also commented on recent developments in relation to criminal use of technology to steal personal information:

The most recent innovations in crime have not been principally technological, but principally psychological because, as the technology gets better, so it becomes easier to deceive individuals, so we are seeing an enormous rise in phishing, in pretexting and other things that involves deceiving people. The criminals are not going to stop deceiving machines as well and we are going to see keystroke loggers, we are going to see the rise in pharming and we are going to see technical crimes going along with crimes that involve deceiving people.[159]

  Pretexting

  Pretexting involves impersonating someone, often using stolen personal data, to contact a business or individual in an attempt to gain access to information or money or to manipulate the business or individual in another way.

  Keystroke logging

  Keystroke logging is a method of recording what is typed on a keyboard, which can be used to capture passwords and other personal information.

  Pharming

  Pharming involves stealing information by diverting a genuine website's traffic to a bogus website designed to look and appear to function in the same way as the legitimate one.

168.  The Finance & Leasing Association and Experian outlined the steps that the credit industry has taken to support victims of identity fraud, or identity theft as it is often called. On average, Experian said, 100 victims of fraud contact its dedicated team each week:

By acting on a consumer's behalf and by co-ordinating any necessary activity the Experian service significantly reduces the amount of time it would normally take an individual to restore his or her credit history.[160]

The Foundation for Information Policy Research (FIPR) took issue with the term 'identity theft' and the measures in place to tackle the crimes described by the term.[161]

Increasing capacity to investigate and penalise criminal activity

169.  FIPR argued that "action is needed to make the Information Commissioner's Office more effective and to make proper penalties available for abuse ... the ICO has always been lacking in technical capability, which has undermined its credibility".[162] Symantec called for:

an urgent review of the Information Commissioner's Office powers ... in order to remove any existing limitations on the ICO's ability to investigate possible misuse of data and increase the legal and financial penalties for offences. Consideration should also be given to the staff and resources currently allocated to the Information Commissioner to ensure the ICO's continued effectiveness.[163]

170.  The Information Commissioner himself has sought support for a strengthening of his Office's powers to tackle criminal activity in relation to personal data. He told us that since embarking on his work to expose a black market in personal information he had seen "better penalties coming through from the courts using their existing powers" but noted that in the cases his office prosecuted criminals "often ended up with derisory penalties: a conditional discharge for one of the most serious ones, or very, very low fines".[164]

171.  On 21 November 2007, the Prime Minister announced that the ICO would be given the power to "spot-check" government departments to ensure that they complied with Data Protection legislation.[165] He also announced that he had asked the Cabinet Secretary to undertake a review of the data handling procedures of departments and agencies. An interim progress report was published on 17 December 2007. The Ministry of Justice (MoJ) summarised the report's recommendations for data security and protection:

ensuring that departments are clear about roles, responsibilities and minimum standards that they must apply

reinforcing the culture across the public service that values and protects information and people's privacy

ensuring that performance is transparent and the right to external scrutiny mechanisms are in place to promote improvements into the future.

The MoJ also outlined the report's initial recommendations in relation to the framework within which data is handled across Government:

enhanced transparency with Parliament and the public about action to safeguard information and the results of that action, through departmental annual reports and an annual report to Parliament

increased monitoring of information assurance through, for example, Accounting Officers' Statements on Internal Control

improved guidance to those involved in data handling, that is simplified and better tailored, setting clear common standards and procedures for departments on data security

legislative steps to enhance the ability of the Information Commissioner to provide external scrutiny of arrangements across the entire public sector through 'spot checks'

commitment in principle to provide for new sanctions under the Data Protection Act for the most serious breaches of its principles.[166]

Providing for developments in data storage, sharing and searching

Information-sharing

172.  At the outset of our inquiry we asked the Information Commissioner about developments in relation to information collected and held by the public sector. He responded:

I think you have to start the debate by recognising that there is a lot of pressure now for more information to be shared across different parts of the public sector. Sometimes that is not particularly controversial or not particularly difficult. In relation to the information between the tax people and the social security people, most of the population expect that goes on already ... The sharing of information between the tax authorities and the police authorities, or between the health authorities and the police authorities raises far more controversial and difficult issues ... you have to take a case-by-case approach.[167]

The Government talks about public services being more citizen-centric, and that is welcome, but is anyone seeing it from the point of view of the citizen in terms of all this information being collected and shared about them?[168]

173.  The Information Commissioner has since developed a Framework Code of Practice for Sharing Personal Information. Published on 10 October 2007 the Framework explains how organisations can set up their own arrangements to ensure that where personal information is shared, good practice is adopted. It aims to help organisations decide when to share information and what information to share, highlights the consequences of sharing and deals with the issue of consent.

174.  The Code is designed to be flexible, enabling organisations to adopt it wholesale or to extract some of its content and integrate this into existing policies and systems. The Information Commissioner's Office told us that this was the first time that a code which could be adapted and used to suit the needs of those involved in a particular information-sharing operation, had been developed:

It reflects the fact that the range of situations in which information-sharing can take place is so broad that trying to develop a single prescriptive code ... would be unworkable.[169]

175.  Government departments take steps to safeguard databases and establish procedures for securing access to information by authorised staff, recognising that unauthorised access and other security breaches pose a risk. For example, the Department of Health's new electronic patient record system will be overseen by a National Information Governance Board, representing:

an extremely high-level and visible statement of the accountability for information governance.[170]

Other safeguards include:

a set of technical access controls and audit facilities that, along with the professional standards of staff in the NHS, safeguard sensitive patient information from inappropriate disclosure

a comprehensive privacy statement in the form of the NHS Care Record Guarantee, articulating in plain language precisely what NHS organisations must do to meet legal and policy requirements

the application of international security standards across all systems

the operation of stringent security controls—such as vetting, use of smartcards and pass codes, proof of a "legitimate relationship" to a patient as a pre-requisite for access to records, setting standards for use of records through codes of conduct and professional responsibilities—to prevent unauthorised access to personal information and to detect potential abuse.[171]

176.  The Department of Health acknowledged, however, that breaches of health databases were an ever-present danger:

You cannot stop the wicked doing wicked things with information and patient data.[172]

"Audit trails" were used to identify abuse of databases and formal disciplinary procedures were in place to deal with individual members of staff involved in such breaches.[173]

177.  The Department for Children, Schools and Families argued that for new databases such as ContactPoint, a clearly-defined purpose for the collection and sharing of information acted as a safeguard against 'function creep':

I do draw a distinction perhaps between education and the care of and welfare of children, and when it comes to systems like ContactPoint there is very clear regulation in place ... so I see no drift from that. ContactPoint is there for a very specific purpose and that is the backstop to what that system will be used for.[174]

178.  The Department for Transport emphasised the importance of clarity in establishing the legal basis and purpose of requests for it to share data:

we certainly do review the legal basis on which we do data-sharing. Most of our data-sharing is fairly long-standing but we would certainly want to know on what basis any approach was made to us and the legal justification ... We need to look at it both from both ends of the telescope ... have they got the power to seek the information but also have we got the power to give it".[175]

179.  Whilst responsibility for the privacy aspects of individual policies rests with the departments which hold information, the Ministry of Justice works with departments to ensure that they comply with data protection legislation and to assist departments which propose to share information. MoJ considers the following issues to be "critical" in any decision on information-sharing:

is there a purpose for sharing information; do the powers exist to share the information; is any intrusion on privacy proportionate to the benefits that will be gained from sharing the data; and is the data going to be adequately protected in terms of the principles underlying the Data Protection Act?[176]

  'Walport Review'

(hh)  On 25 October 2007 the Prime Minister asked Dr Mark Walport, Director of the Wellcome Trust, and Richard Thomas, the Information Commissioner, to conduct a review of the framework for the use of information in the private and public sector.

)  The review will:

  consider whether there should be any changes to the way the Data Protection Act 1998 operates in the UK and the options for implementing any such changes

  provide recommendations on the powers and sanctions available to the regulator and courts in the legislation governing data sharing and data protection

  provide recommendations on how data-sharing policy should be developed in a way that ensures proper transparency, scrutiny and accountability[177]

 A consultation on these issues was launched on 12 December 2007 and closed on 15 February 2008. A report and recommendations are to be submitted to the Secretary of State for Justice in the first half of 2008.[178]

Security breaches and data loss incidents: strengthening non-regulatory safeguards

180.  The Information Commissioner works with private and public sector organisations, including Government departments, to ensure that the safeguards put in place by the Data Protection Act are complied with. The Commissioner told us that he had—by means of codes of practice and other guidance—enjoyed a degree of success in reducing the number of cases of abuse of surveillance technologies by organisations in some areas. One code of practice covered "all aspects of monitoring staff in the workplace": recruitment, personnel records, monitoring email and internet use and health checks. The Commissioner told us that whilst risks had not been eliminated:

the fact that we were able to secure an agreed code of practice [with the support of the Trades Union Congress and the Confederation of British Industry]—we got agreement and we pushed this very hard around the employer community, and the trade unions have taken it seriously too—shows that in this particular context of the workplace—and data protection creeps everywhere, it is a horizontal law—the risks of excessive surveillance have been very substantially reduced because of our code of practice.[179]

181.  A significant strand of the Commissioner's recent work to strengthen safeguards against excessive surveillance has been research into and development of guidance on Privacy Impact Assessments (PIAs). Envisaged in the first instance as a voluntary step by Government, PIAs would involve:

An attempt by the organisation which is going to be collecting information in new or enlarged ways to record what they are going to do, why they are going to do it, how they are going to do it, to identify the various risks associated and to spell out publicly how they are going to mitigate those various risks. It is a discipline. It is a sort of risk management or risk assessment programme.[180]

The Commissioner emphasised that PIAs should not represent a "bureaucratic intervention".[181]

182.  PIAs are a requirement for federal agencies in the United States. Mr J Trevor Hughes of the US-based International Association of Privacy Professionals, told us that those who were engaged in completing PIAs were:

very supportive of and enthusiastic about such measures as a transparent tool not only for governmental data but for privacy professionals who use this tool to assist in the development, deployment and design of these products and services and allow citizens a way to look into the operations of their government to see how things are working.[182]

During our visit to the United States Hugo Teufel, Chief Privacy Officer at the Department of Homeland Security told us that PIAs were made available on the internet and that they had to be updated every two years, with a reassessment of the need for the systems and the information collected by those systems.

183.  MITRE, a not-for-profit corporation which works for the US Government and other sponsors and specialises in systems engineering and information technology, undertook research into the development and use of PIAs. Members of MITRE's privacy team acknowledged that whilst PIAs should be integrated with the design stage of a project, too often they were not carried out until the end of the process, when it was more burdensome to resolve problems. When properly implemented, we heard, PIAs provided an iterative process for assessing privacy, which would help to ensure that privacy concerns played a substantial part in the design of any system, and a 'gatekeeper' stage which would prevent a project going ahead until the PIA documentation had been approved.

184.  Dr Ian Forbes of the Royal Academy of Engineering was more sceptical of the value of PIAs as a safeguard against the risks posed by security breaches or excessive surveillance. He argued that in their current form PIAs were not thorough risk assessments but rather that:

Mostly they seem to be compliance statements or best practice statements. I do not think any of them actually say, "This is your privacy and this is how it will impact upon it for good or ill".[183]

The Royal Academy of Engineering argued in its written evidence that:

PIAs may be useful in ensuring that government policies and their implementation do not infringe excessively on people's privacy. However, it is by no means certain that they will prove effective and they may well hinder the development of ICT projects.[184]

Security breaches and data loss incidents: calls for more stringent regulation

185.  The Information Commissioner argued for an extension of his powers of audit and inspection under the Data Protection Act, in order better to tackle both illegal activity and weaknesses in the steps taken by organisations to protect the information they hold. Whilst he saw "self-interest at work" as organisations were "working very hard" to prevent breaches, the Commissioner told us that there was also "a lot of complacency". The ICO found that "people are really quite shocked to find out how easily their systems have been breached".[185]

186.  The Commissioner also called for further regulatory safeguards to prevent security breaches caused by poor information-handling practices, with penalties where there is a flagrant or a negligent or repeated disregard of the requirements of the law. The Commissioner said:

I do not want to prosecute left, right and centre, but I would like there to be a deterrent and, in the extreme case, where there had been unacceptable disregard of the regulations, to be able to go to court and have a system of fines to sanction that behaviour.[186]

187.  Professor Anderson argued that privacy was largely a policy matter rather than a technology matter because "privacy intrusions generally stem from the abuse of authorised access by insiders or from failures to regulate such access properly".[187] He linked security breaches not with inadequacy of technological safeguards but with a lack of incentive to protect information:

One of the things that we have learnt over the past six or seven years is that, when systems fail, they largely do so because incentives are misaligned and classically because some of the persons who guard a system are not the persons who bear the full economic costs of failure.[188]

Conclusion: curbing unnecessary surveillance and protecting privacy

188.  The Home Office says that it takes a "proactive approach to protecting information". As part of its reform programme the Home Office developed a corporate strategy for information, systems and technology which recognised information assurance as "one of the top cross-cutting themes". In August 2007 the Home Office initiated an information assurance review which it expects to be completed by March 2009, in line with its implementation programme for the Cabinet Office review of data handling procedures in Government.[189]

189.  We welcome the reviews commissioned by the Government to improve data security, particularly in relation to information-sharing. We expect the Government to make full use of the opportunity these reviews provide to reassess the adequacy of the definitions and principles set out in the Data Protection Act. Such a reassessment should be carried out not only in light of recent data loss incidents but also against the challenges presented by increases in the collection, storage and sharing capability of information systems and intensification in criminal activity associated with the misuse of personal information. The Home Office must act as a matter of urgency to tackle these challenges.

190.  Any increase in the collection and storage of information increases the risk that security will be breached and that information will be used for purposes other than those for which it was collected. In keeping with a principle of data minimisation, more rigorous risk analysis of systems already in place must be carried out before new techniques for collecting information are deployed or new databases planned. The decision to create a major new database, share information on databases, or implement proposals for increased surveillance should be based on a proven need.

191.  We commend the Information Commissioner for his work on Privacy Impact Assessments and support his drive to ensure that Government and others undertake thorough evaluation work in relation to the benefits and risks of surveillance. We also acknowledge that if published, in providing individuals and interest groups with details about surveillance activities which would not otherwise be made available, PIAs could help to raise awareness of the issues the Information Commissioner has sought to highlight.

192.  We are concerned, however, that PIAs might be regarded simply as bureaucratic exercises, and that they would be undertaken not before and during the design phase of any system but afterwards; by which time their value as a practical risk assessment tool would have been lost. For PIAs to be effective they should be used to carry out preliminary risk analysis for a new project before the design phase begins. For Government departments and agencies this preliminary risk analysis should culminate in a summary statement, to be signed off by the Information Commissioner or otherwise subject to independent audit. The statement should set out the benefits of a new system against the risks posed by collecting, storing and using the information required by the system.

193.  Every system for collecting and storing personal information should be designed with a focus on security and privacy. The design process should involve planning not only in relation to the technical aspects of access to systems but also to the staff management protocols for access and information-handling.

194.  Every system for collecting and storing data is susceptible to unauthorised access, misuse and theft. For existing and proposed systems the Government should specify what it considers to be an acceptable level of failure and develop contingency plans to mitigate the damage caused by leaks or theft of data.

195.  The weakest aspect of a system may be the establishment and enforcement of protocols for access and use rather than any technological safeguard. Organisations which manage such systems must take full responsibility for limiting access to databases and the information they contain and for enforcing procedures for sharing and transferring data. We support the Information Commissioner's call for an extension of his inspection and audit powers to facilitate the strengthening of these procedures across Government and the private sector. Tougher penalties for negligent information-handling should be introduced in order to make clear where the burden of responsibility lies.

196.  A privacy officer or director of data security should be assigned by departments to take responsibility for risk analysis and to report to the Permanent Secretary on the privacy implications and safeguards of each project which involves the collection or sharing of personal information.

197.  The Home Office should publish a report on an audit of the data collections managed by the Department and its agencies, outlining as far as possible without compromising security the technological and procedural safeguards currently in place.

 


123   See below at paragraph 311. Back

124   Information Commissioner's Office press release, "ICO invites tenders to review EU Data Protection Law", 14 April 2008 Back

125   Home Affairs Committee, Third Report of Session 2006-07, Justice and Home Affairs Issues at European Union Level, HC 76 Back

126   Ev 199 Back

127   Information Commissioner's Office, Framework Code of Practice for Sharing Personal Information; CCTV Code of Practice: revised edition 2008 Back

128   Ev 253 Back

129   Q 79 (Richard Thomas) Back

130   Information Commissioner's Office, Report for 2006-07, HC (2006-07) 646, p 6 Back

131   Justice Committee, First Report of 2007-08, Protection of Private Data, HC 154, p 4 Back

132   Source: www.cookiecentral.com Back

133   Communication to the European Parliament and the Council (COM (2007) 228 final, 2 May 2007)) Back

134   Ev 157 Back

135   European Commission, A Fine Balance: Privacy enhancing Technologies: How to create a trusted information society-summary of conference, November 2007, pp 17-19 Back

136   Royal Academy of Engineering, Dilemmas of Privacy and Surveillance: Challenges of Technological Change (March 2007), p 39 Back

137   Ev 178 Back

138   Ibid. Back

139   Ev 177 Back

140   Ev 273 Back

141   Ev 272 Back

142   Ev 273 Back

143   Q 205 (Pete Bramhall) Back

144   Q 205 (Dr Phippen) Back

145   Q 197 (Dr Phippen) Back

146   Q 210 (Pete Bramhall) Back

147   Q 197 (Dr Phippen) Back

148   Ev 161 Back

149   Ibid. Back

150   Q 206 (Pete Bramhall) Back

151   Q 204 (Professor Anderson) Back

152   Q 209 (Professor Anderson) Back

153   Ibid. Back

154   Q 204 (Pete Bramhall) Back

155   Q 209 (Pete Bramhall) Back

156   Ev 156 Back

157   Ev 178 Back

158   Q 415 (John Suffolk) Back

159   Q 214 (Professor Anderson) Back

160   Ev 229 Back

161   See above at paragraph 101. Back

162   Ev 226 Back

163   Ev 156 Back

164   Q 59 (Richard Thomas) Back

165   HC Deb, 21 November 2007, col 1179 Back

166   Ev 270 Back

167   Q 34 (Richard Thomas) Back

168   Q 40 (Richard Thomas) Back

169   Ev 258 Back

170   Q 327 (Richard Jeavons) Back

171   Ev 220 Back

172   Q 334 (Richard Jeavons) Back

173   Q 327 (Richard Jeavons) Back

174   Q 352 (Tim Wright) Back

175   Qq 357-358 (Dr Stephen Hickey) Back

176   Q 394 (Clare Moriarty) Back

177   Source: http://www.justice.gov.uk/reviews/datasharing-intro.htm Back

178   Ev 270-1 Back

179   Q 62 (Richard Thomas) Back

180   Q 69 (Richard Thomas) Back

181   Ibid. Back

182   Q 104 (J Trevor Hughes) Back

183   Q 268 (Dr Forbes) Back

184   Ev 165 Back

185   Q 68 (Richard Thomas) Back

186   Q 64 (Richard Thomas) Back

187   Q 195 (Professor Anderson) Back

188   Q 186 (Professor Anderson) Back

189   Ev 274 Back

 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 8 June 2008