THE LIMITS OF TECHNOLOGICAL SOLUTIONS
338. The limits of PETs are still being explored
and debated by information specialists and lawyers. The RAE report
said that whilst it was "not possible to guard against all
conceivable ways of invading privacy
it is possible to
'design out' unnecessary compromises of privacy."[169]
339. Technological solutions, if not pursued
within a wider design framework, may help to limit surveillance
and protect privacy, but they should not be seen as a stand-alone
solution. This is because the specific rules, norms and values--for
example, data minimisation, access controls, and the means of
anonymity--that may be built into technological systems must come
from outside those systems themselves. We believe it is
important to avoid assuming that a "technological fix"
or "silver bullet" can be applied to what are essentially
social and human rights issues.
340. Professor Martyn Thomas, independent
consultant and representative of the UKCRC, told us:
"There is a fundamental weakness at the
heart of the transformational government agenda which is that
you cannot build large databases that are accessible to a wide
number of people and maintain a high degree of security
it is very difficult to build a database that is technically secure
on top of commercially available, off-the-shelf software components,
because almost all of them were not designed to support such a
use, and to connect such a database to the internet simply creates
a honey pot that virtually guarantees that the data will be extracted
from it in a way that was not planned for or intended." (Q 407)
341. He pointed to a specific obstacle in the
way of better security protection for personal data:
"There is guidance in the Manual of Protective
Security on how to carry out impact assessments on what the likely
impact is of loss of personal data and on how such data should
be protected. That manual is classified. As a consequence, it
has not been peer-reviewed because it is only available to people
whom government departments believe have a need to inspect it
I would expect that that peer review would lead to significant
strengthening of the protection that was required of personal
data because it would be seen to be clearly inadequate."
(Q 407)
342. In the interests of strengthening the
protection of personal data, we urge the Government to make the
Manual of Protective Security subject to regular and rigorous
peer review.
343. Going beyond the application of technological
remedies, Professor Thomas outlined what needs to be done
if privacy is to be taken seriously:
"It requires proper hazard analysis
and then an appropriate set of protections to be put in place
to address each of the hazards
It means using the appropriate
technical
[and] social means to ensure that, firstly, you
have understood the level of privacy that you are seeking, what
level of breaches of confidentiality do you regard as tolerable
that you actually build the business processes, the social
systems, the training and the technology to deliver that level
of confidentiality in the systems
At the moment, that analysis
appears not to be being done. There is no technical barrier to
it being done, but it would lead to a lot of systems turning out
to be a lot more expensive or not practical." (Q 416)
344. The importance of improving the technological
safeguards for privacy has been underscored by the Council for
Science and Technology (CST) in their plea for further research
into PETs, including techniques for anonymising data, encryption,
and countering viruses.[170]
345. In the light of the potential threat
to public confidence and individual privacy, we recommend that
the Government should improve the safeguards and restrictions
placed on surveillance and data handling.
346. Toby Stevens, Director of the Enterprise
Privacy Group, outlined current developments in industry regarding
privacy technology:
"The industry is focused very hard on this.
The problem that they often seem to stumble up against is the
lack of a common framework, a common language, a common understanding
of what the problems are and what the desired outcomes look like
To date, most of the privacy-enhancing technology programmes
that we have seen over recent years have failed, either due to
lack of interoperability between those that roll them out or a
lack of perceived consumer demand. That does not mean it is not
there, but the consumers have failed to understand what it is
they are being offered." (Q 297)
347. This situation is likely to inhibit government
procurement of privacy-enhancing technology which, in the view
of Philip Virgo, Secretary General of EURIM (the European Information
Society Group), is already compromised by the life-cycle of the
procurement process and the effect of the "churn" of
ministers and officials on project specifications. (Q 303)
Toby Stevens also thought that "government procurement does
not reflect good privacy practice in general." (Q 303)
348. As the state is the main single "customer",
public sector procurement specifications have an important influence
on system design. The CST suggested that government, as the major
procurer of information technology services and systems, should
use procurement specifications to effect improvements in security.[171]
In order for this to happen, as Professor Sasse told
us:
"It is the people who are commissioning
and paying for the system who should have to be clear about what
their security requirements are. Ultimately, the company who is
building the thing will only give the customer what they ask for.
They may raise a few points but currently we really have a problem
that the customers often do not articulate their security requirements,
they do not think about them." (Q 415)
349. We recommend that the Government review
their procurement processes so as to incorporate design solutions
that include privacy-enhancing technologies in new or planned
data gathering and processing systems.
114 Gordon Brown MP, Speech on Security and Liberty,
op. cit. Back
115
Transformational Government-Enabled by Technology, op. cit., para
39(4); Cabinet Office, Transformational Government-Implementation
Plan, March 2006, paras 53-58. Back
116
Transformational Government-Implementation Plan, op. cit., paras
66-67. Back
117
Sir James Crosby, Challenges and Opportunities in Identity
Assurance, March 2008, p 3 and para 1.5. Back
118
Dilemmas of Privacy and Surveillance: Challenges of Technological
Change, op. cit., section 7.1.2. Back
119
Information Sharing Vision Statement, op. cit. Back
120
Data Handling Procedures in Government: Final Report, op. cit.;
Review of Information Security at HM Revenue and Customs, op.
cit.; Report into the Loss of MOD Personal Data, op. cit. Back
121
A Surveillance Society?, op. cit., para 163. Back
122
See for example http://www.homeoffice.gov.uk/passports-and-immigration/id-cards/how-the-data-will-be-used/ Back
123
Biometrics Assurance Group, Annual Report 2007. Back
124
1st Annual Report of the Ethics Group: National DNA
Database, op. cit. Back
125
National CCTV Strategy, op. cit., Chapter 3. Back
126
Data Protection and Human Rights, op. cit., p 3, and para 27.
Back
127
See Box One above. Back
128
Appendix 4, para 4. Back
129
Data Protection and Human Rights, op. cit., para 26. Back
130
Government Response to Data Protection and Human Rights, op. cit.,
pp 6-7. Back
131
Cabinet Office, Data Handling Procedures in Government: Interim
Progress Report, December 2007, paras 7-12. Back
132
Data Handling Procedures in Government: Final Report, op. cit.,
paras 7-9 and Section 2. Back
133
ibid., Section 2. Back
134
ibid., Annex I. Back
135
Data Sharing Review Report, op. cit., Foreword, Chapter 7, and
paras 5.21, 5.26, 5.30, 8.28. Back
136
Data Sharing Review Report, op. cit., paras 1.4, 5.28-5.29, 8.3-8.4 Back
137
Data Handling Procedures in Government: Final Report, op. cit. Back
138
Protecting Government Information-Independent Review of Government
Information Assurance (The Coleman Report), June 2008, Chapter
3. Back
139
ibid. See especially p 7. Back
140
Data Handling Procedures in Government: Final Report, op. cit.,
pp 39-40. Back
141
Data Sharing Review Report, op. cit., para 5.5. Back
142
See also A Report on the Surveillance Society, op. cit., sections
45.1-45.2. Back
143
See http://www.ico.gov.uk/for_organisations/topic_specific_guides/pia_handbook.aspx
Back
144
Williams V, "Privacy Impact Assessment and Public Space Surveillance",
2007. Back
145
Appendix 4, para 77. Back
146
ibid., paras 50-51. Back
147
Data Handling Procedures in Government: Final Report, op cit.,
para 2.11. Back
148
ibid., p 40. Back
149
Data Sharing Review Report, op. cit., para 5.5. Back
150
ibid., para 8.43. Back
151
Response to the Data Sharing Review Report, op. cit., p 17. Back
152
A Passenger Name Record (PNR) holds many details about a passenger,
including name, age, details of contact, ticketing and payment,
frequent flyer details, special meals or personal assistance needs,
passport details, itinerary, etc. Back
153
LACORS Parliamentary Briefing Document on the Draft Consolidating
Orders on the Regulation of Investigatory Powers Act 2000 (RIPA),
June 2008, pp 4-5. Back
154
Annual Report of the Chief Surveillance Commissioner, op. cit.,
para 9.2. Back
155
Covert Surveillance-Code of Practice, op. cit.; Home Office, Acquisition
and Disclosure of Communications Data-Code of Practice, 2007;
Home Office, Investigation of Protected Electronic Information-Code
of Practice, 2007; Home Office, Covert Human Intelligence Sources-Code
of Practice, 2002; Home Office, Interception of Communications-Code
of Practice, 2002. Back
156
Report of the Interception of Communications Commissioner, op.
cit., para 3.25. Back
157
Annual Report of the Chief Surveillance Commissioner, op. cit.,
para 11.1. Back
158
Data Protection and Human Rights, op. cit., para 21. Back
159
This seeks to ensure that organisations give due consideration
to data protection prior to the development of new initiatives.
See Enterprise Privacy Group, Privacy by Design-An Overview
of Privacy Enhancing Technologies, November 2008. Back
160
See Lessig L, Code and Other Laws of Cyberspace, 1999;
Reidenberg J, "Lex Informatica: The Formulation of Information
Policy Rules Through Technology", Texas Law Review,
Vol 76, No. 3 (February 1998), pp 553-93. Back
161
Information Commissioner's Office, Data Protection Guidance Note:
Privacy Enhancing Technologies (PETs), March 2007; and Information
Commissioner's Office, Privacy by Design Report Recommendations:
ICO Implementation Plan, November 2008. Back
162
Information Commissioner's Office, Information Commissioner's
Response to the Cabinet Office Consultation on 'Transformational
Government: Enabled by Technology', February 2006, pp 4-5. Back
163
Protection of Private Data, op. cit., Oral Evidence, Q 40. Back
164
Review of Information Security at HM Revenue and Customs, op.
cit. Back
165
Dilemmas of Privacy and Surveillance: Challenges of Technological
Change, op. cit., Chapter 7. Back
166
ibid., pp 37-38. Back
167
Home Office, National Identity Scheme: Delivery Plan 2008, p 25.
Back
168
Data Handling Procedures in Government: Final Report, op. cit.,
pp 16-18. Back
169
Dilemmas of Privacy and Surveillance: Challenges of Technological
Change, op. cit., p 37. Back
170
Council for Science and Technology, Better Use of Personal
Information: Opportunities and Risks, November 2005, paras
40-47. Back
171
ibid., para 44. Back