Memorandum submitted by Dr Ian Forbes
This submission is intended to complement the
evidence submitted to the Committee by the Royal Academy of Engineering.
Dr Ian Forbes was a member of the Royal Academy's Working Party
on Privacy and Surveillance, and contributor to the Academy's
Report, Dilemmas of Privacy and Surveillance: Challenges of
Technological Change. His submission includes additional reflections,
in response to the specific concerns of the Committee's Inquiry.
The organising principle of the Report of the
Royal Academy of Engineering is that protecting privacy, achieving
greater levels of security and maximising utility will always
generate dilemmas for individuals, government and organisations.
The development and use of technologies leading to a so-called
surveillance society are associated with a wide range of dilemmas.
Nevertheless, efforts to strike satisfactory balances are essential,
and can be successful. The costs of not recognizing and addressing
these dilemmas include a decline of public trust, inefficient
allocation of resources, and avoidable failures.
The design of any system that
collects and processes personal data must have a primary focus
IT projects that include the
collection and processing of large amounts of data must have thorough
risk assessment procedures and effective implementation mechanisms.
Accepting that failures will
occur, incorporate appropriate procedures.
Make reciprocity a design feature.
People occupy many roles, so
it should always be possible for an individual to keep these roles
separate, and to preserve the distinction between identification
Data sharing should only carried
out when there is an explicit need and reason.
Personal data should only be
used for the purposes for which consent has been given.
In general, public agencies
should not be allowed access to private databases.
Public record databases should
be under the control of autonomous agencies, not government.
Penalties for misuse and abuse
of personal data should reflect the damage and distress that the
system failure or crime causes.
CCTV, SOCIAL GOODS
New and emerging technologies which explore
and exploit the capacity to collect, store and manipulate data
about citizens and their behaviour are much deployed by industry
and the police and security services. Much of the debate, and
much policy, focuses on the security aspects of the way that organisations
use these technologies for profit and convenience. In relation
to crime and security, the emphasis is almost entirely on public
This is especially true for surveillance technologies
involving cameras. There are serious concerns about the proliferation
of this technology, and the quickly-evolving capacity digitally
to store, interpret and transmit human images. At present, these
technologies are largely restricted to users who want to prevent,
monitor and sometimes punish certain behaviours, despite the lack
of evidence that surveillance alone is effective. Apart from the
problems associated with general invasions of privacy, specific
problems of predictive profiling of some sectors of the community
arises with the increased capacity to identify individuals, and
target apparently "deviant" or "unusual" behaviour.
The design assumptions that are built into these technologies
are as important as the assumptions of the human operators of
these systems. Failures in any of these systems expose the fragility
of public trust, and can contribute to a lack of trust not just
in the systems, but government and its agencies.
Hardly any attention has been paid to the positive
uses of this technology. Communities have long had a justifiable
interest in their public spaces, in who uses them and how. However,
local communities and citizens under surveillance have few if
any opportunities to see and learn from what the vast number of
cameras see. The uses and benefits of this technology are currently
under the control of the operators, who effectively own the images
and data of citizens without having gained their consent. Unless
and until ordinary citizens are given an active stake and a determining
say in the processes and practices of camera surveillance, new
and socially beneficial uses of these surveillance technologies
will emerge only very slowly, or not at all. A new approach is
needed, which introduces a climate of candour and a requirement
of reciprocity, so facilitating creative input from communities
and citizens. Finally, creating opportunities for citizens to
contribute to the design and use of these systems will help broaden
the basis for trust.
The right to conduct surveillance
should generate reciprocal rights for those under surveillance.
Purposes, placement, conditions
of use operating practices and personnel should, by law, be subject
to consultation, agreement and challenge by those under surveillance.
The full range of policy tools should be employed:
initiate new legislation to
set high design standards, require best practice implementation
and make compensation for system failures routine and costly to
operatorsin other words construct an effective incentive
increase the powers for the
IC, including audit power and greater penalties;
establish a new body to oversee
the collection, retention and use of bioinformation (including
DNA profiles, fingerprints, facial images and so on);
encourage and reward industry
government and its agencies
need to set the highest standards;
introduce reciprocal rights
for those who supply personal data in any form;
facilitate debate on privacy
and security dilemmas; and
inform and consult widely on
PIAs are not a proven mechanism for producing
effective change or reliable information.
They may have the unintended consequence of
diverting energies into a new bureaucratic procedureand
a new wave of consultantsthat fails to lead to productive
change. (The experience of EIS is instructive.) PIAs, in other
words, could work against privacy.
Many PIAs would quickly gravitate toward being
a standard, defensive document, containing:
Predominantly obvious conclusions,
with similar findings reproduced in almost all PIAs.
Disclaimers about important
aspects of privacy impact which are characterised by uncertainty.
An assessment that will never
identify an unintended and unforeseeable consequence.
Assumptions that all other things
remain constant. Changes of circumstance, technology, legislation
and practice could vitiate any PIA at any point after its completion.
Monitor the introduction of PIAs in Canada in
order to assess their efficiency in protecting privacy, their
bureaucratic efficacy and opportunity costs.
Personal data should never be
scored in unencrypted form.
The minimum amount of data should
be kept for the minimum amount of time.
Personal data in large databases
should be checked regularly with data subjects to ensure that
they are accurate.
If a database contains personal
data about many people, or vulnerable people, the database access
software should be developed to very high standards of security
If data are lost, individuals
affected must be informed and compensated swiftly.
Systems should be designed to
keep an automatic audit of when the data are accessed and by whom
and especially when data are changed.
Profile-based decision systems
should be open, accountable, contestable and non-discriminatory.
The national DNA database should
be used only to store the DNA profiles of those individuals involved
in criminal proceedings.