Examination of Witnesses (Questions 368-379)
Professor Angela Sasse, Professor Martyn Thomas and
Dr Ian Forbes
27 FEBRUARY 2008
Q368 Chairman: Dr Forbes, Professor
Thomas, Professor Sasse may I welcome you to the Committee and
thank you very much for coming. We are being recorded, but not
televised, so could I please ask you to state your names and organisations
for the record?
Dr Forbes: My name is Ian Forbes and
I am a consultant with fig one Consultancy and I am representing
the Royal Academy of Engineering.
Professor Thomas: I am Martyn Thomas.
I am an independent consultant software engineer. I was on the
Royal Academy of Engineering Study Team and I also, with Professor
Sasse, submitted the evidence from UKCRC.
Professor Sasse: My name is Angela Sasse.
I am a professor at University College London and I am representing
UKCRC.
Q369 Chairman: Before we proceed
to questions, would any or all of you like to make a preliminary
opening statement?
Dr Forbes: I would not mind doing that.
It would be on the basis of the report of the Academy and the
organising principle of that report is that protecting privacy,
achieving greater levels of security and maximising utility will
always generate dilemmas for individuals, governments and organisations.
The development and use of technologies leading to a so-called
"surveillance society" are associated with a wide range
of dilemmas. Nevertheless, efforts to strike satisfactory balances
are essential and can be achieved and be successful. The costs
of not recognising and addressing these dilemmas include threats
to, and a decline in public trust in some of these areas, inefficient
allocation of resources and avoidable failures.
Q370 Chairman: Thank you very much.
Perhaps I could start by asking about the UKCRC's written evidence
which says " ... no aspect of any individual's life will
be wholly private in future, unless effective measures are introduced
to limit the use of the technology that is available now or ...
in future". Could I ask whether such limits would be most
appropriately placed on data collection, processing or the uses
to which the data are put?
Professor Sasse: It is all three, but
the emphasis has to be on collection. Once you have collected
the data it takes resources and it takes effort and know-how to
protect it and those mechanisms might always fail, particularly
when there is mission creep; when there are competing demands,
those safeguards may turn out to be inadequate. Also, once you
have data collected, development of technology may mean that things
you did not think were possible at the time when you collected
them, can be done in the future. For instance, when the DNA database
was set up, the possibility of familial screening did not exist.
Q371 Chairman: What kind of effective
measures do you envisage and would they require legislation?
Professor Thomas: If I may, I would just
like to add a couple of things to what Angela has said. One of
the problems of collecting substantial amounts of data and then
retaining it for a period is that you can carry out all sorts
of correlations between data that were never envisaged before
and that can reveal all sorts of aspects of individuals' lives
which probably were not apparent to them at the time when they
gave consent, if they ever did, for that data to be collected.
Therefore retaining data, and in particular sharing data so that
it can be correlated, undermines the principle of informed consent.
On the sort of safeguards that could be put in place, it is quite
hard. Enforcing the real letter of the Data Protection Actrequiring
that only the minimal amount of data is collected for the purpose
for which that data has been said to be collectedwould
have a profound effect and it clearly is not happening at the
moment. There are just trivial examples, like the way that the
Oyster card is collecting all sorts of data about people's travel
patterns, which are not really necessary in order to carry out
the functionality of providing a pre-payment card for travel.
You can see many, many examples where, for example, people's names
and addresses are collected when in fact all you need to do is
to accredit them to be able to carry out some activity. That creates
the fundamental privacy issue, because now you have identifiable
personal data whereas previously you had data which it would have
been a bit more laborious to turn into identifiable personal data.
Two issues there really: one is restriction on collection and
one is restriction of further processing and retention.
Q372 Lord Peston: Wearing my former
professorial hat, what you are saying seems to me to make life
very difficult for social scientists who have always had to get
by on data not usually collected for them. What you are suggesting
is actually stopping them using this data. Most of us who have
done research in fields happen to say "Ah, but I could use
that data and correlate it with that data and then I might get
some results". For most of my research lifetime, the idea
that I would have to be involved at the beginning and get permission
and all of that, would make social science nearly impossible it
seems to me. To take your theme of travel patterns, there are
social scientists generally interested in research into travel
patterns and to be told that there is data, but because it was
not collected for that purpose they cannot have it ... I would
fight like mad. I put it to you: do we not have responsibilities
as scientists in the social area to fight that and not accept
it?
Professor Thomas: There are some fundamental
principles that need to be addressed. One is the principle of
informed consent and the other is the issue of identification.
Q373 Lord Peston: The latter I accept,
but the informed consent ... If I am told that I cannot do research
in this area because I did not get informed consent in the first
place from several of the individuals who are in there and who
will not be identified so I cannot do the research. I feel that
that is incredibly anti social scientist.
Professor Thomas: But the two are linked;
if you cannot identify the individuals, then you do not need the
consent because it is not personal data.
Professor Sasse: The DPA only covers
personal data.
Q374 Lord Peston: So as long as the
data is aggregated, are you saying you would never ask for informed
consent?
Professor Thomas: Ideally.
Q375 Lord Peston: Supposing I was
to say on my Oyster card that I do not give consent when I buy
my Oyster card. You are not suggesting that that should be how
it works, are you?
Professor Thomas: No. What I am suggesting
is that you could try to devise a way of setting up an Oyster
card scheme that does not identify the individual. My Oyster card
is not easily identified to me because I have never registered
it and I have only ever topped it up with cash. It could be correlated
with my mobile phone records.
Lord Peston: How is your terrorism activity
going on?
Q376 Baroness O'Cathain: I was very
interested in the point about the mission creep implications of
all this. Of course, whereas you are right constitutionally about
informed consent, et cetera, there seems to be now an overwhelming
and overriding consideration due to terrorism and the so-called
war on terrorism. At the base of all of this now is that the surveillance
society equals our ability to stop crime or to avoid terrorist
attacks, so how can you justify not collecting it?
Professor Sasse: I am sorry but I have
sat in on quite a few debates where terrorism experts have responded
to that question and their response has been that, no, the surveillance
society does not prevent terrorism. Whether that has been in the
discussion surrounding the national identity register or similar
acts, it is basically a tempting but erroneous assumption that
just because you can identify people this enables all sorts of
security features and subversions.
Q377 Baroness O'Cathain: But the
criminals involved in the bombings on the Tube were identified
as a result of closed circuit television cameras; those people
were identified by closed circuit television cameras.
Professor Sasse: Yes, they were identified
and we are not saying there should be no closed circuit television
anywhere. What we are discussing here is a proper legal framework
that governs how that information is being used, who can use it,
for what purpose, how long it is being kept and so on. Of course,
you would want to be able to look at it for many crimes but does
that mean that CCTV footage should be kept forever, and should
be used for fishing expeditions as opposed to targeted investigations
of actual crimes?
Professor Thomas: If I may just add a
quick point, it is an easy mistake to make, but a dangerous mistake
to make, to assume because certain data is available and it was
used to detect few crimes, that therefore it was necessary to
collect it, or that collecting it was proportionate, that it was
actually more beneficial to have it than the damage that was done
by collecting that data overall. If those particular terrorist
bombers had not been picked up on CCTV, they would have been found
by other means. They were not setting out to conceal their identities.
Q378 Lord Morris of Aberavon: May
I come back to My Lord Chairman's question? What effective measures
are contemplated to limit the use of technology in the future?
As I understand it from Professor Thomas's answer, there is ample
material in the Data Protection Act but it is not sufficiently
enforced or not appropriately enforced. Is that the position?
Is legislation needed?
Professor Thomas: I believe that strengthening
the Information Commissioner's Office so that he has more resources
to enforce the Act would be extremely beneficial. Giving him the
ability to require that audit activity be undertakenrequiring,
for example, that a company's auditors reported on compliance
with the Data Protection Actthat could be very powerful
because it would extend the ICO's reach and it would provide an
independent check on whether the DPA was being followed. The people
who are most able to protect privacy are the people who are collecting
the data and therefore shifting the burden of liability firmly
onto those people would have a profound effect. If, for example,
there were a statutory requirement to inform data subjects, whose
data had been inappropriately accessed or lost, of that event
that would be quite a powerful incentive to people not to lose
data. If it had to be accompanied by a statutory flat rate of
compensation, even at the level of £20, suddenly those databases
start to have a significant financial value. A £20 individual
compensation would have meant that the HMRC data, for example,
was worth half a billion pounds and if you have a database worth
half a billion pounds on a couple of CDs in your hand, you do
not put it in the internal mail.
Q379 Lord Woolf: Is the real answer
to what you have just said not that it is worth that amount of
money but that there is a penalty of that amount of money and
that penalty and the value do not necessarily coincide and the
penalty in fact in what you are talking about would be totally
disproportionate?
Professor Thomas: No, because it was
unnecessary to ship that entire database and therefore what a
flat rate penalty would do would be to cause people to think how
many individuals' data they need to put at risk in order to be
able to carry out this particular piece of processing.
Professor Sasse: The National Audit Office
did not ask for the entire database, they asked for a subset of
data and they would have been quite happy to receive them in a
format that was not so risky. However, somebody at HMRC, or several
people at HMRC, decided that the amount of money the contractor
demanded for reducing the database and making it less risky, they
made a judgment, was not worth it and the amount has not been
revealed. I do not know what that amount of money was, but clearly
it was a completely wrong judgment because that amount of money
was not in proportion to the risk for all the people whose data
was on that disk.
|