Examination of Witnesses (Questions 260
TUESDAY 26 JUNE 2007
Q260 Mr Benyon: In the United States,
for example, they throw away the key. It is Alcatraz if you breach
data protection. We have a slightly different attitude in some
respects in this country.
Dr Forbes: That is about sentencing
policy and how you treat adults and children. I do not think that
is an issue about data, frankly. Teenage hackers will show you
how vulnerable your systems are, so they are very useful in fact.
To punish them for your own failures in your own systems I think
is cruel. If I could just come back to your example of the fax,
I think it is a terrific example where the specifications for
a database, for example the NHS database, will have been, "We
want a database that does this, this and this". Has anyone
gone round and asked, "I want to know from all of you medical
professionals what things have gone wrong that the system should
look at and come up with some way of dealing with?" Instead
of just a new specification, it is a problem specification. We
know that this is going to happen; we know that this is typical;
we know that that happens. Design it, please, not so that it is
going to do all these lovely things but so that it will address
some of these common problems with records that the medical service
knows about. I think that is the way that you can build in protections.
Q261 Gary Streeter: Dr Forbes, you
mention in your paper some concerns about the invasion of privacy
caused by the four million CCTV cameras we now have in this country,
although they make our constituents feel safe and they want more
of them, not fewer, I think. What are your concerns and can you
give some concrete examples of this invasion of privacy?
Dr Forbes: I think having four
million cameras is already an invasion of public privacy, which
seems not to have been a consideration by members of the public.
They have just given it away, in a way. There are examples of
the way that cameras have been used to the detriment of particular
individuals and groups of individuals, women for example. The
next problem is going to be an extension of that, if there is
no check or consideration about what the invasion that is taking
place might turn into, because now coverage by cameras is mostly
digitally stored, so it is there for ever. Like anything else,
it is just data which can be mined, explored and new technologies
and new software can look at that data again and again and pull
more things out of it. Effectively, your act of walking down the
street may become interpreted as something very different in the
future. At no point has consent been given by an individual entering
a public space. At most, they are warned that they are being watched,
if at all, if there are signs around. Basically the message is:
we are watching you, do not misbehave. It is an incredibly negative
and critical message to be sending out to any citizen, it seems
to me. The idea that at some point in the future somebody could
say, "Right, this person wants to stand for public office.
Let us Google them to see what is available in the past. Let us
run some of these softwares and say, `See the way this politician
walkscompletely dishonest, and we know this from gait recognition
technology. Why are they over there? What is going on?' "
I can see parties that would be interested in doing that sort
of negative take on a person's past, either a party or the press
or the media.
Q262 Gary Streeter: What is the solution
to then? Is it not to take the pictures in the first place?
Dr Forbes: I do not think you
can stop taking the pictures. They are there now. The cameras
are there. If you think about health and safety legislation, more
or less everybody is asked to do a risk assessment on what is
happening in a particular situation and you get a proliferation
of warnings and signs and a lot more awareness of what your behaviour
might end up as or the harm that might come to you. I would have
thought you need more signs saying, "You have come into this
area and we are going to have your record and we are going to
do what with it". I would like to know if it is going to
be stored and where, how and who gets access to it. The other
side of it is to say: let us think about this in a positive way
instead of in a negative way. What might I want to know about
what happens in my public space that I enter into and go out of
on a daily or weekly basis? I would like to have access to see
what is happening. I also would like to know why people are watching
this. What is the use value of watching that space? What is their
justification for it? What are their reasons and what are they
looking for? Mostly they are looking for bad behaviour but the
community might want to ask: when we are surveilling this piece
of public space, let us think not about justice and crime issues,
security issues, but about care issues. Might we look at this
much in the way that the NHS does and say: we are not looking
for behaviour; we are not looking for an individual who might
be criminal, but we are looking for things that happen to the
detriment of society. We might say that there is a problem here
for this group of people. It is hard for them to get around; they
are not serviced by the way this space is configured. There are
lots of ways we can think about how we care for ourselves in our
community by looking at what we capture on our images, on our
Q263 Gary Streeter: This is what
you mean by new and socially beneficial uses of surveillance technologies.
Does that not mean that basically more people will be looking
at these images so that there is even more of an invasion of privacy?
Dr Forbes: Then the people might
say, "Let's not look there". We might say that we want
this camera a bit further away. We can see the benefit of watching
this area but we do not see the point of intrusive watching. We
might say, "Let's have some information of a different kind
collected. When do we need lights on or not need lights on?"
There are all sorts of things. You do not really know. The community
is being watched all the time but we do not get to say from our
perspective that something else might be done. There is no opportunity
for creativity and innovation coming from people. The technology
is there. It is a bit like text messages. The techies did not
design texting for us. People decided that it was quite handy
and they used it, and it became prolific and ubiquitous. We have
already got the surveillance which is ubiquitous but the uses
of it are not in our possession, even though it is always of us
in our public space.
Q264 Chairman: Can I be clear here
that what you are suggesting is that communities should be invited
to come up with ideas about how community-based surveillance should
take place. You are not suggesting, or are you, that every member
of the community should have the same access to the cameras and
televisions pictures as, for example, the people who working in
the CCTV control centre would, who are sackable, dismissable,
prosecutable should they breach regulations?
Dr Forbes: Why not introduce reciprocity?
If you can see me without my consent, then I think I ought to
be able to see what you are watching.
Q265 Chairman: One reason might be
that I am happy for the images to be looked at by somebody who
has been through a reasonable recruitment process, who is properly
managed, who will be sacked if he breaches it and, as we have
seen in a tiny handful of cases, actually prosecuted, whereas
my next-door neighbour may just be a nosey parker and the last
thing I want them to do is keep an eye on who is walking the street
Dr Forbes: They probably do that
anyway by looking out of the window! I want to shift the balance
here really. There is a dilemma of privacy and security but there
are not any other creative possibilities going on of care, concern
and interest of people saying, "Actually we do not want it".
There is no opportunity for that. I just think (a) that people
should always be consulted before cameras are set up and they
should be asked why and how and contribute to that; and (b), yes,
let them see what is going on, let them be bored, if they like,
as well and see what happens.
Q266 Gwyn Prosser: Dr Forbes, I want
to ask you about privacy impact assessments. The Information Commissioner
came before the committee and he described them as nothing much
more than a discipline and a risk management tool and he seemed
quite keen on them. You seem to conclude that risk impact assessments
might actually work against privacy, which seems counter-intuitive.
Can you give us the grounds for that view?
Dr Forbes: First, if there were
risk impact assessments, I would not have a problem with that
but they do not say that. They call them privacy impact assessments.
I have not seen one that says, "This will impact upon your
privacy in the following way". They all seem to say, "This
will not affect your privacy because we have terrific systems
which never fail and, in any case, if they do, we will fix it
almost straight away".
Q267 Gwyn Prosser: We have heard
a little of that from our other two witnesses this morning.
Dr Forbes: No, I do not think
that is the case at all.
Q268 Gwyn Prosser: This is a system
with treble locks which will not affect privacy.
Dr Forbes: Yes, and it is about
protecting that privacy, which is assumed to exist, so there is
not really a discussion about what privacy is in the first place
and is it privacy to me as an individual or a member of a family
or a group or a profession or career? None of those things are
clear and so I do not see how you can actually do a privacy impact
statement unless you are clear about what the privacy is supposed
to be. Mostly they seem to be compliance statements or best practice
statements. I do not think any of them actually say, "This
is your privacy and this is how it will impact upon it for good
or ill". If they did, that might be interesting, but they
Q269 Gwyn Prosser: You have nothing
positive to say about their possible introduction at all?
Dr Forbes: No, because I think
they are mis-named and they give you the impression that they
are looking after your privacy but they do not do anything about
that at all. If I want to know how good a system is, please tell
me how good your system is for managing data.
Q270 Gwyn Prosser: Would it help
with regards to some public assurance to assure the public that
the impact has been considered, the risks of privacy would be
considered if the system was put in place? Would that be possible?
Dr Forbes: I would like to see
a consultation on what people think is private and what needs
to be kept private. Most of them just conform to the legislation,
it seems to me. You want to introduce some legislation that says:
this is privacy, this is what it means, this is how it might be
damaged, and do a check list that way. Then it might be interesting,
but at the moment I think they are misleading.
Q271 Gwyn Prosser: Can you tell us
anything about the experiences in the States and in New Zealand
and Canada for instance where they are already in place to a degree?
Dr Forbes: They all seem to be
the same. They are about compliance. I read the Homeland Security
one yesterday and it was a joke really because it basically said,
"We have a very good system and these are the three ways
we protect our data and they trust us. If it breaks, we will fix
it pretty soon"if you find out, but you cannot find
out. You cannot be compensated. If we think back to the popular
environmental impact assessments, the evidence is that 90% of
the time they do not really have an impact on outcomes. They have
got to be able to say: yes, no, or do not know. If they say "yes"
they are accepted pretty much. If they say "no", they
might have an impact but mostly they do not. That is what I worry
about with privacy impact assessments. If somebody really did
say, "Look, this is going to affect our privacy", and
I do not know who is going to do them, usually it is in-house,
then it is doubtful that anything would change.
Q272 Patrick Mercer: Turning now,
if we may, to profiling, to all of you, what particular problems
are associated in your view with predictive profiling to target
deviant or unusual behaviour?
Dr Forbes: The key problem here
is that there is a shift that is often unacknowledged but is crucial
from a person's behaviour to the identification of that person
as something. I might see your behaviour but that does not mean
I understand who you are or know who you are. Criminal activity
does not mean that person is a criminal. They are a person engaging
in criminal activity but the shift from one to the other is very
quickly made once you go for predictive profiling. A person comes
before you. They are scanned through your profiling system and
then they are labelled. They are labelled, not their behaviour.
They are labelled. That is the problem. They are then treated
as if they are equivalent to that label. It is just as lazy as
stereotyping. You need cohorts and you need to understand your
data, but it is a way of using new stereotypes.
Q273 Patrick Mercer: What can we
do about it?
Dr Forbes: I think that information
is crucial. If somebody wants to gather my data and work up a
profile of me, I need to know that. That would impact on my privacy.
That I would like to know about in a privacy impact statement.
This data is going to be used to profile me. That would impact
on my privacy because I would not really know what was going on.
I do not know the routines. If you think way back to the St George's
Medical School, it had a fantastic points system for admitting
students until somebody realised that if you had the lowest number
of points, you got in but if you were a woman you got an extra
10 points; if you were an ethnic minority person, you got an extra
10 points, just because it was in the system. So perfectly reasonable
people who were not wanting to discriminate were running this
system and producing discriminatory results. You do not always
know what is going into those assumptions that construct the profile
and you cannot really be sure what is coming out. Most of this
stuff is done by companies for their convenience and for their
maximisation. It is not really a public interest profiling that
we are talking about to which you might agree.
Q274 Patrick Mercer: Do you accept
that profiling may have a legitimate part to play in crime fighting,
counter-terrorism or to enable the police effort to be concentrated
in the most effective way?
Dr Forbes: Yes, but it is full
of dilemmas, is it not? Yes, you want them to target their efforts.
However, past experience shows that the targeting of the efforts
often turns out to be discriminatory in practice on the ground,
so that its use is complicated. It may well be that there was
more crime amongst a certain group but why is that? It may be
because that group is already targeted and more crimes were picked
up. There was a report recently that shows how much middle class
crime there is, which is just not picked up. Why is not the profiling
targeting all these middle class criminals?
Q275 Patrick Mercer: Could NHS patient
records, for instance of psychiatric patients, not be of assistance
to the police in allowing them to profile people who potentially
pose a threat to the public?
Dr Forbes: I think that sort of
data is so difficult to get right that I would be very concerned
Professor Wessely: I never thought
that I would even discuss this but 20 years ago I did my PhD on
the prediction of violent behaviour in people with schizophrenia.
The problem is that it is incredibly inaccurate. It is okay for
a large group of people and so you can make predictions about
large samples in populations, but when it comes to the individual,
it is incredibly inaccurate. The risk of hazard and detriment
to that individual being deprived of their liberty for things
that they are not going to do is very high as opposed to the one
person who is going to commit a serious offence. Back when I did
the research, you would be locking up something like 30 people
who were not going to commit a serious crimeand this is
for schizophreniafor one who was, and I do not think it
has changed that much. I am not up to date. The second point is:
I cannot see any circumstances in which the police would be allowed
access to, of all things, mental health records. Of all the things
that are sensitive personal information, speaking as a consultant
psychiatrist, that would not happen. The only way that it would
happen would be through a court order, which already we would
have to obey but it would be fought tooth and nail. It would be
so destructive to how you deal with psychiatric patients and how
you manage mental health services, it would just be quite an appalling
future. I have not heard that proposal.
Professor Dezateaux: In fact it
might be helpful if the police were to come and talk to epidemiologists,
because they do know quite a lot about associations being a fallacy
in terms of individual predictions.
Q276 Chairman: Professor, that is
one of the areas we said we might question you about, but you
are a child health expert. The Government is constructing a database
of children apparently, and one of the aims is some sort of predictive
profiling to recognise children who are seen to have a bigger
set of risk factors. Can I ask you what your view is about that?
Do you share the general concern about the inaccuracies of profiling
or, given there are so many cases where children have slipped
through the net through the failure to share information between
different professions, and so on, is there actually a value in
that database that is being created?
Professor Dezateaux: Yes, firstly,
I do believe there is, but I think you need to make the distinction
between how it allows you to deliver effective care to an individual
child and avoid some of the Climbié, and so on, tragedies
that we see repeatedly and stepping back and saying: how does
that information at a group level, at a population level, help
you in other ways? If we take, first of all, the opportunities
to identify whether there have been concerns about a child, we
know that quite a few children do end up in contact with healthcare
before they are harmed and that it is at the moment very difficult
for anyone to get access to information that would help them know
that there had been any concern. Because people are conservative,
there are often many more concerns expressed about a child than
there would be things that would be in the public domain, even
being registered at risk. So I think this information can be useful
and it obviously needs to be accurate, and, again, it needs to
link across a unique identifier to avoid children being incorrectly
identified. I think the same point is evident, that just because
certain factors are associated with an increased likelihood of
a behaviour, it does not mean that just because they are present
in an individual that they are behaving in this way, and I think
that healthcare people need to be aware of that, but I think in
terms of Every Child Matters, child protection issues that are
terribly important, this is an advance.
Q277 Chairman: One final question,
if I may. I want to go back to the concept that you floated and
then moved on to about community assent as an alternative to individual
decision-making about this. Dr Forbes has perhaps floated one
model or one approach to be used in relation to CCTV, but could
you say briefly what you have in mind? We can say we have all
been elected by communities and, therefore, if we all say it is
all right, that is community assent, but I do not think many of
us would push that out too far with our constituents. If the focus
on individual control of data is not quite the right one, how
would you express this community assent?
Professor Dezateaux: I think there
are certain types of activity that are a class of activity where
one can actually debate the principle of that and come to a position
for an infrastructure with checks and balances that would be acceptable.
Currently, as it is, we do not actually have a process that engages
the public. So, I think that trust is very important but I think
that Onora O'Neill has shown very clearly that trust that relies
upon this individual consent, whenever studies have been done,
show that actually informed consent is an ideal that is very,
very hard to achieve at an individual level and that, in fact,
you may have a better process by using community assent. However,
I think it needs public engagement, accountability, communication
and transparency in the systems. I think that happens within some
of our ethics committees and related processes, but I think that
it needs to be perhaps much more explicit in our system so that
people are aware that, if they can go and visit their doctor and
talk confidentially, that their data can also visit me as a researcher
and will be treated with exactly the same respect as they would
get from their GP.
Chairman: Thank you. Can I thank all
three of you. That is an enormously helpful session. It gives
us a great deal to think about. Thank you very much indeed.