UNCORRECTED TRANSCRIPT OF ORAL EVIDENCE To be published as HC 508-iv

House of COMMONS

MINUTES OF EVIDENCE

TAKEN BEFORE

Home Affairs Committee

 

 

A Surveillance Society?

 

 

Tuesday 26 June 2007

Professor Carol Dezateux, Dr Ian Forbes and Professor Simon Wessely

Ms Shami Chakrabarti, Mr Jago Russell, Dr Eric Metcalfe and Dr Chris Pounder

Evidence heard in Public Questions 236 - 325

 

 

USE OF THE TRANSCRIPT

1.

This is an uncorrected transcript of evidence taken in public and reported to the House. The transcript has been placed on the internet on the authority of the Committee, and copies have been made available by the Vote Office for the use of Members and others.

 

2.

Any public use of, or reference to, the contents should make clear that neither witnesses nor Members have had the opportunity to correct the record. The transcript is not yet an approved formal record of these proceedings.

 

3.

Members who receive this for the purpose of correcting questions addressed by them to witnesses are asked to send corrections to the Committee Assistant.

 

4.

Prospective witnesses may receive this in preparation for any written or oral evidence they may in due course give to the Committee.

 


Oral Evidence

Taken before the Home Affairs Committee

on Tuesday 26 June 2007

Members present

Mr John Denham, in the Chair

Mr Richard Benyon

Ms Karen Buck

Mrs Ann Cryer

Mrs Janet Dean

Patrick Mercer

Margaret Moran

Gwyn Prosser

Martin Salter

Mr Gary Streeter

Mr David Winnick

________________

Memoranda submitted by Royal Academy of Engineering and Dr Ian Forbes

 

Examination of Witnesses

Witnesses: Professor Carol Dezateux, Institute of Child Health, University College London, Dr Ian Forbes, Royal Academy of Engineering, and Professor Simon Wessely, Academy of Medical Sciences, gave evidence.

Q236 Chairman: Thank you very much for coming to give evidence to us this morning. As you will know, this is one of a number of hearings that we have been holding under the broad heading of "A Surveillance Society?" taking our cue from a report from the Information Commissioner last year. We are very grateful to you for coming to give evidence and to share your particular expertise with the committee. For the record, would each of you introduce yourselves?

Professor Dezateux: I am Carole Dezateux. I am a professor of paediatric epidemiology at the Institute of Child Health, University College London, and I am also an honorary consultant paediatrician at Great Ormond Street Hospital for Sick Children.

Professor Wessely: I am Simon Wessely. I am Professor of Psychological Medicine at the Institute of Psychiatry at King's College London, and I am here on behalf of the Academy of Medical Sciences.

Dr Forbes: I am Ian Forbes. I am a social science consultant and an Associate of the Institute for Science in Society at the University of Nottingham. I am also here partly representing the Royal Academy of Engineering.

Q237 Chairman: Can I start with a question to Professor Dezateux and Professor Wessely. One of the things that you argue very strongly about on the accumulation of databases is that they have been of very public benefit and there have been gains to public health from the use of personal data for medical research. Could you indicate very briefly again what those benefits have been but also what you think the benefits might be in the future, looking at the databases and the science?

Professor Dezateux: Thank you for this opportunity to talk to you about the benefits of using patient data, which is sometimes called secondary research because it is using information about patients rather than necessarily contacting them. Really, without patient data, we would not be able to obtain the evidence on which improvements in health care have been based over some decades now. There are five groups of research that benefit from using patient data. Firstly, by using such data, we are able to identify causes of disease reliably. That is very important often for public health questions but also in terms of allowing us to move forward in ways to finding treatments. Secondly, it allows us to identify effective treatment precisely, quickly and in the longer term, and also to look at the potential adverse effects of treatments, which are often much harder to study. Thirdly, it is absolutely essential to have access to this kind of data to provide any public health monitoring in terms of control of infections and epidemics and pandemics, and also for us to be able to understand the effectiveness of any interventions, either at a health service or at other level, that are designed to control and constrain any epidemics. This leads on to the fourth point, which is really about patient and public safety. I do not think we can over-emphasise to you the value of this infrastructure in terms of being able to answer quickly, reliably and precisely in response to concerns about safety of medicines, safety of environmental issues or safety of vaccines. We can give you lots of examples of this. It is always the thing that you have not thought of that comes up and knocks you on the shoulder. Unless you have an infrastructure that allows you to do this, you are very disabled as a society in responding competently to these concerns. Finally, without the ability to look at patient data, we cannot evaluate how well our health services are doing and how well they are doing relative to one another. That needs high quality data that is complete and that that is given priority in the health service. What I would want to say really is that although these are called secondary uses, these are addressing primary functions of a health system where to protect and promote the health of our population, we want reliable information. In fact, we would not want to be looked after in a health service that did not provide an opportunity to learn from the data that we have collected and constantly improve health care.

Q238 Chairman: You will have seen the signs in the House of Commons on the way in that it is to be smoke free from 1 July. Is it fair to say that probably that sort of public health change would not have come about without the sort of analysis of patient data that you are talking about?

Professor Dezateux: Yes, that is an absolutely wonderful example. The original observation by Sir Richard Doll linking smoking to lung cancer relied exactly on patient data. As we have gone through the whole tobacco control process, it has been informed at every stage by this kind of data, and now we are looking to using this kind of data to see whether we are getting the correct response and results to this kind of intervention, and whether there are any sectors of society that are being excluded or who are continuing, for example children, to be exposed and where perhaps we need different measures. It is important to think about these things in a dynamic way, and smoking is a very good example.

Q239 Chairman: Professor Wessely, in the Academy of Medical Science report, a reference was made to "inappropriate constraints on the use of personal health data". Given all of the positive things that Professor Dezateux has told us about the use of this information and data, what do you regard as the inappropriate constraints?

Professor Wessely: What we meant by that is that there is a framework that allows this kind of research to go ahead, a very well worked out, ethical, legal and governance framework, but there are times when many people are intimidated by things like the Data Protection Act or the common law, usually we found through ignorance of the legislation, and do not allow research to go ahead. Our studies of cancer in Gulf War veterans, for example, had great difficulties in being done because people felt they could not release data from cancer registries. It took about three years to overcome that. That is our general point. We have a well-established, very careful - possibly over-cautious - governance framework to allow this, but we found innumerable examples of good research that was being impeded by people's ignorance of things like data protection, although to be fair, if you read the Data Protection Act, which I had to do, that way madness lies. It is not written to make it easy, but in fact it is a perfectly sensible piece of legislation that, if you work it through, allows proportionate invasion of privacy for public health research, but you would not know it if you read it.

Q240 Chairman: The phrase "surveillance society" conjures up a rather Big Brother image, which is why we put a question mark at the end of the title to our inquiry because we want to take a balanced view. Could I ask the two professors: do you regard the sort of work that you are advocating as part of a surveillance society? Do you feel happy with that tag?

Professor Wessely: It depends what you mean. At the moment, we are carrying out health surveillance into the health of the Armed Forces. We are looking at the rates of post‑traumatic stress disorder, the rates of cancer and the rates of all sorts of other adverse outcomes. That is surveillance because it is based on medical records, cohort studies and research, but most people in that context would think that is a good thing. Certainly the members of the Armed Forces think that is a good thing and they are appreciative that this has finally been done. It all depends on the context. Health surveillance is actually a good phrase and we would agree with looking at the effects of MMR vaccine or surveying the effects of the Vioxx drug, and we are not talking about hidden cameras in supermarkets.

Q241 Chairman: Dr Forbes, we are going to come to your evidence a little later on but do you have any comments on that opening exchange?

Dr Forbes: I am delighted to hear this positive use of the term "surveillance" because I do not believe there is enough awareness of the way that society is constantly 'surveilled' by a range of systems, mostly governance systems, to increase our knowledge of ourselves, to provide information which leads to better knowledge and better insights to wisdom so that we can make really quite crucial and large social decisions based on the information that we provide by just living or dying. This is a good example of how data can be collected, how it is managed. If you think about the safeguards that attend to medical records, they are extremely sophisticated. There is a worldwide practice about how to do this and how to manage anonymity and privacy, and yet it is used to be extremely constructive and to provide us with the sort of data that we need and I think that is missing in other areas of society.

Q242 Ms Buck: Can I pursue the line about data? Are you confident that the medical research that you undertaking requires analysis of databases rather than research that would be done by using volunteers, for example?

Professor Dezateux: I think that we need very large-scale evidence for a lot of the questions that are facing us now. At one level, therefore, just approaching individuals is simply not feasible. The reason we need large-scale evidence is that we are dealing often with things that are quite uncommon but about which it is important that we have reliable answers. These could range perhaps from the association of birth defects with certain drugs that are given to women in pregnancy to the relationship to birth defects with power lines, mobile phones or any of those sorts of things. Birth detects are a good example because they are uncommon and you need data from the whole country. It is clearly also obviously data that you are not going to be able to go back necessarily and get. The importance, from an epidemiological point of view, is that our science is served by giving answers that are not biased and that are not misleading in any way, that are precise and timely and where we can also compare groups of people who are not exposed to the thing we have been asked to look at. All those things really support the need to have large-scale evidence. I can give you any number of examples of issues that in fact parliamentary committees have sat on, such as assisted reproduction and so on, where we really are tied by not knowing answers to the sorts of questions to which we should have answers because we have not been able to get access to large-scale evidence. The other point to make about this is that the kind of research that epidemiologists do is concerned to get information right at the individual level, but we are not concerned to identify or know who that individual is; we are interested in that individual because they are part of a group of individuals and we are interested in things at the group level. From that point of view, large‑scale evidence based on patient data is one of the best ways that you can look at highly sensitive information because it is possible to have very good safeguards and security and you do not stand the risk when writing to somebody of exposing to somebody in their house that you know something about them inadvertently, as you might do when you have to approach at an individual level.

Q243 Ms Buck: Professor Wessely, you said earlier that there was a robust system of governance for undertaking this kind of research but yet, at the time, problems arise and research does not necessarily go ahead or there is controversy over progress because people do not understand that. If it is as good as you say it is, why is it that people then doubt it? To what extent is that to do with concern over issues around individual consent and the anonymisation of data?

Professor Wessely: First, most research goes ahead with consent, and that is the default position and you start from that. Obviously, for any interventional research - if you are going to give people a drug, a new test or some procedure - you have consent; if you do not, you are committing assault. We put that to one side. Research based on data usually goes with consent, where that is practical and possible. That is normally what we do, so our studies - and I have just mentioned the surveillance of the Armed Forces - are based on consent, but there are times when that is not possible. If I gave you a practical example, then it would make sense to you. We wanted to look at the association between depleted uranium and cancer, and this may appeal to Patrick. Therefore, we have 100,000 people who have been potentially exposed in the Armed Forces to depleted uranium and many of them wanted to know whether or not this had led to cancer. To do that study, it is not possible to approach 100,000 soldiers, most of whom have left the Armed Forces. Nobody knows where they live; they do not use landlines because they tend to use mobile phones now; they are almost untraceable and it would cost millions of pounds and you would miss the very ones that you want to find. First of all, you need ethics approval. In fact you need two sets of ethics approval. You cannot do any research without ethics approval. You need to use a system called the Caldecott Guardian. This is for the people who hold that data, namely the MoD or the cancer registries, to permit you to see that data. It is a very complicated procedure which you have to go through. They hold the data and they have to decide if this is a reasonable thing to do. You need permission from something called PIAG, which is a Department of Health committee, that overseas this system and adds an additional layer of governance. You have to comply with the law. You have to show that there is not any other way of doing it. You have to show that no one is going to be upset or distressed by this. You have to show that no one is going to get any individual detriment from the loss of this data. You have to show that you owe a duty of confidentiality and that there are sanctions in place if you break that. That has never happened. There have been no instances of medical researchers leaking confidential data. That is not to say it will not happen but it has not happened yet. You have to belong to an organisation that says that if you do that, you are fired. There is a whole complicated system of checks and balances in place before finally you are allowed to link members of the Armed Forces and their rates of cancer. In fact, the answer was that there is not an association. There is no other way of doing that kind of research.

Q244 Ms Buck: What I do not understand then is with so many locks on protection of the data, why it is that anybody anywhere should ever raise concerns about proceeding with particular research?

Professor Wessely: There is no question that many people do not know this system. I think that is quite clear from the Academy's report. A lot of people, and within the profession itself, to be frank, are ignorant of this framework, partly because it is very complicated. I think also there have been instances of misconduct in research or misconduct in the health services that tar every one with the Alderhay brush, if you would, which are not relevant at all to what we are talking about but have created a sightline of suspicion. Finally, having said that, when you look at what the patient charities in heart disease, Alzheimer's and Parkinson's want, unquestionably they want this kind of research to go ahead.

Q245 Ms Buck: I am sure we will return to many of these points, but as my last question: the Academy's report criticised what it calls the over-rigid application of the principle of "consent or anonymise". What you have described to me is a system that has so many locks in it that it does not seem to justify any deviation from this principle. Why is it that you would be looking for, and what benefit would there be from, a reduction in protection through consent or anonymisation?

Professor Wessely: First, "consent or anonymise" was the principle, for example behind PIAG, the idea that eventually you would either have consent or anonymisation, but that is clearly wrong. The whole point is that there are many times in research where you cannot proceed with that. Our study of Gulf veterans could not proceed on that basis because if the data had been anonymised, we would not know who they were and we would not be able to link them up with their cancer rates. So it was a flawed principle. Therefore, there are occasions when you have to be able to proceed without consent and without anonymisation. I have explained that that is unusual but it does happen. What the Academy is saying is that, first, people are not necessarily aware that these examples exist; they are not necessarily aware of the governance framework; or most of them misinterpret it to say it is "consent or anonymise", which is absolutely not the legal framework we have. It is not the framework behind Connecting for Health. It is not the framework envisaged in the Data Protection Act or the law of confidentiality.

Professor Dezateux: The first thing to say is that one person's anonymised data and identifier is another person's research data, and so it is very difficult to look at the same piece of information and make a clear decision one way or another. Given that is the case, we have to use common sense and we have to ask what are the safeguards when we use this information to avoid disclosing the identity of a person while still being able to answer the question that precisely we think is important. One of the interesting issues is that it all pins on individual consent. We need to move away from that paradigm a bit and ask what processes there are for community assent to saying that it is ridiculous that we cannot answer some of these questions because that is the side on which we often find ourselves. The other thing that I hope we will have a chance to discuss is the greater clarity of the processes because with a lot of the public misunderstandings and understandings, we could do with a better communication strategy. That applies to people operating and interpreting these things at a health service level. The sort of people we need to negotiate with about access to do data are sometimes are also confused, and that is not surprising. At a scientific level, there are many reasons why we need to know things about people. Again, we are not interested in who that individual is, where they live, but we need to know certain things about them in order to practise better science and to produce a more reliable answer. We need to know that we are not double-counting. You need a couple of common identifiers to make sure that you only count an individual once in your data. We need to be able to follow people up in the very long term. That is becoming increasingly important. We are interested in diseases and treatments for things like cancers which might take a long time to develop, and do we need that kind of information to be able to trace people. If the data are anonymised at the beginning, we cannot do that; we cannot get back to them. We need to make sure that the data we produced from our health services is respected and that it is of high quality, and that when it says this person has a condition, it really does have that condition. In fact, we know that sometimes there are mistakes in the basic health service data. Research can help the health service improve the quality of its data by being able to what we call pseudonomise, to have ways of getting back to say, "Was this really what we thought it was? Do these people have this condition or have they had this operation?" Then, as I mentioned right at the beginning, without data that is potentially disclosive, such as postcodes and so on, we are not going to be able to say whether cancers are related to power lines or living near nuclear power stations or any of those kinds of questions that I believe you would have an interest in getting the answers to as well.

Q246 Gary Streeter: You have talked about checks and balances but does it not worry you at all that some of the information that you are overseeing the gathering of is being used and will increasingly be used to produce some of the genetic modification type science of the future? Does that ever cause you sleepless nights?

Professor Dezateux: There are serious issues that society needs to debate about the use and application of genetic advances. I do not think that those issues will be resolved by saying, "We are just not going to use the data at all". Indeed, I think that where it is helpful for us for example to be able to link somebody's biological data to their health and their future health statuses, you can begin to put boundaries round your uncertainty about what the meaning of this genetic change is - whether it is helpful to know about it, whether it has any implications at all - and then inform the health services as to how appropriately we should be using that test within the health service and, bearing in mind that there is a very big private market in genetic testing, to inform the public really about whether it is wise or advisable to get tests that might identify a prediction for disease or not.

Professor Wessely: If you look at the UK Biobank, which is an example of research in genetics, 60,000 people were approached and asked to take part and only 50 people objected and of those 50, 30 then consented to take part in the study. So, within the framework of research, people are confident in the use of genetic information to study disease and are willing to participate.

Q247 Martin Salter: Professor Wessely and Professor Dezateux, obviously you would be disappointed if we had not put you under intense surveillance before you came before us. Because we have a phalanx of staff, we have dug up your interchange with Dr Richard Taylor at the Health Committee on 7 June, which is very informative. My question relates to that. Just to paraphrase, Richard was questioning your contention that perhaps medical researchers should be given more access to patient records than even the police. We want to give you an opportunity to expand on that. It appears from the exchange that you were implying that the public would trust you perhaps more than they would trust the police. You may be right.

Professor Wessely: I am not implying that. I think that is a statement of fact, is it not?

Q248 Martin Salter: And your evidence is?

Professor Wessely: I think I mention it. We did a study around the polonium incident about who do you trust to manage this incident, and doctors and scientists rated much higher than the Home Office and the police force. More seriously, there is a misapprehension there. If you want to look at personal data, the Data Protection Act actually says quite specifically that you cannot do that to make any decision about an individual without their consent. The police would only want to access data to make a decision about an individual: is he a crook, or whatever. That is specifically illegal. You cannot do that and we cannot do that, but that is not why we want to look at data. We are interested in not one child with autism or one child who has had a vaccine. We are interested in all children with autism and all children who have had the MMR. In that sense, it is not a personal piece of data about that individual; it is about all the children who came under that category. That is the framework for which we use that data, and that is the framework that is already permitted by the law for the public good. That is the specific framework that denies the police access to the same data for making individual decisions. As a normal citizen, I can tell the difference between wanting to find out if living near a power station causes leukaemia and wanting to find out if I have been naughty with my taxes or whatever it is. These are chalk and cheese and I think most people accept and can understand that.

Professor Dezateux: The answer is that both the police and biomedical researchers are subject to constraints, and I do not think we have unconstrained access to data. Simon has indicated quite how many approvals and permissions we have to have. The issue of whether we should be constrained in the same way as the police depends on why you are constraining the police and why you would constrain medical researchers. I suppose the thing that unites the police and biomedical researchers is that we are both interested in the elimination of doubt and the reduction of uncertainty but for our purposes, we need to have large-scale unbiased evidence to reduce uncertainty and make sure we have the right answer. The public's concern about the police of course is that they will get the wrong answer and somebody will go down who is innocent. In our instance, getting the wrong answer has public implications that it is really important to avoid. We do need that access but I think we should make the point here that we are not asking for unconstrained access, and we very much support the checks and balances that are there, but they need to be looked at in a flexible way. Also, I hope we will be able to come to discussions about how they can be improved.

Professor Wessely: You have already foreseen this in the legislation by saying that personal information cannot be used to that individual's detriment. If I was to do that - I cannot think how I would or why I would - I would be breaking the law and committing an offence and I am going to be in deep trouble. So there already is a framework to prevent the Orwellian implications, as it were.

Professor Dezateux: May I add that I think it is important that we are accountable. I consider myself a public servant in my research, as a lot of epidemiologists do. We are quite happy to be accountable for the work that we do and our approaches. Indeed, we are audited and have had our systems looked at in relation to the Data Protection Act and so on quite a regular basis within our university.

Q249 Martin Salter: Thank you for your very full answers. How confident are you that medical researchers, having navigated the various checks and balances that you have in place, then are able to access for perfectly legitimate reasons personal patient information? How confident are you that that remains secure and could not be leaked to people who should not receive it?

Professor Wessely: There are two answers. First, when we took evidence for the report, the Information Commissioner confirmed that they had no reports to them of what you are describing happening. They had had reports of receptionists seeing things that they should not and all sorts of things within medicine of data violations but not involving medical research. I am not saying that will not ever happen but so far it has not. The second thing is that if we were to do that, and particularly in the new electronic health systems, you would leave a massive electronic fingerprint all over the place. It is quite straightforward. In my university, and I think in all universities, that is the end of your job, and it is also the end of your career because you would be up before the GMC. Even if you were to do that, you would leave a trace and that is it.

Professor Dezateux: When I started in research and I went into a family care record, the Lloyd George, I could read everything about that patient just by being handed the envelope, but now with technological developments, there is access control and audit trails. I think the chances of leaving a set of notes out on the table that somebody else can read are very much less. The computing infrastructure is much stronger.

Q250 Mrs Dean: Moving on to the NHS database, what opportunities will it open up in terms of medical research and epidemiology and what safeguards are being built in to protect unauthorised access to patients' records? Are those satisfactory?

Professor Dezateux: From what we have said before, it is clear that electronic patient records will provide a huge advance because they will allow us very effective access to the large-scale data that we need. I will not rehearse the issues about that. One of the things it allows us to do is to be inclusive in our research so that we do not leave certain sections of the population out. It can help us get swift answers. It helps us look at areas of medicine that we are often criticised for not spending enough time on in our research: rare disorders, under-served populations. It helps us look at demographic change in a dynamic way because we are a very changing population in the UK. One of the areas that we are interested in is that it allows us to link inter-generationally, so that a lot of the issues that we are concerned about are what happens to mothers/parents and their children and subsequent generations? You can be very powerful in those sorts of things and you can use electronic records that can be linked by a single identifier. They are cost-effective. I think that we need to understand that after the Cooksey Report, there is a real recognition that unless we make the most of these electronic health records, we will not be able to maintain globally our competitiveness in terms of our science, and that will have economic implications for society. That is not what I come to work day to day to do but it is a very important issue, and it is an important issue for trials and providing an infrastructure for trials, just as much as it is an important infrastructure for understanding the safety of medicines. There will be investments in research that we will be able to attract if we are able to get this right and use the electronic health records effectively. The safeguards that are in place that we have described already are quite sophisticated for the kind of research that we do, and they would be appropriate for this kind of research with the electronic health records. There are plenty of examples in Scotland, in parts of England already, the Nordic countries, Australia, the US and Canada where there are systems in place that allow the data to be kept in its own home, as it were, but for the linkage to be done by somebody that does not need to look at the data, so that the researcher at the end of the day just gets the information that they want and need on a need-to-know basis. I think those kinds of safeguards are very important.

Q251 Mrs Dean: Do you have anything to add to that?

Professor Wessely: I do not really. On the technical side, I would rather hand over to our engineer.

Dr Forbes: I am not an engineer but engineers become very nervous about single source databases, which have to be used by a large number of people and have to be designed for ease of input on a regular basis, and so there is a large number of users. The compromises that need to be made in engineering terms inevitably compromise the security of any single database. They are concerned about that and they say it is better to build in rather than build on. They say, "Let us think about how this system could fail and will final either by misuse or by abuse or accident, all the ranges of human and technical possibilities". That needs to be thought about in advance. This has nothing to do with the way the research is conducted or the way that the protocols for research are developed. I am very happy with that. I am sure they are going to work very well. It is just that the matter of a huge amount of data being input by a wide range of users means that it is a single system; it has greater vulnerability and that needs to be acknowledged. It is a dilemma. You want the single database for all the social benefits that become possible by having the single database but that has to be weighed against the dangers. It is all about balancing those things. The engineers would say you should think about this in advance, design up-front, do lots of upstream thinking about how any human system is going to fail at some point in some ways, and work out what you are going to do about it when it does fail. I do not think you can get any further than that. You cannot produce a completely fool-proof system, so let us design it to assume that it will fail in some ways.

Q252 Mrs Dean: When it is up and running, how can patients be reassured that their own medical details are kept confidential?

Professor Wessely: First, if you read, for example, the Care Record Guarantee, which I think is a very sensible and remarkably plainly written document, it talks about the various checks and balances on confidentiality. Beyond that, there is also the moral and ethical framework - and I am talking specifically about medical research now because that is what I am interested in and patient care in particular - in which we work, and I do not think that climate has changed. I do not think doctors have become any less concerned with confidentiality, or the GMC has become less concerned with the advent of electronic patient records. It is presenting new systems but the ethical framework for the conduct of those systems is just the same. Personally, if I had gone to my wife's surgery in Kennington a few years to collect her, I could see on the desk the notes of everyone she had seen that day. They were in a big pile. Now I cannot do that; I cannot see them. We sometimes have a view of a rosy-tinted past in which doctors clutched case notes to their bosom and never let them out of their sight 24 hours a day. That is just not true. I personally feel more confident in the security of electronic matters and not least because again it is really important that if I mis-use them, the constraints on the system and the recriminations are so vast that that is a greater deterrent. In the past, I could wander into medical records and, quite frankly, if I wore a white coat, I could take out medical records; nobody would challenge me. You are right, there will be mistakes. Of course there will be errors but there is a system for correcting them and there is also a system for governance to make sure that if those are done maliciously, there will be severe penalties.

Q253 Mr Benyon: Before I turn to Dr Forbes, a doctor in a surgery in my constituency received a fax the other day from a hospital up north about a patient on their register who had self-harmed. This was just attached to his medical notes. The next time he came in, the doctor questioned him about it, and it was perfectly obvious that they had got the wrong person; he had the same surname and the same date of birth but he was a different person. Should we be concerned that information is floating around the country when mistakes are made of that nature that can have a huge impact on that person's life and job prospects if it became public?

Professor Wessely: Of course you should be concerned about that but that has always been the case. It has been far worse in the past. Notes could just get lost and you would never see them for years.

Q254 Mr Benyon: I should clarify that nobody seemed to know who should correct this information, whether it was the hospital that had made the original error or whether it was the GP surgery. There seemed to be no understanding about who owns that fault.

Professor Wessely: That is a governance issue. I am not au fait with how that works. I do know it is much easier to correct that kind of mistake now than it was in the past. It has always happened. Mistakes will be made and there will be a lot of John Smiths with a certain date of birth. Now, whoever it is, either the GP or the hospital, can alter the record whereas previously the notes will be there in perpetuity down in the bowels of the hospital and years later they will turn up and nobody would know they were mistaken, and, most importantly, the patient would not know that they were mistaken. I am not going into how this happens but now I know that patients will be able to check their records and they will be the first people who will say, "That is not me. You have made a mistake". Previously they would not know. They would have no idea what was lying on discharge summaries all round the country in the bottom of hospitals.

Professor Dezateux: The electronic health record will improve this because if we use a unique identifier, then one John Smith will not be mixed up with another. That is important. The second point is that you will not fax it so that this terribly disclosive information is left for anyone to read; you will send it by the existing system which works with Connecting for Health, which is encrypted emails and messaging services. Thirdly, to find that out, if you can access a single electronic health record, the information on the spine is visible to the person who cared for that patient and the GP who is looking after them, so there is instant communication, and the patient who should be able to access and look at their own record.

Q255 Mr Benyon: Dr Forbes, you have set out six clear principles to govern the use of surveillance data, the fourth of which says, "In general, public agencies should not be allowed access to private databases". Should the police and the security services be an exception to this and, if so, what conditions should be put on those exceptions?

Dr Forbes: I would say there should be no blanket exceptions. I would say there needs to be justification for access to a private database by a public agency because when I give information to a private agency, I am giving it to them; I am consenting for them to use it for their common purposes. There may well be cases where there needs to be or there is a very good case for access to a private database. The case needs to be made on a case-by-case basis. If, after a time, you think that there are so many of these cases, we need to have a rule or a rubric which would allow the police to invoke it, I do not see any problem with that in terms of a governance procedure. The NHS is a very good example that says, "This is what we want to use this data for and we therefore generate a series of mechanisms to make sure that it is not misused". They do not say, "Do what you like with the data". I do not think there is any case anywhere for saying, "There is data. Do what you like with it" just because you are the police or the security services.

Q256 Mr Benyon: Your fifth principle is: "Public record databases should be under the control of autonomous agencies, not government." What difference does it make?

Dr Forbes: There is a huge difference. The difference here is between the state and the government. I do not mind providing information to the state which uses that information for purposes which are about the collective good and the benefits will be indivisible; they may or not come to me. Governments have purposes for which I may or may not have consented. I may have voted for them, I may not. They may be doing something I like, or they may not. I think it is a good principle to say that the government has to justify the use of the data of its citizens. The government does not own me; it does not have any right over me. It is the other way round in fact. It is there because the people have put the government there. We consent to the state and we elect a government, which we can get rid of. I think that is an important principle. Public trust is very important. Trust in governments goes up and down, this way and that way, for good or bad reasons. If it also is going up and down in the same way on the state, I think that is potentially damaging for society and for politics in general because ultimately you want people to honour their commitment to the state and do what they like with the government.

Q257 Mr Benyon: Would you call the NHS an autonomous agent?

Dr Forbes: Yes, it is an autonomous agency. As far as I know, the government, cannot say, "Give me that" and just have it.

Q258 Mr Benyon: Your sixth principle relates to the penalties for misuse and you say that they should reflect the damage and distress that the system failure or crime causes. However, we all know that sentencing usually reflects not only the consequence of the offence but the culpability of the offender. Do you accept that?

Dr Forbes: Yes, I think that is a fair consideration.

Q259 Mr Benyon: Do you think, for example, that leniency should be shown in the case of a teenager who is particularly skilled at hacking and finds his way into personal data for kicks rather than for any malicious intent?

Dr Forbes: I do not know what leniency would mean. I think that you would treat a teenager as a teenager, first of all. I do not know about you but I am not lenient with bad behaviour.

Q260 Mr Benyon: In the United States, for example, they throw away the key. It is Alcatraz if you breach data protection. We have a slightly different attitude in some respects in this country.

Dr Forbes: That is about sentencing policy and how you treat adults and children. I do not think that is an issue about data, frankly. Teenage hackers will show you how vulnerable your systems are, so they are very useful in fact. To punish them for your own failures in your own systems I think is cruel. If I could just come back to your example of the fax, I think it is a terrific example where the specifications for a database, for example the NHS database, will have been, "We want a database that does this, this and this". Has anyone gone round and asked, "I want to know from all of you medical professionals what things have gone wrong that the system should look at and come up with some way of dealing with?" Instead of just a new specification, it is a problem specification. We know that this is going to happen; we know that this is typical; we know that that happens. Design it, please, not so that it is going to do all these lovely things but so that it will address some of these common problems with records that the medical service knows about. I think that is the way that you can build in protections.

Q261 Gary Streeter: Dr Forbes, you mention in your paper some concerns about the invasion of privacy caused by the four million CCTV cameras we now have in this country, although they make our constituents feel safe and they want more of them, not fewer, I think. What are your concerns and can you give some concrete examples of this invasion of privacy?

Dr Forbes: I think having four million cameras is already an invasion of public privacy, which seems not to have been a consideration by members of the public. They have just given it away, in a way. There are examples of the way that cameras have been used to the detriment of particular individuals and groups of individuals, women for example. The next problem is going to be an extension of that, if there is no check or consideration about what the invasion that is taking place might turn into, because now coverage by cameras is mostly digitally stored, so it is there for ever. Like anything else, it is just data which can be mined, explored and new technologies and new software can look at that data again and again and pull more things out of it. Effectively, your act of walking down the street may become interpreted as something very different in the future. At no point has consent been given by an individual entering a public space. At most, they are warned that they are being watched, if at all, if there are signs around. Basically the message is: we are watching you, do not misbehave. It is an incredibly negative and critical message to be sending out to any citizen, it seems to me. The idea that at some point in the future somebody could say, "Right, this person wants to stand for public office. Let us Google them to see what is available in the past. Let us run some of these softwares and say, 'See the way this politician walks - completely dishonest, and we know this from gait recognition technology. Why are they over there? What is going on?' " I can see parties that would be interested in doing that sort of negative take on a person's past, either a party or the press or the media.

Q262 Gary Streeter: What is the solution to then? Is it not to take the pictures in the first place?

Dr Forbes: I do not think you can stop taking the pictures. They are there now. The cameras are there. If you think about health and safety legislation, more or less everybody is asked to do a risk assessment on what is happening in a particular situation and you get a proliferation of warnings and signs and a lot more awareness of what your behaviour might end up as or the harm that might come to you. I would have thought you need more signs saying, "You have come into this area and we are going to have your record and we are going to do what with it". I would like to know if it is going to be stored and where, how and who gets access to it. The other side of it is to say: let us think about this in a positive way instead of in a negative way. What might I want to know about what happens in my public space that I enter into and go out of on a daily or weekly basis? I would like to have access to see what is happening. I also would like to know why people are watching this. What is the use value of watching that space? What is their justification for it? What are their reasons and what are they looking for? Mostly they are looking for bad behaviour but the community might want to ask: when we are surveilling this piece of public space, let us think not about justice and crime issues, security issues, but about care issues. Might we look at this much in the way that the NHS does and say: we are not looking for behaviour; we are not looking for an individual who might be criminal, but we are looking for things that happen to the detriment of society. We might say that there is a problem here for this group of people. It is hard for them to get around; they are not serviced by the way this space is configured. There are lots of ways we can think about how we care for ourselves in our community by looking at what we capture on our images, on our webcams.

Q263 Gary Streeter: This is what you mean by new and socially beneficial uses of surveillance technologies. Does that not mean that basically more people will be looking at these images so that there is even more of an invasion of privacy?

Dr Forbes: Then the people might say, "Let's not look there". We might say that we want this camera a bit further away. We can see the benefit of watching this area but we do not see the point of intrusive watching. We might say, "Let's have some information of a different kind collected. When do we need lights on or not need lights on?" There are all sorts of things. You do not really know. The community is being watched all the time but we do not get to say from our perspective that something else might be done. There is no opportunity for creativity and innovation coming from people. The technology is there. It is a bit like text messages. The techies did not design texting for us. People decided that it was quite handy and they used it, and it became prolific and ubiquitous. We have already got the surveillance which is ubiquitous but the uses of it are not in our possession, even though it is always of us in our public space.

Q264 Chairman: Can I be clear here that what you are suggesting is that communities should be invited to come up with ideas about how community-based surveillance should take place. You are not suggesting, or are you, that every member of the community should have the same access to the cameras and televisions pictures as, for example, the people who working in the CCTV control centre would, who are sackable, dismissable, prosecutable should they breach regulations?

Dr Forbes: Why not introduce reciprocity? If you can see me without my consent, then I think I ought to be able to see what you are watching.

Q265 Chairman: One reason might be that I am happy for the images to be looked at by somebody who has been through a reasonable recruitment process, who is properly managed, who will be sacked if he breaches it and, as we have seen in a tiny handful of cases, actually prosecuted, whereas my next-door neighbour may just be a nosey parker and the last thing I want them to do is keep an eye on who is walking the street with whom.

Dr Forbes: They probably do that anyway by looking out of the window! I want to shift the balance here really. There is a dilemma of privacy and security but there are not any other creative possibilities going on of care, concern and interest of people saying, "Actually we do not want it". There is no opportunity for that. I just think (a) that people should always be consulted before cameras are set up and they should be asked why and how and contribute to that; and (b), yes, let them see what is going on, let them be bored, if they like, as well and see what happens.

Q266 Gwyn Prosser: Dr Forbes, I want to ask you about privacy impact assessments. The Information Commissioner came before the committee and he described them as nothing much more than a discipline and a risk management tool and he seemed quite keen on them. You seem to conclude that risk impact assessments might actually work against privacy, which seems counter-intuitive. Can you give us the grounds for that view?

Dr Forbes: First, if there were risk impact assessments, I would not have a problem with that but they do not say that. They call them privacy impact assessments. I have not seen one that says, "This will impact upon your privacy in the following way". They all seem to say, "This will not affect your privacy because we have terrific systems which never fail and, in any case, if they do, we will fix it almost straight away".

Q267 Gwyn Prosser: We have heard a little of that from our other two witnesses this morning.

Dr Forbes: No, I do not think that is the case at all.

Q268 Gwyn Prosser: This is a system with treble locks which will not affect privacy.

Dr Forbes: Yes, and it is about protecting that privacy, which is assumed to exist, so there is not really a discussion about what privacy is in the first place and is it privacy to me as an individual or a member of a family or a group or a profession or career? None of those things are clear and so I do not see how you can actually do a privacy impact statement unless you are clear about what the privacy is supposed to be. Mostly they seem to be compliance statements or best practice statements. I do not think any of them actually say, "This is your privacy and this is how it will impact upon it for good or ill". If they did, that might be interesting, but they do not.

Q269 Gwyn Prosser: You have nothing positive to say about their possible introduction at all?

Dr Forbes: No, because I think they are mis-named and they give you the impression that they are looking after your privacy but they do not do anything about that at all. If I want to know how good a system is, please tell me how good your system is for managing data.

Q270 Gwyn Prosser: Would it help with regards to some public assurance to assure the public that the impact has been considered, the risks of privacy would be considered if the system was put in place? Would that be possible?

Dr Forbes: I would like to see a consultation on what people think is private and what needs to be kept private. Most of them just conform to the legislation, it seems to me. You want to introduce some legislation that says: this is privacy, this is what it means, this is how it might be damaged, and do a check list that way. Then it might be interesting, but at the moment I think they are misleading.

Q271 Gwyn Prosser: Can you tell us anything about the experiences in the States and in New Zealand and Canada for instance where they are already in place to a degree?

Dr Forbes: They all seem to be the same. They are about compliance. I read the Homeland Security one yesterday and it was a joke really because it basically said, "We have a very good system and these are the three ways we protect our data and they trust us. If it breaks, we will fix it pretty soon" - if you find out, but you cannot find out. You cannot be compensated. If we think back to the popular environmental impact assessments, the evidence is that 90% of the time they do not really have an impact on outcomes. They have got to be able to say: yes, no, or do not know. If they say "yes" they are accepted pretty much. If they say "no", they might have an impact but mostly they do not. That is what I worry about with privacy impact assessments. If somebody really did say, "Look, this is going to affect our privacy", and I do not know who is going to do them, usually it is in‑house, then it is doubtful that anything would change.

Q272 Patrick Mercer: Turning now, if we may, to profiling, to all of you, what particular problems are associated in your view with predictive profiling to target deviant or unusual behaviour?

Dr Forbes: The key problem here is that there is a shift that is often unacknowledged but is crucial from a person's behaviour to the identification of that person as something. I might see your behaviour but that does not mean I understand who you are or know who you are. Criminal activity does not mean that person is a criminal. They are a person engaging in criminal activity but the shift from one to the other is very quickly made once you go for predictive profiling. A person comes before you. They are scanned through your profiling system and then they are labelled. They are labelled, not their behaviour. They are labelled. That is the problem. They are then treated as if they are equivalent to that label. It is just as lazy as stereotyping. You need cohorts and you need to understand your data, but it is a way of using new stereotypes.

Q273 Patrick Mercer: What can we do about it?

Dr Forbes: I think that information is crucial. If somebody wants to gather my data and work up a profile of me, I need to know that. That would impact on my privacy. That I would like to know about in a privacy impact statement. This data is going to be used to profile me. That would impact on my privacy because I would not really know what was going on. I do not know the routines. If you think way back to the St George's Medical School, it had a fantastic points system for admitting students until somebody realised that if you had the lowest number of points, you got in but if you were a woman you got an extra 10 points; if you were an ethnic minority person, you got an extra 10 points, just because it was in the system. So perfectly reasonable people who were not wanting to discriminate were running this system and producing discriminatory results. You do not always know what is going into those assumptions that construct the profile and you cannot really be sure what is coming out. Most of this stuff is done by companies for their convenience and for their maximisation. It is not really a public interest profiling that we are talking about to which you might agree.

Q274 Patrick Mercer: Do you accept that profiling may have a legitimate part to play in crime fighting, counter-terrorism or to enable the police effort to be concentrated in the most effective way?

Dr Forbes: Yes, but it is full of dilemmas, is it not? Yes, you want them to target their efforts. However, past experience shows that the targeting of the efforts often turns out to be discriminatory in practice on the ground, so that its use is complicated. It may well be that there was more crime amongst a certain group but why is that? It may be because that group is already targeted and amore crimes were picked up. There was a report recently that shows how much middle class crime there is, which is just not picked up. Why is not the profiling targeting all these middle class criminals?

Q275 Patrick Mercer: Could NHS patient records, for instance of psychiatric patients, not be of assistance to the police in allowing them to profile people who potentially pose a threat to the public?

Dr Forbes: I think that sort of data is so difficult to get right that I would be very concerned about that.

Professor Wessely: I never thought that I would even discuss this but 20 years ago I did my PhD on the prediction of violent behaviour in people with schizophrenia. The problem is that it is incredibly inaccurate. It is okay for a large group of people and so you can make predictions about large samples in populations, but when it comes to the individual, it is incredibly inaccurate. The risk of hazard and detriment to that individual being deprived of their liberty for things that they are not going to do is very high as opposed to the one person who is going to commit a serious offence. Back when I did the research, you would be locking up something like 30 people who were not going to commit a serious crime - and this is for schizophrenia - for one who was, and I do not think it has changed that much. I am not up to date. The second point is: I cannot see any circumstances in which the police would be allowed access to, of all things, mental health records. Of all the things that are sensitive personal information, speaking as a consultant psychiatrist, that would not happen. The only way that it would happen would be through a court order, which already we would have to obey but it would be fought tooth and nail. It would be so destructive to how you deal with psychiatric patients and how you manage mental health services, it would just be quite an appalling future. I have not heard that proposal.

Professor Dezateaux: In fact it might be helpful if the police were to come and talk to epidemiologists, because they do know quite a lot about predicting associations being a fallacy in terms of individual predictions.

Q276 Chairman: Professor, that is one of the areas we said we might question you about, but you are a child health expert. The Government is constructing a database of children apparently, and one of the aims is some sort of predicted profiling to recognise children who are seen to have a bigger set of risk factors. Can I ask you what your view is about that? Do you share the general concern about the inaccuracies of profiling or, given there are so many cases where children have slipped through the net through the failure to share information between different professions, and so on, is there actually a value in that database that is being created?

Professor Dezateaux: Yes, firstly, I do believe there is, but I think you need to make the distinction between how it allows you to deliver effective care to an individual child and avoid some of the Climbié, and so on, tragedies that we see repeatedly and stepping back and saying: how does that information at a group level, at a population level, help you in other ways? If we take, first of all, the opportunities to identify whether there have been concerns about a child, we know that quite a few children do end up in contact with healthcare before they are harmed and that it is at the moment very difficult for anyone to get access to information that would help them know that there had been any concern. Because people are conservative, there are often many more concerns expressed about a child than there would be things that would be in the domain, even being registered at risk. So I think this information can be useful and it obviously needs to be accurate, and, again, it needs to link across a unique identifier to avoid children being incorrectly identified. I think the same point is evident, that just because certain factors are associated with an increased likelihood of a behaviour, it does not mean that because they are present in an individual that they are, and I think that healthcare people need to be aware of that, but I think in terms of Every Child Matters and the kinds of issues that we know, child protection issues that are terribly important, this is an advance.

Q277 Chairman: One final question, if I may. I want to go back to the concept that you floated and then moved on to about community assent as an alternative to individual decision-making about this. Dr Forbes has perhaps floated one model or one approach to be used in relation to CCTV, but could you say briefly what you have in mind? We can say we have all been elected by communities and, therefore, if we all say it is all right, that is community assent, but I do not think many of us would push that out too far with our constituents. If the focus on individual control of data is not quite the right one, how would you express this community assent?

Professor Dezateaux: I think there are certain types of activity that are a class of activity where one can actually debate the principle of that and come to a position for an infrastructure with checks and balances that would be acceptable. Currently, as it is, we do not actually have a process that engages the public. So, I think that trust is very important but I think that Anora O'Neill has shown very clearly that trust that relies upon this individual consent, whenever studies have been done, show that actually informed consent is an ideal that is very, very hard to achieve at an individual level and that, in fact, you may have a better process of assent, but I think it needs public engagement, accountability, communication and transparency in the systems. I think that happens within some of our ethics committees and those processes, and I think that that needs to be perhaps much more explicit in our system so that people are aware that, if they can go and visit their doctor and talk confidentially, their data can visit me as a researcher and will be treated with exactly the same respect as they would get from their GP.

Chairman: Thank you. Can I thank all three of you? That is an enormously helpful session. It gives us a great deal to think about. Thank you very much indeed.


Memoranda submitted by Dr Pounder, JUSTICE and Liberty

 

Examination of Witnesses

Witnesses: Dr Chris Pounder, Editor, Data Protection and Privacy Practice, Dr Eric Metcalfe, Director of Human Rights Policy, JUSTICE, Ms Shami Chakrabarti CBE, Director, and Mr Jago Russell, Policy Officer, Liberty, gave evidence.

Q278 Chairman: Good morning. Thank you very much indeed. I know that you have largely, most of you, been able to hear the previous session, or most of it. Thank you very much indeed for coming to this session on "A Surveillance Society". I think you have all given evidence to the Select Committee in the past, but if each of you could introduce yourselves for the record.

Mr Russell: I am Jago Russell, policy officer at Liberty.

Ms Chakrabarti: Shami Chakrabarti, Director of Liberty.

Dr Metcalfe: Eric Metcalfe, Director of Human Rights Policy at JUSTICE.

Dr Pounder: Dr Chris Pounder, Editor, Data Protection and Privacy Practice.

Q279 Chairman: Can I start by asking perhaps Liberty and JUSTICE to be as precise as you can about what you see from a civil libertarian point of view as the real practical risks for individuals of the sort of surveillance society that has been conjured up by the Information Commissioner and which was responsible for us having this inquiry?

Ms Chakrabarti: It is a wonderful phrase, is it not, "surveillance society"? If it has got us all talking about the issue, and it has got your Committee engaged, then that is a really good thing, because our concern would be that, alongside other very important societal concerns, like security, like public health, as we have heard, sometimes the value of personal privacy can be lost. There are very good reasons why that value can be lost and forgotten on occasion. Of course, by definition, privacy is a qualified right, unlike some of the rights that Liberty and JUSTICE defend sometimes - the right not to be tortured, the right not to be arbitrarily detained. Privacy, by definition, is a qualified right. We know that we are social creatures. The moment we come together, even in very primitive societies, or when we come together in families, let alone complex modern societies, we do give up a little bit of varying degrees of personal privacy, sometimes voluntarily and sometimes not voluntarily, but in a way that is, of course, necessary and proportionate in that society. The danger is that, because it is a qualified right for the individual, but also, I would argue, some of it is very important to society more generally, to the flavour of democratic society, if we are not quite rigorous enough about the defence of something that is about balancing that right against other great concerns like security, health and so on, we can, without really noticing and without having proper public debate perhaps, lose very important things from democratic society. For example, without really quite a significant degree of value paid to personal privacy, there would be a society where the dignity of the individual has been compromised; intimacy between people, confidence between people and trust in big institutions, whether it is the Health Service or the Government, would be lost. Where we are, I would argue (and I think Mr Thomas would agree), perhaps in Britain in 2007 is at a place where there are great technological opportunities to interfere with privacy, often for very good reasons, and we just need to make sure that the ethical, political and legal debate keeps apace with all of this technological development.

Q280 Chairman: Moving on to you, Dr Metcalfe, can I perhaps put the question this way. Should my concern be that somebody will actually find out something about me and do something to me as a result, if you took Dr Forbes, the previous witness, that all my neighbours can watch the CCTV as well as the CCTV control room, or somebody finds out something about by credit record or something and damages me, or is it almost a more philosophical objection that some people would say, "Even if nobody does anything to harm me, I have somehow lost out as a free citizen by the fact that other people have got access to information about me that I would rather they did not have"? Where in our inquiry should we be focusing on the practical damage that can be done to individuals or the philosophical concern that we are less free if other people have our private information?

Dr Metcalfe: I am sorry to say that you have to focus on both. It is entirely true that you have to focus on the practical, but also, yes, you are harmed, in a way, if the information is stored, even if the information is never actually seen by anyone else, because your own sense of personal privacy is affected by the knowledge that people have access. For example, if I write a diary and I leave it in a room and I am subsequently aware that maybe ten people have gone through that room and had the opportunity to read my personal thoughts sitting on the desk, maybe none of them did, but already I have had an effect on my personal privacy. If you think about all your personal data as being in that diary and if you think about not merely ten people passing through that room but, say, all the relevant agencies have come on to the stage having access, then you have reason to be concerned, and your own sense of personal privacy, which we think has a very important value because it allows us to do so many things that we take for granted as being part of a good life, is affected as a result. There is a chilling effect that comes about in that kind of situation.

Chairman: When you talk about your diary, I feel very much the same about my blog. Anybody could read it, but nobody seems to bother!

Q281 Mrs Cryer: Shami, congratulations on your CBE. I just want to take it a bit further from what the Chairman has been saying. I want to ask you all if you accept that there could be real and pressing needs for data sharing, particularly in the light of what happened on 7/7 two years ago and given the fact that we all recognise that the most precious human right is the right to life itself and to keep our bodies intact. Therefore, how do you compare that need for the public to know what is going on and protect our citizens with the overriding consideration for individual privacy?

Ms Chakrabarti: I think you have to do it on a case by case or policy by policy basis. I think that the principles in the European Convention, and in this country they are older - there is he justificatory principle for interfering with the individual - still work very well. So, rather than balancing these issues at an abstract philosophical level, we would look at a particular policy, or a particular interference, a particular need to match data or to access data. I am assuming you are talking about the law enforcement context or the investigation context possibly by compulsion rather than voluntarily, though in other contexts sometimes voluntary sharing is good enough. You say, "Is this policy, is this measure, is this particular accessing of data truly necessary and proportionate for this?" and it is balance. That is why it is so difficult. If I may say so, that is why Parliament is actually better suited to protecting privacy ( and I think it has got a long way to go) and I hope this is the start of it, than the courts are. In my experience the courts are almost uniquely well qualified for dealing with a situation where what is at stake for the individual is torture or incarceration, and that is being balanced against other factors, but the courts are not best placed where the balance is between two great societal objectives, where the interference with the individual's right is not that great actually. Some would argue that if my DNA is taken from me, for example, when I am arrested for shop-lifting, even though they got the wrong woman and the police apologised to me and sent me on my way, the DNA is now kept forever because someone says one day I might be a terrorist or I might be guilty of shop-lifting, the courts have not so far been very good at conducting that proportionality exercise, but I would hope that because that taking of DNA is as much an issue for hundreds and thousands of people as it is for me individually that Parliament is actually much better suited, and in the future I hope that the debate about privacy and various policies could be really enhanced by greater Parliamentary involvement.

Q282 Mrs Cryer: Would anyone else like to comment?

Dr Pounder: One comment in relation to trust and trusting in the data sharing arrangements. I think the issue is one of trust, and possibly the risk is the global erosion of trust. The previous speaker, Professor Wessley, said in relation to medical research there was a lack of trust in the system and that he had experience in people refusing to give consent for medical research. If you look at the data sharing arrangements, all the trusting is from the public. The public has to trust that the data sharing is limited in accordance with the rules, the public have to trust that staff who do the data sharing are properly trained and follow the rules, the public have to trust that the procedures for authorising the data sharing are properly maintained and the public have to trust that Parliament does not enact legislation that provides for function creep. All this trusting is in one direction. What there needs to be, as Shami said, is a strong counterbalance to that public trust. All the trust is coming from the public to the authorities with very little counterbalance, in my view.

Ms Chakrabarti: Ironically it could manifest itself. If this trust is broken on occasions or generally, it manifests itself, not just in a way that is of detriment to the individual but of great harm to public policy as well. For argument's sake, if there were a health collection of data, and, of course, we have heard from people who care about protecting trust and privacy, your previous witnesses, but if you got to a point where the public no longer trusted the protection of their confidential information that they share with their doctor, people would say less to their doctor, and then, suddenly, you have got a counterproductive policy where you thought you were being so expedient by saying more and more people within the Health Service, etcetera, etcetera, can have access to this data because we are going to do such great research and we are going to help people wherever they are in the country. That all seems very laudable, but if you lose trust, then the woman who has been battered does not confide in her doctor any more. So, it is this very difficult balancing exercise which, as Dr Pounder has said, can also be enhanced by saying information is taken for a specific purpose. We put more robust ethics and laws and practice and culture in place to make sure that there is not just a general free-for-all or a general presumption of sharing where it is it expedient rather than sharing when it is truly necessary and proportionate.

Dr Pounder: Can I add.

Chairman: No, I am sorry, to get through the questions, we have got four witnesses, we cannot have everybody having two goes at every answer, so if everybody can be brief and if people have said the main points, please can we move on.

Q283 Mrs Cryer: We were talking mainly about public authorities and their knowledge of people. Can we move on to private authorities, private concerns, and their accessing and holding information on individuals and, even more complicated, where the functions are contracted out from public authorities to private authorities. Would you comment on those areas about access to private information?

Mr Russell: I think there are a number of similarities and a number of differences between large databases held by private bodies and large databases held by public bodies. Liberty has concerns or is interested in both but has mainly focused on public bodies, it has to be said. For example, in the context of our concerns about CCTV, that applies both to private bodies and public bodies. I think one of the main differences is this question of consent. In terms of giving information to a private body, it is very much based on consent but actually, in the context of providing information to a public body, it is often compulsory or, if it is not compulsory, it is basically, in order to receive a public service which people are paying for by their taxes, you have to provide that information. I think that is quite a key difference between these two types of database, but, of course, there is a big question as well about informed consent in terms of providing information to private databases and whether people are really aware about the value of what they are providing to those kinds of companies.

Dr Metcalfe: I think there is a significant problem with private companies that they are not always motivated by the same issues as the public sector obviously. In fact we received a letter very recently about the use of fingerprinting technology being sold to schools. A number of private security companies are selling schools security systems, whereby you used to be able to access the school library by way of a library card, and, indeed, with school lunches you now can have a fingerprint system. The kids just swipe their fingerprint across a scanner and that is matched against a record of their fingerprints, which are stored. So, you now have private companies holding fingerprint databases of school children. There are, obviously, various legal measures which can apply to that kind of situation, but I think it is a very good example of the way in which technological change is impacting upon personal privacy without very much appreciation of the impact on that personal privacy.

Q284 Chairman: It has been suggested to us that public sector companies are being covered by the ECHR, private sector companies are not being covered by it, and that possibly, going by the recent court ruling last week, for example, if the DWP at some point contracted out its work on investigating incapacity benefit to a private contractor, the private contractor would not be covered by the ECHR provisions. Is that correct, and is that a significant issue to worry about?

Ms Chakrabarti: Sadly, it is not completely clear. What is clear from, in my view, a very disappointing decision last week is that residential care homes have not been considered to be public authorities, regardless of Parliamentary intention or the vulnerability of the people concerned. The case is confined to that situation, and their Lordships did try to distinguish a number of other potential scenarios, but there is a lack of clarity. You would not be able to say that all public functions that are contracted out are definitely caught; and so there will be parliamentary work to be done. I would argue, on a sector specific basis to be absolutely certain, that where Parliament is allowing local government or central government to contract out a particular service, that Parliament makes the decision, at the time of providing that sector specific legislation, whether it intends the Convention to apply, because I do think it could be an important safeguard.

Mr Russell: Can I give you an example of where this particular issue is arising in a bill that is before Parliament at the moment? It is the Serious Crime Bill, and there is a power in there for the Audit Commission to mine data in order to identify potential fraudsters. There is a power in that bill for the Audit Commission to subcontract the power to do that data-mining, this kind of mechanical, computerised fishing expedition, to a subcontractor, to a private body. I think what was said is that, given the doubt in the court's mind about whether that body would be covered by the Human Rights Act, Parliament could clarify in the Serious Crime Bill that, for the avoidance of doubt, any private contractor will be covered.

Dr Pounder: Could I quickly add on this point, if the Data Protection Act has its contact with a data controller, the data controller is the person who has the statutory duty and if somebody contracts out the statutory duty, the delivery of service to a data processor, I think the data controller would still be in control of the data. That is my own view of it.

Chairman: That is a very useful comment. We can rehearse current issues around it.

Q285 Ms Buck: Can we pursue this issue of the difference between the approach of the private sector and the public sector, and just to ask, particularly Dr Pounder, but others may have a view, about what could be done. If we assume that the consent element in the private sector is a strength in terms of data protection, what could be done within the public sector systems to, if not exactly follow down that line, perhaps for some of the reasons we heard from the earlier witness, to try as much as possible to build in that kind of informed consent? What would be the systems requirements and how feasible is it?

Dr Pounder: It depends on what you are doing. The previous witnesses said something about the police and consent which personally I did not think was quite right. I cannot see the police seeking consent for anything. If you have a statutory duty you do not need to seek consent, end of argument. What you can build in, in certain circumstances, is the right to object to the processing of personal data. So, in the private sector body, say, for example, I do not like Tesco; I have consented to Tesco processing my personal data. I am able to withdraw consent quite easily, for example, in relation to marketing or, possibly, in relation to their databases that look into my sales and purchases. So, for some areas of data sharing in the public sector, where there is a statutory gateway that permits the sharing but the sharing does not involve, say, for example, law enforcement, that kind of area, you could have an easy right to object to the processing. When the UK Government implemented the right to object in the Directive they implemented it in the narrowest possible terms, and that could be broadened. I am thinking particularly, for example, of the facilities in the identity card legislation that allows for disclosures for efficient and effective delivery of public services. You could have a right to object there.

Q286 Ms Buck: Having listened to those witnesses, particularly on health, to what extent do you accept that there is a tension between public good, in terms, for example, of the benefits of using accurate epidemiological data, and the kind of protection and the potential right to opt out or to change data?

Dr Metcalfe: I think a very good example is the police DNA database, because we have already seen applications being made by medical researchers to use that information; and it is all very well to say that the information is being stored for one particular law enforcement purpose but, as we know, the definition can go very broadly, and so you might say that the storing of DNA for a law enforcement purpose means that it should only ever be used in relation to a specific crime and a specific forensic investigation, but what we find happening is that medical researchers will go along to the police database and say, "We are interested in the idea of perhaps a gene for criminals. Can we do the speculative search in relation to your database to see if there is a link between, say, for example, people with adherent criminal behaviour and potentially, given the breadth of the scope of the law enforcement purpose, that could actually fall within it. Obviously the police DNA database has its own regulatory framework and there are high ethical standards in relation to medical research, but I am not going to say it is impossible. I know that medical searches have already been improved in relation to that.

Ms Chakrabarti: There comes a point, I think, where you really do need to start saying: is the Information Commissioner well-resourced enough? Does he have enough powers to really police even the existing Data Protection Act, and you have to say, given all the possibilities that we have at the moment and which are coming, Parliament is going to have to take a more robust role because there is a tension, there will be a tension at times, and I am not going to say that the previous witnesses are all wrong about the enormous potential benefits, but someone has got to make that judgment. When they say the normal paradigm has been consent or anonymity but that paradigm has to change, I would argue that it is you and your colleagues who should be conducting that judgment ultimately on behalf of your constituents, and, frankly, if that kind of paradigm is going to be ignored on occasion because they are going to cure cancer, then I think maybe there should be a specific bill and there should be a robust parliamentary debate. Generally speaking, law enforcement and the state have powers of compulsion, but in return there has been greater accountability. That is generally the trade-off. The private sector has generally been taking information by consent and there is less accountability. The lines between the private and public sector are increasingly merging to the point where I am not even sure the distinction is that helpful. The real question is the purpose for which the interference is taking place, who sanctions the interference and what are the protections against abuse?

Q287 Ms Buck: A last question on that really, which is, I think, particularly for Dr Pounder. What about the scope for actually changing and adding to data in a way that is theoretically possible, although I suspect in practice it is not quite as easy as that, to change data on your credit rating? To what extent should it be possible within public databases to actually amend and correct data?

Dr Pounder: There is specific legislation (the Consumer Credit Act) that permits that. In relation to the NHS discussion that we had, the NHS Act 2006 allows the disclosure of medical records without patient consent, subject to the Patient Information Advisory Group giving permission. I was a bit puzzled about why the medical researchers do not use the statutory routes that are available to them. In relation to public and private sector merging, what I would say is if you look at, say, the credit reference agencies - that is private data - credit reference agencies collect a whole pile of transactions from the banks, the telecommunications companies providing data to the authorities on a regular basis, the public and private sector is---. The barrier is not there in large databases. I think Shami is right, you have got to treat the whole thing case by case.

Q288 Mr Winnick: Liberty and JUSTICE, in particular, the paper we received from Liberty, paragraph 12, the final sentence of that paragraph states, "There is growing public unease about the extent of the surveillance society." What evidence do you have of such public unease?

Ms Chakrabarti: I am going to call on Mr Russell to answer that, but can I apologise at the outset for the author of this evidence not being here. Gareth Crossman is our privacy expert, he will publish a report later in the year, but I am afraid that the rights of privacy and family life seem to allow people to take personal holidays when they are working at Liberty. I did take advice on this.

Chairman: We will hold you collectively to the evidence, I am sure.

Q289 Mr Winnick: Mr Russell what evidence do you have?

Mr Russell: First of all the anecdotal evidence is that we do receive hundreds and hundreds of queries from the press, and I suspect that you receive hundreds of letters through your mail bags about privacy type issues, but it is definitely something that we receive a lot of mail on.

Q290 Mr Winnick: Mr Russell, can I interrupt you. I do not know about my colleagues; I cannot recall in recent times a single letter from constituents complaining about lack of privacy. I am being the devil's advocate, because to a large extent, as often with Liberty and JUSTICE, I intend to take the same view as you, but my job, like my colleagues, is to cross-examine you and find evidence for your statements. When you say there is great unease, that everyone is trembling in our constituencies that their privacy is being invaded, pray, give us some evidence.

Mr Russell: There has been some limited polling done on this, and there was an article at the end of last year in the Telegraph with some YouGov Survey and that said that 78% of people felt that they lived in a surveillance society. Only 2% thought, for example, that the Government could be trusted to run an ID card scheme which did not contain serious errors. Fifty-two per cent were fairly unhappy, or very unhappy, at the idea that personal data could be recorded on government databases. So there is some data. One of the things that we will propose and will consider in this report to be published later in the year is the idea that more polling needs to be done, more information needs to be done about public attitudes to surveillance, but there is some suggestion in this limited data that there are public concerns.

Q291 Mr Winnick: I am going to ask you this question, Mr Russell. If there is such concern, why is it that, not only perhaps my colleagues have a different sort of post bag, but if I have not received such correspondence and my constituents, certainly those who write letters, are not usually reluctant to express their point of view, I get quite number of letters of a different kind asking, in fact, for CCTV cameras. Of course they take the view (perhaps it is exaggerated) that CCTV cameras, in the view, presumably, of the large majority of people in this country, play some part in undermining criminality. If there is such a feeling of concern, why do I receive letters along the lines I have just indicated?

Ms Chakrabarti: In my experience it is extremely dangerous for Liberty to fall into the trap that you are setting, which would be to suggest that general elections are going to be won or lost on CCTV. We are not in a position to argue that. Of the issues that people write to us about, that is already a more limited class. People do not ask us to build a Health Service for them, etcetera. It does seem to be a very high concern. When MPs write to us, which they do as well, to ask for help, on many occasions they are writing to us with concerns about fingerprinting in schools, DNA and so on. It may be a healthy minority of the public. I do not think that there is going to be a revolution about CCTV, but CCTV is really interesting. There is an interesting cultural point if you compare Britain to other European countries, because even in as far as privacy interferences go, there are big cultural differences about which particular interference people are concerned about. In Germany or other parts of the Continent you put a CCTV camera in the wrong place and there literally will be riots, and may be that is the non-democratic past. As a result, the authorities go through a much more rigorous process of community consultation and analysis before they decide where to place cameras. They put them up for the October Fest in Munich because they are expecting anti-social behaviour and trouble. At the end of the festival they take the cameras down. In Britain we seem to have had a much higher tolerance of lots and lots of cameras that seem to make a lot of people comfortable, but we still have concerns that from an efficacy point of view having lots of cameras everywhere, many of them not particularly well looked at or maintained, is not necessarily the best use of public money but also it is largely unregulated. Mr Denham made the point that you would feel better about the cameras if you thought that the people who were operating them were properly trained and properly recruited. That is not always the case, and it is not really regulated as an industry. I am not going to sit here and say that every single CCTV camera that has ever been erected is a complete violation of human rights, but I do think proportionality has a lot to contribute.

Q292 Mr Winnick: Next time I receive letters about that I will bear in mind your comments on the spot. Dr Metcalfe, do you believe on behalf of JUSTICE that there is a large feeling in the country that we are on the verge of 1984, big brother and the rest?

Dr Metcalfe: I think there is public unease. I do not think there is enough. There should be more public unease.

Q293 Mr Winnick: There should be more, but it does not exist at the moment.

Dr Metcalfe: There is public unease. We get the same letters and emails and telephone calls that Liberty get inviting us to take up concerns and, generally speaking, we go along, we have our club card points, we have our credit cards, we walk along the street, we are monitored by CCTV and we really do not think about the impact these things have on our personal privacy. Maybe someone is arrested. It is a case of mistaken identity, but someone makes a complaint about them being, say, a sex offender. They are acquitted or maybe charges are not even brought, and they think nothing of it until the next time they try to apply for a job working with children, and then they find they cannot because they have failed the child protection test because the fact they have been arrested in relation to a sex offence means that that information has to be disclosed. That is the point at which people recognise that personal privacy has some importance. I am not saying for a moment that that kind of information should not be disclosed, I am saying that we do not have very much appreciation of the way in which information is transferred, even with our consent, because we all tick the box on the credit card form, not being aware that it says, "This information may be transferred and shared with other third parties", but we never fully appreciate, until we start receiving marketing letters form other people on the credit cart list, how that information is precisely being used. So there is public unease - a lot of issues, like, for example, fingerprinting in schools that came to our attention by way of a letter - but is there enough? No, there is not.

Q294 Mr Winnick: Can I put this question to Dr Pounder. Is there a contradiction between what we were just dealing with, the concern and how far it is extended regarding intrusion into private lives, and the fact that an increasing number of the public seem to take what could be described as a remarkable casual attitude to publishing large amounts of personal data about themselves? For example, FaceBook or MySpace websites. For all we know, on Mr Denham's blog he might be openly speculating what sort of job he is likely to be offered later this week!

Dr Pounder: People have their own view of privacy. Lots of people are ex-directory; lots of people are not ex-directory. Some people when they fill out an application form tick the box before filling in the form. If people want to put their personal information on the Internet, then, basically, that is their permission, but coming back to the point here---

Q295 Mr Winnick: Pursuing that for a moment, it does demonstrate - I do not do it myself - the fact that there seem to be so many people, perhaps younger people, putting such information on the websites which I have mentioned. It does not seem me to express a fear that their personal privacy is in some way being invaded.

Dr Pounder: Well, they take the risk. Whether they know the risks, I do not know, but coming back to the point here - seriously, it has to be faced - there have been 20,000 complaints to the Information Commissioner last year.

Q296 Mr Winnick: How many?

Dr Pounder: Twenty thousand in the annual report. The annual report also has a tracking survey for privacy that picks up Liberty's issues. You are already having people thinking of the big opt-out in relation to the Summary Care Record of the NHS, you have people, in a sense, questioning (and I am sure you have had this) why the police have DNA data on somebody who has not committed a crime, you have even got people questioning the electronic tag on their rubbish collection. If that is not concern about surveillance, I do not know what is.

Ms Chakrabarti: To interrupt---

Chairman: No, we are not going to have two attempts at the question. Can we move on?

Mr Winnick: I assure you, Mr Pounder, we share your view, although it might not appear to be in my question.

Chairman: Meanwhile, I am composing ten pictures of my favourite members of the Select Committee! Carry on, Mr Winnick.

Q297 Mr Winnick: I am sure I would be foremost. Dr Metcalfe, JUSTICE, you argue that the interests of the private individual and public good are not opposed - this is the point of view you have expressed - but is not the job for parliamentarians somewhat different, a question of personal liberty versus the common good, and trying, as far as we are concerned, to reach a balance between the two?

Dr Metcalfe: The point I was trying to make, and it was probably the most philosophical section of our evidence, is that personal liberty is ultimately part of the common good, that we benefit from having privacy, we benefit not merely as individuals in having privacy, we benefit as a society: because people do things in their private space, in their private time, and the benefits from that flow on to society as a whole. You could give a moving example, for example, a writer. We would not have much of the great literature that we have today if, say, all our great writers took the thought that everything that they wrote down was likely to be surveyed, for example. It was just a very abstract philosophical point about the way in which privacy exists, not only for the individual, but also for the common good, and that we should be very careful about the impact of new technologies that threaten that, and I think MySpace and FaceBook are a very good examples. It is great that we have these new communication networks, but I do not actually think that lot of young people think very clearly ahead about the way in which their personal data could be disclosed and could be used, in the same way that young people do not think ahead about an awful lot of things, like their educational choices and how much they drink on a Friday night. So, in the position of responsibility that Parliament is in, we need to establish greater safeguards to ensure that other bodies, other agencies, other companies take responsibility as well.

Q298 Mr Winnick: Presumably that is Liberty's point of view?

Ms Chakrabarti: Absolutely. You were all elected in secret ballots and the concept of a secret ballot is essential to free elections. Without this right, even in the human rights community, sometimes regarded as a bit low-level, a bit trivial - it is not torture, it is not arbitrary detention - you cannot have free elections, freedom of thought, conscience and religion, freedom of speech in some circumstances without that little bit of personal space and respect for it. I completely agree with Eric on the young people and the FaceBook point. The threats do not just come from the Government or big business; if we are not careful we will rear a generation of young people who have not really known the value of privacy as part of dignity, as part of respect. People can take pictures of each other with their mobile phones; they put pictures of their girlfriends on Internet in states of undress. We as citizens, if you do not help us to resurrect the importance of privacy and dignity, could be a great enemy to each other in relation to this value.

Q299 Patrick Mercer: Turning now to automated data exchange and Shami Chakrabarti, this is for you, please: do you think that the creation of databases sometimes provides an easy or a lazy solution to problems that actually require better communication and co-ordination between responsible professionals?

Ms Chakrabarti: Yes, I do. That is a very helpful and leading question, but, yes, I do. At Liberty we try to take a balanced view of these issues. We are not against all databases, how could we be, let alone all automated databases, but sometimes, we would argue, when something bad happens it is easy to say that the answer, for example, to a Climbié situation is to build an ever bigger database, whereas in the specific tragic case of Victoria Climbié it was not the lack of a data entry of every child in the country, a lot of bad things happened to that girl before she came to her tragic end and people did not communicate about the specific. Obviously, sometimes when you are looking for a needle in a haystack, it has been said many times before, do not build an ever bigger haystack where you increase the risks of accidents, and so on and so forth.

Q300 Patrick Mercer: I am referring exactly to that sort of case. Do you think there is a real danger that a focus on automated data-sharing can actually make getting across essential information harder, and there is simply too much information out there? It confuses rather than helps.

Mr Russell: The thing we said on the children's index was actually, in principle, there is nothing wrong with a children's index, it is a database, a targeted database. Targeted amounts of information on children at risk can be helpful. The problem is, when you have got every child on a database, as Shami said, it is incredibly difficult to see the wood for the trees. In certain circumstances, yes, a database is important, but we need to be---. These human right principles that we started off with - is this necessary, is there a legitimate aim, is it going to work - those are the questions we think Parliament should be asking when a new proposal for a new government database is being proposed.

Chairman: Thank you, Margaret Moran.

Q301 Margaret Moran: I, like David, am interested in the evidence base of some of the things you have been asserting to us. You say in your submission to us that the extent to which every person in the UK is subjected to surveillance has increased disproportionately to any justified social need or benefit. Could you give us the research evidence for that just as a reference? If you cannot do it now could you, please, send it to us? You also make reference to the National DNA Database and say that there is an intention to make that database compulsory. Could you give us what evidence you have for that statement?

Ms Chakrabarti: It is, of course, compulsory even now as a matter of law, because this is a criminal justice policing measure. Your DNA is compulsorily taken from you under pain of criminal sanction if you do not agree to it being taken.

Q302 Margaret Moran: I think the suggestion is that it implies universally?

Ms Chakrabarti: That there be a desire in certain quarters to make it---

Q303 Margaret Moran: You have stated that you believe that a compulsory universal DNA database---

Ms Chakrabarti: The present, soon to be outgoing, Prime Minister has stated that he thinks it would be desirable to have a universal DNA database after a public debate. Various chief constables have taken that view. It is a perfectly respectable, if slightly terrifying, view. There is logic to it. There is a logic that says, "Let us have the DNA of every man, woman and child in the country, and then, when something bad happens and there is a crime scene, we will match it." There is also a logic, I would argue, to our position, which is to say, have a smaller more ring-fenced DNA database of people who have been convicted of a particular threshold level of crime. What there is not a logic to, in our view, is the current situation where anyone who has been arrested for an offence can have their DNA taken and even if they are let go, as in my shop-lifting example, the police apologise, say, "We have got the wrong woman", never charged, let alone convicted, my DNA can be kept forever.

Q304 Margaret Moran: I was not actually asking for a treatise on DNA, I was asking for the evidence-base?

Ms Chakrabarti: That is the evidence; that is the law.

Q305 Margaret Moran: Various comments do not constitute a research evidence base either to the initial point I made or to the second of those points. Have you got something substantial other than people's comments?

Ms Chakrabarti: Well, the legal position is clear and not in contention as to what the basis for taking and keeping people's DNA is at the moment. That is a statement of the law.

Q306 Margaret Moran: I was referring to your assertion about a universal---

Ms Chakrabarti: If the Prime Minister says he thinks it would be a good idea, I think that is a pretty good suggestion of intention, and, as I have said, it is a logical position, I just do not think it is proportionate.

Q307 Margaret Moran: Mr Russell, earlier you made reference to the Serious Crime Bill. The reason I have been out of the room is because I am sitting on the Serious Crime Bill. You referred effectively to function creep, to what is now known in technical circles as the possibility of phishing, data-mining, data-sharing. What evidence have you got for that function creep and are you aware of what the Minister said at the second reading on the Serious Crime Bill in relation to that in answer to the specific question that I raised?

Mr Russell: The specific point about function creep and where my concern about the function creep comes from is the fact that in the bill there is a very clear provision which says that the Home Secretary, Secretary of State, may by order increase the functions for which data-mining may be undertaken. So, that is how function creep most often happens: if you have got a power to do something with personal information and then, by regulation, the reasons for which you can process that information can be extended. That is where the concern about function creep comes from. There is a clear power in the bill. I cannot remember the clause reference, but there is one there which says that the purposes can be extended. So that is the function creep point.

Q308 Margaret Moran: That contradicts what the Minister said at the second reading, that the Audit Commission will not be able to use the powers to predict who might commit fraud in the future, in other words phishing, and it is right and proper that we put safeguards in place to prevent data-mining and data-phishing.

Mr Russell: Can I come back on that point? That is absolutely right. We pushed in the House of Lords for an amendment to the bill which would prevent data-mining done to be used to profile people's future behaviour. The Government agreed with us that that was a concern in the current legislation and, therefore, agreed in the House of Lords to put an amendment in to stop profiling of individual suspects in terms of their future behaviour, and we are delighted they have put that in. That is slightly different to the question of function creep, because the question of function creep is about what purpose is this data-mining going to be used for, and I would be very surprised if the Minister had said that there was no risk of function creep in relation to this aspect of the Serious Crime Bill, because the provision is there.

Dr Pounder: Just a comment on the Serious Crime Bill. The Audit Commission can do data-matching in relation to serious crime, not so serious crime and debt collection. In relation to debt recovery, one wonders whether the Serious Crime Bill is the correct vehicle for this. There is a real problem in over-indebtedness in the UK. Whether or not that should be treated by separate legislation is another thing, but if you look at schedule seven, you will see that debt recovery is part of the Audit Commission's remit in the Serious Crime Bill.

Dr Metcalfe: Can I make an additional point about function creep. Before I was at JUSTICE I was a lawyer in the immigration and judicial review section of the Treasury Solicitors Department and I was responsible for helping to arrange advice in relation to the Asylum Registration Card System or ARC System, so that was an identity card system which involved fingerprinting of asylum seekers. I am not saying anything that is not in the public realm at this point. The original purpose of the Asylum Registration Card was to reduce fraud in relation to asylum seekers, but it is very easy to see, just as a practical measure, how the information stored for one purpose can be used in relation to others. If you had that information stored in relation to asylum seekers and you are a law enforcement agency, why would you not want to check information to see whether any of the people that you now have on your database match unsolved crimes? Why would you not want to see if any of those people are also involved in relation to mainstream benefit fraud, if in some way they have managed to fraudulently obtain documents in relation to mainstream benefits? Why would you not, if you were a medical researcher, want to cross-reference the biometric information that you might have on that database in relation to preventing genetic diseases? You do not have to be a conspiracy theorist to see how function creep happens. It happens perfectly naturally, in that people see information which is useful and then seek to gain it; and no-one can deny that these databases are useful; the point that we are trying to make in this situation is that what people do not see when they see the utility of information is the danger and risks. I thought the evidence this morning from the people involved in medical research was extremely interesting. Yes, it is true that in the old days you could go into a doctor's surgery and get a patient's medical records off the doctor's desk, but, generally speaking, that would mean going down to a quiet street in Basingstoke, finding the doctor's surgery and going in there. Now, anyone with a computer can access that information. Just to give you some idea of the extent to which---

Q309 Chairman: Just a minute. It is not actually true, is it, that anyone with a computer can access the NHS database? If you want to let that lie as your evidence that anyone with a computer can access the NHS database, I think you need to justify it.

Dr Metcalfe: Obviously, I am generalising to a degree. The computer has to be networked and also has to be able to access the NHS network.

Q310 Chairman: That is quite a big difference, is it not, between "anyone with a computer"?

Dr Metcalfe: We are currently extraditing a man to the United States because he was able from the United Kingdom to hack into the United States Department of Defence database. Do we really suppose---. I do not think literally everyone with a computer can access that information, but I mean anyone who skilled enough with networks, and there are a large number of people like that nowadays out there. If someone in the United Kingdom can access what is arguably the most secure defence network in the United States from here in the United Kingdom, I do not think we can afford to be blasé about the possibility that someone in China could at one point hack into our NHS database.

Q311 Chairman: Nonetheless, you take our point about being a little bit more accurate.

Ms Chakrabarti: He qualified it.

Q312 Margaret Moran: The suggestion you are making there is that these other uses should not be occurring. What would you advocate to prevent phishing? Are there limitations that could be placed on the use of this data that would give sufficient assurance, in your view, to the general public or to yourselves rather, because maybe the general public have a different idea?

Dr Metcalfe: I think really it has to be taken on a case by case basis, because obviously not all databases are equal and different databases work in different ways. One major source of concern, for example, is the recent European Framework Directive, which allows law enforcement agencies from across the European Union to access information held in UK law enforcement databases, which means that information could potentially be passed from police criminal records to a law enforcement agency in Lithuania. One major concern there is what assurance do we have that the end user in Lithuania will not misuse that data, because they are not subject to the same data protection standards as we are here in the United Kingdom? I think that is a very good illustration of a potential gap. We need to make sure that every end user, every person who has access to official government data is bound by the same standards. So, that is one global point I would make, particularly in relation to data-sharing across the European Union. In relation to the specific---

Q313 Margaret Moran: I want to be clear. You are saying there should not be sharing of data across Europe or beyond until all of those protocols are in place. I think the parents of young Maddie might have a different view on that.

Dr Metcalfe: Certainly, I would hope so, but I would also like to think that they do not want her personal data being shared willy-nilly with people in another European Union country without sufficient data protection standards. Think of the potential risks, for example, if you allowed access to our children's database to be given to any accession country, and think of the potential risk to children that might arise from that situation, because we are not asking the same database standards of an accession country that we do of our own public officials in this country.

Q314 Gwyn Prosser: You have all argued in your various ways that the current legislation does not provide comprehensive data protection, that it is out of date, out of step and fails to keep pace with technological changes. I wonder if I can ask you briefly each to describe revision or improvement in the legislation which would correct that error and how can we ensure that such provision does not get outpaced by the rapid improvement in technology?

Dr Pounder: I think the starting position I have is that there needs to be a counterbalance to the data surveillance and the data-sharing that occurs. I think there are three elements to this counterbalance. One is parliamentary, the second is regulatory and the third is the individual. Starting from the individual basis, I think the time has come to look at a right to information privacy. The Culture and Media Committee toyed with this idea and recommended that Parliament should grab this particular nettle. My own view is that it can be done via the Data Protection Act, a right to information privacy, and the advantage of that is that it would not disturb the relationships with the press, it would avoid that problem. In relation to parliamentary, what I would like to see is the ability to have a feedback loop into Parliament that could possibly result in, say, for example, a show-stopper in respect of, shall we say, some sort of surveillance activity potential. I will try and explain what I mean. At the moment the Home Secretary and many secretaries of state are responsible for setting the procedures that safeguard as well as the responsibilities for interference, and I would like to see Parliament being more on the ability of being able to, shall we say, have some safeguards. For example, the Home Secretary could produce a Code of Practice in relation to X and, say, for example, he could approach the Information Commissioner with a view to what the Commissioner's views are. Instead of the Code of Practice being, say, for example, laid before Parliament, it could be approved by Parliament. So, if the Information Commissioner, for example, had problems with the Code of Practice, he could bring those problems to Parliament and Parliament could set social policy as to where the balance lay. I also think that the regulator, the Data Protection Commissioner, should have the ability to check regulations passed by this House (and as you know in the identity card legislation there are some wide-ranging powers), shall we say, for example, to go straight to the court and say, "I think these regulations are awful", and have somebody who can actually challenge the lawfulness of the regulations that are placed in human rights terms. I also think Parliament needs more information about what government intends. The bulk of the appendix in my evidence relates to how I thought that Parliament was not informed as to the true intent of the identity card, and I hope that in the new arrangements, with respect to Gordon Brown's possibilities, that Parliament will be able to get the information it seeks to make informed decisions. In relation to the regulator, the final thing I would say is that---. Not the regulator, a general matter is that there has to be absolute transparency in relation to data-sharing or any surveillance, what is going on, and that absolute transparency has to be backed up by the fact that people can do something with the information. It is pointless telling you, "Oh, there is a camera here", blah, blah, blah. Once you have given this information, you can do something, and that is one reason why I think a right to information privacy is inevitable. At least the individual who is subject to the surveillance can do something with the information that he gets.

Q315 Gwyn Prosser: Dr Metcalfe, would you concur with that?

Dr Metcalfe: I would concur with that. It is very difficult for me to add anything further. Perhaps one point I should just identify, if we are going to identify wish-lists. We would argue that there needs to be prior judicial authorisation of any interception of private communications under Part I of the Regulation of Investigatory Powers Act. Currently you can intercept, a law enforcement agency can intercept email, it can intercept telephone calls, it can intercept letters and text messages simply by going to the Home Secretary and asking for a warrant. I am not saying that the Home Secretary grants them willy-nilly, but in every other common law country you find that the prior authorisations are made by independent judicial authority. That does not happen in this country and it should.

Q316 Gwyn Prosser: Ms Chakrabarti or Mr Russell?

Mr Russell: Again, we agree with the comments that have been made, and I will not repeat them. There are another couple of points that we would make. We need to look at the Data Protection Act with specific reference to CCTV, because a large number of CCTV cameras are not regulated by the Data Protection Act at all, and we think that there should be very sensible, legally binding guidance or regulations on the question of whether people have to be informed about where a CCTV camera is, who operates the CCTV camera or what training they need and the appropriateness of the placing of cameras. So, we think CCTV should be looked at. The DNA database: we think there should be a presumption in favour of the removal of DNA from somebody who is not charged or convicted, a robotable presumption, but in some cases it may be necessary. I am thinking of something like Ian Huntley. It may be necessary to keep somebody's DNA even if they are not convicted, you know, if there are repeat allegations, but generally we think there should be a presumption for removal.

Q317 Chairman: Thank you. Could I just press the Parliamentary scrutiny point a bit. Dr Pounder, to some extent your evidence is slightly embarrassing for this Committee in the sense that it suggests the Home Office were able to put one over on us and on Parliament. We very clearly said there should not be a Citizens Information Project. You may have been given the impression there would not be one and you track how officialdom kept the Citizens Information Project going for months, if not years, and it then re-emerges as the core of the National Identity Register. Given that experience where, certainly when we were discussing the Identity Cards Bill, none of us knew that the officials were carrying on with this secret project, how can Parliament actually do the scrutiny role you want us to us do?

Dr Pounder: You invited me to say that that is why I recommended that this Committee should recommend removing section 1(iv)(e) of the ID Card Act.

Q318 Chairman: Remind us, for any who may be watching on the Internet link, which section that was.

Dr Pounder: It is to do with the ability to share information, using the identity card database for a general public administration purpose. The other thing I would say is that this public administration purpose is subject to the review, it is called the Crosby Review, which is supposed to announce first. I have given my evidence to the Crosby reviewers with the hope, I have said to them, that if they are going to progress their ideas in identity management, it has to be through primary legislation and not through section 1(iv)(e) of the Identity Card Act.

Q319 Chairman: Thank you. Ms Chakrabarti.

Ms Chakrabarti: I would agree with that. There are more general points about doing more in primary legislation and having less opportunities. They do not just apply to privacy protection but to Parliament privacy scrutiny more generally and less by way of regulations after the event.

Q320 Chairman: Am I right in thinking, though, that the sort of Parliamentary role that you would like us as members of Parliament to play does require some quite profound reworkings of the way in which Parliament operates? You are fairly regular witnesses, all of you actually, to this Committee. You know the Select Committee's strengths, but also we are not full-time, we have many other commitments. How realistic is it to ask Parliament, as you actually see it, to play the sort of level of scrutiny role that clearly you all think in one way or another is the answer to some of these problems? I am not saying it is wrong, but it is a major change, is it not, to the way in which the Commons, in particular, works?

Ms Chakrabarti: Yes. There are general problems, but also there is a great opportunity at this moment to address some of them because we have a new Government and a new Prime Minister talking very much about trying to enliven Parliament. Privacy is a particular area, for the reasons we have discussed, that would benefit. I think you may at some point consider having a specific privacy committee just because the terrain is so considerable and the issues are not just constitutional, they are technological. So, with respect to your wonderful staff, you may consider some enhancement in your resource to do that job. I personally, and Liberty, would like to see the Information Commissioner enhanced too, and we would like to see the Information Commissioner report to Parliament, be appointed by Parliament, and that could be true with some of the other public roles of that kind, but I think privacy in particular is such a qualified right, it requires such a constant public policy balancing act that Parliament really is going to be the court that enhances and defends it.

Q321 Mr Benyon: In relation to the point that Dr Metcalfe was making a moment ago about the wish that both your organisations want to have, transferring the power to intercept communications from the Home Secretary to the courts, you are quite happy to quote polling evidence that supported an argument that you made earlier. I suggest that the thousands of my constituents that use public transport in London, if they were polled on that, would say they would prefer it to stay with the Home Secretary because, if it went through a judicial process, it would be likely to take longer and, therefore, might put them at more risk, and at least they can get rid of the Home Secretary, if they feel he is failing, because he is elected. What do you say to that approach?

Dr Metcalfe: If there is specific criticism that prior judicial authorisation takes longer, it is worth pointing out that in Canada, Australia and the United States it is possible to get an emergency warrant without prior authorisation so long as the agency goes back to the court within 48 hours, sets out the reasons why they had to act as they did, given the nature of the emergency, and explains to the court what happened.

Ms Chakrabarti: It is not a full-blown criminal trial we are discussing here, it is just about who you trust to make this authorisation in a particular context, and we think one way to add to trust is to say a judge, not a High Court judge, perhaps something more akin to a magistrate---. It just seems appropriate that, where it can see such an intrusion with the individual, this is a particular role, this is something that a judge could do. There are many times in the context of anti-terror legislation where you and your colleagues say to the public a control order, or this, or that, or other measure would be enhanced by judicial involvement. Sometimes we at Liberty and JUSTICE agree with you, sometimes we do not think it kills the defect, but we do think that these issues of trust can partly be enhanced, not necessarily, as I say, by a very involved process, but by a judge, not a politician, issuing the warrant. We also argue in other contexts that there could be greater use made of Intercept product in criminal trials, and that is a debate that rages elsewhere, including in this Committee. So, if that were to happen, and that debate is being conducted, you are going to see greater transparency in any event.

Q322 Mr Benyon: Very quickly, you are saying you can have greater safeguards and a speedy process?

Ms Chakrabarti: Yes, you can.

Q323 Mr Benyon: As opposed to what we have at the moment?

Ms Chakrabarti: It is just about who constitutionally might be the better person to issue the warrant. When you search people's houses, as the police do and as they must do, because they have contraband or there is evidence of criminality, that is a warrant that is issued by a magistrate. Nobody finds that odd.

Dr Metcalfe: Courts make emergency orders all the time and late night injunctions. You have judges who are available 24 hours a day to grant injunctions or to make orders. It would be no different with intercept.

Dr Pounder: Can I add that you have the Home Secretary responsible for these organisations, for interference as well as the safeguards. This is an example of where you separate the two.

Chairman: Margaret Moran, last question.

Q324 Margaret Moran: This is to Dr Pounder. I think we have touched on the issues and you have referred to the data protection response being not up-to-date, increasingly disjointed, with the changes in government services being more joined up and, indeed, the technology. Some would argue, indeed some of the earlier witnesses and some of the research that has been done, that the greater problem is not so much increasingly disjointed data protection legislation as ignorance of what the data protection actually says. Can you comment what you would do in respect of the issue of disjointed data protection and the role of the Information Commissioner?

Dr Pounder: I did not catch the last part of the question?

Q325 Margaret Moran: What would your response be? What would you be looking to do in respect of what you see as increasingly disjointed data protection legislation, and if you wanted to comment on the role of the Information Commissioner in that context?

Dr Pounder: There are two elements. One of the problems, and why I think the Data Protection Act is in a sense weak, is that it is legislation that Parliament enacts because of the scrutiny element. For example, if you look at the protection principles, there are many purposes, but it uses the word "purpose". So if you have a broad purpose, for example efficient, effective delivery of public services, that actually negates the principle, so the Information Commissioner cannot do anything. What I would like to see from the Information Commissioner's perspective is the ability for him to exercise powers of audit, and I think the Commissioner has asked those. In relation to misuse of personal data, I think that the Commissioner should be able to have enforced, shall we say, powers of prosecution. One thing I would say on the transparency area: the Government knows that the European Commission has started, or begun, or threatened infraction proceedings that the Data Protection Act is not a proper implementation of the Data Protection Directive and for two or three years all attempts to get to why the European Commission thinks the UK Data Protection Act is defective has basically come to nought. Of course, the Data Protection Act is central to what we are discussing today. One thing I would ask the Committee to do is to find out why the Government is refusing to publish the letter sent from the European Commission to the Department of Justice explaining why it thinks the UK Data Protection Act is deficient and the UK department, for its part, to publish why it thinks the Data Protection Act is a proper implementation: because I think that would help sort out quite a lot of the problems of understanding how data protection relates to the 'surveillance society', as it is so-called.

Chairman: Thank you. That is a very helpful suggestion. Can I thank you very much indeed. I think it has been an extremely useful morning from both sets of witnesses, but particular thanks to the four of you.