Select Committee on Home Affairs Minutes of Evidence


Examination of Witnesses (Questions 240 - 259)

TUESDAY 26 JUNE 2007

PROFESSOR CAROL DEZATEUX, DR IAN FORBES AND PROFESSOR SIMON WESSELY

  Q240  Chairman: The phrase "surveillance society" conjures up a rather Big Brother image, which is why we put a question mark at the end of the title to our inquiry because we want to take a balanced view. Could I ask the two professors: do you regard the sort of work that you are advocating as part of a surveillance society? Do you feel happy with that tag?

  Professor Wessely: It depends what you mean. At the moment, we are carrying out health surveillance into the health of the Armed Forces. We are looking at the rates of post-traumatic stress disorder, the rates of cancer and the rates of all sorts of other adverse outcomes. That is surveillance because it is based on medical records, cohort studies and research, but most people in that context would think that is a good thing. Certainly the members of the Armed Forces think that is a good thing and they are appreciative that this has finally been done. It all depends on the context. Health surveillance is actually a good phrase and we would agree with looking at the effects of MMR vaccine or surveying the effects of the Vioxx drug, and we are not talking about hidden cameras in supermarkets.

  Q241  Chairman: Dr Forbes, we are going to come to your evidence a little later on but do you have any comments on that opening exchange?

  Dr Forbes: I am delighted to hear this positive use of the term "surveillance" because I do not believe there is enough awareness of the way that society is constantly `surveilled' by a range of systems, mostly governance systems, to increase our knowledge of ourselves, to provide information which leads to better knowledge and better insights to wisdom so that we can make really quite crucial and large social decisions based on the information that we provide by just living or dying. This is a good example of how data can be collected, how it is managed. If you think about the safeguards that attend to medical records, they are extremely sophisticated. There is a worldwide practice about how to do this and how to manage anonymity and privacy, and yet it is used to be extremely constructive and to provide us with the sort of data that we need and I think that is missing in other areas of society.

  Q242  Ms Buck: Can I pursue the line about data? Are you confident that the medical research that you undertaking requires analysis of databases rather than research that would be done by using volunteers, for example?

  Professor Dezateux: I think that we need very large-scale evidence for a lot of the questions that are facing us now. At one level, therefore, just approaching individuals is simply not feasible. The reason we need large-scale evidence is that we are dealing often with things that are quite uncommon but about which it is important that we have reliable answers. These could range perhaps from the association of birth defects with certain drugs that are given to women in pregnancy to the relationship to birth defects with power lines, mobile phones or any of those sorts of things. Birth defects are a good example because they are uncommon and you need data from the whole country. It is clearly also obviously data that you are not going to be able to go back to necessarily and get. The importance, from an epidemiological point of view, is that our science is served by giving answers that are not biased and that are not misleading in any way, that are precise and timely and where we can also compare groups of people who are not exposed to the thing we have been asked to look at. All those things really support the need to have large-scale evidence. I can give you any number of examples of issues that in fact parliamentary committees have sat on, such as assisted reproduction and so on, where we really are tied by not knowing answers to the sorts of questions to which we should have answers because we have not been able to get access to large-scale evidence. The other point to make about this is that the kind of research that epidemiologists do is concerned to get information right at the individual level, but we are not concerned to identify or know who that individual is; we are interested in that individual because they are part of a group of individuals and we are interested in things at the group level. From that point of view, large-scale evidence based on patient data is one of the best ways that you can look at highly sensitive information because it is possible to have very good safeguards and security and you do not stand the risk when writing to somebody of exposing to somebody in their house that you know something about them inadvertently, as you might do when you have to approach at an individual level.

  Q243  Ms Buck: Professor Wessely, you said earlier that there was a robust system of governance for undertaking this kind of research but yet, at the time, problems arise and research does not necessarily go ahead or there is controversy over progress because people do not understand that. If it is as good as you say it is, why is it that people then doubt it? To what extent is that to do with concern over issues around individual consent and the anonymisation of data?

  Professor Wessely: First, most research goes ahead with consent, and that is the default position and you start from that. Obviously, for any interventional research—if you are going to give people a drug, a new test or some procedure—you have consent; if you do not, you are committing assault. We put that to one side. Research based on data usually goes with consent, where that is practical and possible. That is normally what we do, so our studies—and I have just mentioned the surveillance of the Armed Forces—are based on consent, but there are times when that is not possible. If I gave you a practical example, then it would make sense to you. We wanted to look at the association between depleted uranium and cancer, and this may appeal to Patrick. Therefore, we have 100,000 people who have been potentially exposed in the Armed Forces to depleted uranium and many of them wanted to know whether or not this had led to cancer. To do that study, it is not possible to approach 100,000 soldiers, most of whom have left the Armed Forces. Nobody knows where they live; they do not use landlines because they tend to use mobile phones now; they are almost untraceable and it would cost millions of pounds and you would miss the very ones that you want to find. First of all, you need ethics approval. In fact you need two sets of ethics approval. You cannot do any research without ethics approval. You need to use a system called the Caldicott Guardian. This is for the people who hold that data, namely the MoD or the cancer registries, to permit you to see that data. It is a very complicated procedure which you have to go through. They hold the data and they have to decide if this is a reasonable thing to do. You need permission from something called PIAG, which is a Department of Health committee, that oversees this system and adds an additional layer of governance. You have to comply with the law. You have to show that there is not any other way of doing it. You have to show that no one is going to be upset or distressed by this. You have to show that no one is going to get any individual detriment from the loss of this data. You have to show that you owe a duty of confidentiality and that there are sanctions in place if you break that. That has never happened. There have been no instances of medical researchers leaking confidential data. That is not to say it will not happen but it has not happened yet. You have to belong to an organisation that says that if you do that, you are fired. There is a whole complicated system of checks and balances in place before finally you are allowed to link members of the Armed Forces and their rates of cancer. In fact, the answer was that there is not an association. There is no other way of doing that kind of research.

  Q244  Ms Buck: What I do not understand then is with so many locks on protection of the data, why it is that anybody anywhere should ever raise concerns about proceeding with particular research?

  Professor Wessely: There is no question that many people do not know this system. I think that is quite clear from the Academy's report. A lot of people, and within the profession itself, to be frank, are ignorant of this framework, partly because it is very complicated. I think also there have been instances of misconduct in research or misconduct in the health services that tar every one with the Alder Hey brush, if you would, which are not relevant at all to what we are talking about but have created a climate of suspicion. Finally, having said that, when you look at what the patient charities in heart disease, Alzheimer's and Parkinson's want, unquestionably they want this kind of research to go ahead.

  Q245  Ms Buck: I am sure we will return to many of these points, but as my last question: the Academy's report criticised what it calls the over-rigid application of the principle of "consent or anonymise". What you have described to me is a system that has so many locks in it that it does not seem to justify any deviation from this principle. Why is it that you would be looking for, and what benefit would there be from, a reduction in protection through consent or anonymisation?

  Professor Wessely: First, "consent or anonymise" was the principle, for example behind PIAG, the idea that eventually you would either have consent or anonymisation, but that is clearly wrong. The whole point is that there are many times in research where you cannot proceed with that. Our study of Gulf veterans could not proceed on that basis because if the data had been anonymised, we would not know who they were and we would not be able to link them up with their cancer rates. So it was a flawed principle. Therefore, there are occasions when you have to be able to proceed without consent and without anonymisation. I have explained that that is unusual but it does happen. What the Academy is saying is that, first, people are not necessarily aware that these examples exist; they are not necessarily aware of the governance framework; or most of them misinterpret it to say it is "consent or anonymise", which is absolutely not the legal framework we have. It is not the framework behind Connecting for Health. It is not the framework envisaged in the Data Protection Act or the law of confidentiality.

  Professor Dezateux: The first thing to say is that one person's anonymised data and identifier is another person's research data, and so it is very difficult to look at the same piece of information and make a clear decision one way or another. Given that is the case, we have to use common sense and we have to ask what are the safeguards when we use this information to avoid disclosing the identity of a person while still being able to answer the question that we think is important. One of the interesting issues is that it all pins on individual consent. We need to move away from that paradigm a bit and ask what processes there are for community assent. It is ridiculous that we cannot answer some of these questions because that is the side on which we often find ourselves. The other thing that I hope we will have a chance to discuss is the greater clarity of the processes because with a lot of the public misunderstandings and understandings, we could do with a better communication strategy. That applies to people operating and interpreting these things at a health service level. The sort of people we need to negotiate with about access to data are sometimes also confused, and that is not surprising. At a scientific level, there are many reasons why we need to know things about people. Again, we are not interested in who that individual is, where they live, but we need to know certain things about them in order to practise better science and to produce a more reliable answer. We need to know that we are not double-counting. You need a couple of common identifiers to make sure that you only count an individual once in your data. We need to be able to follow people up in the very long term. That is becoming increasingly important. We are interested in diseases and treatments for things like cancers which might take a long time to develop, and we need that kind of information to be able to trace people. If the data are anonymised at the beginning, we cannot do that; we cannot get back to them. We need to make sure that the data we produced from our health services is respected and that it is of high quality, and that when it says this person has a condition, they really do have that condition. In fact, we know that sometimes there are mistakes in the basic health service data. Research can help the health service improve the quality of its data by being able to what we call pseudonomise, to have ways of getting back to say, "Was this really what we thought it was? Do these people have this condition or have they had this operation?" Then, as I mentioned right at the beginning, without data that is potentially disclosive, such as postcodes and so on, we are not going to be able to say whether cancers are related to power lines or living near nuclear power stations or any of those kinds of questions that I believe you would have an interest in getting the answers to as well.

  Q246  Gary Streeter: You have talked about checks and balances but does it not worry you at all that some of the information that you are overseeing the gathering of is being used and will increasingly be used to produce some of the genetic modification type science of the future? Does that ever cause you sleepless nights?

  Professor Dezateux: There are serious issues that society needs to debate about the use and application of genetic advances. I do not think that those issues will be resolved by saying, "We are just not going to use the data at all". Indeed, I think that where it is helpful is for us for example to be able to link somebody's biological data to their health and their future health status, you can begin to put boundaries round your uncertainty about what the meaning of this genetic change is—whether it is helpful to know about it, whether it has any implications at all—and then inform the health services as to how appropriately we should be using that test within the health service and, bearing in mind that there is a very big private market in genetic testing, to inform the public really about whether it is wise or advisable to get tests that might identify a prediction for disease or not.

  Professor Wessely: If you look at the UK Biobank, which is an example of research in genetics, 60,000 people were approached and asked to take part and only 50 people objected and of those 50, 30 then consented to take part in the study. So, within the framework of research, people are confident in the use of genetic information to study disease and are willing to participate.

  Q247  Martin Salter: Professor Wessely and Professor Dezateux, obviously you would be disappointed if we had not put you under intense surveillance before you came before us. Because we have a phalanx of staff, we have dug up your interchange with Dr Richard Taylor at the Health Committee on 7 June, which is very informative. My question relates to that. Just to paraphrase, Richard was questioning your contention that perhaps medical researchers should be given more access to patient records than even the police. We want to give you an opportunity to expand on that. It appears from the exchange that you were implying that the public would trust you perhaps more than they would trust the police. You may be right.

  Professor Wessely: I am not implying that. I think that is a statement of fact, is it not?

  Q248  Martin Salter: And your evidence is?

  Professor Wessely: I think I mention it. We did a study around the polonium incident about who do you trust to manage this incident, and doctors and scientists rated much higher than the Home Office and the police force. More seriously, there is a misapprehension there. If you want to look at personal data, the Data Protection Act actually says quite specifically that you cannot do that to make any decision about an individual without their consent. The police would only want to access data to make a decision about an individual: is he a crook, or whatever. That is specifically illegal. You cannot do that and we cannot do that, but that is not why we want to look at data. We are interested in not one child with autism or one child who has had a vaccine. We are interested in all children with autism and all children who have had the MMR. In that sense, it is not a personal piece of data about that individual; it is about all the children who came under that category. That is the framework for which we use that data, and that is the framework that is already permitted by the law for the public good. That is the specific framework that denies the police access to the same data for making individual decisions. As a normal citizen, I can tell the difference between wanting to find out if living near a power station causes leukaemia and wanting to find out if I have been naughty with my taxes or whatever it is. These are chalk and cheese and I think most people accept and can understand that.

  Professor Dezateux: The answer is that both the police and biomedical researchers are subject to constraints, and I do not think we have unconstrained access to data. Simon has indicated quite how many approvals and permissions we have to have. The issue of whether we should be constrained in the same way as the police depends on why you are constraining the police and why you would constrain medical researchers. I suppose the thing that unites the police and biomedical researchers is that we are both interested in the elimination of doubt and the reduction of uncertainty but for our purposes, we need to have large-scale unbiased evidence to reduce uncertainty and make sure we have the right answer. The public's concern about the police of course is that they will get the wrong answer and somebody will go down who is innocent. In our instance, getting the wrong answer has public implications that it is really important to avoid. We do need that access but I think we should make the point here that we are not asking for unconstrained access, and we very much support the checks and balances that are there, but they need to be looked at in a flexible way. Also, I hope we will be able to come to discussions about how they can be improved.

  Professor Wessely: You have already foreseen this in the legislation by saying that personal information cannot be used to that individual's detriment. If I was to do that—I cannot think how I would or why I would—I would be breaking the law and committing an offence and I am going to be in deep trouble. So there already is a framework to prevent the Orwellian implications, as it were.

  Professor Dezateux: May I add that I think it is important that we are accountable. I consider myself a public servant in my research, as a lot of epidemiologists do. We are quite happy to be accountable for the work that we do and our approaches. Indeed, we are audited and have had our systems looked at in relation to the Data Protection Act and so on on quite a regular basis within our university.

  Q249  Martin Salter: Thank you for your very full answers. How confident are you that medical researchers, having navigated the various checks and balances that you have in place, then are able to access for perfectly legitimate reasons personal patient information? How confident are you that that remains secure and could not be leaked to people who should not receive it?

  Professor Wessely: There are two answers. First, when we took evidence for the report, the Information Commissioner confirmed that they had no reports to them of what you are describing happening. They had had reports of receptionists seeing things that they should not and all sorts of things within medicine of data violations but not involving medical research. I am not saying that will not ever happen but so far it has not. The second thing is that if we were to do that, and particularly in the new electronic health systems, you would leave a massive electronic fingerprint all over the place. It is quite straightforward. In my university, and I think in all universities, that is the end of your job, and it is also the end of your career because you would be up before the GMC. Even if you were to do that, you would leave a trace and that is it.

  Professor Dezateux: When I started in research and I went into a primary care record, the Lloyd George record, I could read everything about that patient just by being handed the envelope, but now with technological developments, there is access control and audit trails. I think the chances of leaving a set of notes out on the table that somebody else can read are very much less. The computing infrastructure is much stronger.

  Q250  Mrs Dean: Moving on to the NHS database, what opportunities will it open up in terms of medical research and epidemiology and what safeguards are being built in to protect unauthorised access to patients' records? Are those satisfactory?

  Professor Dezateux: From what we have said before, it is clear that electronic patient records will provide a huge advance because they will allow us very effective access to the large-scale data that we need. I will not rehearse the issues about that. One of the things it allows us to do is to be inclusive in our research so that we do not leave certain sections of the population out. It can help us get swift answers. It helps us look at areas of medicine that we are often criticised for not spending enough time on in our research: rare disorders, under-served populations. It helps us look at demographic change in a dynamic way because we are a very changing population in the UK. One of the areas that we are interested in is that it allows us to link inter-generationally, so that a lot of the issues that we are concerned about are what happens to mothers/parents and their children and subsequent generations? You can be very powerful in answering those sorts of questions by using electronic records that can be linked by a single identifier. They are cost-effective. I think that we need to understand that after the Cooksey Report, there is a real recognition that unless we make the most of these electronic health records, we will not be able to maintain globally our competitiveness in terms of our science, and that will have economic implications for society. That is not what I come to work day to day to do but it is a very important issue, and it is an important issue for trials and providing an infrastructure for trials, just as much as it is an important infrastructure for understanding the safety of medicines. There will be investments in research that we will be able to attract if we are able to get this right and use the electronic health records effectively. The safeguards that are in place that we have described already are quite sophisticated for the kind of research that we do, and they would be appropriate for this kind of research with the electronic health records. There are plenty of examples in Scotland, in parts of England already, the Nordic countries, Australia, the US and Canada where there are systems in place that allow the data to be kept in its own home, as it were, but for the linkage to be done by somebody who does not need to look at the data, so that the researcher at the end of the day just gets the information that they want and need on a need-to-know basis. I think those kinds of safeguards are very important.

  Q251  Mrs Dean: Do you have anything to add to that?

  Professor Wessely: I do not really. On the technical side, I would rather hand over to our engineer.

  Dr Forbes: I am not an engineer but engineers become very nervous about single source databases, which have to be used by a large number of people and have to be designed for ease of input on a regular basis, and so there is a large number of users. The compromises that need to be made in engineering terms inevitably compromise the security of any single database. They are concerned about that and they say it is better to build in rather than build on. They say, "Let us think about how this system could fail and will fail either by misuse or by abuse or accident, all the ranges of human and technical possibilities". That needs to be thought about in advance. This has nothing to do with the way the research is conducted or the way that the protocols for research are developed. I am very happy with that. I am sure they are going to work very well. It is just that the matter of a huge amount of data being input by a wide range of users means that it is a single system; it has greater vulnerability and that needs to be acknowledged. It is a dilemma. You want the single database for all the social benefits that become possible by having the single database but that has to be weighed against the dangers. It is all about balancing those things. The engineers would say you should think about this in advance, design up-front, do lots of upstream thinking about how any human system is going to fail at some point in some ways, and work out what you are going to do about it when it does fail. I do not think you can get any further than that. You cannot produce a completely fool-proof system, so let us design it to assume that it will fail in some ways.

  Q252  Mrs Dean: When it is up and running, how can patients be reassured that their own medical details are kept confidential?

  Professor Wessely: First, if you read, for example, the Care Record Guarantee, which I think is a very sensible and remarkably plainly written document, it talks about the various checks and balances on confidentiality. Beyond that, there is also the moral and ethical framework—and I am talking specifically about medical research now because that is what I am interested in and patient care in particular—in which we work, and I do not think that climate has changed. I do not think doctors have become any less concerned with confidentiality, or the GMC has become less concerned with the advent of electronic patient records. It is presenting new systems but the ethical framework for the conduct of those systems is just the same. Personally, if I had gone to my wife's surgery in Kennington a few years ago to collect her, I could see on the desk the notes of everyone she had seen that day. They were in a big pile. Now I cannot do that; I cannot see them. We sometimes have a view of a rosy-tinted past in which doctors clutched case notes to their bosom and never let them out of their sight 24 hours a day. That is just not true. I personally feel more confident in the security of electronic matters and not least because again it is really important that if I mis-use them, the constraints on the system and the recriminations are so vast that that is a greater deterrent. In the past, I could wander into medical records and, quite frankly, if I wore a white coat, I could take out medical records; nobody would challenge me. You are right, there will be mistakes. Of course there will be errors but there is a system for correcting them and there is also a system for governance to make sure that if those are done maliciously, there will be severe penalties.

  Q253  Mr Benyon: Before I turn to Dr Forbes, a doctor in a surgery in my constituency received a fax the other day from a hospital up north about a patient on their register who had self-harmed. This was just attached to his medical notes. The next time he came in, the doctor questioned him about it, and it was perfectly obvious that they had got the wrong person; he had the same surname and the same date of birth but he was a different person. Should we be concerned that information is floating around the country when mistakes are made of that nature that can have a huge impact on that person's life and job prospects if it became public?

  Professor Wessely: Of course you should be concerned about that but that has always been the case. It has been far worse in the past. Notes could just get lost and you would never see them for years.

  Q254  Mr Benyon: I should clarify that nobody seemed to know who should correct this information, whether it was the hospital that had made the original error or whether it was the GP surgery. There seemed to be no understanding about who owns that fault.

  Professor Wessely: That is a governance issue. I am not au fait with how that works. I do know it is much easier to correct that kind of mistake now than it was in the past. It has always happened. Mistakes will be made and there will be a lot of John Smiths with a certain date of birth. Now, whoever it is, either the GP or the hospital, can alter the record whereas previously the notes will be there in perpetuity down in the bowels of the hospital and years later they will turn up and nobody would know they were mistaken, and, most importantly, the patient would not know that they were mistaken. I am not going into how this happens but now I know that patients will be able to check their records and they will be the first people who will say, "That is not me. You have made a mistake". Previously they would not know. They would have no idea what was lying on discharge summaries all round the country in the bottom of hospitals.

  Professor Dezateux: The electronic health record will improve this because if we use a unique identifier, then one John Smith will not be mixed up with another. That is important. The second point is that you will not fax it so that this terribly disclosive information is left for anyone to read; you will send it by the existing system which works with Connecting for Health, which is encrypted emails and messaging services. Thirdly, to find that out, if you can access a single electronic health record, the information on the spine is visible to the person who cared for that patient and the GP who is looking after them, so there is instant communication, and the patient who should be able to access and look at their own record.

  Q255  Mr Benyon: Dr Forbes, you have set out six clear principles to govern the use of surveillance data, the fourth of which says, "In general, public agencies should not be allowed access to private databases". Should the police and the security services be an exception to this and, if so, what conditions should be put on those exceptions?

  Dr Forbes: I would say there should be no blanket exceptions. I would say there needs to be justification for access to a private database by a public agency because when I give information to a private agency, I am giving it to them; I am consenting for them to use it for their common purposes. There may well be cases where there needs to be or there is a very good case for access to a private database. The case needs to be made on a case-by-case basis. If, after a time, you think that there are so many of these cases, we need to have a rule or a rubric which would allow the police to invoke it, I do not see any problem with that in terms of a governance procedure. The NHS is a very good example that says, "This is what we want to use this data for and we therefore generate a series of mechanisms to make sure that it is not misused". They do not say, "Do what you like with the data". I do not think there is any case anywhere for saying, "There is data. Do what you like with it" just because you are the police or the security services.

  Q256  Mr Benyon: Your fifth principle is: "Public record databases should be under the control of autonomous agencies, not government." What difference does it make?

  Dr Forbes: There is a huge difference. The difference here is between the state and the government. I do not mind providing information to the state which uses that information for purposes which are about the collective good and the benefits will be indivisible; they may or not come to me. Governments have purposes for which I may or may not have consented. I may have voted for them, I may not. They may be doing something I like, or they may not. I think it is a good principle to say that the government has to justify the use of the data of its citizens. The government does not own me; it does not have any right over me. It is the other way round in fact. It is there because the people have put the government there. We consent to the state and we elect a government, which we can get rid of. I think that is an important principle. Public trust is very important. Trust in governments goes up and down, this way and that way, for good or bad reasons. If it also is going up and down in the same way on the state, I think that is potentially damaging for society and for politics in general because ultimately you want people to honour their commitment to the state and do what they like with the government.

  Q257  Mr Benyon: Would you call the NHS an autonomous agent?

  Dr Forbes: Yes, it is an autonomous agency. As far as I know, the government, cannot say, "Give me that" and just have it.

  Q258  Mr Benyon: Your sixth principle relates to the penalties for misuse and you say that they should reflect the damage and distress that the system failure or crime causes. However, we all know that sentencing usually reflects not only the consequence of the offence but the culpability of the offender. Do you accept that?

  Dr Forbes: Yes, I think that is a fair consideration.

  Q259  Mr Benyon: Do you think, for example, that leniency should be shown in the case of a teenager who is particularly skilled at hacking and finds his way into personal data for kicks rather than for any malicious intent?

  Dr Forbes: I do not know what leniency would mean. I think that you would treat a teenager as a teenager, first of all. I do not know about you but I am not lenient with bad behaviour.


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 8 June 2008