Select Committee on Home Affairs Written Evidence


Memorandum submitted by Caspar Bowden

  Privacy Enhancing Technologies (PETs) span disparate fields of computer science research[260] (eg the cryptography of "private credentials", untraceable and unlinkable data transport mechanisms, biometric encryption, human-computer interaction and usability, policy control languages, database/statistical privacy) which in combination can create complete "privacy systems". PETs should not be considered to be a "toolbox" which can rectify specific privacy problems in isolation.

  Industry and academia are able and willing to develop effective PETs and privacy systems, but there is a chronic lack of awareness and interest from both data controllers and most regulators. Since data protection compliance obligations fall on data controllers, in the absence of clear incentives (regulatory or economic) to deploy PETs, it is unreasonable to expect them to become widespread through market forces or regulator exhortation alone. Sanctions sufficient to deter organisations from treating breaches of information privacy as an acceptable business contingency, and greater public awareness of privacy risk are probably necessary to generate market demand.

  Design issues in identity systems and privacy systems are really complementary and inseparable—and they require "threat modelling" of risks both to the individual and organisations—which are not necessarily similar or symmetrical. However the DP principle that processing should not be "excessive" in relation to specified and legitimate purposes is crucial for reconciling conflicting interests, but the policy of the ICO to date has generally eschewed major interventions to halt disproportionate processing,[261] in contrast to some other European DPAs. The policy climate for discussion of enterprise information security in the US has been profoundly affected by the broad scale of security breaches that have come to light, since the passage of security breach disclosure laws, however the ICO remains ambivalent whether individuals have a right to be informed.

  Privacy is often fatally compromised in designs for large scale identity systems through preconceptions that all transactions need ultimately to be traceable or (re-) identifiable, extreme hypothetical cases used to justify overbroad processing of personal information, and reliance on procedural rather than technical safeguards. Sophisticated PETs can provide much more robust bulwarks against function-creep than policy controls alone, but it must be understood that the purpose of these technologies is expressly to minimize the disclosure of personal information to the absolute minimum required.

  Advanced PETs for identity management have some very curious properties, which have been painstakingly designed by a small number of world-class cryptographers over the past twenty years, to try and achieve better outcomes for both security and privacy. However the subtlety of the problems these techniques are designed to solve can seem very abstruse to non-specialists, and the techniques themselves can accomplish things which might seem logically impossible, or contrary to the intuition of the layperson.

  For example, it is possible to authenticate a transaction in such a way that if a cryptographic token (which proves entitlement to some service) is used only once (ie honestly and as intended), it can be mathematically guaranteed that the individual cannot be identified. However if an attempt is made to forge or copy a token, or use it more than once, then only in that contingency does it become possible to trace and identify the person to whom it was issued.

  The above idea was essentially conceived as a way of implementing online payment services with the privacy properties of cash. A more recent and much more widely applicable technique is the ability to revoke the validity of long-lived untraceable tokens with service providers other than those that issued the tokens, without the necessity to identify the owner of the token.

  There is now a family of such innovative cryptographic techniques which together constitute a very powerful new paradigm for combating fraud and abuse, whilst strongly preserving the privacy of honest users. It means that privacy is compromised if and only if dishonesty is detected, and thus potentially forms the basis of a new kind of social contract with citizens. The potential applications include:

    —    road-pricing and congestion charging;

    —    welfare benefits, healthcare, and social services entitlement;

    —    private sector use of data from the National Identity Register; and

    —    use of a national identity card in over-the-counter transactions.

  The fundamental legal and policy issue this raises is that if one takes the Human Rights Act (and Art.8.1 of ECHR) seriously, the state has a duty to limit intrusions into privacy to that which is necessary, not in a general sense, but case-by-case according to the circumstances of the individual. Therefore the use of these advanced PET techniques is mandated by the HRA (subject to reasonable feasibility), because it infringes privacy only to an extent that is individually proportionate. This is in stark contrast to schemes (such as the Oyster card) which rely on blanket collection of identifiable transaction data, and thus are highly vulnerable to "function creep".

  As things stand, in systems which collect all transactional data identifiably (on the basis that it cannot be predicted in advance which transactions may need to be retrospectively traced for fraud investigation), a "side-effect" is that a database of all transactions is retained (for some period), but because the database exists, it is a temptation to use it for general surveillance or other purposes unrelated to its primary function.

  However, to see the connection between human rights and these advanced technologies, one has to appreciate both the counter-intuitive possibility of such "conditional identifiability", and the implications of existing ECHR jurisprudence. So far, no parliamentary inquiry in any ECHR jurisdiction has spanned this legal and technical gulf. Policy makers are simply unaware these technical possibilities exist. Thus the legality of blanket retention of identifiable transaction data is never fundamentally questioned, because there seems to be no logical alternative to providing a realistic capability for audit and fraud control.

  This is the main point I would wish the Committee to consider in this inquiry. However I would also make the following recommendations for specific reform of the Data Protection Act 1998.

    —    The right of the data subject to access their personal data should in general be exercisable online and without charge. I have more detailed proposals for the necessary technical and policy reforms.

    —    The definition of personal data in S1, which presently might exclude data which is only "indirectly" identifiable from being personal, should be altered to implement fully Recital 26 of the Directive ("or by any other person").

    —    There should be a presumption that the consent of the data subject is required for processing personal data, with the onus on the controller to specify why derogation from obtaining consent is justified. Essentially the emergence of user-centric identity management technologies makes this feasible. In previous pre-Internet decades of data protection policy it would not have been feasible.

  Disclosure: although I am submitting this memo as a private individual, and not to represent the views of my employer, I feel it is proper to disclose that between the time this note was initially drafted and later finished, Microsoft has (partially in consequence of my recommendations) acquired the intellectual property of Credentica Inc, and hired Dr Stefan Brands, one of the leading cryptographers in the field of advanced PETs.

8 April 2008








260   Digital Privacy: Theory, Technologies, and Practices: Alessandro Acquisti, Stefanos Gritzalis, Costos Lambrinoudakis, Sabrina di Vimercati eds, Auerbach 2007. Back

261   The affair of fingerprinting at Heathrow Terminal 5 is a rare exception. Back


 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 8 June 2008