Select Committee on Home Affairs Written Evidence


Memorandum submitted by Hewlett-Packard Laboratories


  Attainment of the Government's vision regarding digital services is threatened by many individuals' concerns over the increased potential for surveillance, over them and their actions, that consuming such services would offer. Adoption of digital service delivery infrastructures whose designs avoid the need to know the absolute identity of the service consumer would significantly reduce that potential and the concerns it creates. The technologies that are needed in such designs exist today and are available for use. Clear support by the Government for such system designs would provide the necessary catalyst to enable their widespread deployment. In turn, this would reduce the threat to attainment of that vision.


  1.  Hewlett-Packard strongly supports the Government's vision of:

    "Creating a country at ease in the digital world, where all have the confidence to access the new and innovative services that are emerging, whether delivered by computer, mobile phone, digital television or any other device, and where we can do so in a safe environment."[161]

  It is clear that for the desired confidence and feeling of ease to exist, all consumers of such services must not only do so in a safe environment, but must regard that environment as safe. In turn, such positive regard requires that their concerns about being the subject of surveillance, whether based on actual experience of being the subject of surveillance, reported experiences of others or just a personal desire to enjoy their human right of privacy,[162] be addressed and resolved.

  2.  As the corporate research laboratory of Hewlett-Packard, we wish to submit comments to help the Committee to understand the potential role of information technology to address the privacy and trust concerns that many citizens have about surveillance. The scope of our comments includes the roles of privacy-enhancing technologies and trusted computing technologies, and the necessary rooting of their use in the human/social concept of trust.


  3.  Where the intent of surveillance, whether by government/public agencies or others, is for a clear and specific purpose that is generally viewed positively and the attainment of which is seen to be aided efficiently by surveillance, and the actuality is absolutely limited to that intent, it is to be expected that few would object thereto. However, unless all concerned with the instigating, sponsoring and operating surveillance have both met those criteria and been seen to meet them, this lack raises concerns in individuals, which in turn influence their behaviour. Scaling this argument up from individuals to society as a whole, it can be seen that attainment of the Government's vision will be affected by the feelings that individuals en masse have about surveillance.

  4.  Applying the above logic to the online world, it is clear that surveillance can be performed by a number of parties on both the actions of digital service consumers and on static information about them. The lack of precision, clarity and stability in Government statements about the specific purposes, operational details, controls and limitations over uses of personal data, etc. of schemes such as the retention of communications traffic data, the National Identity Register and Cards, the National DNA Database, the NHS database and various child-oriented services and databases (eg Connexions, the Electronic Social Care Record) does not provide assurance that any surveillance by government/public agencies would be exclusively of the acceptable nature referred to in paragraph 3. The same statement can be made about private sector providers of digital services, many of whom appear to pay minimal regard to the spirit, if not the letter, of the data protection regulations' requirements regarding their privacy policies. We therefore look next at some work we have done which provides an insight into the feelings aroused in individuals by the possibility of negatively-viewed surveillance.


  5.  A research project, named Trustguide,[163] was undertaken over a period of 15 months to October 2006 by HP and BT, sponsored in part by the DTI Sciencewise[164] programme. It took the form of workshops which explored the opinions of, in total, approximately 250 citizens with a wide mix of backgrounds, ages, interests and personal values, regarding the tensions in the provision of internet enabling technologies that also fulfil personal expectations of trust, privacy and security.

  6.  It is not our intention here to describe or summarise all the findings from this project,[165] but we wish to highlight the following findings that are relevant to the purpose of this submission:

    —    Lack of control and openness leads to mistrust. Citizens want more responsibility to be taken by government, the banks and ISPs (Internet Service Provider) and guarantees to be provided.

    —    Virtually all participants commonly referred to "risk" rather than "trust" when describing their ICT mediated experiences, and felt more comfortable and secure when restitution existed.

    —    A majority of participants believe that it is impossible to guarantee that electronic transactions or electronically held data can be secure from increasingly innovative forms of attack.


  7.  Many of the concerns uncovered by Trustguide can be addressed by breaking (or, better, never forming) the link between data that describes an individual's characteristics (or his/her actions) and data that defines that individual's absolute identity, eg, full name plus date and place of birth, or National Insurance number. For many types of digital service, the service consumer's absolute identity is not needed, and only a means of paying for the service is required. Such services can be thought of as being similar to real-world services that are paid for in cash and around which the purchaser retains anonymity, eg, a bus journey, a haircut, an entry to a cinema. For some other digital services, eg, online personal healthcare, a link between an individual's characteristics and his/her absolute identity has to exist, in order to ensure that the service is consumed by the intended person. However, even in such situations that link does not always have to be direct; as long as the service consumer can provide proof of some sort that he/she is the intended recipient of the personalised service and has the resources and mechanism to pay for it, then all the needs of both him/her and the service provider are met—the consumer's absolute identity is just not needed for this.

  8.  Adoption, by public and private sector entities, of digital service delivery systems whose designs minimize (or, better, avoid) the need to know the absolute identity of the service consumer and also minimize (or, better, avoid) the need for information about the service consumer, from which his/her absolute identity can be (easily) derived, would:

    —    reduce the opportunities for surveillance activity to identify observed individuals absolutely;

    —    limit those opportunities to situations where there already exists a valid need for absolute identity to be used for service-delivery reasons;

    —    ameliorate the concerns of individuals about their actions or personal information being linked to their absolute identity, for purposes they have not specifically agreed to, as a result of surveillance activities;

    —    reduce the risks of theft, loss and abuse of absolute identity information and the consequent costs to individuals and society of the associated frauds;

    —    ameliorate the concerns of individuals that their online actions increase the risk of falling victim to such fraud or even just receiving unwanted communications;

    —    reduce the costs borne by service providers to keep large volumes of absolute identity information safe from unnecessary access, secure against loss or corruption due to process/equipment failures and up-to-date;

    —    enable the observation of online activity en masse and the mining of data in large databases to continue to be done, by service providers and others, in order to provide useful aggregated information without the risk of infringing individuals' privacy;

and so increase the perceived safety of, and hence confidence in participation in, the digital economy by individuals, thus helping the Government's vision to be attained.

  9.  We do not advocate the total replacement of identity-based digital service delivery systems by those in which no identity information at all is required; to do so would allow individuals the freedom to break laws and contracts without risk of being traced and held to account. Rather, we wish to inform the committee of the benefits to be gained if identity information demanded by a service provider, whether public or private sector, be just that required to deliver the service, and no more, thus mirroring the requirements found in the real world. Except where there is a real need otherwise, service delivery systems could be designed to allow consumers to indicate their (partial) identities by means of a set of pseudonyms, ie, tags which are not readily linkable to an absolute identity.

  10.  We also wish to inform the committee that, following that principle, in many situations digital credentials that assert the right of an individual to consume a service, or assert his/her competence or capability to perform an action (eg, make payment), could be used in place of absolute identity. To repeat a point already made, for many purposes a digital service provider does not need to know the absolute identity of the service consumer—it is merely a convenient way of discovering, labelling, linking and/or tracking the various characteristics of the consumer, which in the process also permits surveillance and exposes the consumer to a range of risks.

  11.  Some credential-based systems that control access to services, both in the digital and real worlds, require the existence and participation of third parties that are trusted by both the service provider and the service consumer. Typically, such trusted third parties (TTPs) know the absolute identity of a service consumer, and can therefore provide a means for the link between a pseudonym or credential and its owner (ie, the service consumer) to be followed in the event that his/her absolute identity is required, eg, for law enforcement purposes.

  12.  These abilities of a TTP both to revoke credentials and to reveal absolute identity imply that the digital service consumer must place a high degree of trust in the TTP. However, that is no more than the high level of trust that a digital service consumer today must place in most of the service providers with whom he/she interacts; this is especially true in the case of online financial service providers and most government agencies.


  13.  There is a variety of technical approaches to providing the individual with the means to manage his/her digital identity information to and control its release and subsequent use. These range from approaches in which all communication and interaction between digital service provider and consumer is done on the basis of anonymous credentials (ie, no identity information is transferred) to those in which the service provider's identity management systems are designed to follow all the consumer's requirements regarding his/her identity information (and thus act as his/her proxy) and are verified as actually doing so.

  14.  Some of these technical approaches are being further researched and developed within the PRIME project,[166] a 4-year co-operation between 20 industrial and academic research institutions, that aims to advance the state of the art of privacy-enhancing technologies. It is part-funded by the European Union, and its scope includes technologies and system architectures, reference prototypes and application trials, all within a context provided by legal, social, economic and human factors requirements for these. Hewlett-Packard Laboratories is one of the leaders of the project. Within it we have undertaken research and development of technologies that:

    —    aid a service provider to manage the identity information, provided by a service consumer, according to the requirements of that consumer;

    —    aid the service consumer to assess the trustworthiness of the service provider's systems, ie, that they will actually manage his/her identity and other information in accordance with his/her wishes;

    —    aid the service consumer to manage the trust aspects of the device he/she uses to access the digital service;

  and work continues on these.

  15.  Note that two of the above-listed items refer to the trustworthiness of a device or a system. This term is used in a technical sense, and can be defined as the degree of reliance that a device or system will behave as specified, ie, that it has not been corrupted or subverted. Given the present level of cybercrime and likely continuation or steepening of its rate of increase, there is a growing need for both service providers and individual service consumers to have trusted mechanisms for ensuring that their systems and devices are protected against attack and to provide assurance that they have not been subverted (and warnings if they have).

  16.  Hewlett-Packard Laboratories has been conducting world-leading research into such mechanisms for many years, the results of which have led to open, industry standard specifications[167] for the necessary system components and their use, and to the commercial availability of these components (eg, PCs, laptops, etc.) from a number of vendors. This research and development work continues.

  17.  Rigorously provable assertions that devices and systems are "trustworthy" are, however, only as valuable as the trust that is placed in the entity making the assertion by the individual or organisation that is considering whether or not to rely on such assertions.


  18.  The Trustguide project also found that there exists a high degree of distrust of ICT-mediated applications and services ("mediated" means: delivered using a range of technologies), that citizens want more responsibility to be taken by government, the banks and ISPs (Internet Service Providers) and for guarantees to be provided. This implies that citizens would be willing to trust these entities, and in turn this opens up the possibility for them to take on the roles of TTPs for individuals, and also to be part of the chain of trust that supports technical verifiers of software and systems.

  19.  The existence of such a trust infrastructure would enable the design of digital service delivery systems that rely much less on needing to know the absolute identities of their consumers.

  20.  To bring this into being would probably require initial support from government. Some reassurance that a critical mass of demand for use of such a trust infrastructure would be generated within a reasonably short timescale would probably be a necessary part of adequately reducing the business risk to investment to create the infrastructure. This may perhaps be less of an issue for financial service enterprises.

  21.  The Government's ability to satisfactorily provide that support, by itself being a pathfinder provider and operator of a trust infrastructure, is currently questionable, because of the points raised in paragraph 4. However, by making clear statements in support of reducing the use of absolute identities in digital services and by providing open commercial incentives to encourage private sector pathfinders, the Government would be widely seen to be acting to reduce the risks and incidences of exposure to unacceptable digital surveillance (refer to paragraph 3).

  22.  The Government could further enhance its trust rating by supporting the wider use of clear, precise statements of the purposes for which a digital service requests any piece of personal information, thereby helping such best practice become the norm.

  23.  Such an enhanced trust rating would increase and widen popular support for other IT-intensive government initiatives that are aimed at fighting crime and terrorism and at providing joined-up government services.


  24.  Hewlett-Packard Laboratories believes that privacy-enhancing and trusted computing technologies have a strong role to play in addressing the privacy issues raised by the increased potential for surveillance over digital service consumers, and that clear statements and actions by Government to support the use of these and other technologies to reduce the use of absolute identities in digital service infrastructures will assist in removing the concerns of (existing and potential) digital service consumers over surveillance and cybercrime, and hence help attain the Government's vision of creating a country at ease in the digital world.

April 2007

161   March 2005 Connecting the UK: the Digital Strategy. Cabinet Office, Prime Minister's Strategy Unit, joint report with the Department of Trade and Industry. Back

162   Article 8, European Convention on Human Rights. Back

163   Trustguide website: Back

164   Sciencewise website: Back

165   The Trustguide Final Report is available at Back

166   PRIME website: Back

167   These have been developed by, and are available via, the Trusted Computing Group, whose website is Back

previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2008
Prepared 8 June 2008