Responsible Use of Data - Science and Technology Committee Contents


4  Informed consent and 'terms and conditions'

36. The issue of how informed consent was obtained from users, for the rights to process their personal data, was raised early in the inquiry and was a central issue throughout.[85] Informed consent to participate in research is defined by the Council for International Organizations of Medical Sciences, as that given:

    by a competent individual who has received the necessary information; who has adequately understood the information; and who, after considering the information, has arrived at a decision without having been subjected to coercion, undue influence or inducement, or intimidation.[86]

37. As the use of social media data and its financial value continues to grow, it is clear that social media platforms potentially have a very important asset at their disposal, but the ability of organisations and researchers to use that data is limited by law. In the UK, this law is the Data Protection Act 1998, which maintains that consent must be obtained from individuals before their data can be used for research purposes.[87]

38. The University of Cambridge told us that

    Signing social media platforms' terms and conditions does not necessarily correlate to informed consent, as research has shown that users sign these complicated documents without reading them in order to open their accounts.[88]

The University of Manchester described the example of an individual using the social media platform, Twitter, where "there is a process of consent as part of creating a Twitter account" but even then, "it may not be entirely clear to the account holder how their Tweets might be used for secondary purposes, including research".[89]

39. The call for ethical behaviour was not limited to the users of services. Sureyya Cansoy, techUK, reported that an "increasing number of companies" were "concerned about ethics questions" and Professor McAuley, Horizon Digital Economy Research Institute, said that "many small companies, even large ones, want to be seen to be behaving ethically and are getting somewhat annoyed at some of the unethical behaviours of others".[90] Timo Hannay, Digital Science, told us that "we want to be able to abide by any reasonable requests that users may make to remove things".[91]

40. The primary method for obtaining consent on social media platforms and their equivalents is by asking users to agree to terms and conditions when they register to use the service. Terms and conditions can be defined as "special and general arrangement[s], rule[s], requirements, standards etc. forming integral parts of a contract or agreement".[92] Dr Kevin Macnish, University of Leeds, commented on the problems in using this process.

    It is widely known that these forms are rarely read in detail or understood. The print is small and the eventual use often obscure. Even if signing the form could be considered an act of consent, it is often not an act of informed consent. Caveat emptor is an unfair response if a large number are signing up for the service on a (widely recognized) limited understanding of the future use of the information they will provide.[93]

41. Professor John Preston, University of East London, told us that "people treat social media a bit like they treat the pub".[94] He continued:

    They feel that if they go into a pub and have a private conversation, it does not belong to the pub; it is their conversation. They interpret Twitter or Facebook in the same way—as a place to have a conversation.[95]

He thought that "people need to know what they are signing up to".[96]

42. It could be argued that users are compelled to either accept terms and conditions (which they might disagree with) or relinquish access to services like social media platforms. While the Internet Association maintains that "companies compete on privacy and understand that there are few barriers to switching among providers should consumers lose confidence in an online platform or service"; if a Tesco customer wants to collect points using the Tesco club card, they have to sign up to Tesco's terms and conditions.[97] Similarly, if a person wants to connect with their friends on Facebook, it is necessary to use the Facebook service and sign up to their data use policy.[98]

43. During this inquiry we sought to understand the issues surrounding informed consent, its importance in ensuring users understood what they were agreeing to, the manner in which companies communicated that to users, the difficulties in services that are often sourced outwith jurisdictional boundaries and how these issue might be addressed.

Length and complexity

44. Professor David De Roure, Economic and Social Research Council, highlighted that a primary problem with terms and conditions is that "few people read [what] they are signing up to".[99] Dr Mark Elliot, University of Manchester, speculated that users just want to "get to the good stuff", and Carl Miller, Demos, told us that there was "a wonderful statistic that if you read all the terms and conditions on the internet you would spend a month every year on it".[100] Mr Miller concluded that "there has been little incentive for many platform providers to do anything other than issue 100-page documents, because everyone clicks 'Yes'", and, to change the status quo, "pressure" would have to be applied from "outside the company".[101] The lack of attention people pay to the terms and conditions was emphasised when, in 2010, GameStation temporarily added a clause to its terms and conditions that stated the company now owned the user's "immortal soul".[102] Given the widespread acceptance of the clause, GameStation concluded that 88% of signatories did not fully read the terms and conditions documents.[103]

45. To compound the problem of length, terms and conditions contracts have been found to employ unnecessarily complex language. Dr Elliot identified the complex language used in terms and conditions as a barrier to reading them, "even if you made your terms and conditions only 500 words, I still do not think people are going to read them, partly because of the language they are written in".[104] Sir Nigel Shadbolt, Web Science Trust, labelled the contracts "totally impenetrable" and "more complex than Shakespeare", necessitating a reading age of "19.2 years to get through".[105] Dr Kevin Macnish, University of Leeds, commented on the fact that these terms and conditions were often accepted by children and thus "if we are going to make the terms and conditions understandable to schoolchildren, that gives us a level of English and understanding that I think is sensible to be aiming at".[106]

46. We wrote to a number of social media platforms (or their parent companies) to ask questions about their commitment to informed consent and their use of terms and conditions. YAHOO! wrote that it provided information to users via dedicated privacy webpages;[107] LinkedIn aimed to "provide clarity" to members about how the company used their information in a similar manner,[108] claiming it had earned the trust of users by respecting members' "privacy and properly protecting their personal information".[109] Facebook explained that it had "introduced a simplified Data Use Policy," which offered "a slimmed-down, jargon-free guide to the way that Facebook manages data" [110] (see image below).[111]

However, with the exception of a comment by Twitter, stating that "during the registration process users must give their consent to our Terms of Service, Privacy Policy and Cookie Use"[112] and a brief Facebook reference to Terms,[113] there was little comment on how the terms and conditions documents (and their use to indicate consent for data use) were utilised by those organisations.

47. Steve Wood, Information Commissioner's Office, told us that often terms and conditions documents "will come from a culture within an organisation where lawyers are heavily involved. […] It comes from the point of view that you need to cover everything".[114] He speculated that "there may be some organisations that sometimes are able to exploit the opacity of the notice".[115]

48. The Minister agreed that terms and conditions were too complex, saying that

    the idea that people read 150 pages of terms and conditions is simply laughable; it is a complete nonsense. We all know what lawyers are like—every t is crossed and every i is dotted. But the consumer needs something that is easy to understand and straightforward.[116]

He reported that "the Information Commissioner's Office is going to be the conduit for this kind of work, with the industry coming forward with proposals to simplify terms and conditions online".[117]

49. We are not convinced that users of online services (such as social media platforms) are able to provide informed consent based simply on the provision of terms and conditions documents. We doubt that most people who agree to terms and conditions understand the access rights of third parties to their personal data. The terms and conditions currently favoured by many organisations are lengthy and filled with jargon. The opaque, literary style of such contracts renders them unsuitable for conveying an organisation's intent for processing personal data to users. These documents are drafted for use in American court rooms, and no reasonable person can be expected to understand a document designed for such a niche use. We commend the Information Commissioner's Office for investigating ways to simplify the contents of terms and conditions contracts and ask the Government, in its response to this report, to detail how the public at large will be involved in arriving at more robust mechanisms for achieving truly informed consent from users of online services. Clear communication with the public has been achieved in the past, for example in the use of graphic health warnings on cigarette packets. Effective communication with the public can be achieved again.

Communicating the intentions for data use

50. When questioned about user expectations on the use of their data by social media platforms, witnesses did not believe that terms and conditions documents allowed for informed consent to be provided for the use of that data.[118] Carl Miller, Demos, held up Twitter as an example of better practice in this regard; he told us that Twitter was "unapologetically open" and the "clear statements" it used early on in the application process were "helpful in informing people's reasonable expectation about what can happen and what is possible with their [data]".[119]

51. The Information Commissioner's Office's report, Big Data and Data Protection, pointed out that the Data Protection Act 1998 requires an organisation to tell people:

    what it is going to do with their data when it collects it. It should state the identity of the organisation collecting the data, the purposes for which they intend to process it and any other information that needs to be given to enable the processing to be fair.[120]

Despite this, witnesses to this inquiry considered that the communication of how collected data may be used was rarely clear and helpful: the University of Leeds even suggested that the Government should consider "introducing legislation which requires social media platforms to provide clearer information about what happens to users' data than currently exists".[121]

52. Social media platforms outlined their policies about transparency. For example, LinkedIn told us that its transparency reports were intended:

    to provide [their] members and the general public with information about the numbers and types of requests for member data that [LinkedIn] receive from governments around the world, as well as the number of [its] responses.[122]

Similarly, Twitter told us that it "publishes a bi-annual Transparency Report to inform our users about key elements of disclosures".[123] Facebook said that it had taken "extensive steps" to put "transparency and usability at the heart of our work on privacy, data use and consent" and outlined a number of methods it used to promote transparency including "Ad Preferences", a feature that is currently available to users in the US, which is "a way for people to learn why they are seeing a particular ad, and control how we use information about them, both on and off Facebook, to decide which ads to show them".[124]

53. The Government was supportive of efforts to improve the transparency of data use, stating that it "expects businesses to be transparent and open about their use of consumer data. This transparency is essential if consumers are to feel safe and empowered in an increasingly digital marketplace".[125]

54. We consider it vital that companies effectively communicate how they intend to use the data of individuals and that if terms and conditions themselves cannot be made easier to understand, then the destination of data should be explained separately. We recommend that the Government drives the development of a set of information standards that companies can sign up to, committing themselves to explain to customers their plans to use personal data, in clear, concise and simple terms. In its response, the Government should outline who will be responsible for this policy and how it plans to assess the clarity with which companies communicate to customers. Whilst we support the Government in encouraging others to meet high standards, we expect it to lead by example. The Government cannot expect to dictate to others, when its own services, like care.data, have been found to be less than adequate. We request that the Government outline how it plans to audit its own services and what actions it plans to take on services that do not meet a satisfactory level of communication with users about the use of their personal data.

Requiring versus requesting information

55. When users sign up to access services, organisations often require that users provide personal information, usually without any accompanying explanation to justify such requirements. One Committee member observed that a flashlight (torch on mobile phone) application required him to allow the application to access his location before being able to download the flashlight service.[126] Professor Derek McAuley, Horizon Digital Economy Research Institute, agreed that this was an example of an unjustified request and suspected that the information was not needed for the application to do its job but the company was "obviously after it for some other reason".[127] He indicated that companies were being "duplicitous in their behaviour" as they were "presenting one experience, yet asking for a lot more information".[128] In an article published by the Huffington Post entitled 'The Insidiousness of Facebook Messenger's Mobile App Terms of Service', marketing expert Sam Fiorella wrote:

    Facebook's Messenger App, which boasts more than 200,000 million monthly users, requires you to allow access to an alarming amount of personal data and, even more startling, direct control over your mobile device. I'm willing to bet that few, if any, of those using Messenger on Android devices, for example, fully considered the permissions they were accepting when using the app.[129]

Some of the clauses in the contract highlighted in his article allow the app to:

·  Record audio with microphone. This permission allows the app to record audio at any time without your confirmation.

·  Read personal profile information stored on your device, such as your name and contact information. This means the app can identify you and may send your profile information to others.

·  Take pictures and videos with the camera. This permission allows the app to use the camera at any time without your confirmation.[130]

To clarify why this array of permission requests were made by the Facebook Messenger application, Facebook published an explanation page on its website, stating that:

    Almost all apps need certain permissions to run on Android, and we use these permissions to help enable features in the app and create a better experience for you. Keep in mind that Android controls the way the permissions are named, and the way they're named doesn't necessarily reflect how the Messenger app and other apps use them.[131]

It listed a number of examples for permissions justifications, which included:

    Take pictures and videos: This permission allows you to take photos and videos within the Messenger app to easily send to your friends and other contacts.[132]

56. We were informed that the Information Economy Council, a body co-chaired by the Government and the techUK president that "brings together Government, industry and academia to drive the information economy in the UK", was "creating a set of data principles to address how we can reassure consumers in this new digital age without losing the opportunity to get the most out of technological innovations".[133] One working group, led by Professor McAuley, was working on how companies process personal information something, he indicated, that "industries are crying out for".[134]

57. There is a qualitative difference between requesting personal information when registering for a service and requiring that same information. Companies should have a greater responsibility to explain their need to require (and retain) personal information than when they simply request it. We welcome the work of the Information Economy Council and recommend that the Government use that work to provide companies with guidelines to aid organisations in deciding what information they should require and how that, and the subsequent use of the data, might be managed responsibly. We expect the Government, in its response to this inquiry, to outline a draft timetable for when businesses might expect to receive Government endorsed guidelines in this area.

Companies based in foreign jurisdictions

58. Several of the larger social media platforms, like Facebook and LinkedIn, are headquartered in the USA, outside of the jurisdiction of UK and the EU and thus subject to different pressures when producing terms and conditions.[135]

59. Steve Wood, Information Commissioner's Office, noted that data protection issues possess a "global dimension".[136] The global nature of the internet can blur the traditional physical boundaries between legal jurisdictions and cause confusion when considering regulation. Carl Miller, Demos, stated that "legal jurisdiction" was coming up against "rapid technological change and globalised information architectures".[137] This issue was aptly demonstrated by a recent dispute between Microsoft and the US courts. The US Magistrate Judge James Francis in New York said "internet service providers such as Microsoft Corp or Google Inc. cannot refuse to turn over customer information and emails stored in other countries when issued a valid search warrant from U.S. law enforcement agencies".[138] The judge ordered Microsoft to hand over the contents of email stored on a server in Ireland, despite Microsoft having previously reassured global customers that their "data should not be searchable by U.S. authorities and said it would fight such requests".[139]

60. Dr Mathieu d'Aquin told us that "the organisations that will have the best ability to misuse personal data are certainly private companies, especially large-scale private companies not located in the UK".[140] It has been revealed, for example, that Facebook had been manipulating the information presented to users in an experiment to assess user's emotional reactions to being presented with posts containing positive or negative sentiments.[141] This experiment attracted controversy as it was not clear that Facebook's users had consented to take part in an experiment which "manipulated" people's thoughts and emotions.[142] Dr Mark Elliot, University of Manchester, said that Facebook's experiment was a "clear example of misuse" of data and that a "boundary [had been] crossed" by Facebook.[143] However, Facebook's Data Use Policy indicates that users' data may be shared if Facebook has:

·  received your permission;

·  given you notice, such as by telling you about it in this policy; or

·  removed your name and any other personally identifying information from it.[144]

61. According to the Information Commissioner's Office, the proposal for the EU General Data Protection Regulation (GDPR) would "extend the scope of data protection to apply to data controllers outside the EU that are processing the personal data of people in the EU, if the processing relates to offering them goods or services or monitoring their behaviour (article 3)".[145] We understand that negotiations about the contents of the GDPR are ongoing and that, in June 2014, the Council agreed a partial, general approach on limited elements of the proposal, including territorial scope and Article 3.

62. The United States has also been wrestling with where the balance should lie in facilitating new data based business and protecting privacy. In response to a request for comment concerning big data and the Consumer Privacy Bill of Rights, the Internet Association, a trade association representing internet companies such as Yahoo!, wrote:

UL@We are concerned that any legislative proposal to address "big data" may create a "precautionary principle problem" that hinders the advancement of technologies and innovative services before they even develop.[146]

The Internet Association considered that internet service users "trust" member companies (of the Internet Association) to "use their data responsibly" and that it should be sufficient for member companies to "voluntarily abide by self-regulatory codes such as the Interactive Advertising Bureau (IAB)'s Self-Regulatory Principles, Digital Advertising Alliance's (DAA) Self-Regulatory Program, and the Network Advertising Initiative (NAI) Code of Conduct, which are subject to enforcement by the FTC [Federal Trade Commission]".[147]

63. We are also aware of the "US-EU safe harbour" [sic] agreement, which operates as an "interoperability mechanism", outlining "internationally accepted data protection principles".[148] The website for the agreement says that:

    The European Commission's Directive on Data Protection went into effect in October of 1998, and would prohibit the transfer of personal data to non-European Union countries that do not meet the European Union (EU) "adequacy" standard for privacy protection. While the United States and the EU share the goal of enhancing privacy protection for their citizens, the United States takes a different approach to privacy from that taken by the EU. In order to bridge these differences in approach and provide a streamlined means for U.S. organizations to comply with the Directive, the U.S. Department of Commerce in consultation with the European Commission developed a "safe harbor" framework and this website to provide the information an organization would need to evaluate—and then join—the U.S.-EU Safe Harbor program.[149]

The Internet Association thought that the US-EU Safe Harbour agreement must "remain strong for Internet businesses".[150]

64. In our report Malware and cybercrime we noted that the UK Government has a responsibility to protect UK citizens online, in an extension of the protections that are conferred on citizens in the offline world: a responsibility the Government accepted in its written evidence to this inquiry. As the majority of popular social media platforms are head-quartered in the US, we find it essential that the Government revisit all international agreements, including the US-EU safe harbour, to ensure that they protect UK citizens. We ask that, in its response to us, the Government outlines the international agreements that currently exist where it has ensured that the data of UK citizens will be guarded as well as if it were within UK legal jurisdictions.

A Kitemark

65. One solution to the lack of international governmental agreements on data protection, discussed during our inquiry, was the use of a 'kitemark' on the contents of terms and conditions documents.[151] Professor Derek McAuley, Horizon Digital Economy Research Institute, stated that this would provide users with confidence that any particular set of terms and conditions met a "higher standard".[152] He thought that as a result, people might be able to "reflect on what they do and do not use".[153]

66. According to Carl Miller, Demos, the potential benefits of a kitemark system may include incentivising companies "to put in plain English in a few pages what the implications of people putting their data on those platforms really is" and an independent and authoritative authentication of "whether or not the terms and conditions were clear".[154]

67. One initiative that already exists to look specifically at the clarity of text on the internet is the Plain English Campaign, a citizen-led campaign launched in 1990. The organisation awards the "crystal mark", which is a "seal of approval for the clarity of a document" and currently used internationally, applying to 21,000 documents in the UK, USA, Australia, Denmark, New Zealand and South Africa.[155] The Campaign states that it is "the only internationally-recognised mark of its kind".[156]

68. The Minister was attracted to the introduction of a kitemark. He told us that the Information Economy Council during its work on terms and conditions would be engaging industry to come up with proposals for a kitemark, something he considered would be welcomed by industry.[157]

69. We consider an internationally recognised kitemark to be the first step in ensuring the responsible use of the data of UK citizens by both social media platforms and other organisations. We are pleased that the Government seems to be working toward this end and recommend that, in its response to this report, it provides a draft timetable for when proposals for a kitemark can be expected.


85   See, for example, Q18 [Carl Miller]. Back

86   Council for International Organizations of Medical Sciences, 'International Ethical Guidelines for Biomedical Research Involving Human Subjects', 2002. Available at chioms.ch. Accessed 24 October 2014. Back

87   Data Protection Act, 1998.  Back

88   SMD 013 [para 13]. Note, thie witness cites the research of Beninger et al, 'Research using Social Media; Users' Views', NatCen Social Research, February 2014. Available at natcen.ac.uk. Accessed 24 October 2014. Back

89   SMD 015 [para 17] Back

90   Q16 [Sureyya Cansoy]; Q89 [Professor McAuley] Back

91   Q15 [Timo Hannay] Back

92   The Law Dictionary, 'Terms and conditions', thelawdictionary.org. Accessed 24 October 2014.  Back

93   SMD 001 [para 7] Back

94   Q67 [Professor Preston] Back

95   Q67 [Professor Preston] Back

96   Q67 [Professor Preston] Back

97   The Internet Association, Request for Comments Concerning Big Data and the Consumer Privacy Bill of Rights (Docket No. 140514424-4424-01), published 5 August 2014, page 3. See also Tesco Clubcard, 'Terms and conditions', tesco.com. Accessed 24 October 2014. Back

98   Facebook, 'Data use policy', en-gb.facebook.com. Accessed 24 October 2014. Back

99   Q91 [Professor De Roure] Back

100   Q180 [Dr Elliot]; Q18 [Carl Miller] Back

101   Q27 [Carl Miller] Back

102   Joe Martin, 'GameStation: "We own your soul"', bitgamer, 15 April 2010, bit-tech.net. Accessed 24 October 2014.  Back

103   Joe Martin, 'GameStation: "We own your soul"', bitgamer, 15 April 2010, bit-tech.net. Accessed 24 October 2014.  Back

104   Q180 [Dr Elliot] Back

105   Q91 [Sir Nigel Shadbolt] Back

106   Q176 [Dr Kevin Macnish] Back

107   SMD 028. See also Yahoo, 'Yahoo privacy centre', yahoo.com. Accessed 24 October 2014.  Back

108   SMD 027 Back

109   SMD 027 Back

110   SMD 026 Back

111   www.facebook.com/about/privacy accessed 9 September 2014 Back

112   SMD 029 Back

113   SMD 026 Back

114   Q172 [Steve Wood] Back

115   Q172 [Steve Wood] Back

116   Q209 [Ed Vaizey MP] Back

117   Q209 [Ed Vaizey MP] Back

118   Q67 [Professor Yates] Back

119   Q19 [Carl Miller] Back

120   Information Commissioner's Office, Big data and data protection, July 2014, paragraph 106, page 33. Available at ico.org.uk. Accessed 24 October 2014. Back

121   SMD 011 [para 20] Back

122   SMD 027. See LinkedIn, 'Our transparency report', linkedin.com. Accessed 24 October 2014. Back

123   SMD 029 Back

124   SMD 026 [para 5] Back

125   SMD 020 [para 11] Back

126   Q89 [Jim Dowd MP] Back

127   Q89 [Professor McAuley] Back

128   Q89 [Professor McAuley] Back

129   Sam Fiorella, 'The Insidiousness of Facebook Messenger's Mobile App Terms of Service', The Huffington Post, huffingtonpost.com. Accessed 24 October 2014. Back

130   Sam Fiorella, 'The Insidiousness of Facebook Messenger's Mobile App Terms of Service', The Huffington Post, huffingtonpost.com. Accessed 24 October 2014.  Back

131   Facebook, 'Why is the Messenger app requesting permission to access features on my Android phone or tablet?', en-gb.facebook.com. Accessed 24 October 2014. Back

132   Facebook, 'Why is the Messenger app requesting permission to access features on my Android phone or tablet?', en-gb.facebook.com. Accessed 24 October 2014. Back

133   Q5 [Sureyya Cansoy] Back

134   Q89 [Professor McAuley] Back

135   See Q67 [Professor Preston]. Back

136   Q180 [Steve Wood] Back

137   Q19 [Carl Miller] Back

138   Joseph Ax, 'U.S. judge rules search warrants extend to overseas email accounts', reuters.com, 25 April 2014. Accessed 24 October 2014.  Back

139   Joseph Ax, 'U.S. judge rules search warrants extend to overseas email accounts', reuters.com, 25 April 2014. Accessed 24 October 2014.  Back

140   Q123 [Dr d'Aquin] Back

141   BBC, 'Facebook emotion experiment sparks criticism', BBC News Online, 30 June 2014, bbc.co.uk. Accessed 24 October 2014. Back

142   http://www.bbc.co.uk/news/technology-28051930 accessed 5 September 2014 Back

143   Q159 [Dr Elliot] Back

144   https://www.facebook.com/about/privacy/your-info  Back

145   Information Commissioner's Office, Big data and data protection, July 2014, paragraph 124, page 39. Available at ico.org.uk. Accessed 24 October 2014. Back

146   The Internet Association, Request for Comments Concerning Big Data and the Consumer Privacy Bill of Rights (Docket No. 140514424-4424-01), published 5 August 2014, page 2. See also National Telecommunications and Information Administration, 'NTIA Seeks Comment on Big Data and the Consumer Privacy Bill of Rights', ntia.doc.gov. Accessed 24 October 2014. Back

147   The Internet Association, Request for Comments Concerning Big Data and the Consumer Privacy Bill of Rights (Docket No. 140514424-4424-01), published 5 August 2014, pages 3 and 4. Back

148   The Internet Association, Request for Comments Concerning Big Data and the Consumer Privacy Bill of Rights (Docket No. 140514424-4424-01), published 5 August 2014, page 15.  Back

149   Export.gov, 'Welcome to the U.S.-EU Safe Harbor', export.gov. Accessed 24 October 2014. Back

150   The Internet Association, Request for Comments Concerning Big Data and the Consumer Privacy Bill of Rights (Docket No. 140514424-4424-01), published 5 August 2014, page 15. Back

151   The Kitemark is a registered certification mark owned and used by the British Standards Institute. References in this report are to an analogous mark.  Back

152   Q91 [Professor McAuley] Back

153   Q91 [Professor McAuley] Back

154   Qq27;18 [Carl Miller] Back

155   Plain English Campaign, 'Crystal Mark', plainenglish.co.uk. Accessed 24 October 2014. Back

156   Plain English Campaign, 'Crystal Mark', plainenglish.co.uk. Accessed 24 October 2014. Back

157   Q210 [Ed Vaizey] Back


 
previous page contents next page


© Parliamentary copyright 2014
Prepared 28 November 2014