Select Committee on Science and Technology Minutes of Evidence


Memorandum by Jonathan Zittrain

  1.  My name is Jonathan Zittrain. I hold the Chair in Internet Governance and Regulation at Oxford, and much of my work focuses on PC and Internet security.

2.  The fundamental engine of digital innovation has been the generativity of both the Internet and the PCs attached to it. By "generativity" I mean the openness of each to third party innovation. Anyone once connected to the Internet can offer any service or functionality without permission from gatekeepers. Similarly, PC architecture allows third parties to introduce new code to users without the PC or operating system maker serving as a gatekeeper. With PCs and Internet together, new code can spread remarkably easily as one user after another simply clicks "install."

  3.  This crucial benefit is also the basis for threat. Users can run new and unfamiliar code from unknown sources near-instantly, and when they ask to install bad code, their machines and the data they contain can be just as quickly compromised. With the advent of always-on broadband-connected PCs, a compromised machine can become a "zombie," open to further instructions from afar, and able to execute those instructions continuously, usually completely unbeknownst to their owners.

  4.  In one notable experiment conducted in the fall of 2003, a researcher simply connected a PC to the Internet that simulated running an "open proxy," a condition in which many users' PCs can unintentionally find themselves.[1] Within nine hours the computer had been found by spammers, who began attempting to send mail through it. Sixty-six hours later the computer had recorded an attempted 229,468 distinct messages directed at 3,360,181 would-be recipients.[2] (The researcher's computer pretended to forward on the spam, but in fact threw it away.)

  5.  The US Computer Emergency Response Team Co-ordination Center statistics reflect a sea change. The organization began documenting the number of attacks against Internet-connected systems—called "incidents"—from its founding in 1988, and they are reproduced below.

  6.  The increase in incidents since 1997 has been roughly geometric, doubling each year through 2003. CERT/CC announced in 2004 that it would no longer keep track of the figure, since attacks had become so commonplace and widespread as to be indistinguishable from one another.[3]


  7.  There are several undesirable ways to address the security problem. The transformation from open, generative PC to Internet appliance is one. In the face of a major security breach, or fear of one, consumers will rightfully clamour for the kind of reliability in PCs that they demand of nearly every other appliance, whether a coffeemaker, a television set, a Blackberry, or a mobile phone. This reliability may be offered through a clamp on the ability of code to instantly run on PCs and spread to other computers, a clamp applied either by the network or by the PC itself. The infrastructure is in place to apply such a clamp. Both Apple and Microsoft, recognizing that most PCs these days are Internet-connected, now configure their operating systems to be updated regularly by the companies, often automatically. This stands to turn vendors of operating-system products into service-providing gatekeepers, possessing the potential to regulate what can and cannot run on a PC. So far, consumers have chafed at clamps that would limit their ability to copy digital books, music, and movies; they are likely to look very differently at those clamps when their PCs are crippled by a worm.

  8.  To be effective, a clamp must assume that nearly all executable code is suspect until the operating system manufacturer or some other trusted authority determines otherwise. This creates, in essence, a need for a license to code, one issued not by governments but by private gatekeepers. Like a driver's license, which identifies and certifies its holder, a license to code could identify and certify software authors. It could be granted to a software author as a general form of certification, or it could be granted for individual software programs.

  9.  The downside to licensing may not be obvious, but it is enormous. Clamps and licenses managed by self-interested operating-system makers would have a significant impact upon the ability of new applications to be widely disseminated. What might seem like a gated community—offering safety and stability to its residents, and a predictable landlord to complain to when something goes wrong—would actually be a prison, isolating its users and blocking their capacity to try out and adopt new applications. As a result, the true value of these applications would never be fully appreciated, since so few people would be able to use them. Techies using other operating systems would still be able to enjoy generative computing, but the public would no longer be brought along for the ride.

  10.  An additional incomplete fix is the dual-machine option. Consumers, rightly fearful of security vulnerabilities latent in the generative Internet/PC grid, will demand a future in which locked-down information appliances predominate over generative PCs. One may seek the best of both worlds, however, by creating both generativity and security within a single device. To accomplish this compromise, we might build PCs with physical switches on the keyboard — switching between "red" and "green".[4] A PC switched to red mode would be akin to today's PCs: it would be capable of running any software it encountered. This mode would maximize user choice, allowing participation in unanticipated applications, such as PC-to-PC telephony, whose value in part depends on uptake by other users. Such a configuration would retain a structural vulnerability to worms and viruses, however. Hence the availability of green mode, by which the computer's processor would be directed to a different OS and different data within the same hardware. In green mode, the computer might run only approved or vetted software — less interesting, but much more reliable. The consumer could then switch between the two modes, attempting to ensure that valuable or sensitive data is created and stored in green mode and leaving red mode for experimentation. A crude division such as this has the benefit of being eminently understandable to the consumer—just as a driver can understand putting a sport utility vehicle into all-wheel drive for off-roading—while retaining much of the leverage and adaptability of today's PC.

  11.  But such PCs give rise to new problems. For example, ISPs might offer a lower rate for connecting a green PC and a higher rate for a red one—presuming the green to be less burdensome for customer service and less amenable to network abuse. Corporate environments might offer only green PCs and thus limit the audience for available innovation. Or the green PC might be so restrictively conceived that most users would find it unpalatable and would thus continue to choose between traditional PCs and vendor-specific information appliances. Even to hypothesize a green PC is to ask that some way be found to determine which software is suitable for use on an open PC and which is not.

  12.  We can and should develop new technologies to underpin an open Net. A long term solution doing minimal damage to the generative capacity of the network can be found in two areas of research: development of ways to measure the Internet's overall health and the PCs that are connected to it, and development of programs or methods that allow mainstream users to make informed decisions about how they use the network. A distributed application to facilitate the awareness of large numbers of Internet-connected people about downloadable software and other relevant behavior could have a serious positive impact on the badware problem. An important advantage of such a program over other badware protection software like anti-virus software is that it can take into account the human factors in this security problem, and it can offer protection without creating new centralized gatekeepers that could reduce the overall generativity of networked PCs. Such an initiative would allow members of the general Internet public to trade simple but useful information about the code they encounter. Each would download a simple program that included a digital dashboard to display information such as how many other computers in the world were running a candidate piece of software and whether their users were, on average, more or less satisfied with their computers than those who did not run it. A gauge that showed that a piece of software was nonexistent last week but is now unusually popular might signal to a cautious PC user to wait before running it. Explicit user judgments about code could be augmented with automatically generated demographics, such as how often a PC reboots or generates popup windows. By aggregating across thousands or millions of users, the dashboard can isolate and display the effects of a single piece of code. Users could then make informed individual decisions about what code to run or not run, taking into account their appetite for risk. Its success would depend on uptake, turning users into netizens, citizens of the Net.

  13.  Such distributed solutions can work beyond badware. They can help us to detect Internet filtering by national governments, paranoid employers, or mercenary ISPs playing with "network neutrality." They can help us to break down rising geographical barriers on the Net, pushing those who still cling to notions like regional windowing of movies and other content to consider business models more in line with abundance rather than scarcity.

  14.  The information produced by the community can be openly accessible and available to all, the processes of judging applications completely transparent. The database and decision making power need not be with a profit driven company. With a distributed application making people aware of others' decisions about programs and about how they are protecting their computers, it becomes possible to harness the power of the community of Internet users to empower and inform each other so that they can decide for themselves what level of risk they would like to undertake for what sorts of benefits.

  15.  We need to bring together people of good faith in government, academia, and the private sector for the purpose of shoring up the miraculous information technology grid that is too easy to take for granted and whose seeming self-maintenance has led us into an undue complacence. Such a group's charter would embrace the ethos of amateur innovation while being clear-eyed about the ways in which the research Internet and hobbyist PC of the 1970s and 1980s are straining under the pressures of serving as the world's information backbone.[5]

21 October 2006




1   Luke Dudney, Internet Service Providers: The Little Man's Firewall. A Case Study in ISP Port Blocking (Dec. 9, 2003), available at http://www.securitydocs.com/library/1108 (discussing port blocking, packet blocking, and other methods that Internet service providers could employ to prevent the spread of computer viruses). Back

2   Id. at 5. Back

3   CERT has also noted the exploding number of incidents of application attacks as a threat as websites increasingly link webpages to company databases. The Risk of Application Attacks Securing Web Applications, SecurityDocs (Jan. 7, 2005), at http://www.securitydocs.com/library/2839. Back

4   For a preliminary sketch of such a division, see Butler Lampson, Accountability and Freedom (2005), available at http://www.ics.uci.edu/¥cybrtrst/Posters/Lampson.pdf. Back

5   For more information on these issues, please refer to: J. Zittrain, "The Generative Internet," Harvard Law Review, vol. 119, May 2006 and J. Zittrain, "Without a Net," Legal Affairs, January/February 2006. Back


 
previous page contents next page

House of Lords home page Parliament home page House of Commons home page search page enquiries index

© Parliamentary copyright 2007