Select Committee on Science and Technology Second Report



3.1  This Chapter provides a sketch of the development of computing over the years, and outlines some of the key concepts discussed later in the Report. If desired, greater detail on some of the technical matters is set out in Appendix 9.

Computing over the years

Early computing

3.2  Strictly speaking, a computer is any device on which computation can be performed. Throughout history, people have used tools or machines to extend their powers of mental arithmetic. The abacus has ancient origins and remains in wide use in many parts of the world.

3.3  In 1614, John Napier published a table of the logarithms he had invented. These simplified the multiplication and division of numbers by the simpler task of adding or subtracting their logarithms. He later produced an elementary calculating device based on this principle using a series of rods known as Napier's Bones. These were the precursor to the slide rules that were the constant companion of scientists and engineers before the electronic calculator.

Mechanical devices

3.4  In 1642, Blaise Pascal invented the first machine for addition and subtraction. Thirty years later, Gottfreid Wilhelm von Leibnitz improved the machine so that it could also perform multiplication and division.

3.5  The most ambitious mechanical devices were those conceived by Charles Babbage in the early 19th Century. Concerned at the inaccuracies in manually (and laboriously) calculated logarithmic and other mathematical tables, he designed in the 1820s a mechanical Difference Engine to automate the generation of such tables. He pursued this, and an even more ambitious general-purpose Analytical Engine[9], until his death in 1871. None of his machines was completed in his lifetime[10].

3.6  However, many sophisticated mechanical devices were developed for a wide variety of specific tasks — particularly for financial accounting — as well as for more general purposes. The general principle of all these devices was that an ingenious series of cogs and levers would mechanically convert a variable starting position to the desired output.

3.7  During the Second World War, the British code-breaking effort at Bletchley Park was greatly helped by the calculating Bombe designed by the British mathematician, Alan Turing, drawing on the thinking in his pre-war thesis about a universal computing machine. This device was more electrical than mechanical: the linkage through cogs and levers was replicated by the passage of an electrical current through electrically-operated switches (or relays). While effective, this machine was relatively slow.

The arrival of electronics

3.8  Taking Turing's pre-war thinking a stage further, experts at Bletchley Park and the Post Office Research Centre designed and built Colossus. Commissioned in 1943, this was the world's first real computing device using electronics. The electro-mechanical switching of the Bombe was replaced by much faster electronic switching using thermionic valves that were then in common use for radios and other applications.

3.9  Wartime and post-war secrecy meant that, for many years, the significant advance represented by Colossus was unrecognised — indeed, the machine and blueprints were deliberately destroyed at the end of the war[11]. However, the wheel was reinvented, not least by the US developers of ENIAC (commissioned in 1945 and used for calculating artillery trajectories). After a number of test machines in the late 1940s, electronic computing became a valuable part of advanced science.

3.10  For example, the Pilot ACE was built at NPL in 1950 and, for six years, was used for advanced scientific and engineering work, including aeronautics, crystallography and calculating bomb trajectories. The Ferranti Mark 1, closely based on developments at Manchester University[12], was the first commercial general-purpose computer when it went on the market in 1951. That was the same year in which the LEO I computer, developed by J Lyons and Company, ran the first routine office computer jobs.

3.11  Valve-driven computers were expensive to make, with much of the circuitry having to be built by hand. They were also expensive to operate and maintain. As their name suggests, thermionic valves need to be heated to work. The heating gave the valves limited lives and also weakened the associated circuitry. These early computers used tens of thousands of valves, and a machine might operate for only a few minutes before yet another valve replacement or circuit repair was required.


3.12  The power consumption and limited life of thermionic valves had inspired research into ways of replicating their functions in solid-state devices. This was fuelled by the increasing understanding of semiconducting materials that were beginning to be produced to the requisite purity. The race was won by a team at the Bell Laboratories in 1947, but their point-contact device was difficult to make.

3.13  Bell Laboratories went on to develop a more robust junction device, called a bipolar transistor. Subsequent development led to the field-effect transistor (FET), now the dominant type. Box 2 illustrates the structure of an FET and describes its operation. Key points are the following.

(a)  It consists of three zones of alternating types of semiconducting material, each of which has been carefully doped with other substances to make it either electron-rich (n-type) or electron-poor (p-type);

(b)  The flow of electricity through the transistor — between the source and the drain through the region under the gate — is regulated by a control voltage on the gate (hence the name, "gate").

(c)  The smaller the distance that electricity has to travel across the gate, the quicker the transistor can be operated. Transistors on modern chips are very small, with gate-lengths of about 100 nanometres (nm)[13] or about a thousandth of the thickness of the paper on which this Report is printed.

3.14  It was not until the mid-1950s that transistors were made in quantity. As they run at much lower voltages, do not have to warm up and are also much smaller, they quickly replaced thermionic valves for all but a few specialised (and particularly high-power) applications. One form of the thermionic valve that remains commonplace is the cathode ray tube (CRT) used to display the image on a television or computer monitor; only now are new technologies — liquid crystal display (LCD) and plasma screens — beginning to displace the CRT.

3.15  Early transistors were made singly and thus had to be wired into circuits one at a time. Many computers were built using individual transistors. A notable example was the Ferranti Atlas (like the Mark 1, see paragraph 3.10, developed jointly with Manchester University). When it first ran in 1962, this was the most powerful computer in the world. While its commercial impact was limited, it left an inheritance of innovations that are central to high-speed processors to this day: multiprogramming, pipelining, interrupts, spooling and paged virtual memory were all first employed on the Atlas computer.

Integrated circuits

3.16  The next breakthrough came from Fairchild Semiconductor where, capitalising on the process devised by Jean Hoerni in 1957 for creating a layered structure on a slice or chip of silicon, Robert N Noyce developed the integrated circuit in which transistors and associated circuitry were fabricated together. (A parallel breakthrough was made by Jack Kilby at Texas Instruments.) The first commercial integrated circuits[14] were made in 1958. As these greatly simplified product assembly, such chips became the standard building block for computer and other manufacturers.


3.17  For some ten years, the market was dominated by customised and increasingly complex chips. Robert N Noyce and his Fairchild colleague, Gordon E Moore, had founded Intel to pursue these opportunities. In 1971, Intel fielded a bid for 13 complex integrated circuits for scientific calculators. That was beyond what they could tackle, so they met the demand by producing a single general-purpose chip which could perform all 13 functions and more. This was the first microprocessor — the first computer on a chip. The recent and future development of microprocessor technology is considered further in Chapter 4.

Electronic computers

3.18  Nowadays, the term computer is invariably taken to mean an electronic machine. Moreover, it is also taken to mean a digital machine — that is, one which works by manipulating numbers.

3.19  In the early days of electronic computing, there were analogue machines that represented aspects of the physical world through continuously variable electric voltage. These had some strengths, but their weakness was inflexibility and a degree of imprecision depending on the accuracy with which voltage levels could be read. A digital computer represents its subject matter wholly as numbers[15]. Moreover, those numbers are managed in binary form[16], and can thus be represented with complete accuracy by transistor switches that are either off (for 0) or on (for 1).

3.20  Unless indicated otherwise, we use "computer" throughout this Report to mean a binary digital electronic machine.

Hardware and software

3.21  While it is the underlying technology that is at the heart of our Inquiry, computers are useful only for what they can facilitate. It is the applications that drive purchasers of computing power. Users should not have to think about the chip technology itself.

3.22  The first electronic computers were "hard-wired" — that is, the way the machines processed the input data was fixed by their circuitry. This was acceptable for dedicated applications[17], but changing the processing arrangements could require days of rewiring.

3.23  Alan Turing's vision in the 1930s was of a universal machine, the internal workings of which were configured by a program. Essentially, part of the input data would determine how the remaining input data would be processed, enabling the computer to be operated by the programmer in effectively unlimited ways. That vision was first realised by Freddie Williams and Tom Kilburn in the Manchester Baby machine which ran the world's first computer program on 21 June 1948. The following year, Maurice Wilkes[18] completed EDSAC in Cambridge, the first stored-program computer to operate a regular computing service.

3.24  The physical elements of a computer (as noted in the next paragraph) are known as the hardware. Programs — the electronic instructions loaded into a computer to make it perform in the desired way — are known as software. The hardware, however, is not a completely blank sheet. At the very least, a computer has to be hard-wired or pre-configured so that, on being switched on, it awaits its first instructions and will recognise what do with them. Generally, computer designers build in a range of basic operations which the software is designed to exploit.

Elements of a computer

3.25  A computer has five basic physical components:-

(a)  some means of accepting input and converting it into binary form — typically a keyboard or sensor;

(b)  memory in which to store program instructions and other data (whether as input or the result of computation);

(c)  a central processor unit (CPU) for executing the instructions and manipulating the data;

(d)  some means of producing output — typically a screen, printer or instructions to some other device; and

(e)  appropriate connections between the various components.

3.26  Our main interest is in the CPU — the device that does the actual computation — where the most advanced chip technology is found. (Chips also provide a computer's main operating memory, and also help power its input and output devices.) However, a fast CPU can make a computer fast only if it is supported by adequate memory and sufficiently fast internal and external connections.

3.27  The design of CPUs and the general architecture within which they operate are essential complements of the hardware itself. These matters are discussed further in Chapter 5.

Computing performance

3.28  At the machine level, a computer program is a series of numbers to which the CPU reacts in appropriate ways. Computers have no intelligence. They need to be given highly detailed instructions covering every single step of moving and manipulating numbers around the machine to accomplish the outcome the programmer designed the application to achieve. That computers appear clever is the result of well-designed applications and the blistering speed at which CPUs work through the step-by-step instructions — modern PCs deal with billions[19] of instructions per second.

3.29  The greater the computer speed available, the more sophisticated the applications can become. Speed can be increased in three ways:

(a)  by making the transistors and other components on the chip smaller[20] — the consequently reduced gate-length[21] means that they can switch more quickly and, because the speed of electrical signals is effectively fixed, the smaller distance between components reduces the time for signals to travel between them;

(b)  by using the greater number of components per chip to design more powerful CPUs; and

(c)  by using multiple CPUs, appropriately configured for parallel processing.

The first is achieved by improvements in manufacturing techniques as discussed in Chapter 4. The last two flow from improvements in design and architecture, discussed in Chapter 5. (The parallel processing of the last also makes specialised demands on the software, see paragraph 5.8.)

3.30  Driving CPUs at the highest possible speed provides top of the range performance. This is highly desirable for supercomputers and other computers (such as internet servers) handling large volumes of data. This is also proving essential for PCs: as users find to their cost, machines more than a few years old cannot deal with the latest applications.

3.31  However, for some purposes — particularly for dedicated computing that is built into products[22] — top speed may not be of the essence. In these cases, the miniaturisation of components means that the same speed can be achieved but at lower cost and power consumption, with obvious advantages for, respectively, sales and battery life in mobile appliances.

9   By common consent, this was the first real conception of a computer as we have come to understand it. Back

10   In 1992, to mark the bicentenary of Babbage's birth, the Science Museum built the Difference Engine No 2 to his original designs. It has 4,000 moving parts (excluding the printing mechanism) and weighs 2.6 tonnes. Back

11   The technical achievement that Colossus represented did not become public knowledge until the 1980s. Back

12   See paragraph 3.23. Back

13   A nanometre is a thousand millionth of a metre.  Back

14   Early integrated circuits were based on bipolar transistors, but FETs are now the primary transistor technology employed on computer chips. Back

15   Some parts of digital computers may still include analogue aspects - for example, to convert analogue sound signals to digital data and vice versa. Back

16   That is, in base 2 which uses only the two digits 0 and 1 - see paragraph 15 of Appendix 9. Back

17   Such as the Bletchley Park Colossus, see paragraph 3.9. Back

18   Sir Maurice remains active in the field. His written evidence to the Inquiry is on p 232. Back

19   Thousands of millions. Back

20   As noted in paragraph 4.24, the driving force for miniaturisation is more to do with reducing the unit cost of components. Increased speed is an incidental benefit. Back

21   The distance across the gate of the transistor, see Box 2. Back

22   Commonly referred to as embedded computing.  Back

previous page contents next page

House of Lords home page Parliament home page House of Commons home page search page enquiries index

© Parliamentary copyright 2002