Monday, April 30, 2012

A Personal History of Computer Hardware

Reading Herb Sutter's comments on changes in computer hardware ("The Free Lunch Is Over", from 2004 (http://www.gotw.ca/publications/concurrency-ddj.htm), and "Welcome to the Jungle", from 2011 (http://herbsutter.com/welcome-to-the-jungle) led me to think about the computers I have engaged with over the years.

I had fleeting encounters with computers as a university student; this was at a time when a whole university had just a handful of computers.  My first real engagement with computers was in the late 1960s when I got a summer job at a computing laboratory run by CSIRO, the Commonwealth (of Australia) Scientific and Industrial Research Organisation.  The machine was a Control Data 3200, which had (I think) 32,000 24-bit words of memory.  That is 96 kilobytes (though the byte wasn't in use then), less than one thousandth of the memory of any video card today, let alone the memory of a whole computer.  It occupied the whole of a large room, being made of discrete transistors (not integrated circuits, i.e. "chips").  Input was by punched card, one card per line of program; you put the bundle of cards in a box, and waited some hours for the program to be run, since the computer required specialised human operators.  Then you looked at the printed output, found the missing comma in your program and tried again.  The machine had four magnetic tape units (one tape held about 5 megabytes), and there was a monstrous line printer.  I think there was also a pen plotter, though I didn't use it.  As a great privilege I got to go once or twice into the machine room and actually sit at the console and type commands.

Despite all the obvious differences, the basic architecture of both the hardware and the software was remarkably similar to that which prevailed across the whole of Sutter's "Free lunch" period, 1975-2005.  There was a single processing unit, a quantity of memory (RAM), and slower but more capacious external storage, in this case provided by the magnetic tape drives.  I did some programming in assembly language, and the underlying operations that the machine carried out (load, store, add, shift, jump, and so forth) are still there, though the way these operations are carried out inside the CPU has become much more complex and there are new types of operation (I don't think there were any stack manipulation instructions then, let alone vector instructions).   The higher-level language was Fortran, and far as I remember the cycle of compile (separately for each "compilation unit"), link, load, run was the same as that still used today with languages like C++.

I went to England for further study, and encountered my first "departmental" computer, meaning that it belonged to the Mathematics Department, not the University as a whole.  It was a PDP-8 computer, the size of a bar fridge, it had (I think) the equivalent of 8 kilobytes of memory, and the program was input via paper tape.  I took a course on Lisp using this machine; it was the first interactive language I encountered, where I could change things on the fly.  Around this time I visited a friend at Cambridge University and encountered for the first time the arrangement of numerous terminals connected to a single computer.  By this time integrated circuits were being used, though the single-chip microprocessor didn't arrive until a little later.  Also hard drives were arriving, though they were the size of washing machines or bigger.

My working life was spent in University mathematics departments, so computers were always there, though often just in the background.  The system of numerous terminals connected to a single computer, probably in another building, remained dominant for quite some time.  For a while the terminals were teletypes; they physically typed onto paper.  The Control key on computer keyboards dates from the teletype era: it was used to control the teletype by, for example, advancing the paper a line (control-J), or ringing the bell on the teletype (control-G).  The resulting non-printing "control characters" are still used in computer text files.  In the 1960s a character set only held 64 characters including the control characters; there was only room for UPPER CASE letters.  When  character sets with 128 characters (7 bits) came into use, lower case letters became available, and computer output became much more readable.

The teletypes gave way to the ubiquitous green-screen monitors, 80 characters across and 24 or 25 lines deep.  What look like descendants of these can still be seen in shop checkout counters.

At some point the mathematics typesetting program TeX arrived, and we all became amateur typesetters.  Before that, mathematical typing was done by administrative staff, and it was a specialised skill, using IBM golfball typewriters.  TeX allowed the production of better-looking results than any typewriter could achieve, but it wasn't easy to use, and really only people from mathematics and related disciplines took to it.  It was and is open-source software and remains the standard method of producing mathematical documents.

The next big change was the spread of personal computers.  The first one of these I got to use was an Apple II that belonged to a friend.  I went round to his place, and he sat me down in front of the machine and then went out to do some errand.  I knew that in principle I couldn't harm the computer just by pressing keys, but I was still a bit nervous (it was expensive).  I touched a key, there was a loud bang, and the computer stopped working.  The machine was full of plug-in cards, and it turned out that a sharp protrusion on one card had managed to eat its way into a capacitor on a neighbouring card, resulting in a destructive short circuit.

The first computer that I owned myself (1985) was a Commodore 64; the name indicated that it had 64 kilobytes of memory in its small plastic box, that is two thirds of the memory of the room-filling machine of the late 1960s.  It also had an inbuilt sound synthesiser chip, and it was the only computer I have ever used that had a genuine random number generator.  Usually there is a pseudo-random number generator, a small program that generates a determinate sequence of numbers once the starting point is set, but the Commodore 64 could read the analogue noise generator circuit in the sound chip, which gave genuine physically-based random numbers.  The Commodore was much cheaper than the Apple, but it didn't have a floppy disk drive, only a very slow unit that stored data on audio cassettes.  It has been said that the Commodore 64 was the last computer that one person could understand all of; it even came with a circuit diagram.

These home computers had some of the attributes of a video game console and certainly helped the evolution of computers into multi-media machines.

In 1989 the Internet proper arrived in Australia with a satellite link from Australia to the mainland U.S. via Hawaii, and the establishment of what was called AARNET by a consortium of Australian universities and the CSIRO.  Previously there had been more local Australian networks, with international email available, though not easy to use.  A lot of the network developments happened in University computer science departments, with mathematics, physics and engineering departments not far behind.  General use outside Universities didn't start in Australia until about 1993.

At home I bought an Atari, also in 1989; I was getting involved in electronic music, and the Atari was well adapted for that.  Meantime at work workstations had arrived, desktop computers in their own right, with much better displays than the old terminals, and networked together.  A little later I got a Sun desktop computer at work.   It had 4 megabytes of memory (I think), but by default it only had an 80 megabyte hard drive.  This was nowhere near enough, and I got an additional 600 megabyte disk drive, which cost over $2000.  Twenty years later, a drive with 1,000 times the capacity costs around one twentieth of the price, not allowing for inflation.  I don't think anyone foresaw this extraordinary increase in hard drive capacity.

The Sun workstation had an additional piece of hardware that could be used as a sound card, though it was actually a general scientific data collector.  It contained a so-called DSP (Digital Signal Processor) chip, that for certain purposes was much faster than the main processor.  DSP chips are still used in specialised applications, including sound cards.

After that the World Wide Web appeared, via the Mosaic browser.  The IBM PC and clones gradually become dominant; at work they were connected to a central server, and were more likely to run Linux than Windows.  I also used a PC at home; I changed to the Macintosh in 2006.

A computing-related development that came at work shortly before I retired was the establishment of an "access grid room", essentially a well-equipped and well-connected video conferencing room allowing the sharing of specialised mathematics courses between universities.  Another development late in my working life, and one related to Sutter's comments, was the building of super-computer class machines by hooking together a network of 100 or more PCs.   Smaller versions of these clusters were within the reach of individual University departments or research centres.  I didn't have an excuse to seek access to them.

The electronic computer was born a little before I was, but stored program machines did not arrive until after I was born, the earliest electronic computers not being stored-program.  The transistor was also born shortly after I was, so the twin revolutions of computing as we know it and of micro-electronics have taken place in my lifetime.

No comments:

Post a Comment

Add to Google Reader or Homepage
Subscribe in Bloglines