Can you remember a milestone year, or moment in your career? Or in the broader landscape of technology? I’ve lived through a few, and perhaps we are living through such a moment right now (I believe so).
And that’s prompted me to look back on milestone years on the path to today’s computing.
Years of Transformation, the series
- Years of transformation–the prehistory of how we got to here and where we go next
- Years of transformation: 1981. The IBM PC
- Years of transformation: 1984 and the Mac
- Years of transformation: 1985, Desktop Publishing
- Years of transformation: 1989? 1990? 1991? and the birth of the Web
- Years of transformation: 1993 and Mosaic
I didn’t know it in my early teens, but something was shifting in the nature of computing, which would shape the coming decades in ways few if any could really imagine at the time.
Computers had until relatively recently been large extremely expensive capital investments requiring teams of expert operators. They processed, even by today’s standards, quite large amounts of data, but did it in batches.
First IBM’s near monopoly mainframes, then a slew of mini computers from the likes of DEC, Wang, HP and Texas Instruments. While less expensive than mainframes they still cost in todays terms hundreds of thousands of dollars.
But the invention of the transistor and integrated circuit, and Moore’s law (coined by Intel co-founder in 1965–the idea that broadly the number of transistors you could put in a given area for the same cost would double every 18 months), meant by the late 1970s the computing power of just a decade before was close to free. Suddenly that $100,000 computer (well give or take) cost a few hundred dollars.
This environment saw the explosion of microcomputers (the term itself was first used by none other than Isaac Asimov in 1956)–from the likes of Apple, Commodore and Tandy (my first computer, a local Australian Tandy TRS 80 knock-off called the Dick Smith System 80).
It was a time of experiment–in hardware (all kinds of different CPUs) and Operating Systems (all manner of DOS (Disk Operating Systems)–MSDOS was just one of many, with CP/M being the heavy hitter in this space-the reason why MSDOS, not CP/M became the OS for IBM’s new PC a few years later is worth looking up–how close we might have come to a world without Microsoft).
Then, after this flurry of innovation in the late 1970s, IBM cobbled together the IBM Personal Computer-uncharacteristically for a company know for it’s slow pace of development up til them.
At the time Apple, Tandy, Commodore, and Atari sold computers that were priced in the few hundreds of dollars, while mini computers sold for $20,000 or more (in todays terms as mentioned that’s $100,000 or more). Was this a market IBM could enter? And how could they do that at a fraction of the price of their existing hardware–and quickly too?
In an almost complete about face from their closed proprietary approach of the past (famously you didn’t even own an IBM Mainframe, you leased it from the company) IBM based their PC hardware on existing commodity hardware–like the Intel 8088 chip (Motorola’s 68000 family, that ended up not long after powering the original Macintosh, was also considered, and almost became the CPU of the PC, but it wasn’t ready for production. The IBM PC could easily have had a different OS and CPU, and Intel, and x86 architecture’s lock on computing for the following quarter century might never have happened.)
Several operating systems were supported, but at launch it was Microsoft’s PC-DOS that was available and which became the de-facto standard.
But why would the famously controlling IBM enable anyone to make a clone of their PC, by using commodity hardware and software? Well, they had (what they thought was) an ace up their sleeve.
A critical piece of the system, the BIOS (that enabled the computer to startup and the operating system to run) was kept a trade secret. Without this, even with all the other software and hardware you didn’t have a functional PC. IBM’s long running dominance of computing looked assured to remain for decades to come in this new era.
But several companies, most well-known of them Compaq, legally reverse engineered the BIOS, enabling the rise of the PC Clone–that would run (well enough) all the software written for PC-DOS, and which bought about an explosion in competition for PC sales, and a dramatic fall in the price of hardware. PCs proliferated, finding their way into workplaces, schools and homes.
1981 set the scene for computing for the coming decades. It placed Intel and Microsoft at the centre of personal computing. By today’s standards these were incredibly slow underpowered devices. They were barely ever locally networked, let alone connected to the then nascent internet.
But our age of personal computing had begun.