Year round learning for product, design and engineering professionals

Years of transformation: 1984 and the Mac

Can you remember a milestone year, or moment in your career? Or in the broader landscape of technology? I’ve lived through a few, and perhaps we are living through such a moment right now (I believe so).

And that’s prompted me to look back on milestone years on the path to today’s computing.

Years of Transformation, the series

Last time it was 1981, and the birth of the PC. Today, it’s not long after, 1984 and the birth of the Macintosh and the GUI.

a 1980s computer in an office setting
What midjourney imagines an early Mac looked like

Apple had been a dominant player in the late 1970s early microcomputer days, with the Apple II first released in 1977 (the earlier Apple I was a strictly a hobbyist device, and preceded the Apple II by only a yea for so).

Unlike IBM’s approach of using largely third party, commodity hardware and software, Apple’s approach (with a philosophy that has been essentially unchanged ever since) was a mixture of off the shelf and custom designed components. No one was going to clone this bad boy.

But while the Apple II gained a strong foothold in education (a market position that for many years with this and subsequent products probably did more than anything to keep Apple alive) and even business users (the first spreadsheet, VisiCalc was originally developed for the Apple II) IBM’s 1981 Personal Computer came to dominate the emerging personal computer market (indeed giving them their name–until then these computers had typically been called microcomputers–hence ‘Microsoft’).

Like other micro computers, and the PC, the user interface of the Apple II was text based (what we might call command line interface).

Meanwhile at Xerox’s PARC, many of what we now think of as modern personal computing’s core facets were being developed–from laser-printing to ethernet, the mouse to, most pertinently here, the Graphical User Interface (often referred to at the time as WIMP UI’s–Window, Icons, Mouse, Pull down menus).

Xerox themselves attempted to capitalise on their research with the Alto and Star computers, though the high cost, and genuine novelty of the system contributed to their commercial failure.

But people were taking notice, including Apple’s Steve Jobs, who got “Apple engineers three days of access to the PARC facilities in return for the option to buy 100,000 shares” (you might think Xerox got the raw end of that deal, but if you do the math, and they had held on to those shares, we’re likely talking hundreds of billions of dollars now).

Apple’s first GUI driven effort was the Lisa, a project from which Jobs was ousted, despite it being named after his daughter. Ouch.

Job’s response was to take over an existing low cost text based computer project, a successor to the Apple II that the other Steve, Steve Wozniak had been working on, turning this into a low(er) cost GUI based device. Hell hath no fury…

The Macintosh, launched with the famous 1984 advertisement, made by legendary director of Blade Runner and Alien (among numerous other subsequent movies, including Gladiator) Ridley Scott, which aired just once on television, during the 1984 Super Bowl, brought the GUI to mainstream computing, reshaping what it was to interact with a computer. It was marketed as ‘The computer for the rest of us’.

While the Mac never reclaimed for Apple the dominance it had had with the Apple II before the event of the PC (but the iPhone sure did), it’s launch in 1984 was as significant or more so than the PC’s release 3 years earlier. That capitalised on existing paradigms and use cases. The Mac created whole new ones.

The Mac Introduced to the mainstream the paradigm for computing that brought computational power to non-experts, and without which desktop publishing, the Web, and subsequent revolutions we’ll cover are unlikely to have taken root.

It wasn’t until Windows 95 a decade later, that something remotely like the original Mac GUI came to be really widely used. But without the Mac and 1984, would computing have ever become mainstream?

delivering year round learning for front end and full stack professionals

Learn more about us

[Web Directions] is a delicious mix of things educational, social and mind-blowing. It’s time out from the hurly-burly to step back, get some perspective, and develop new ways forward, fortified with a whole lot of new stuff in your head.

Chris Stephens Technology Director, Mozo