The HitchHiker’s Guide to the Macintosh: Part 1

Written by: Adam Christianson

Categories: Editorial

A Brief and Warped History of the Mac, part 1(in which the Mac doesn’t appear).

Welcome to the first column in my series. I would at this point like to tell you what the column is going to be about but sadly I can’t really at the moment. I did ask Adam for some guidance on topic, but he said he’d leave it up to me – which was probably a big mistake.

All I can tell you is that this column is not going to be useful. However it may be interesting, informative and occasionally controversial – though if it is please don’t send too much abuse my way.

One thing I have always found incredibly interesting about the Mac is not just how it is now, but also how we got here. Apple have revolutionised the world of computers time and time again in terms of software and hardware. Not ambitious enough for them, they’ve now revolutionised the whole music industry.

The History of the Apple company is probably well documented in scraps around the internet and several other ‘brief history’ guides out there – most of which are better written. But here is my take on the whole sordid affair – serialised over 6 parts.

However, if you’ve not read about the History of Apple, sit back – but not so far back that you can’t comfortably read your screen – and enjoy the sometimes-dramatic history of a company that has done more for technology than you might think.

Our story starts way back in 1856. It starts as most computer stories start (in England at any rate) with a man called Charles Babbage. Babbage was an intellectual performer. He would invite wealthy people to his house and try to impress them with the wacky machines he had come up with. Soon he came up with the grandest scheme of all. While all the machines he had previously come up with were designed to do a single task, he wanted to devise a machine capable of doing a multitude of tasks depending how it was configured. Babbage’s machine was known as the analytical engine and was powered by a series of highly complicated gears. Alas, it was not completed in Babbage’s time – although boffins in London’s Science Museum built an analytical engine from Babbage’s original plans in 1991.

Others had been enthused by Babbage’s idea of a ‘thinking machine’ that could handle tasks that humans found complicated. However, it took an extraordinary leap of imagine by an extraordinary man working under extraordinary conditions to take the next step.
War is always a massive catalyst for technology as scientists on either side work to outwit scientists on the other side. Around 1940, during the Second World War, the British government set up a research facility called Station X to try to decipher the codes used by the German military. It was here that Alan Turning – who many would describe as the father of modern computing – developed an electronic machine in do much of the de-coding automatically.
Alan Turing was a genius, but what should not be overlooked is the contribution of a telecoms electrician called Tommy Flowers. While Turing had the idea and could work out much of how the machine had to work and what it could do, it was Flowers who actually built the computer – called Colossus.
The machine was so successful that the intelligence that it deciphered helped win the war. Of course, all of this was top secret until fairly recently. Indeed, when in the mid-1950’s, British Telecom (then the GPO) asked Tommy Flowers to work on a computer for them, he could not say that he had worked on such a machine before.

In the following decades, it was largely efforts in America that began to capture the power of the electronic computer. Soon the bulbs used in early computers were replaced with more reliable transistors. These in turn were replaced with the revolutionary silicon microchip, invented by Jack Kilby and Robert Noyce while working at Texas Instruments in 1964.
By 1965, Gordon Moore came up with theory that in the number of transistors on a microprocessor would double every two years. This prediction is in fact freakishly accurate even today.

Computers were generally huge machines that filled rooms until the late 1960’s when Digital Equipment Corporation among others developed the mini computer. In this time devices such as CRT displays, page printers and graphing devices were developed, but people did not consider the use of computers outside of the business applications such as accounting, automated payrolls, ordering and billing.

Strangely enough however, another piece of the puzzle that I promise you will start to resolve itself into the history of Apple was developed years before this, in the mid-60’s by Douglas Engelbart. In 1970 he was awarded the patent for a wooden shell with two metal wheels. The patent outlined the use of this device for inputting x,y position data into a computer system. If you haven’t already realised, we’re talking about the first computer mouse.

In the late 1960’s, several computer companies and academic constitutions came together to try to work on a solution to the increasing problem that plagued computers at the time. Up until this point, every new and different computer that was built was unable to run the applications that had been designed to run on other equipment. This was largely due to the lack of a universal operating system that could run on a number of machines – each machine had to have it’s operating system developed along with the hardware.
The team included people from MIT and Bell Labs and the produced an operating system called MULTICS. This effort fell on its face and after some experimentation, Bell pulled out of the projects. Forced to downgrade back to an older system, a group of engineers at Bell, Ken Thompson, Dennis Ritchie, Doug McIlroy, and J. F. Ossanna, decided to come up with their own operating system. It 1969, the team set about developing what would become UNIX. With this another massive innovation was introduced in the form of the C programming language.
The History of Unix could form a whole other series of columns, but will crop up again later in our story.

This brings us to the 1970’s. It was around this time that the personal computer that we know today was beginning to take shape.
It is at this point that we can introduce the hero of our saga – if you can really call him that. He is of course Steve Jobs.
Dropping out of school, Jobs worked for Atari where he worked on a game that a lot of people might know – Breakout (or if you have an iPod you will probably think of as Brick).
To help him with this project, Jobs enlisted the help of a high school friend – Steve Wozniak. Woz had been dabbling with computer electronics and had built a small personal computer. Impressed by this, and looking for a good opportunity, Jobs persuaded Woz to start selling this machine.
Apple Computer was born on 1st April 1976 to sell these machines – the Apple I. The Apple I actually came as a circuit board and the owner had to build their own casing. Most Apple I cases were made of wood, and most looked pretty different.

Building on this success, Woz came up with a new and improved design for his computer. It was marketed as the Apple II. I’ve not been able to find out from any of my online sources or magazines, but I have a hunch that it might have been Steve Jobs that made sure the biggest feature of the Apple II – it’s plastic casing.
The Apple II could probably claim to be the first modern personal computer, because while its applications were far from personal, it was the first computer to come with the notorious beige case – designed by Jerry Mannock. It also had a built-in keyboard and remarkably can boast to be the first personal computer to offer colour graphics. These features combined with the funky striped logo designed by Rob Janov made the Apple II an instant hit. Sales escalated as various programmers created great applications for the Apple II including VisiCalc.

In 1981, IBM introduced their personal computer. Running with Microsoft’s DOS, this soon became the Apple II’s biggest competitor. The problem was that the PC’s architecture was developing at a much faster pace than the Apple II, and it was clear that Apple would have to bring out a new machine to compete with the fast evolving IBM machine. Again, the history of the PC would probably make a nice weekly column, but not on a Mac related website.

Read this column next week to discover the first Mac…

Please address all comments to
This column is written by Richard Tanner for the MacCast. Any errors or opinions included are entirely my fault and nothing to do with the MacCast.

There are 8 comments on The HitchHiker’s Guide to the Macintosh: Part 1:

RSS Feed for these comments
  1. Buckdog | Jun 03 2005 - 09:40

    Wow! This is really interesting. I am looking forward to the next one!

  2. Elliot | Jun 04 2005 - 04:47


    when is the next episode goin to come out!
    cant wait any longer!

  3. rickt42uk | Jun 05 2005 - 06:05

    The idea is that I will be writing an article on a weekly basis, so I guess the next part will go up Thursday / Friday depending when Adam is able to upload it. Glad you’re enjoying it so far!

  4. Carl | Jun 08 2005 - 02:19

    Super! more please!

  5. Bruno | Jul 16 2005 - 08:38

    This is great… maybe you should have shared the whole episode when woz built that contraption that allowed people to make free phone call from public phones… I’ve always found that to be a really great factoid!

  6. Business Card Printing | Mar 15 2006 - 01:18

    Anybody know how we get an RSS feed for this blog? I am not very tech savvy and would really like to get updated info on this blog. Thanks!

  7. Bob Cohen | Dec 28 2006 - 11:36

    Thanks for this great history, but you left out a very important person in the history of computing, Admiral Grace Hopper. She was very influential in the development of computer languages and also worked during WWII on the Mark II computer calculating navel gunnery tables. She worked on the Univac I and II, for Sperry Rand, where she developed the first compiler. She was also influential in the development of both COBAL and FORTRAN.

    Admiral Hooper would make guest appearances on the Tonight Show, with Johnny Carson. She would carry a length of wire, just under a foot long, in her purse. She would pull out the piece of wire to describe a nano second in lay terms. The distance electricity would flow in a billionth of a second. She also coined the term a “computer bug.” While working with the Mark II computer she found that a moth had jammed a relay and she called the problem a “bug”. She is also attributed to have originated one of my favorite phrase, “It’s easier to ask forgiveness than it is to get permission.” Usually I have to beg for forgiveness.

  8. webhosting | Jun 08 2009 - 07:00

    Can someone post the link for all of the episodes?