A Brief and Warped History of the Mac, part 1(in which the Mac doesnâ€™t appear).
Welcome to the first column in my series. I would at this point like to tell you what the column is going to be about but sadly I canâ€™t really at the moment. I did ask Adam for some guidance on topic, but he said heâ€™d leave it up to me â€“ which was probably a big mistake.
All I can tell you is that this column is not going to be useful. However it may be interesting, informative and occasionally controversial â€“ though if it is please donâ€™t send too much abuse my way.
One thing I have always found incredibly interesting about the Mac is not just how it is now, but also how we got here. Apple have revolutionised the world of computers time and time again in terms of software and hardware. Not ambitious enough for them, theyâ€™ve now revolutionised the whole music industry.
The History of the Apple company is probably well documented in scraps around the internet and several other â€˜brief historyâ€™ guides out there â€“ most of which are better written. But here is my take on the whole sordid affair â€“ serialised over 6 parts.
However, if youâ€™ve not read about the History of Apple, sit back â€“ but not so far back that you canâ€™t comfortably read your screen â€“ and enjoy the sometimes-dramatic history of a company that has done more for technology than you might think.
Our story starts way back in 1856. It starts as most computer stories start (in England at any rate) with a man called Charles Babbage. Babbage was an intellectual performer. He would invite wealthy people to his house and try to impress them with the wacky machines he had come up with. Soon he came up with the grandest scheme of all. While all the machines he had previously come up with were designed to do a single task, he wanted to devise a machine capable of doing a multitude of tasks depending how it was configured. Babbageâ€™s machine was known as the analytical engine and was powered by a series of highly complicated gears. Alas, it was not completed in Babbageâ€™s time â€“ although boffins in Londonâ€™s Science Museum built an analytical engine from Babbageâ€™s original plans in 1991.
Others had been enthused by Babbageâ€™s idea of a â€˜thinking machineâ€™ that could handle tasks that humans found complicated. However, it took an extraordinary leap of imagine by an extraordinary man working under extraordinary conditions to take the next step.
War is always a massive catalyst for technology as scientists on either side work to outwit scientists on the other side. Around 1940, during the Second World War, the British government set up a research facility called Station X to try to decipher the codes used by the German military. It was here that Alan Turning â€“ who many would describe as the father of modern computing â€“ developed an electronic machine in do much of the de-coding automatically.
Alan Turing was a genius, but what should not be overlooked is the contribution of a telecoms electrician called Tommy Flowers. While Turing had the idea and could work out much of how the machine had to work and what it could do, it was Flowers who actually built the computer â€“ called Colossus.
The machine was so successful that the intelligence that it deciphered helped win the war. Of course, all of this was top secret until fairly recently. Indeed, when in the mid-1950â€™s, British Telecom (then the GPO) asked Tommy Flowers to work on a computer for them, he could not say that he had worked on such a machine before.
In the following decades, it was largely efforts in America that began to capture the power of the electronic computer. Soon the bulbs used in early computers were replaced with more reliable transistors. These in turn were replaced with the revolutionary silicon microchip, invented by Jack Kilby and Robert Noyce while working at Texas Instruments in 1964.
By 1965, Gordon Moore came up with theory that in the number of transistors on a microprocessor would double every two years. This prediction is in fact freakishly accurate even today.
Computers were generally huge machines that filled rooms until the late 1960â€™s when Digital Equipment Corporation among others developed the mini computer. In this time devices such as CRT displays, page printers and graphing devices were developed, but people did not consider the use of computers outside of the business applications such as accounting, automated payrolls, ordering and billing.
Strangely enough however, another piece of the puzzle that I promise you will start to resolve itself into the history of Apple was developed years before this, in the mid-60â€™s by Douglas Engelbart. In 1970 he was awarded the patent for a wooden shell with two metal wheels. The patent outlined the use of this device for inputting x,y position data into a computer system. If you havenâ€™t already realised, weâ€™re talking about the first computer mouse.
In the late 1960â€™s, several computer companies and academic constitutions came together to try to work on a solution to the increasing problem that plagued computers at the time. Up until this point, every new and different computer that was built was unable to run the applications that had been designed to run on other equipment. This was largely due to the lack of a universal operating system that could run on a number of machines â€“ each machine had to have itâ€™s operating system developed along with the hardware.
The team included people from MIT and Bell Labs and the produced an operating system called MULTICS. This effort fell on its face and after some experimentation, Bell pulled out of the projects. Forced to downgrade back to an older system, a group of engineers at Bell, Ken Thompson, Dennis Ritchie, Doug McIlroy, and J. F. Ossanna, decided to come up with their own operating system. It 1969, the team set about developing what would become UNIX. With this another massive innovation was introduced in the form of the C programming language.
The History of Unix could form a whole other series of columns, but will crop up again later in our story.
This brings us to the 1970â€™s. It was around this time that the personal computer that we know today was beginning to take shape.
It is at this point that we can introduce the hero of our saga â€“ if you can really call him that. He is of course Steve Jobs.
Dropping out of school, Jobs worked for Atari where he worked on a game that a lot of people might know â€“ Breakout (or if you have an iPod you will probably think of as Brick).
To help him with this project, Jobs enlisted the help of a high school friend â€“ Steve Wozniak. Woz had been dabbling with computer electronics and had built a small personal computer. Impressed by this, and looking for a good opportunity, Jobs persuaded Woz to start selling this machine.
Apple Computer was born on 1st April 1976 to sell these machines â€“ the Apple I. The Apple I actually came as a circuit board and the owner had to build their own casing. Most Apple I cases were made of wood, and most looked pretty different.
Building on this success, Woz came up with a new and improved design for his computer. It was marketed as the Apple II. Iâ€™ve not been able to find out from any of my online sources or magazines, but I have a hunch that it might have been Steve Jobs that made sure the biggest feature of the Apple II â€“ itâ€™s plastic casing.
The Apple II could probably claim to be the first modern personal computer, because while its applications were far from personal, it was the first computer to come with the notorious beige case â€“ designed by Jerry Mannock. It also had a built-in keyboard and remarkably can boast to be the first personal computer to offer colour graphics. These features combined with the funky striped logo designed by Rob Janov made the Apple II an instant hit. Sales escalated as various programmers created great applications for the Apple II including VisiCalc.
In 1981, IBM introduced their personal computer. Running with Microsoftâ€™s DOS, this soon became the Apple IIâ€™s biggest competitor. The problem was that the PCâ€™s architecture was developing at a much faster pace than the Apple II, and it was clear that Apple would have to bring out a new machine to compete with the fast evolving IBM machine. Again, the history of the PC would probably make a nice weekly column, but not on a Mac related website.
Read this column next week to discover the first Macâ€¦
Please address all comments to firstname.lastname@example.org.
This column is written by Richard Tanner for the MacCast. Any errors or opinions included are entirely my fault and nothing to do with the MacCast.