A potted history of the Personal Computer

Andy Cormack this week looks at the history of the personal computer – a device that we take for granted, yet over the last forty or so years has revolutionised the way in which the human race works, plays, communicates and interacts.

Andy looks at the beginnings of the personal computer, drags up a few good and bad memories, as well as some surprises along the way, where we have been in terms of computing, as well as looking where we might be heading.

In the beginning

To talk about the origins of the Personal Computer (PC), we first probably need to talk about electronic computers in general; these were the very foundation that paved the way for the modern household PC we take for granted every day and many other innovations along the way.

What exactly is a computer?

If you distill the term computer down to its core essence, outside of the bounds of modern electrical computers that we think of today, you can trace its roots right back to manual counting devices such as the abacus, simply a tool to aid in counting and basic calculations.

Of course, things have moved on significantly over the literal millennia since the first confirmed appearance of the abacus, dating back to roughly somewhere around 2700 – 2300 BC, created by the Sumerians, albeit in a relatively different form due to the sexagismal (base 60) number system they used, as opposed to the decimal (base 10) number system in use today.

One of the largest strides in the advancement of computers came about in the early years of the Industrial Revolution, which began in the early 1760s in England. The reason this era was so important to computers is that it during this time we saw the beginning of mechanical devices being utilised to automate long, repetitive, and / or complicated tasks – one obvious example here being the invention of the weaving loom.

Of course, no article about the origins of computers would be complete without the inclusion of Charles Babbage, a renowned mechanical engineer and polymath from London, England. Commonly referred to as the “father of the computer” for having originated the concept of a programmable computer, Babbage first designed the “Difference Engine”, a mechanical calculator designed for the calculation of polynomial functions to aid in navigational calculations.

After the Difference Engine, Babbage realised he could make a more generalised model that wasn’t just useful for specific polynomial functions. Thus his next creation, the “Analytical Engine” was born. This was a significant milestone on the road to the modern computer, as this machine took input in the form of punch cards, a method that was already being used for mechanical looms and served to direct the machine in various ways. The machine would then output its results via a linked printer.

From here, things progressed on towards much more complex analogue electrical machines designed for large calculations in the early 20th century, and then from there to the first digital calculating machines that were developed during World War II. This then lead to the first computer as we would think of it in a modern context.

Definition of the Personal Computer

Now that the very origins of modern computing have been discussed, we can start to consider the evolution of the Personal Computer, from its infancy, all the way to today and beyond.

Before we delve further, there are wildly differing opinions from various sources as to what key factors define a PC, as opposed to a more generic computer. A great deal of confusion can be ascribed to the term PC, due to its descriptive title. The term PC has traditionally been associated with IBM – this is due to the earlier years of personal computers when the main competition for the market was between IBM and Apple. Branded the “IBM-compatible PC”, these computers were originally built with Intel CPUs and an operating system of DOS, MS-DOS, or Windows. Conversely the Apple branded computers, the Macintosh, typically used Motorola CPUs and an operating system built by the company themselves. For the most part those sources do agree on a couple of things:

A PC is a computer intended to be operated by a single user at a time.

Before the advent of the personal computer, computers were designed for, and only really affordable by, companies, and since companies typically have many employees, they were all designed with multiple terminals attached to one large computer with the users sharing the resources.

Traditionally a PC is thought of as a personal computer that is not made by Apple, since the long standing previously explained difference in branding, thus even though a Macintosh is a personal computer, it is not technically a PC.

There are some other factors you could ascribe to personal computers, albeit fairly inconsistently, such as the ability for the consumer to afford the computer being a factor, however since PCs these days can range wildly in price and power this will be rather pointless with arbitrary cutoff brackets based on estimates we’ve made.

All this being said, we’re somewhat splitting hairs here, although the contraction of PC has been used throughout this article, we are referring more to the term personal computer, which would definitely include Apple’s line of computers. So with that clarification out of the way, the definition of the term “personal computer” is incredibly broad and spans a wide array of devices from its inception until now.

Back to the future

Going by the extremely broad definition of a computer operated by one person, the Electronic Numerical Integrator and Computer (ENIAC), being among the earliest examples of an electronic computer that was general-purpose in nature, would certainly fit. Completed in 1946, this incredibly early example was originally built for the purpose of calculating artillery firing tables for the US Army, but despite this some of its first programs included a study of thermonuclear weapons and their feasibility.

The speed of this computer was unparallelled compared to rivals at the time, clocking in at speeds in the order of one thousand times faster when compared to electro-mechanical machines. The general-purpose nature of the ENIAC combined with its raw power also caught the attention of many, especially scientists and industrialists looking to leverage its computational power for other purposes.

Courtesy of Magnus Manske

Moving on quite a number of years, we reach a point where there are quite a number of contenders in the late 1960s through to the mid to late 1970s that could follow on as a more true sense of a personal computer, and the point is still widely contested, with sources varying greatly in their choices for the first personal computer. The Computer Museum in Boston, Massachusetts, which opened its doors in 1979, held a competition in 1986 which attempted to answer this question. The judges of the competition eventually accepted the Kenbak-1 as the first personal computer, which is probably as close to any kind of “official” answer as we’re likely to get.

The Kenbak-1 – The First “Real” PC

The Kenbak-1, created by John Blankenbaker at the Kenbak Corporation, was completed in 1970 and went on sale in early 1971 for $750 USD. The Kenbak-1 is a rather obscure machine, with only 50 units having ever been built total, and production ending in 1973 when the company went under and was bought by CTI Education Products Inc. at which point the Kenbak-1 was rebranded as the H5050 with similar failed sales results.

The key thing that sets the Kenbak-1 apart from most personal computers to succeed it is that it was created before the invention of the microprocessor, so the machine didn’t have one chip as a central processing unit (CPU) but instead utilised a number of Transistor-Transistor Logic (TTL) chips. The machine was 8-bit with an instruction cycle time of 1ms, which is roughly equivalent to 1MHz but often performed slower than that 1ms time due to slow memory, and the amount of memory was just 256 bytes. The system was programmed purely in machine code using the buttons on the front panel, with the output in the form of a row of lights.

Xerox Alto – A Leap Forwards

The Xerox Alto, named after it being developed at Xerox’s Palo Alto Research Center (PARC), and created in 1973, was perhaps the birth of the graphical user interface (GUI), later serving as the inspiration for Apple’s Macintosh and Microsoft’s Windows operating systems. Despite this, the Alto was a demo project that was never put on sale due to the expensive parts that comprised the system and would have to be prohibitively expensive to compensate.

BASIC and the Desktop Computer

Also that same year in 1973, Hewlett Packard ushered in a range of fully Beginner’s All-purpose Symbolic Instruction Code (BASIC) programmable computers that all fit on a desk, including a keyboard, a small single line display, and a printer.

This was also closely followed by the Wang 2200 by Wang Laboratories, another BASIC programmable computer complete with a built in Cathode Ray Tube (CRT) monitor and cassette tape storage. The Wang 2200 sold roughly 65,000 systems worldwide over its lifetime and found most of its use in small and medium-size businesses.

Apple I and Apple II

Steve Wozniak and Steve Jobs produced and sold the Apple I computer circuit board in 1976, a fully prepared kit-style board containing around 30 chips. Although the system was originally created as a kit board, it differed from most other hobby kit boards at the time.

The pair received their first order of 50 Apple I computers from Byte Shop, under the condition that they were fully assembled and tested systems, not kit computers. The reason being that Paul Terrell, owner of Byte Shop, wanted to sell computers to a wider audience than the usual experienced electronics hobbyists that the kit boards attracted, not requiring the consumer to be knowledgeable enough to solder their own boards and assemble one. Despite this, the Apple I was still technically a kit computer due to it being delivered to Byte Shop without a power supply, case, or keyboard.

In June 1977 its successor, the Apple II, often just referred to as the “Apple” was released. It was a significant advancement over its predecessor, standing tall on the pioneered foundational features that the Apple I had brought to market. The Apple II was arguably one of, if not the, first commercially successful personal computers, which also lead to the launching of Apple as a company and continued to sell, relatively unchanged, into the early 1990s.

The IBM PC

800px-IBM_PC_5150The IBM 5150, or more commonly, the IBM PC is the personal computer that launched an entire line of hardware termed “IBM PC compatible”. Originally launched in August 1981 after being developed in just a single year, the company’s fastest ever hardware development time up until that point: and by dint of design, also easily it’s biggest marketing mistake.

Prices started at $1,565 USD for a base model with 16 kilobytes of RAM, a Color Graphics Adapter, and not a single disk drive. This base model was aimed at home users, attaching to a tape cassette player and a TV in order to lower the price by not selling a dedicated monitor or floppy disk drives. IBM had very intentionally priced its PC models to compete with Apple and other rivals of the time, with the company publicly stating that the pricing “invites comparison”. Other models released upped the price but added more features, a more “typical” system was one with 64 kilobytes of RAM, a floppy disk drive, and its own monitor that was priced at around $3,000 USD.

Although the original 5150 model’s motherboard only supported up to 64KB of RAM, later revisions allowed higher capacity RAM, resulting in 256KB of RAM, or even up to 640KB if you were to include RAM expansion cards.

The operating system that IBM were using at the time, PC-DOS, was originally an IBM branded version of Microsoft’s MS-DOS until 1993 when IBM and Microsoft went their separate ways, and IBM then released PC-DOS 6.1 in June of that year which was separately developed after Microsoft’s last release while in partnership with IBM, MS-DOS 6. The problem was that PC-DOS was not made available in cassette form, so this base model that didn’t have a floppy disk drive only had Microsoft’s BASIC environment to fall back on because it was built in to every PC.

Attack of the clones

Because of the design of the IBM PC, the majority of its components were off the shelf.  That expediency in design for IBM meant that it had a product it couldn’t effectively copyright as its own design.  It’s competitors and other companies quickly realised this, and made their own cloned versions.  These clones were not only IBM compatible, they were also then modified and enhanced at a rapid rate, so much so that IBM then had a fight on its hands to keep up with the competition.

Let’s delve into some of the more significant ones that made it to market after the release of the IBM 5150.

First up we’ve got the Compaq Portable which was released in March 1983 for around $2,995 to $3,590 USD depending on whether you got the single or double disk drive option. While certainly not the first portable computer to exist, it was one of the first 100% IBM PC compatible systems following the launch of the 5150. In terms of portability, the entire system weighed in at around 28lb (13kg) and folded up into a form that was carryable. This computer actually prompted IBM to make their own portable system to compete, and thus launched the IBM Portable in February 1984.

Next we have the Olivetti M24, also launched in 1983, which sported the more powerful Intel 8086 CPU running at 8MHz as opposed to the 5150’s 8088 CPU that ran at 4.77MHz. The M24 was also very compatible with the IBM PC, and came with an enhanced CGA video card which was not only capable of the more typical 200 line video modes of the day but also sported the ability to run at 640×400 resolution 2 colour mode, basically doubling the lines to 400. The M24 also had AT&T and Xerox rebadged versions; both companies had bought the rights to sell rebranded versions of the same system and was AT&T’s first foray into the IBM PC compatible market when it launched the AT&T 6300 in June 1984.

Courtesy of Marcin Wichary
Courtesy of Marcin Wichary

A little further forward in 1985, and we have the Toshiba T1100, which was an early laptop computer with system specs comparable to the IBM 5150 and released at a price point of $1,899 USD. It sported the same speed (4.77MHz) CPU, albeit a newer model, 256KB of RAM with room for 512KB, an internal 3.5” floppy disk drive (but no hard drive), and a monochrome LCD display running at a resolution of 640×200.

Finally the PC1512 from Amstrad, released in 1986, which was both Amstrad’s most IBM PC compatible computer up to that point, and also one of the first relatively cheap PCs to have released in Europe for just £499 GBP. It was heavily marketed towards home owners due to its pricing and sold well. The specs for the 1512 were fairly impressive for its price also, with 512KB of RAM (expandable to 640KB via an expansion pack known as a “top hat”), an Intel 8086 CPU running at 8MHz, up to two 5¼ inch floppy disk drives and an optional 10 or 20MB hard disk drive. It also included a CGA compatible video out with the possibility to use all 16 colours in 640×200 resolution.

These companies, and others like Opus, Dell, and Time, all latched onto the PC and gave it their all.

The home front

On the home front, the home computer was also on the rise.  Smaller than the IBM PC and its clones, these were designed for education, entertainment and home computing.  Marketing machines for business was one thing. Companies started to look at other areas where the PC would excel – lifestyle and the home, manufacturing and gaming.  Lots of hybrids and nuanced developments began to spring up, creating a number of new business opportunities around computing and the PC.  But there were lots of budget options as enthusiasm for the home computer exploded and people started to dip their toes into the water.

Sinclair and the ZX Series

Sinclair Research, originally named Science of Cambridge Ltd, a UK based company, produced a series of computers in the early 1980s, the ZX80 in 1980, the ZX81 in, you guessed it, 1981, and the ZX Spectrum in 1982. While the ZX80 shipped just 100,000 units, its popularity rose significantly due to its affordability, coming in a build it yourself kit form for just £79.95, and a ready to use built form for £99.95. It was such a popular device that there were sometimes waiting lists that were several months long via mail order.

By March of 1981 Sinclair had produced its successor, the ZX81. Also designed to be an affordable personal computer for the home, the sales of the ZX81 skyrocketed after the previous popularity of the ZX80, selling some 1.5 million units before it was discontinued in 1984.

Then in April of 1982 came the ZX Spectrum, roughly 5 million computers were manufactured in the Timex factory in Dundee, Scotland over its lifetime of 10 years and was named the Spectrum due to it being the first in the series of computers to include colour output. The Spectrum was released in eight different models, ranging from an entry level RAM size of 16kb up to 128kb  with a built in floppy disk drive that was introduced in 1987.

Commodore 64

Following hot off the trail of the ZX Spectrum’s success, Commodore Business Machines (CBM) created the Commodore 64, America’s equivalent of the highly popular and affordable home computer in terms of success and mass adoption, selling a staggering amount of units over its lifetime, coming in between 12.5 million and 17 million depending on which sources you believe and earning its place in the Guinness World Records as the highest selling single computer model of all time.

The Commodore 64 dominated the market, even outselling the likes of IBM, Apple and Atari in terms of units sold. Part of its success can undeniably be attributed to it being put up for sale in normal retail stores, not just electronics or computer hobbyist specialist stores where less consumers would pass by one.

The Amiga 1000 and Atari ST

In July 1985, the Amiga 1000, or more simply referred to as the “Amiga”, also created by Commodore, went to market. This was the first computer in the Amiga line and was quite revolutionary in a number of ways.

Firstly it included a Motorola 68000 CPU, a 16/32-bit processor, a relatively powerful processor for its time. The reason it was sometimes called simply a 16-bit processor was due to the fact that data buses were indeed 16 bits wide, however it can also be considered 32-bit because its registers were 32 bits wide and most arithmetic instructions on the processor supported 32-bit.

Secondly the system boasted one of the most advanced graphics and sound systems of its day, allowing for up to a 12-bit colour palette (4096 colours), along with 256kb of RAM that was upgradeable with a memory module to 512kb.

Lastly the operating system that the Amiga ran on this system featured windowed multitasking, rare at the time and quite a significant leap forwards for computer user interfaces.  It’s most significant rival was the Atari 520ST-FM, comparable in price and equally as popular.

BBC B and Acorn

The BBC Microcomputer System was a whole series of computers made by Acorn Computers Ltd under the BBC’s Computer Literacy Project. Originally the BBC created the project’s TV shows and reading material, with then a bidding competition taking place to land a contract with them for the hardware side of the project which Acorn won with the Proton, a prototype whipped up on short notice to compete for the project and a successor to the Atom, once the bid was won the Proton was dubbed the BBC Micro.  Their rivals were Sinclair, with a version of the ZX Spectrum.

OLYMPUS DIGITAL CAMERA

Due to the BBC’s project and backing, this system went on to be sold, in a variety of models, to most schools spanning the UK, and due to this exposure across the country also netted some success as a home computer as well, despite its cost of between £235 and £335 depending on the model.

Acorn also used these systems to develop the ARM architecture which, as many may know or at least have heard about: it is the basis of the systems architecture for the majority of mobile phone devices in the world today.

Apple Mackintosh

Steve Jobs, co-founder of Apple, introduced the Macintosh to the world in January 1984, after some 4-5 years of development. The original Macintosh computer marked Apple’s first mass-market computer to feature a Graphical User Interface (GUI) and a mouse input device.

Both the Commodore 64 and IBM PC and its various clones were already dominating the market by the time the Macintosh launched. Combined with the relatively expensive price point of the Macintosh it had an uphill battle to climb in sales.

Despite all of this, the system was still relatively successful among both education and desktop publishing users, solidifying Apple’s position as the second largest PC manufacturer for numerous years, until improvements in the Windows platform with Windows 3 and subsequently Windows 95 overtook the market share from the more expensive Macintosh systems.

After the very public firing of Steve Jobs in 1985, just over a year after the release of the Macintosh, it seemed like ousting Jobs was a good move. His recklessness and impulsiveness was seen as a hindrance to the Fortune 500 company that Apple had become. For a while at least it seemed to go well, sales had regained momentum and the new leadership had been receiving positive press.  John Sculley was now at the helm of Apple and for a while at least, things went well.

By 1988 however, things took a turn. Apple increased its prices to the point that Macs were no longer selling, and newly released models were both prolific and confusing to the consumer. Topping this, in 1993 when Sculley announced to the world that Apple had released the first ever modern mobile device, a Personal Data Assistant called the Newton, it received terrible reviews and even worse performance and sales.  Compared to devices such as the Psion Organiser and its replacement the Series 3, IBM’s Simon and the Nokia 9000 communicator, it simply didn’t catch on. He was soon replaced.  Even these devices, as good as they were were technological flashes in the pan – the PDA died out as soon as mobile phones took on the same functionality and overtook them.

Subsequently the company had multiple missteps, replacing CEOs multiple times and trying many things to right the ship with shares and sales plummeting. By 1997 the company had just a 3.3% computer market share, and its share price was down to just $14. The current CEO at the time then made the move to buy NeXT, the company that Jobs had created after being fired, getting Jobs in the buyout.

Just months later, that CEO was also removed from the company and the CFO at the time, the next in line for company leadership, announced they’d search for a new CEO and that Jobs would serve as a board advisor. Contrary to Apple’s problems, Jobs was having a much better year in 1997; he’d bought computer animation company Pixar nearly a decade earlier and was now revelling in the incredible success of its first film, Toy Story, which was released in 1995. On top of which, with his startup company NeXT being bought by Apple for a tidy sum, and the company floundering without a CEO to lead the ship, Jobs just slyly slid back in control again.

Jobs brokered a truce with Microsoft, and convincing them to invest $150 million in non-voting stock as well as continuing to produce their suite of Office software for Mac. He then brought on board Jonathan Ive to head up design and Tim Cook to lead manufacturing. Two powerful allies that would comprise his team that have run Apple ever since.

In 1998, Apple’s fortunes had drastically improved. Jobs had steered Apple into consolidating its multiple desktop models into just one, the iMac G3. The G3 became a tremendous success, revitalising the Mac brand and became one of the major contributing factors to the turnaround of Apple’s woes in the decade or so prior.  It was also seen, thanks to Ive’s design, as an instant classic, very much of its time.

By Michael Gorzka [CC BY 2.0 (http://creativecommons.org/licenses/by/2.0)], via Wikimedia Commons

By 2007, the mobile phone market had started to take off, with a relatively large percentage of the population owning one, and Apple had set itself up to take over this market too, releasing the first generation of iPhone in June 2007, running the iOS mobile operating system, and basically blowing all competition out of the water, and indeed the concept of what a mobile phone is and what it was capable of.  Owning an iPhone for many was a must have, a lifestyle accessory and a weapon of choice.

This would then spawn numerous competing devices, not least of which was the now market share dominant Android devices, an open source platform developed by Google which it partnered with multiple mobile hardware manufacturers to bring phones of varying capabilities to market, giving consumers a much wider amount of device options.

The rise of processing power

The main processor on the original version of the IBM PC was the Intel 8086. With the release of the Intel 8086 CPU and its variant the 8088, Intel had begun solidifying an extensible, backward-compatible instruction set, later termed x86 due to the many successors to the 8086, including the 80186, 80286, 80386, and 80486.

Numerous additions to the instruction set had been made over the subsequent years, with near unwavering backwards-compatibility along the way. As of 2017, desktop and laptop computers are still predominantly powered by x86 based CPUs.

The success of the x86 architecture that has prevailed until this day, all started back with the 16-bit 8086 back in 1978. Running a clock rate of between 5 – 10 Mhz, from there Intel went from strength to strength with improvements to the instruction set that lead to the 80286, 80386, and 80486: more commonly known as the 286, 386, and 486. The clock speed determines the speed of processing power for the PC.  Over the years, these have increased spectacularly.

cpu-564771_1920With clock speeds ranging from 6 – 25 MHz, 12 – 40 MHz, and between 16 and a theoretical 150 MHz respectively, AMD then made competing versions of the 386 and 486 CPUs in the form of the Am386 and Am486, although Intel beat AMD to market by multiple years in both cases. AMD did however end up selling its processors at a lower price, a trend that would continue for quite a long time.

Due to a multitude of improvements to the CPU between the 386 and 486, the performance of the 486 was considered roughly twice as fast for the same clock rate as its 386 predecessor, mostly due to a host of technical feature improvements I won’t get into too deeply here, such as on-chip instruction cache, data cache, and FPU and a better bus interface.

By 1993, Intel had produced the first of its Pentium lines of CPUs. its architecture was being dubbed P5, along with the Pentium MMX in 1996 which included the MMX instruction set, with larger caches and multiple other small improvements. These chips ranged in speed over the years, with the original P5 models ranging between 60 and 66MHz, up to the later models in 1996 running up to 233MHz.

Following on from that, is the Pentium 2, 3, and 4 series of CPUs, spanning the years between 1997 with the Pentium 2 until the later years of the Pentium 4, as far back as 2008. Clock speeds along this obviously large time span increased dramatically, with the first Pentium 2 CPU running at between 233 and 450 MHz, while the later Pentium 4 lines spanned up to a staggering 3.8 GHz, orders of magnitude higher performance in just over a decade.

From there, AMD rose from being the underdogs to the ones on top for a little while, with their Athlon 64 series in 2003, the first consumer grade true 64-bit processor. Clock rates have slowly but surely been becoming less at the forefront of how to measure the performance of a CPU, evidenced as far back as the 486’s advances over the 386 which doubled the performance for the same clock. The Athlon series certainly proved this, with it summarily beating Intel’s Pentium 4 series of processors, while also including a number of feature improvements while still having similar clock speeds to its rival.

Intel, having had a less than stellar time with its Pentium 4 series of processors, and then having their main competitor release a competing product that stole their number one spot, proceeded to power through to produce their Core 2 series of processors in 2006, not only was this a 64-bit CPU to rival its competitor, it was also the first consumer grade processor with multiple cores. Again, these processors barely made any headroom in the clock speed numbers, with the highest clock out of the box coming in at 3.33 GHz, though as previously stated this was far from the only performance indicator for CPUs anymore, and in fact outperformed AMD’s processors quite handily even on single-threaded tasks that didn’t take advantage of the second core.

From then until now, Intel has mostly maintained its standing at the top, with consistent improvements to its lineup over the years. More cores, better performance, improved feature sets, better and more caches; the list goes on. While AMD has seen fit to be content with providing consumers with cheaper alternatives that don’t quite perform on par with its competition, it certainly has the favour of the budget conscious individual.

In an article written a couple of years ago, Phoronix had dug up some relatively old CPUs from the early 2000s to compare it to the rest of their more modern hardware running in the office at the time. Results are to be expected if you’ve been taking in what the rest of this article has been saying. The performance difference even in a decade is almost insurmountable, with numbers ranging so widely in just a few years that it’s somewhat hard to believe. You can check out all the benchmark results in the article.

Not only has sheer horsepower as it were been improved so vastly, but also power efficiency. Taking a look at the power consumption of the old Pentium 4, compared with a more modern Core i3 processor shows startling results, as the Core i3 not only outperforms those older processors by miles but with just 1/5th of the power draw.

Storage space, miniaturisation and cost

Storage space has been something that again has grown exponentially since the early days of computing, and whenever there was a lull in storage space increases, there was instead either a decrease in physical size or a decrease in price as the technologies behind them became less new and easier to manufacture cheaply.

Consumer grade hard drives have risen to somewhere around the 10TB (10 terabytes = 10,000 gigabytes) size currently, though perceivably a little pricey, coming in at over £300 as of the time of writing. That being said, just a few years back we were giddy enough at the prospect of 4TB and before that the thought of even 1TB was far fetched.

If we dive right back to the beginning of data storage on PCs, many typically didn’t even have internal storage at all, but just loaded and ran data from floppy disks. By 1986 the most you could generally hope for was 10 – 20MB, which by modern standards is so laughable it’s almost impossible to consider its usability, especially when you factor in that modern operating systems like Windows 10 and Apple’s OSX take up gigabytes worth of space. How could so little do so much? Well back then so little did “so much” because “so much” wasn’t all that much by today’s standards.

If we were to go back to the roots of hard disk drives, we’d have to go back much further to IBM in 1956, when they announced the first one, the RAMAC 350. It was made of fifty, yes FIFTY 24 inch disk platters, weighing in at near a ton, and only stored 5MB of data, one average sized MP3 by today’s standards if you’re lucky and considering the materials used, it was obviously far from cheap, coming in at a whopping $1,000 USD per megabyte.

From there the technology was slowly shrunk to accommodate a wider market. By the early 1980s you had things like the RA80 that could store around 12MB on a 31 sector platter; by the 1990s this had started creeping into the first gigabyte drives such as the IBM 0663 “Corsair” from 1991, which weighed in at 1,004MB. Things start really kicking up a notch after that, with a 500GB hard drive being released in 2005 by Hitachi, the Hitachi GST, this is just two years after IBM sold its disk drive division to them, and was also around the time that the SATA standard was created.

After 2005 the sky really was the limit. The first gigabyte drive, also being made by Hitachi, just two years later in 2007, at which point competitors started the real storage capacity arms race, with Seagate and Western Digital both rising to take the crowns for the first 1.5TB a year later and the first 2TB the year after that in 2009 respectively.

To sideline from more traditional hard disk drive storage media, let’s take a look at miniaturisation. With the advent of USB sticks and memory cards, suddenly physical data storage had reached a new era of portability.

In 1987, Toshiba announced the very first NAND storage a technology that, while small, also retained data even without power. Obviously at the time that level of storage was incredibly tiny, but it was a proof of concept that would lead to greater things.

By 1995, they had released a 40MB memory card so that in the the new age of digital cameras, these could be used to store images without the need for film, and these sizes grew exponentially year after year, to the point where these days you can store hundreds of gigabytes of data on a microSD card the size of your thumb nail.  A simpler way of looking at this is the average storage size of a USB stick ten years ago.  In 2007 the average storage size was 64MB.  In 2017 it is 64GB: in short, it has increased on average 1000 fold.

The NAND technology that spawned all these USB sticks and memory cards also had another effect – the switch in focus for many disk drive manufacturers to convert to the same idea, eliminating the moving parts from hard disk drives entirely. Thus, the SSD (Solid State Drive) was born, a more compact and lightweight hard drive without the metal platters of old, initially of course this technology was incredibly expensive for even small amounts of storage – even getting a 128GB SSD when the technology was fairly new was going to set you back quite a pretty penny, but as with all technology, methods became more efficient and cheaper, the market began to become flooded with competitors and better models, and so today you can pick up a 1-2TB SSD for between £200 and £400 GBP.

The iPhone, Android, and tablet computers

With the release of the iPhone in 2007, Apple had revolutionised the public’s perception of what a mobile phone was, they had transformed it from a device predominantly centred around the basic voice and text communications we’d come to expect, into a multi-functional device, capable of so much more, that one could argue that it was a form of personal computer in its own right. It certainly fits the definition.

By the end of the same year, in November 2007, Google announced that they were releasing the Android operating system, totally open source technology, easily proliferated and completely free to anyone that could make use of it or change it to suit their needs. This resulted in many manufacturers either partnering with or making smartphones using the Android platform, and just a year later in November 2008 the G1 Android smartphone was launched. Though it lacked some touchscreen capabilities that the iPhone had, it made up for it with a sliding hardware keyboard.

At this point it bears mention that Microsoft had their own share of the smartphone line up until this point. Dubbed Windows Mobile, and market sandwiched ibetween the iPhone and Android phones, they threw in the towel on that product line after they realised they just couldn’t compete with the two. Microsoft later went on to develop their new platform, the Windows Phone.

By January of 2010, Apple released the first iteration of the iPad, a tablet computer with similar touchscreen tendencies to that of its iPhone roots and running basically the same operating system. Tablet computers were far from a new concept, though most offerings before the iPad were clumsy at best, with middling support and lacking features. It was around this time that Android phones began to catch up to iPhones in feature parity, with proper multi-touch screens, resulting in a gaining traction on the market share at just under 10% as of April that year.

The Android platform yet again follows suit with competition to Apple with its own tablet offerings, beginning with the Samsung Galaxy in September 2010. The very next month, Microsoft announces their new Windows Phone product line, mostly to fairly meagre initial sales and a somewhat tepid response from consumers.  Amazon also entered the foray with its Amazon Fire tablet, still going strong today.  They also produced the Kindle, which was a form of e-book.  Like a tablet, but for book readers everywhere, its main function was to store e-books.  You could now take your entire library anywhere to read, on demand, when you wanted to read and unwind.

Android phones have grown into a behemoth of the smartphone market, commanding a 43% market share, with Samsung, an Android smartphone manufacturer, taking the top spot as the largest smartphone vendor by far to date. The smartphone battle continues to this day, with the main competitors constantly trying to outdo the others in an attempt to shake up the top spot. Smartphones are part of a large majority of the world population’s daily lives, a global phenomena that puts the power of a personal computer in the palm of your hand, a truly magnificent invention that only furthers personal computing as a whole.

The rise of the Internet and Dial Up

While the origins and minutiae of how the internet was born is beyond the scope of this article, its rise and predominance over the activities carried out on personal computers (See Starjammer Bulletin’s article, Twenty five years of the World Wide Web). With the advent of the what was termed “Web 1.0”, there was a sort of gold rush for market dominance online. The proliferation of the internet amongst home users continues to grow to this day, with a vast majority of the world having access to it through one device or another, be it desktop PC, laptop, mobile phone, through numerous technologies that allow data wireless for more portable use.

By secretlondon123 (Flickr: analogue modem) [CC BY-SA 2.0 (http://creativecommons.org/licenses/by-sa/2.0)], via Wikimedia Commons

When the internet first truly reached a high adoption rate on personal computers across the world, it was still in its relative infancy. Things like social media and video sharing sites like Youtube were but a twinkle in someone’s eye. Even web browsers themselves were in their early stages, supporting a handful of basic features, with the web itself just basically being a bunch of text pages, some basic formatting and maybe a few pictures or even sound if you were lucky.

The primary home desktop uses for the internet were rooted in communication, mailing lists, emails, forums, newsgroups, and the like, and expanding to some of the first popular online retailers like Amazon and eBay.  Then things really took off with the advent of social media and online software applications.  A new age gold rush where the gold was almost infinite and the only limits were our imaginations and the technologies of the time.

Of course, because all of these things were initially quite limited at the time, it wasn’t until the early 2000s that the web really started to come into its own, with the introduction of languages like PHP, Java, and Javascript (no relation outside of the name) which, with other technologies, helped the internet grow into the core part of many computer’s day-to-day primary function and focus.

Moore’s Law and Present Day

The computer market continues to thrive to this day, despite numerous calls to the contrary, claiming that the PC is dead, in favour of more portable options like smartphones and tablets. You won’t find as many “branded” pre-built PCs from the likes of big name computer companies like IBM or HP anymore, as they sold off those parts of their businesses in order to make changes and evolve. Though in both cases they still sell server hardware, just not PCs.

The technological advancement in computing has never truly lost its stride since the mid 1960s following, at least somewhat, Moore’s Law, the observation made by, and named after, Gordon Moore, co-founder of Fairchild Semiconductor and Intel. His 1965 paper described the doubling in the number of components per integrated circuit year-on-year, projecting that this would continue for at least another decade. 10 years later in 1975 he refreshed his previous statement, revising it to an estimate of doubling every two years instead of one. Often misquoted as 18 months instead, this was misattributed to Moore and should actually be attributed to David House, an Intel exec who predicted 18 months for the doubling of chip performance due to the combined factors of more and faster transistors.

Moore’s Law on a more up to date graph as of 2015 and dating back as far as the inception of “modern” computing, showing that the old axiom still holds fairly true to this day:

Que sera sera…

Nanotechnology? Quantum computing? The future of computers certainly has many interesting avenues open to it.

In a recent article in New Scientistthey report on Google “leading the pack” in quantum computing, a developing area of computing that will leave many unable to adequately explain how it even works, and even more scratching their heads at the very premise.

The future will truly hold many great things for computing, though some of it might take some time to get here as the rest of us await the fruits of their labour with baited breath and wallets at the ready.

As someone who grew up around computers and has had an affinity and interest in them from a young age, my views on personal computers might be a bit skewed in their favour. They’ve been a consistent part of my formative years, arguably even forming and determining my goals and job aspirations before I even perhaps knew myself what they were.

My first PC was a 486DX2 when I was just 4 years old. I remember the excitement and wonder that I felt back then, writing BASIC code, painting squares that I would later learn were pixels, playing some, by today’s standards, incredibly crude games with even cruder graphics and audio due to the hardware and software limitations of the time.  Computers have always been an important part of my life, and the friendships I’ve formed and people I’ve met as a result of both the fascination and hobby of computer enthusiasts, as well as all the people I probably would have never met if it wasn’t for the internet and all of the revolutions that it has brought.

Probably some of my fondest memories will forever be rooted in technology, though obviously that isn’t the whole story. Regardless, I couldn’t imagine how different my life would be had I grown up without the computers, and technology behind them, that we as a society so readily take for granted today. They’re a marvel of ingenuity and a testament to the kinds of incredible things that humans can create in spite of all the darkness in the world. The benefits that computers have brought the human race as a whole are innumerable, and I believe the future holds only more spectacular and wondrous things to come.

Ransomware: What you need to know

Subscribe to the Starjammer Bulletin