Have you ever wondered how we got here?
Not on this planet, nor within our embodied human form, but here – in a world where everyone practically has a supercomputer in their pocket, and many people can’t do an effective days’ work without being connected to our global data-sharing network (i.e., the Internet).
Depending on your definition of computing, the history of computing is surprisingly old – we’re talking thousands of years BCE. So, let’s start with that definition, shall we?
What is Computation?
In its simplest definition, computation is simply the act of mathematical calculation. By that definition, anything that has helped people do maths or handle data through the ages can be classed as a computer.
Understandably, this is a far cry from the highly sophisticated computers that we use every day, but at the end of the day, it’s all maths. From Microsoft 365 to Minecraft; from counting devices to the Commodore 64; from the abacus to Apple.
“Computers” Through Prehistory
By the above definition, we could argue that the earliest computer dates back to around 19,000 BCE. The Ishango bone features a number of somewhat orderly engravings and appears to be some kind of tally stick, but its precise use or relevance has long been lost to history.
Next up, we have a computing system that has arguably the most staying power in our view. Quipu are recording devices fashioned from knotted cord, which were formulated around 2,600 BCE by civilisations native to Andean South America. Research supports the idea that they were used for complex record-keeping, including things like tax records, censuses, and economic output. Their longevity is immense, with some variants still being used to this day.
Next up is a familiar proto-computer, the abacus. Invented by the Babylonians around 2,500 BCE, it was used to help with simple arithmetic. The abacus arguably laid the groundwork for computing as a whole. Later, the suanpan was recorded in China around 190 CE, a type of abacus which enables speedy multiplication, division, as well as square and cube roots.
And last but certainly not least, there is the Antikythera Mechanism, considered to be the world’s first analogue computer. Though discovered in 1901, it is thought to have originated from Second Century Greece. It’s an “orrery” – a mechanical device that predicts astronomical positions and eclipses.
Also worth mentioning here is Hero of Alexandria, a prolific inventor, engineer, and mathematician who developed “automata,” one of our first steps towards robotics, and “sequence control” a kind of rudimentary programming.
Medieval Mathematicians & Mechanics
For much of the mathematical advances in the mediaeval period, we need to look to the East.
Among the first things we would consider to be “computers”, arriving around the 1700s, would be heavily influenced by the technology of clocks and clockmaking. Therefore, we have to mention the invention of the first mechanical water clock (slash astronomical device), developed by Liang Lingzan and Yi Xing in 725.
Beginning around 800 CE we have the Islamic Golden Age, wherein incredible strides were made in numerous areas of study, but it’s the mathematicians and engineers that we need to talk about here.
Algebra is an Arabic invention and comes from “al-jabr,” meaning a reuniting of broken parts. However, we have its inventor, Muḥammad ibn Mūsā al-Khwārizmī, to credit with another word that is particularly important to computing nowadays – “al-Khwārizmī” became “algorithm”. Al-Khwārizmī made incredible strides in maths, algebra, and more.
The Persian Banū Mūsā brothers’ 850 CE Book of Ingenious Devices features a number of fascinating automata and mechanical devices, including the first mechanical musical instrument, automatic water fountains, and even a boiler with a tap for retrieving hot water. The 1100s saw Ismāʿīl al-Jazarī invent a whole host of mechanisms and automata, including a programmable musical robot band, a peacock fountain for hand-washing, and a flush mechanism much like the ones used in modern toilets. And anyone who created a programmable musical robot band is alright by us.
And from the Middle East, we have a cheeky stop off in the Med to discuss one of history’s great polymaths, Leonardo Da Vinci. Da Vinci created a number of true masterworks but perhaps most appropriate for our journey is his concept drawings for a mechanical calculator – around 150 years before the machine many credit as the first such device.
Early Modern Mechanical Calculators
Then we hop on our travels to Scotland to discuss another lauded polymath, John Napier. Amongst a host of other works and interests, in 1617 Napier created a device known as “Napier’s Bones” – a calculating device that operated through lattice multiplication. Napier’s work with logarithms led to William Oughtred’s 17th Century invention of the slide rule.
In 1642, Blaise Pascal invented the mechanical calculator (though Da Vinci fans would say otherwise!). This “Pascaline”, an adding machine, received royal privilege from Louis XIV, permitting Pascal the right to manufacture and sell his calculating machines – nine of which still exist in museums to this day.
Just 30 years later, German mathematician Gottfried Liebniz began work on his “Stepped Reckoner”. It was effectively a multiplication machine and was capable of giving a result of up to 16 digits. One of the crucial elements of the Reckoner is exquisite in its simplicity – the Leibniz wheel or stepped drum.
Leibniz appears to have had an idea to link his Reckoner’s functionality and Pascal’s calculator, though there is no evidence that Leibniz ever built this “link”. However, Pascal’s and Leibniz’s work did inspire Thomas de Colmar to create his Arithmometer in 1820. A simplified version of the device eventually became the first mass-produced calculating device in 1851 and pioneered the mechanical calculator industry.
In 1774, Philipp Matthäus Hahn created the first portable calculating device capable of carrying out all 4 types of calculations.
The Babbage Era & The Difference Engine
You may have heard that a device called the “Difference Engine” is the granddaddy of modern computing – but what is it? Well without getting into logarithms and polynomials, a Difference Engine is a type of mechanical calculator that works out and formulates numerical tables.
Though Charles Babbage is perhaps the most well-known for developing the Difference Engine from 1822 onwards, the idea was initially conceived by Johann Helfrich von Müller back in 1786. Sadly, von Müller never actually got the funding to build the machine; Babbage made more progress having received some funding from the British Government to develop the device, but no complete versions of his engine were ever completed within his lifetime.
By 1832, Babbage and engineer Joseph Clement had built one part of the Difference Engine. The precision gears and mechanisms needed for the device hit the limits of what 19th Century engineering could offer, though this work arguably set the gold standard for precision machinery for decades to come.
Though the difference engine is the name that gets talked about, we feel this is unfair on Babbage’s next project: the Analytical Engine. This was a more general-purpose calculating machine which could be programmable using punched cards. During the development of the Analytical Engine, Babbage worked with Ada Lovelace, who is largely credited with creating the first computer algorithm for use with the device.
In the early 1840s, Babbage’s funding was cut. This was due to significant cost overruns; the fact that Babbage had moved on to the Analytical Engine (rendering the original Difference Engine obsolete); and that no complete, working Engine had yet emerged. By 1949, Babbage had started work on his Difference Engine No. 2. Again, this was not constructed during his lifetime, but two were eventually built; the first was completed in 1991 (and a printer designed by Babbage was added in 2002) and the second was completed in 2016.
However, that’s not to say that Babbage’s invention was the only difference engine in play. Just one example is Sweden’s Per Georg Scheutz and his son Edvard Scheutz who finalised the Scheutzian calculation engine in 1843. The rest of this era is coloured by improvements in calculating and tabulating tech, as well as advances in mass production of some devices.
Also worth mentioning during this era is the work of Dorr Felt, who patented the Comptometer in 1887, the first commercially successful mechanical calculator. Herman Hollerith invented an electromechanical tabulating machine to assist in processing US census data that used punched cards as inputs – his eventual Tabulating Machine Company later became IBM.
The 1900s and Beginnings of Electrical Computing Devices
In 1906, Hollerith developed a tabulator with a plugboard, which provided a means to rewire the machine to carry out different functions. Plugboards remained in use to “program” calculating functionality until the 1950s. In 1909, Irishman Percy Ludgate developed his own mechanical Analytical Engine – a type of device only ever conceived by himself and Babbage.
In 1913, Leonardo Torres Quevedo proposed a new branch of engineering that would become a bit of a tech buzzword today – automation. He also developed an electromechanic analytical engine.
By 1911, the company that Hollerith started had been rolled in with several other companies under the helm of the Computing-Tabulating-Recording Company (CTR), and in 1924, this company was renamed International Business Machines Corporation (IBM). Hollerith’s legacy continued through IBM, who standardised data input punch cards. Cards like these influenced data processing for around half a century, and technology surrounding them continued to develop.
1938 saw Konrad Zuse pioneer the first mechanical binary programmable computer called the Z1 (originally the V1). It’s a kind of proto-version of many modern machines, including the use of binary and other mathematical functions. Zuse built on this invention with the Z2, the Z3 in 1941, and the Z4 in 1945 – one of the first commercial digital computers.
In 1939, we see another familiar company name emerge, the brainchild of William Hewlett and David Packard. Because the pair were based in Palo Alto, California, Hewlett-Packard’s founding is considered to be the inception of Silicon Valley.
As World War II rumbled on, the Allies were making great strides in computing for code breaking purposes. There’s not enough space in this article to extol the full genius of the likes of Max Newman, C. E. Wynn-Williams, Vannevar Bush, Tommy Flowers, and of course Alan Turing. Howard Aiken and his team developed the Harvard Mark I by 1944, which was used in the war effort. It weighed 4.3 tonnes and was 16 metres long.
The 1940s and 50s saw the invention of many devices and concepts we use today: the transistor; the trackball; Kathleen Booth developing the first assembly language; and the SSEM (or “Manchester Baby”) – the world’s first electronic, vacuum-tube-based, stored-program-controlled computer. The handheld electronic calculator we know today didn’t arrive until the 1960s, but a notable, massively portable, handheld mechanical calculator was released in 1948 – the Curta.
Also worth noting is Cambridge University’s EDSAC computer, which ran its first stored program in 1949. This was another room-sized computer which even used basic self-modifying code.
The Modern Era
The 50s and 60s saw us slowly emerge from the room-sized, vacuum-tube computers into digital devices that could sit by (but not yet on) your desk. But without delving too deeply into different machines like SEAC, ACE, UNIVAC, and EDVAC, let’s touch on some of the themes from this era.
In 1950, Alan Turing published a paper describing what we now call the Turing Test. Turing rather presciently foretold the evolution of computer intelligence – and described a test that would evaluate a machine’s ability to behave with human-like intelligence. AIs have now passed this test – “Eugene Goostman” passed it back in 2014, and ChatGPT and Google’s LaMDA either have or haven’t passed the Turing Test, depending on who you ask.
This was an era when computers started doing some of the things we now take for granted or have even long surpassed: operating in real time; handling both text and numbers; using magnetic tape; making music; and using transistors rather than vacuum tubes. It was an age of firsts – the first programming language (as we know them today); the first silicon chips; mainframe computers; microprocessors; early computer gaming; the mouse; networks using packet switching; and more.
The 70s and 80s are where things really start to get recognisable by today’s standards. Desktop computers – that can actually sit on a desktop – emerged in the early 70s. ARPANET, the first wide area network, sent its first packets in 1969, with what is considered the first email sent in 1971. Scientific calculators hit the market, graphical user interfaces emerged, Ethernet networking was developed, and the 8-inch floppy disk was introduced.
After the 1968 inception of Intel, some other household names started to emerge during the 70s, like Microsoft, Apple, and Motorola. Retro gaming fans may recognise some names popping up during the 70s too, like Pong, Space Invaders, Atari, Activision, and Commodore.
Familiar home computer names continued into the 80s, with the ZX Spectrum, Amstrad, and the BBC Micro. The 80s also saw the release of command line operating system MS-DOS in 1981, Microsoft Word was released in 1983, Windows in 1985, the CD-ROM drive was developed, and the first domain name (symbolics.com) was registered in 1985.
But arguably the biggest thing to happen to tech in the 80s was Tim Berners-Lee’s work on what would become the World Wide Web, which eventually released in the 1990s.
The use of the Web gradually caught on during the 90s in both commercial and residential settings. Also, the tech world’s focus on multimedia, graphics, gaming, connectivity, and speed began in earnest; a legacy that continues today.
All of which played a part in technological strides, enriched countless lives, and even led to you reading this article today!
If you’ve enjoyed this article and would like us to cover computing’s recent history in more detail, let us know over on LinkedIn!