From little acorns: the humble history of computing

We started with scratched bones, and now we’re fearing the robot takeover. Today we’ve become accustomed to carrying almost all human knowledge in our back pockets, and we take smart technology for granted.
But who would have thought that computing began with baboon bones, pebbles, and clockwork counting? How did bones evolve into artificial intelligence, widespread automation and space exploration? The chronicles of computing wind back through the centuries, spanning a journey that’s longer and more diverse than you might think.
From humble origins to humbling advancements, we explore the history of computing.
BC: Before Computers
Would you believe it if we told you that the history of computing dates as far back as 19,000BC?
There might not have been Microsoft in the late stone age, but there was a calculator. The earliest calculation tool was an African device: the Ishango bone. This was a notched baboon bone that (experts believe) was used for simple calculations.
The first certain instance of a calculator came much later, in 2500BC. This was known as a counting board, which later evolved into the abacus. The abacus is a tool used to mimic human hands, as at the time counting and addition were often done using fingers.
They might not be as complex as a modern computer, but these early tools mark the first instances of calculation, or simplistic computing.
The history of computing’s clockwork foundations
Bones and pebbles aside, the BC years also saw the invention of the basis for modern computer languages. In 200 BC, the binary number system was first described by Indian mathematician Pingala. Despite being so old, binary lives on today – providing the foundation of 0s and 1s still used in modern computing equipment.
Just 75 years later (ish) The ‘Antikythera Mechanism’ came into being. It was a clockwork computer that could track the relative positions of heavenly bodies (stars and planets, not angels, sadly.)
The medieval dream of robots
The latter half of the medieval era (5th century to the 15th) saw basic devices evolving into dreams of automated humans. Al-Jazari invented – among some 100 other technical innovations – automata. These included plans for a clockwork mannequin and a drink-serving mannequin, forming the first designs for humanoid automated robots, way back in 1206.
These classic automata later influenced Leonardo da Vinci. In 1493, he added his own rich developments to the history of computing with plans for building a robot, including detailed designs of a mechanical man. While none of these plans ever left the page, they are the first blueprints for the automation and robots that are now so prevalent today.
1600s – automated calculators
The history of computing doesn’t just see automation in the form of robots. In the 1600s people began to work on mechanical calculators, though the first true inventor is a topic for debate.
In 1623 German inventor William Schickard drew plans for a calculating machine that could add and subtract 6-digit numbers. Unfortunately, the original machine was reported to have been destroyed in a fire before it was completed. Evidence of these plans was not discovered until much later.
20 years after Schickard, in 1642, French teenager Blaise Pascal invented and built a (sort of) working calculator, which he called the Pascaline. Its calculations weren’t always accurate, but he was still celebrated as the sole inventor of the mechanical calculator.
It wasn’t until over 100 years later that Schikard’s letters were found. Then, Pascal became one of two inventors of the early mechanical calculator – still an impressive accomplishment.
1800s – Programs, computers and automation
In the 19th century, the history of computing finally shifted towards the technology we’d recognise today, with the invention of programmable computers.
This started with the creation of ‘punch cards’ by Joseph Jacquard in 1801, which told mechanical looms what pattern to make. Using this technology, Charles Babbage began the creation of the first programmable computer, which he called The Analytical Engine.
Ada Lovelace, a friend of Babbage, was the first computer programmer. She designed a program, which would have been delivered using a punch card, that would have worked had the machine been completed before Babbage’s death.
At the end of the 1800s, Herman Hollerith invented the Hollerith desk, which used punch cards to automate counting. The machine reduced the time it took to complete the 1890 census. Beyond that, it was among the foundations of the data processing industry.
Hollerith’s legacy has stood the test of time. His company would go on to become part of IBM, and punch card automation became the norm – used for everything from calculating gas bills to monitoring workers.
Wartime wonders
Not even WWII could halt technological progress. Despite the limitation of resources available due to the war, some exciting new pages turned in the history of computing.
In 1941 German inventor Zuse created a series of digital computers. These could have been the first software controlled machines. Unfortunately, three out of four of them were destroyed in the war. Another invention that claims the title of the first programmable digital computer is COLOSSUS, invented in Great Britain in 1943.
With the production of programmable computers, the first computer bug shortly followed (of course). Grace Hopper recorded this bug in 1947. Amusingly, an actual bug (a moth) caused this programming blip when it flew into the mechanisms of the machine.
PCs and portability
The history of computing saw a boom in the late 20th century and the turn of the millennium. Companies such as Apple and IBM started releasing personal computers, making computation more widespread and accessible than ever before. Apple’s Newton in the early 1990s shaped handheld PCs such as the tablets that we have today.
After the invention of the World Wide Web by Sir Tim Berners Lee in 1990, chatbots such as Jabberwacky and later, IBMs Watson started to develop alongside the chatrooms that preceded social media.
Smartphones and tablets evolved as computers became more portable, and are becoming more powerful with every passing month. Now AI is in the limelight, and computing is looking to continue to develop into an automated, artificial intelligence-powered future.
Present computing presents
The history of computing has survived wars and revolutions to bring us where we are today. It’s been humble from the start – a few scratches on a monkey bone – but all signs indicate that we’re headed for a thrilling future. Who would have guessed that our screen addiction started with scratched monkey bones in ancient Africa?