Connect with:

The First Computer Was Around a Century Ago

Sam Sattel


Before It’s Time – Computers Were Around a Century Ago, We Just Never Heard About Them

Some inventions come about before their time and seem to disappear into the vacuum of history overnight, never to be seen again for ages. Like solar cells, which were first invented in 1883 by Charles Fritts, but never gained mass popularity until over a century later. Then there’s something more close to home, the computer. Despite what many think, computers weren’t invented in the 1940s. What started in the 1940s might be the computers that we know today, but they were built off an existing foundation; built upon the shoulders of the engineers that went before them.

The first computer had its reaches all the way back to the 1800s, but like other great inventions that didn’t fare the passage of time well neither did these. Maybe we just weren’t ready for them, or they weren’t ready for us? Regardless of the reason, the events that transpired over a century ago laid the foundation to today’s digital computer revolution, and without it, you’d never be reading this blog.

So who we do have to thank?

The Land Before Bits and Bytes

Back before computers were even associated with mechanical devices, the word “computer” was first in use in 1613 as a label for a person that performed calculations. And this definition would stick to its human counterpart for over three centuries until the 1800s arrived. It’s during this time, 1822 to be exact, when English mathematician, philosopher, and inventor Charles Babbage first introduced the concept of the computer, only he called it the Difference Engine.


Mr. Charles Babbage himself, knowing as the Father of Computing. (Image source)

This Difference Engine was 100% mechanical, capable of only computing numbers and making a note of its results on physical materials. The limitations were pretty clear to Babbage, and to make the leap between doing simple calculations to some beefy computations Babbage was going to need a more general purpose tool. And as so funding for Babbage’s project started to dry up from the British government, the famous inventor turned his sights onto something bigger, a general purpose computing machine that he called the Analytical Engine.


The Analytical Engine in all its beauty with included memory and processing power. (Image source)

This Analytical Engine was, by all means, the foundation for the digital computers that we know and use today. While it was still mechanical in nature, it had a ton of systems inside that perfectly matched today’s technology, including:

  • A Store which acted as the memory that we use in computers today that could store numbers and the results of calculations.
  • A Mill which would be equivalent to today’s Central Processing Unit (CPU) found on every computer from desktops to smartphones that perform arithmetic calculations.
  • Flow Control which is still in use in today’s programming environments to do things like conditional branching, looping, parallel processing, latching, and polling.
  • Outputs which were used to print the results of programmed computations on physical materials like punch cards that we’ve now replaced with monitors.

As you can see, what Babbage had created laid the groundwork to the computers of our digital age, and it could all be programmed. This computer would take input in the form of a program, do the heavy lifting computations with its Mill, Store those results in memory, and output them on a physical medium. All of these fundamental processes are how today’s computers work, but Babbage was a hundred years ahead of his time! But Babbage wasn’t alone in his ingenuity. He had a partner who understood his inventions just as deeply and saw the future of their possibilities with programming.

Her name? Ada Lovelace.

Enchanting Numbers

To understand Ada Lovelace, who is considered by the world of computer science to be the first programmer, you first have to understand her parents. Ada was the daughter of the famous poet and renowned writer Lord Byron, and if there’s anything to know about this man, it’s that he had some violent mood swings.

And so as you can imagine, the relationship between Lord Byron and Ada’s mother, Lady Anne Isabella, didn’t last long, and they ended up splitting just a few weeks after Ada was born. From this moment on, everything shifted in Ada’s life. Instead of being taught poetry and art, Ada’s mother instead focused all of her daughter’s studies on science, philosophy, and mathematics. All with the goal of ensuring that Ada would never turn out like her father.


Ada Lovelace, the mother of all programmers. (Image source)

Her mother’s strategy worked. Ada was tutored in the mathematics and sciences of the day and flourished. At the age of 17, she met Charles Babbage, and a decade’s long friendship began. Despite the huge age difference, Babbage and Lovelace were equals in intellectual might. Babbage would later mentor Ada, and in turn, Ada started to learn about Babbage’s Difference Engine and Analytical Engine, and she was entranced.

At one point, Babbage came to Ada asking her to translate an article on his Analytical Engine written by an Italian engineer. During this process of translation, not only did Ada translate all of the text from French to English, but she also added her own set of thoughts about the machine and its implications for the future, and she didn’t hold back.


Just some of the notes Lovelace included in her translation; this one is considered to be the first computer program ever written. (Image source)

Ada understood at a very young age that these computers Babbage had invented could do more than just work with numbers, they could manipulate any data that numbers could represent, and with that the possibilities were endless. Ada saw a glimpse into the future with this Analytical Engine, with possibilities like:

  • Being able to create complex and elaborate pieces of music with any degree of complexity.
  • Being able to manipulate symbols for complex computation, not just calculation.
  • Being able to use a computer not just for computation but also for graphical drawings.

In short, Ada was a prophet for the coming computer age that would soon dominate our entire society. Except she was over 100 years too early. At the time, Ada’s published work disappeared into the vacuum of history, and so did Ada.

A Century Down the Road

It wasn’t until over 100 years later that Ada Lovelace’s contributions to computer science and Babbage’s foundation of modern computing were finally brought to light in the 1900s. Ada’s writings into the potential of computer programming came out when her notes on Babbage’s invention were republished by B.V. Bowden in the book Faster Than Thought: A Symposium on Digital Computing Machines in 1953. Since that publishing, Ada is now known worldwide as the first computer programmer, and the United States Department of Defense even named a computer language after her, called Ada.

All of the success that Charles Babbage laid in the 1800s also came to fruition in the form of the first concept for the modern computer by Alan Turing in 1936. Did Turing base his invention off of the work Babbage created a century earlier? Who knows. What he did create was a machine that could be controlled by a program that provided coded instructions were processed, stored, and outputted. All of these systems, the memory, the processing capabilities, the input of data, and the output of results were all accomplished a century early by Babbage.


Here we have a modern replica of what a Turing Machine might look like today. (Image source)

The rest of the history of computer development seems to rush by in a blur. The first electronic programmable computer, called the Colossus, was invented in 1943 and helped British code breakers to read encrypted German messages during World War 2. And from there we have the invention of the first digital computer in 1946, called the ENIAC, which took up over 1,800 square feet, packed in 18,000 vacuum tubes, and weighed in at 50 tons. By 1974, we had the first personal computer that could be purchased by the masses, the Altair 8800. And today, we’ve got computers that we can strap to our wrists; the progress is just mindblowing.


The first personal computer that started the personal computing craze, the Altair 8800. (Image source)

Laying the Foundations

This isn’t just a history lesson in this blog; there’s a reminder. It’s a reminder about the importance of foundations, and how most great inventions are built on top of them. Without Charles Babbage’s early success with a mechanical computational machine or Ada Lovelace’s success with understanding the possibilities of computer programming, we would never be where we are today. And this is how progress in engineering works on a large scale, even beyond computers.

All of the work that we do, day in and day out, is done because we stood on the shoulders of engineers that came before us. We’re not still drawing the same circuits over and over in every new design, or recreating the same parts from scratch because we know that what we or someone else created in the past can be trusted and relied upon. In many ways, the success that we experience today is only possible because of the work is done in the past, whether that’s through your own engineering efforts, or from inventors and visionaries like Charles Babbage and Ada Lovelace. These two saw the future clearly and pointed us in a new direction. And without them, our digital computer age would never be as it is today.

Autodesk EAGLE helps you use the technologies of the past to design the future.  Check out Module Design Blocks today!

Subscribe to Autodesk EAGLE

For as low as $15 a month.