A brief history of computing

Humans have always built tools and made innovations to make life more comfortable and to allow us to do more things faster and more efficiently. We need to go back in time a few hundred years in order to see the first attempts at building a tool that could resemble a computer. However, before we do that, we might want to define what a computer is. Wikipedia offers the following definition:

A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming.

So, a computer is a programmable machine that performs arithmetic or logical operations. Let's review a few inventions from the past using this definition to ascertain which of them could be considered a computer.

To begin, we can rule out the Jacquard machine, which was the automated loom invented in the early years of the 19th century. These looms could be programmed using punch cards, but they produced woven silk, which, of course, is not the result of an arithmetic or logical operation. Programmability, using punch cards, was an idea that survived well into the computer age, but these looms were not computers.

If we go even further back in time, we find devices such as the abacus that helped us to get the results of arithmetic operations; however, they were not programmable.

In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, created some mechanical dolls that he called automata. These dolls could read instructions and could thereby be considered programmable, but they did not perform arithmetic or logical operations. Instead, he created one doll that could play music, one that could make drawings, and one that could write letters (they are referred to as the musician, the draughtsman, and the writer):

Figure 1.1: The Jaquet-Droz automata (photograph by Rama, Wikimedia Commons; Cc-by-sa-2.0-fr)

In order to see something that resembles a computer, we will need to look at Charles Babbage's inventions. He originated the concept of a programmable computer with his ideas for a machine, called the Difference Engine, and later, a more advanced version called the Analytical Engine. Of the two, the Analytical Engine was particularly groundbreaking as it could be programmable, which meant it could be used to solve different problems. He presented his work in the first half of the 19th century, and even if the machines were never completed, we can agree that Babbage is a very important person behind the basic concept of the programmable computer.

During the first half of the 20th century, we witnessed some analog computers, but it was not until the second world war, and the years following, that we saw the birth of real digital computers. The difference between an analog and a digital computer is that the former is a mechanical machine that works with an analog input such as voltage, temperature, or pressure. In comparison, a digital computer works with input that can be represented by numbers.

Many people consider the Electronic Numerical Integrator and Computer (ENIAC), constructed by J. Presper Eckert and John Mauchly between 1943 and 1946, as the first digital computer because it was the first one that was both completed and fully functional:

Figure 1.2: Betty Jean Jennings and Fran Bilas, both programmers, operate ENIAC's main control panel – U.S. Army Photo (Public Domain [PD])

Since then, we have seen tremendous development up until the point we are at today. However, even though our modern computers can do so much more and at a much faster rate than these earlier inventions, the basic principles of how they operate remain the same.