From Big to Small: A Look into the History of Computers

The modern computer is a great example of how technological science has achieved wonders throughout the years. It’s crazy to think that many years ago, only very few people own these things and that if you are one of those lucky few, you’ll probably need an entire room to shelter it. Nowadays, computers are becoming more compact than ever. Their history and how they became convenient for people to use are long, but this article will try to make it bite-size. Here is a brief look into the history of computers.

The Humble Abacus

Believe it or not, but computers started from a simple invention: the abacus. This simple computing machine was ahead of its time and made many Chinese merchants and scholars happy. This was way back in 1200 CE and is considered to be the origin of computing machines. Other variations can be found in Rome and Greece, which were more compact than the Chinese ones.

The Loom Machine

Interestingly enough, it took thousands of years to develop the abacus into a functioning machine that required minimal human intervention. In 1801, Joseph Marie Jacquard invented a loom that punched in cards to produce various loom designs. The loom did almost all the work, and humans only needed to check the product right after its creation. Later on, Herman Hollerith, the founder of IBM, used the same design for punch cards in 1890.

A Failed Design

It was Charles Babbage who created the very first digital computer in 1832. It was also known as the difference engine. The structure and the idea of the difference engine made it possible for future inventors to create computers with a similar design. However, it took many years before that happened because the difference engine didn’t take off. The difference engine and the later on the analytical engine were both massive calculators. The former was accurate to up to 31 digits, while the latter (which was never created) was supposedly accurate to 500 digits.

Alan Turing and AI

AI

Alan Turing was the founder and inventor of the modern computer. He created a device known as the Turing machine back in 1936, which can compute any mathematical system given enough time. However, if given the math problems we have right now, it will take thousands of years to solve them. But this foundation has given us the foothold we need to create the supercomputers we have today. Additionally, Turing proposed the idea of artificial intelligence and created the first test to determine whether computers can think for themselves. It’s called the Turing test, and it’s still being used up to this day to test modern computers when it comes to learning. But that’s a whole different story.

The Grandfather of Digital Computers

If the difference engine is the first digital computer, the Electronic Numerical Integrator and Calculator (ENIAC) is the grandfather of digital computers. It was created by John Mauchly and J. Presper Eckert in 1944. Its size was big enough to fill a room and has 18,000 vacuum tubes. What it can do is similar to the modern calculators we have today, which can now fit on the palm of your hand!

Bill Gates

Bill Gates comes into the picture in 1975 when he and his partner created the modern computer known as Altair 8080. Although the computer wasn’t a big success, the software became the operating system we know today: Microsoft.

The First Personal Computer

IBM created the very first computer that was available commercially in 1981. It was named Acorn and ran by the MS-DOS operating system from Microsoft. This personal computer can fit anyone’s desk but is still considered bulky when it comes to a modern standard. It can do the same thing that the ENIAC can do and so much more. And it’s much faster!

Video Games

playing video games

It was in 1994 when personal computers become highly coveted by many because of the entertainment they provide. Games like Command and Conquer made computers fascinating to use and introduced many people into the world of computers.

AMD’s Athlon 64

AMD created the first 64-bit processor, which can fit on the palm of your hand. Its computing power is a million times more efficient than the ENIAC and is one of the fastest processors back then. This was a time when microchips are becoming widely available in the consumer market.

Microchips

Computers then started to become smaller from then on. Chips were being produced smaller every year, and these microchips have made it possible for computers to become more efficient and much smaller. Microchips have made circuit board fabrication much easier and lightweight, making motherboards and processors much smaller. It introduced us to the era of smartphones and modern computers, which can compute complex mathematical sequences in a blink of an eye.

The Era of Smartphones and the Modern times

The era of smartphones has recently dawned on us. Now devices that can fit the palm of our hand have enough computing power to send us all to the moon. Of course, that’s only theoretically, and that we all individually need billions of dollars to create the necessary spacecraft for us to travel to the moon with the use of our smartphone’s computing power.

Still, that’s an amazing benchmark to show how far we have progressed to science and technology throughout the years. This is where we will end this brief look into the history of computers. The future is still unknown because we are discovering more things we can do with this technology each year. Hopefully, it’ll be enough to make us a spacefaring civilization in the distant future.

Like this article?

    Scroll to Top