A brief history of the world’s greatest invention : The CPU

CPU
The current CPU is made of silicon, the same material as its 1970s predecessors. (Photo: FreePik)
We all remember attending math classes in school where the teachers would profoundly state that it was imperative that students learn how to perform math manually rather than using a calculator. This statement was quickly followed by “you won’t carry a calculator around with you in real life!”  اضافة اعلان

Fast forward to today, we now have devices that fit into the palm of our hands that not only run your average calculator, but in fact hold enough processing power to do thousands of tasks rapidly, efficiently, and most importantly, on demand.

Interestingly enough, the creation of the modern-day silicon CPU was an accident, or more accurately, the use of it was never intended to be leveraged in the way that it currently is.

The current CPU is made of silicon, the same material as its 1970s predecessors. A transistor is the essential building block of any processor, and silicon is the major component that makes up a transistor.

Transistors are essentially “switches” that are activated and deactivated by an electric current. The 1s and 0s that make up the language that computers speak are fundamentally represented by this “on” and “off” function (binary).

Today we’re going to go over the history of the CPU and in what ways it has shaped the human experience over the last decade.

The company that started it all: Intel

Intel, at this point in time, is a household name.

Whether you’re a tech geek, an average Joe, or know next to nothing about technology, at some point in your life, you have either heard of Intel, recognized their logo, seen their ads, or all of the above.

In the 1970s, Intel partnered with a calculator manufacturing company called Nippon Machine Corporation (NMC) and promised to develop the next generation of chips for their advanced calculators at the time.

After many trials and errors, they successfully developed the 4004 Intel chip; an extraordinary, first-of-its-kind microprocessor that would be sold for around $60 on the market for general use to the public.

This chip was revolutionary for its time, and it kick started the technological boom in the ‘80s and ‘90s.

Developed by Federico Faggin, Marcian Hoff, and Masatoshi Shima, this processor would fit an entire general-purpose processor into a single silicon chip. Around 2,300 random-logic transistors were present in this version, and the chip boasted five times the speeds of other chips on the market, all while being 50 percent cheaper than its alternatives.

Previously computers were bulky, incredibly expensive, and almost entirely inaccessible to the general public. But, as a result of this invention, along with their patented silicon gate technology, Intel was able to launch an inquisitive research effort into making smaller, faster, and more affordable chips to the general marketplace.

IBM, an American computer company, chose Intel’s 16-bit 8088 — a then highly-advanced chip — as the CPU for its first mass-produced personal computer (PC) in 1981. Intel also supplied microprocessors to companies who created PC “clones” that were compatible with IBM’s product, leading to a significant spike in demand for laptops and portable devices.

The 80386 — a 32-bit chip introduced in 1985 — established the company’s promise to make all future microprocessors backward-compatible with earlier CPUs. This was possibly the most important of the numerous microprocessors manufactured by Intel, as application developers and PC customers could be confident that software designed for older Intel machines will operate on the latest versions.

Fast forward to 1993, Intel outshined itself yet again by developing the Pentium microprocessor of 1993.

New era of power

When Intel introduced the Pentium microprocessor, it abandoned its number-oriented product naming practices in favor of trademarked names. The Pentium was the first Intel processor for PCs to employ parallel, or superscalar, processing, which enhanced its performance dramatically.

The Pentium had 3.1 million transistors, compared to the 1.2 million transistors of its predecessor, the 80486, which had 1.2 million.

In 23 years, Intel has managed to go from 2,300 transistors to a staggering 3.1 million.

The substantially faster Pentium CPU, when combined with Microsoft’s Windows 3.x operating system, aided in the monolithic rise of the PC industry. Although most PCs were still purchased by companies, the higher-performance Pentium computers enabled consumers to utilize PCs for multimedia graphics programs such as games that required larger amounts of processing power.

By the end of the century, Intel and similar processors from firms such as AMD could be found in every PC except Apple Inc.’s Macintosh, which was using Motorola’s CPUs dating back to 1984.

However, Apple’s then CEO Steve Jobs stunned the industry in 2005 when he stated that future Apple PCs will employ Intel CPUs.

Today, one of the most technologically advanced chips, Apple’s M1, features over 60 billion transistors on its CPU unit. This rampant growth in the scalability of transistor shrinkage is known as “Moore’s Law” meaning it is believed that the number of transistors — and as a result, the power — of CPUs will double every two years.

Thus far, we have been able to keep up with the law that was ingeniously predicted in 1965; but we are now reaching a crossroads.

The atomic limit

Computer engineers around the globe are now beginning to test the waters of what is thought to be possible. As it stands, adding additional transistors to our modern-day CPUs is proving to be a challenge. In fact, it is slowly becoming more mathematically impossible by the day. And as we move along Moore’s parabola, it becomes more evident that the future of CPUs is not microscopic in nature, but in fact atomic.

Whether we continue along the predicted path set out for us by Gordon Moore, or if we begin to see stagnation in the growth of micro-processing power, the CPU has undoubtedly been one of the most revolutionary discoveries of man.


Read more Technology