Who Invented the Computer? Microprocessors

Who Invented the Computer? This is the thirty-fourth installment in our ongoing series.

The microprocessor is the beating heart of the information age.

In July of 1968, in the region of California that would eventually become widely known as “Silicon Valley,” Gordon E. Moore (just barely becoming known for Moore’s law) and Robert Noyce (co-inventor of the integrated circuit) opened the doors of a small start-up company with the goal of building and selling integrated circuits.

They named their company Intel and backed it with an initial $2.5 million investment. It would eventually grow to become the world’s largest manufacturer (by revenue) of semiconductor chips. In the process their creations would forever alter the computer industry and, by extension, the world in unimaginable and wondrous ways.

Numbers Game

Moore and Noyce’s first microprocessor was nothing to brag about, but it was their second that put them on the map. The 1103 chip was a huge success that enabled the company to go public. It had a 1024-bit dynamic Random Access Memory (RAM) device that abruptly ended the industry’s reliance on magnetic core memory.

Intel’s next and most famous microprocessor was the 4004. It was the result of a three-year-long joint project with the Busicom Corporation in Japan. Designed by engineers Federico Faggin, Ted Hoff, Stanley Mazor, and Masatoshi Shima, the 4004 proved the superiority of MOS silicon gate technology, which enabled twice the number of transistors to be placed on a chip while simultaneously giving it five times the operating speed.

The 4004 was the world’s first general purpose computer on a chip. It consisted of three segments: an Arithmetic Logic Unit to perform mathematical and logic calculations on data, registers that acted as temporary data storage locations, and a Control Unit that controlled the flow of data between the chip and the system and dictated how a computer’s internal memory should respond to instructions.

With 2,300 transistors, the 4004 had a 4-bit word size capable of storing the numbers “0” to “9” in binary code, making it a perfect fit for its original design purpose, the Busicom 141-PF calculator. By November of 1971 the 4004 became the first commercially available microprocessor.

Since it was a general purpose chip, it was soon being used in various other pieces of computerized equipment such as cash registers, teller machines, billing machines, and even a microprocessor-controlled pinball machine made for a Las Vegas casino.

Double Your Pleasure

The microprocessor is the beating heart of the information age.

During the development of the 4004, Intel was simultaneously developing a processor called the 1201 as part of a contract deal with the Computer Terminal Corporation (CTC, known today as Datapoint). Although the device failed to meet CTC’s performance requirements, the agreement between the companies permitted Intel to market it to other companies.

Feeling the 1201 had market value, Intel continued working on it, renaming it the 8008 to maintain a sense of continuity with the 4004.

The 8008 was unveiled in April of 1972 as the first 8-bit programmable microprocessor. It contained 3,500 transistors, 50 percent more than the 4004. With eight times the clock speed (the number of cycles a CPU executes per second) of the 4004, it was capable of executing several hundred thousand instructions per second. The biggest advantage of the 8008 was its ability to perform data/character manipulation, which gave it a much larger range of applications.

Soon after release, the 8008 became the must-have chip and seats at Intel product seminars were in such high-demand that the company began charging entrance fees.

Intel could have easily utilized their microprocessors to become a computer manufacturer, but instead made the wise decision to continue pushing the 4004 and 8008 as chips that could be embedded in control applications for various machines (such as cash registers) and so forth.

More (or Moore?) Transistors!

Faggin, who had played the central role in the development of both chips still wasn’t satisfied and while his team was finishing up the 8008, he began pushing management to let him create a true single-chip microprocessor that combined speed with practicality. The board gave him the green light and, in 1974, the Intel 8080 (the eighty-eighty) was on the market for a sale price of $360.

Utilizing a 6-micron size process, the 8080 contained an impressive 4,500 transistors as opposed to the 8008’s 3,500. In addition to being 10 times faster than its predecessor and capable of performing more than 290,000 operations per second, the 8080 had a greater level of application flexibility. It quickly was included in thousands of different devices including, most importantly, various early microcomputers such as the IMSAI 8080 and the Altair 8800.

Intel had proven the applicability of microprocessors and before long other companies such as Motorola were hard at work designing and producing their own microprocessors. By doing so, Intel blew open the doors of development for ever more powerful mainframes and, eventually, personal computers.

Impact of Microprocessors

The microprocessor is the beating heart of the information age.

Microprocessors are rightly called the “engines of the digital age.” Their invention was a true revolution in computing — enabling  a single tiny chip to replace room-sized computers that contained thousands of vacuum tubes — and without them, our modern world could not exist.

Today, it’s almost impossible to find a daily use device that does not contain microprocessors. They are literally everywhere, in automobiles, household appliances, electric shavers, billions of computers and smart phones, sewer and water systems, vending machines, traffic lights, construction equipment, gaming systems, and supercomputers.

As microprocessors continue to grow in power, new technologies are helping manufacturers place tens of thousands of them onto chips. No one knows what the future of computing will be, but it’s certain that until something better comes along, it will involve ever more microprocessors.

Would you like more insight into the history of hacking? Check out Calvin's other articles about historical hackery:
About the Author
Calvin Harper

Calvin Harper is a writer, editor, and publisher who has covered a variety of topics across more than two decades in media. Calvin is a former GoCertify associate editor.