Intel Introduces the First “Computer on a Chip”

Engineers at Intel Corporation publicly introduced the first microprocessor, the Intel 4004, combining the basic logic circuits of a computer on a small square of silicon called a “chip.”


Summary of Event

The microelectronics industry began shortly after World War II with the invention of the transistor. During the war, it was discovered while radar was being developed that certain crystalline substances, such as germanium and silicon, possessed unique electrical properties that made them excellent signal detectors. This class of materials became known as semiconductors, because they were neither conductors nor insulators. Immediately after the war, Bell Telephone Laboratories Bell Telephone Laboratories began to conduct research on semiconductors in the hope that they might yield some benefits for communications. The Bell physicists learned to control the electrical properties of semiconductor crystals by “doping” them with minute impurities. When two thin wires for current were attached to this material, a crude device was obtained that could amplify voice. The transistor, as this device was called, was developed late in 1947. The transistor duplicated many functions of vacuum tubes, but was smaller, required less power, and generated less heat. The three Bell Labs scientists who headed its development—William Shockley, Walter H. Brattain, and John Bardeen—won the 1956 Nobel Prize in Physics. Intel Corporation
Microprocessors
Integrated circuits
Computers;microprocessors
Microelectronics
[kw]Intel Introduces the First “Computer on a Chip” (Nov. 15, 1971)
[kw]First “Computer on a Chip”, Intel Introduces the (Nov. 15, 1971)
[kw]”Computer on a Chip”, Intel Introduces the First (Nov. 15, 1971)
[kw]Chip”, Intel Introduces the First “Computer on a (Nov. 15, 1971)
Intel Corporation
Microprocessors
Integrated circuits
Computers;microprocessors
Microelectronics
[g]North America;Nov. 15, 1971: Intel Introduces the First “Computer on a Chip”[00480]
[g]United States;Nov. 15, 1971: Intel Introduces the First “Computer on a Chip”[00480]
[c]Computers and computer science;Nov. 15, 1971: Intel Introduces the First “Computer on a Chip”[00480]
[c]Inventions;Nov. 15, 1971: Intel Introduces the First “Computer on a Chip”[00480]
[c]Science and technology;Nov. 15, 1971: Intel Introduces the First “Computer on a Chip”[00480]
Noyce, Robert Norton
Moore, Gordon E.
Shockley, William
Hoff, Marcian Edward, Jr.
Kilby, Jack St. Clair

Shockley left Bell Laboratories and returned to Palo Alto, California, where he formed his own company, Shockley Semiconductor Laboratories, as a subsidiary of Beckman Instruments. Palo Alto is the home of Stanford University, which, in 1954, set aside 655 acres of its land as a high-technology industrial park, known as Stanford Research Park. One of the first small companies to lease a site there was Hewlett-Packard. Many others followed, and the surrounding area of Santa Clara County gave rise in the 1960’s and 1970’s to a booming community of electronics firms known as “Silicon Valley.” Silicon Valley

On the strength of his prestige, Shockley recruited eight young scientists from the eastern United States to work for him. One was Robert Norton Noyce, an Iowa-bred physicist with a doctorate from Massachusetts Institute of Technology. After working at Philco’s transistor division, Noyce came to Shockley’s company in 1956. The “Shockley Eight,” as they became known in the industry, soon found themselves at odds with their boss over issues of research and development. Seven of the dissenting scientists negotiated with industrialist Sherman Fairchild, Fairchild, Sherman and they convinced the remaining holdout, Noyce, to join them as their leader. The Shockley Eight defected in 1957 to form a new company, Fairchild Semiconductor, in nearby Mountain View. Shockley’s company never recovered from the loss of these scientists and soon went out of business.

At the same time, an established Dallas company, Texas Instruments, Texas Instruments began its own production of transistors. By 1957, the Soviets had launched Sputnik, Earth’s first artificial satellite. The American aerospace industry struggled to catch up, and government contracts beckoned to the companies that could find a reliable manufacturing process for transistors. Texas Instruments was one of the first companies to obtain defense contracts for the manufacture of transistors.

Research efforts at Fairchild Semiconductor Fairchild Semiconductor and Texas Instruments now focused on putting several transistors on one piece, or chip, of silicon. The first step involved making miniaturized electrical circuits. Jack St. Clair Kilby, a researcher at Texas Instruments, succeeded in making a circuit on a chip, consisting of tiny resistors, transistors, and capacitors, all connected with gold wires. He and his company filed for a patent on this “integrated circuit” in February, 1959. Noyce and his associates at Fairchild Semiconductor followed in July of that year with an integrated circuit manufactured by a “planar process,” which involved laying down several layers of semiconductor, isolated by layers of insulating material. Although Kilby and Noyce are generally recognized as coinventors of the integrated circuit, Kilby has the sole honor of membership in the National Inventors Hall of Fame for his efforts.

Marcian Edward Hoff, Jr., holds one of the computer chips he invented while with Intel in 1971.

(Intel)

A significant development in semiconductor technology was the introduction of photolithography in the mid- and late 1950’s. This process enabled manufacturers to etch the semiconductor by projecting a photo reduction of an actual circuit diagram. The development of the integrated circuit, the planar manufacturing process, and photolithography were all crucial steps in the evolution of microelectronics. They reduced the need for labor-intensive attachment of individual components to a circuit board, greatly facilitating mass production.

In the early 1960’s, Fairchild Semiconductor and Texas Instruments led the semiconductor industry, in great measure because of contracts for missile guidance system components. By the time International Business Machines (IBM) IBM introduced its 360 line of computers in 1963, computer manufacturers were using “solid state”(semiconductor) components in their logic circuits. Computer memory had developed from vacuum tubes to “magnetic core,” which consisted of thousands of tiny ferrite rings through which wires were run. Federal regulations were mandated that required new television sets be able to receive ultrahigh frequency (UHF) broadcast, which was not practical with vacuum tubes. Semiconductor industry leaders saw enormous potential markets in computer memory and television components.

By 1968, Fairchild Semiconductor had grown to a point where many of its key Silicon Valley managers had major philosophical differences with the East Coast management of their parent company. This led to a major exodus of top-level management and engineers. Many started their own companies. Noyce, Gordon E. Moore, and Andrew Grove Grove, Andrew left Fairchild to form a new company in Santa Clara called Intel with $2 million from venture capitalist Arthur Rock. Rock, Arthur Intel’s main business was the manufacture of computer memory integrated circuit chips. By 1970, the Intel team was able to develop and bring to market a random-access memory (RAM) chip that was purchased in mass quantities by several major computer manufacturers, providing large profits for Intel.

In 1969, Marcian Edward Hoff, Jr., an Intel research and development engineer, met with engineers from Busicom, a Japanese firm. Busicom wanted Intel to design a set of integrated circuits for their desktop calculators, but Hoff told them their specifications were too complex. Nevertheless, Hoff began to think about the possibility of incorporating all the logic circuits of a computer central processing unit (CPU) into one chip. He began to design a chip called a microprocessor, which, when combined with a chip to hold the program and one to hold the data, would become a small general-purpose computer. Noyce encouraged Hoff and his associates to continue his work on the microprocessor, and Busicom contracted with Intel to produce the chip. Frederico Faggin Faggin, Frederico was hired from Fairchild, and he did the chip layout and circuit drawings. In January, 1971, the Intel team had their first working microprocessor, called the 4004. The following year, Intel made a higher-capacity microprocessor, the 8008, for Computer Terminals Corporation. That company contracted with Texas Instruments to produce a chip with the same specifications as the 8008, which they produced in June, 1972. Other manufacturers soon produced their own microprocessors.

Hoff later admitted, “If we had not made the 4004 in 1971, someone else would have invented the microprocessor in a year or two.” Indeed, Texas Instruments also claims ownership of the inventions of the microprocessor and the microcomputer. Furthermore, according to a 1990 article in Computerworld, a California engineer named Gilbert Hyatt Hyatt, Gilbert received a patent on July 17, 1990, for a single-chip microcomputer. The patent had been filed in December, 1970, for work dating back to 1968. Regardless of who actually invented the microprocessor, its appearance in the early 1970’s was truly a great event.



Significance

The evolution of integrated circuits produced a corresponding evolution of computer technology. Profits from the sales of integrated circuit memory chips to computer companies allowed the semiconductor makers to research and develop new integrated circuits with more power. In 1964, Moore, the director of research at Fairchild Semiconductors and one of the original Shockley Eight, noted that the number of elements in the most advanced integrated circuits had doubled each year since 1959 and made a prediction that the trend, which became known as Moore’s law, would continue. Coupled with Moore’s law is the “learning curve,” which describes the 20 to 30 percent decline in an integrated circuit’s price for each doubling of its market lifetime as production efficiency improves.

The integrated circuit industry became a fiercely competitive arena where companies such as Texas Instruments, Intel, Motorola, and Fairchild Semiconductors contended. Many other new companies were created in the process. After leaving Fairchild, Shockley’s original team eventually produced more than thirty corporate descendants in Silicon Valley.

By 1975, microcomputer kits were sold to hobbyists for less than a thousand dollars. More powerful computers made by Apple, Commodore, and Radio Shack soon appeared in stores for the same price. Personal computers Intel was assured a dominating role in the industry when its 8088 microprocessor was used as the basis of the IBM personal computer (PC), introduced in 1981. IBM’s prestige as the world’s largest computer company legitimated the PC for business use. Also, it created a standard that served to unify a growing number of divergent microcomputer architectures.

The learning curve of microprocessor technology made available for a few dollars computers the size of a fingernail that would have filled a room and cost hundreds of thousands of dollars a decade before. Computers had existed previously behind closed doors as the mysterious domain of a few scientists and technicians, but the microprocessor allowed almost anyone to own a computer, often without even being aware of it. The microprocessor found its way into automobiles, microwave ovens, wristwatches, telephones, and an abundance of other ordinary items encountered in everyday life.

On the cutting edge of technology, ever more powerful computers were developed, often using a parallel architecture of several CPUs. The declining cost of microelectronic technology made it possible for more and more companies to invest in and make use of a global network of communications satellites.

After several decades, Moore’s law was still an accurate measure of integrated circuit development. However, Noyce and others admitted that chip densities have an upper limit imposed by the laws of physics. More than four decades later, integrated circuits contained hundreds of millions of transistors. The future of this technology is difficult to predict, but it is known with certainty that computer technology will continue to develop rapidly. Intel Corporation
Microprocessors
Integrated circuits
Computers;microprocessors
Microelectronics



Further Reading

  • Berlin, Leslie. The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley. New York: Oxford University Press, 2005. This well-researched biography includes interviews with Andy Grove, Steve Jobs, and Gordon Moore, among others.
  • Hanson, Dirk. The New Alchemists: Silicon Valley and the Microelectronics Revolution. Boston: Little, Brown, 1982. Chronicles the evolution of semiconductor technology from the days of Thomas Alva Edison. The history, methods, and culture of the semiconductor industry are richly described, along with the industry’s impact on communications, the military, consumer products, and larger social issues such as freedom and unemployment.
  • Harrington, Maura J. “Micro Chip Patent Rewrites History.” Computerworld 24 (September 3, 1990): 4. This short article contains the startling report that twenty years after filing with the U.S. Patent Office, Gilbert Hyatt was granted a patent for the first microcomputer chip.
  • Jackson, Tim. Inside Intel: The Story of Andrew Grove and the Rise of the World’s Most Powerful Chip Company. 2d ed. New York: Plume Books, 1998. History of the genesis and rise of Intel, the world’s largest semiconductor company.
  • Large, Peter. The Micro Revolution Revisited. London: Frances Pinter, 1984. An English view of the impact of microprocessors on human society. Contains a description of how chips are made and their history.
  • Malone, Michael S. The Microprocessor: A Biography. Santa Clara, Calif.: Telos, 1995. Presents a general history of the microprocessor, how it works, and the people involved with its development.
  • Noyce, Robert N. “Microelectronics.” Scientific American 237 (September, 1977): 63-69. Cites Moore’s law as the basis for predicting the rapid growth and proliferation of computing power. The revolutionary impact on society that Noyce predicted continues to unfold.
  • Osborne, Adam. An Introduction to Microcomputers. 3d ed. Berkeley, Calif.: Osborne/McGraw-Hill, 1982. For those who desire to learn about how microprocessors work, Osborne provides a good introduction. The book is filled with illustrations and diagrams that help explain the components and functions of microprocessors and their related devices to the nontechnical reader.
  • _______. Running Wild: The Next Industrial Revolution. Berkeley, Calif.: Osborne/McGraw-Hill, 1979. Industry pioneer Osborne’s account of the microprocessor’s impact on technology, society, and the future. Unfortunately, the unusual insight evident in this book was not enough to ensure the success of the company Osborne founded in 1980 to manufacture the Osborne 1 portable computer. After two years of huge initial sales, production and cash flow problems caused Osborne Computer to go bankrupt.
  • Reid, T. R. The Chip: How Two Americans Invented the Microchip and Launched a Revolution. Rev. ed. New York: Random House, 2001. Biography of inventors Jack Kirby and Robert Noyce. Provides a useful introduction to the science of microchips.
  • Rogers, Everett M., and Judith K. Larsen. Silicon Valley Fever: The Growth of High-Technology Culture. New York: Basic Books, 1984. Thorough chronicle of Silicon Valley, the hub of the U.S. semiconductor industry, with an emphasis on the companies that form its history, economics, and culture. Chapters 2 and 6 are particularly informative. The authors spent two years interviewing industry leaders. Contains many personal accounts that offer an intimate look at key events.


Texas Instruments Introduces the Pocket Calculator

Jobs and Wozniak Found Apple Computer

Apple II Becomes the First Successful Preassembled Personal Computer

IBM Introduces Its Personal Computer

Introduction of Optical Discs for Data Storage

Introduction of the Apple Macintosh

IBM and Apple Agree to Make Compatible Computers

Intel Introduces the Pentium Processor