Computer industry

The computer industry is a major locus of innovation that has enabled all kinds of American businesses to manage information more efficiently.


The computer industry had its beginnings during the 1880’s, when the United States government faced an insurmountable problem in counting its population. By law, the government was required to perform a census every ten years to determine apportionment in the House of Representatives. However, the 1880 census had taken nine years to tabulate by hand, while immigration and new births were almost doubling the national population. It was recognized that tabulating the 1890 census rapidly enough to make its data actually useful would require mechanical assistance.Computer industry



The First Computers

Herman Hollerith, HermanHollerith, an engineer hired by the U.S. Bureau of the Census to collect and analyze industrial statistics, devised a machine that could input the necessary information in the form of punched cards and show the resulting count on clocklike dials. After lengthy negotiations, he installed a number of his machines and kept them working throughout the tabulation process. As a result of his diligent work, the 1890 census was tabulated in a year and a half.

The world’s first computer, the Electronic Numerical Integrator and Calculator (ENIAC) pictured here, was built in 1946.

(National Archives)

Hollerith was soon receiving requests from other countries for his machines to be used in their periodic censuses. In addition, the railroads and other large companies were interested in such equipment to streamline their accounting departments. As a result, Hollerith formed a business, the Tabulating Machine CompanyTabulating Machine Company, to build and market his machines. After a series of mergers, it would ultimately form part of International Business MachinesInternational Business Machines (IBM), an early giant of the computer industry. The best-known computing companies were known as the Seven Dwarfs: Burroughs, Control Data Corporation (CDC), General Electric, Honeywell, National Cash Register (NCR), the Radio Corporation of America (RCA), and Sperry Rand.

During the early decades of computing, each device was built specially for the agency or corporation that would use it. These devices were in many ways as experimental as those built by research institutions to study computing. The lack of uniformity among installations meant their production and operation were more craft than an industry. Individual components might be mass-produced, but beyond that level, economies of scale could not be brought to bear.

In 1964, IBM introduced the System/360, the first mass-produced mainframe computer using a standardized architecture and instruction set. For the first time, it was possible for a business or government agency to order a computer and software from stock. The shift from computer as a custom-designed item to computer as a product was as critical to the creation of the modern computer industry as the technological progression from electromechanical relay to vacuum tube to transistor. However, mainframe computers and their smaller siblings, the minicomputers (machines about the size of an entire desk) were all sold on the same service-contract model as the original computers. The computer company did not sell its customers a device, but a long-term relationship of integrated software and support.

This concept is critical to understanding just how revolutionary the microcomputer was. Rather than being the end of a steady shrinking of the mainframe, the microcomputer had its roots in the youth culture of electronics enthusiasts in California’s Silicon Valley. Like the radio enthusiasts of the 1920’s, they were in love with the pioneering spirit of the new technology. With the development of the microprocessor, which put all the components of the traditional mainframe central processing unit onto a single piece of silicon, they could build up from this one chip to create a tiny computer.



Microcomputers

Apple Apple (company)Computer (later simply Apple) had its beginnings when Steve Wozniak, SteveWozniak built the original Apple I to prove to his friends in the Home Brew Computer Club that he could build a better computer with fewer parts, but his friend Steve Jobs, SteveJobs saw a potential market for a preassembled computer that could be used by anyone. That computer became the Apple II, and it took off in the market so rapidly that it soon made the founders of Apple wealthy. Their success was noted by other companies, and soon there were a large number of microcomputers on the market, all using incompatible formats and proprietary software.

Once IBM entered the microcomputer market in 1981 with the Personal Computer (PC), its reputation landed it a solid market among businesses who wanted to put small computers on the desks of their workers. Eager to cash in on this market, a number of other companies took advantage of certain loopholes in IBM’s patent and licensing arrangements to build machines that would work the same way as an IBM PC. These low-cost IBM compatibles, often called “clones,” secured a major portion of the market, to the point that competing approaches to the microcomputer were driven from the market, with one notable exception.

Jobs of Apple Computer refused to get on the bandwagon of IBM compatibility. In the Orwellian year of 1984, Apple showed a puzzling but prophetic advertisement in one of the coveted Superbowl advertising slots. Known as “Big Brother,” the advertisement featured hordes of gray-clad drones listening to a corporate talking-head in a vast grimy theater, when they are interrupted by a muscular youth who flings a sledgehammer into the screen. A poke at the corporate domination of IBM, this advertisement introduced the Macintosh, a revolutionary new design in computing.

Jobs had become convinced that the microcomputer could succeed in the mass market only if it became an appliance. It must require no more understanding of its technology on the part of the user than a refrigerator or washing machine did. The Macintosh did away with the command line and its arcane commands, replacing them with an object-oriented graphical user interface built on a desktop metaphor. Anyone could sit down at a Mac and start doing useful work without needing to memorize commands.

However, the Mac’s success in the marketplace was limited by its high cost. Unlike IBM, Apple jealously guarded its proprietary architecture. Although Apple did license the technology for connecting peripherals to a Macintosh, consumers could buy the computer itself only from Apple, on its terms. Even with deep educational discounts, the Mac remained beyond the reach of many cash-strapped students. Many people chose an IBM-compatible computer on the basis of price (and because of IBM’s domination of the business market).

Apple did try some innovative marketing strategies to push the Macintosh in its early years. One of the most unusual was Test Drive a Mac, in which people could take a Mac for two days and try it out in the comfort of their home before deciding to buy it. The idea was that consumers would be so enchanted with the Mac experience that they could not bear to part with the Mac at the end of the trial period. However, Apple made one disastrous mistake: It rolled out the campaign during the 1985 Christmas season. Computer dealers already busy with the Christmas rush did not want the additional hassle of processing applications for loaner computers. As a result, the loaner program failed and was soon discontinued.



At the same time, users of the disk operating system (DOS) looking to simplify their experience were buying and installing shell software to interpret the command line for them. Most DOS shells offered a simplified set of menus, but BillGates, BillGates’s company, MicrosoftMicrosoft, offered one that used the visual metaphor that had been so successful for Apple. The earliest versions of Windows were primitive, but by Windows 3.1, the interface was smooth enough that Apple sued Microsoft for copyright infringement on the basis of look and feel. In a ferocious court battle, Microsoft won on the basis that Apple had taken the Macintosh Finder largely from the experimental Alto interface developed by Xerox. After that legal battle secured its future, Windows became the dominant microcomputer operating system, capturing 85 percent of the market by 1995.

By the middle of the 1990’s, Apple Computer seemed to have lost its way and was in danger of being put out of business altogether. Ironically, it was Microsoft’s own success that saved Apple. Because of its dominant position in the microcomputer operating system market, Microsoft became the target of a U.S. Department of Justice antitrust suit alleging that it had used illegal monopolistic practices to secure its predominance. As a result, Gates became increasingly willing to work out a joint venture deal with Jobs, who had returned to Apple as its new chief executive officer. The newly reinvigorated Apple simplified its line of products with the four-cell grid marketing scheme (personal vs. business, desktop vs. laptop) and secured its small but steady portion of the market share.



Mainframes and More

Although by 1990 the microcomputer in its various permutations had become people’s primary image of a “computer,” the mainframe had not vanished. In this market sector, IBM remained the dominant driving force. Critical as the IBM PC and its successors may have been in establishing microcomputer standards, mainframes remained IBM’s bread and butter. The use of microprocessors and superscalar architecture permitted mainframes to shrink from the size of entire rooms to that of small cabinets, but they generally continued to be purchased on the full-service model. With the growth of the Internet and particularly the World Wide Web, mainframes grew popular once again for use as server farms by companies such as Yahoo!, eBay, and Google, running the infrastructure that served the information superhighway.

At the uppermost end of the mainframe market, a new subtype of computer had appeared–the Supercomputerssupercomputer. These giant number crunchers were more the descendants of the university research computers such as ILLIAC than of the business mainframes, but with the rise of companies such as Cray, they became manufactured items that research universities could order from an established model line.

The beginning of the twenty-first century saw the convergence of several information technology industries. The bottom of the mainframe industry began to blur into the high end of the microcomputer workstation market, and some of the smallest laptop and notebook microcomputers began to share features with high-end scientific calculators, digital cellular telephones, and digital cameras. In addition, an increasing portion of the computer industry was devoted to the production and implementation of ubiquitous yet almost entirely invisible microcontrollers built into ordinary household appliances, automobiles, and other mechanical systems to make them run more efficiently and serve their users better. It was often cheaper for manufacturers to buy bulk lots of a standard microcontroller and hire a programmer to write a program to control the appliance’s operations than to design and build a mechanical control system.



Further Reading

  • Berlin, Leslie. The Man Behind the Microchip: Robert Noyce and the Invention of Silicon Valley. New York: Oxford University Pres, 2005. Argues that Noyce and Fairchild Semiconductor were primarily responsible for Santa Clara County, California, becoming a major center of the computer industry.
  • Chandler, Alfred D., Jr. Inventing the Electronic Century: The Epic Story of the Consumer Electronics and Computer Industries. New York: Free Press, 2001. Overview of the rise of the computer industry.
  • Cringely, Robert X. Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition, and Still Can’t Get a Date. Reading, Mass.: Addison-Wesley, 1992. Focuses on the business culture of the computer industry.
  • Malone, Michael S. Infinite Loop: How Apple, the World’s Most Insanely Great Computer Company, Went Insane. New York: Doubleday, 1999. Business history of Apple, the first company to make a microcomputer for consumers.
  • Pugh, Emerson W. Building IBM: Shaping an Industry and Its Technology. Cambridge, Mass.: MIT Press, 1995. Business history of IBM, the giant of computer companies.
  • Reid, T. R. The Chip: How Two Americans Invented the Microchip and Launched a Revolution. New York: Random House, 2001. A basic history of the development of the microchip, critical to the development of modern computers.
  • Wallace, James, and Jim Erickson. Hard Drive: Bill Gates and the Making of the Microsoft Empire. New York: Harper Business, 1993. Looks at Gates’s role in dominating the microcomputer operating system market.



Apple

Automation in factories

Business crimes

Digital recording technology

eBay

E-mail

Fiber-optic industry

Bill Gates

Google

International Business Machines

Internet

Online marketing