At the same time that transistors were replacing vacuum tubes, Jack S. Kilby of Texas Instruments and Robert Noyce at Fairchild Semiconductor were separately developing the integrated circuit (IC). Using separate methods, they discovered that the components of electronic circuits would be placed together or integrated onto small chips. Soon a single silicon chip less than one-eighth inch square could hold sixty-four complete circuits. This seems crude to us since today's chips may contain several million transistors.
The chips marked the beginning of third-generation computers, computers that used less power, cost less, and were smaller and much more reliable than previous machines. Although computers became smaller, their internal memories increased due to the placement of memory on chips(see picture below).
A major third-generation innovation resulted when IBM realized that its company was turning out too many incompatible products. The company responded to the problem by designing the System/36O computers, which offered both scientific and business applications and introduced the family concept of computers. The first series consisted of six computers designed to run the same programs and use the same input, output, and storage equipment. Each computer offered a different memory capacity. For the first time, a company could buy a computer and feel that its investment in programs and peripheral equipment would not be wasted when the time came to move to a machine with a larger memory. Other manufacturers followed IBM's lead, and before long, more than 25,000 similar computer systems were installed in the United States.
Other developments in this period included minicomputers. Although these machines had many of the same capabilities as large computers, they were much smaller, had less storage space, and cost less. Use of remote terminals also became common. Remote terminals are computer terminals that are located some distance away from a main computer and linked to it through cables such as telephone lines.
The software industry also began to emerge in the 1960s. Programs to perform payroll, billings, and other business tasks became available at fairly low costs. Yet software was rarely free of "bugs," or errors. The computer industry experienced growing pains as the software industry lagged behind advances in hardware technology. The rapid advancements in hardware meant that old programs had to be rewritten to suit the circuitry of the new machines, and programmers skilled enough to do this were scarce. Software problems led to a glut of computer-error horror stories: a $200,000 water bill or $80,000 worth of duplicate welfare checks.
Last Updated Jan 5/98