|
1920s-1950s:
The EAM Era From the 1920s throughout the mid-1950s,
punched-card technology 4improved with the addition of more
punched-card devices and more sophisticated capabilities. The
electromechanical accounting machine (EAM) family of punched-card
devices includes the card punch, verifier, reproducer, summary
punch, interpreter, sorter, collator, and accounting machine.Most of
the devices in the 1940s machine room were "programmed" to perform a
particular operation by the insertion of a prewired control panel. A
machine-room operatorin a punched-card installation had the
physically challenging job of moving heavy
1904-1995:
Dr. John V. Atanasoff In 1939 Dr. John V.Atanasoff, a professor
at Iowa State University, and graduate student Clifford E. Berry
assembled a prototype of the ABC (for Atanasoff Berry Computer} to
cut the time physics students spent making complicated calculations.
A working model was finished in 1942. Atanasoff's decisions—to use
an electronic medium with vacuum tubes, the base-2 numbering system,
and memory and logic circuits—set the direction for the modern
computer. Ironically, Iowa State failed to patent the device and
IBM, when contacted about the ABC, airily responded, "IBM will never
be interested in an electronic computing machine." A 1973 federal
court ruling officially credited Atanasoff with the invention of the
automatic electronic digital computer.
1942:
The First Computer, The ABC During the years 1935 through 1938,
Dr. Atanasoff had begun to think about a machine that could reduce
the time it took for him and his physics students to make long,
complicated mathematical calculations. The ABC was, in fact, born of
frustration. Dr. Atanasoff later explained that one night in the
winter of 1937, "nothing was happening" with respect to creating an
electronic device that could help solve physics problems. His
"despair grew," so he got in his car and drove for several hours
across the state of Iowa and then across the Mississippi River.
Finally, he stopped at an Illinois madhouse for a drink. It was in
this roadhouse that Dr. Atanasoff overcame his creative block and
conceived ideas that would lay the foundation for the evolution of
the modern computer
1944:
The Electromechanical Mark I Computer The first
electromechanical computer, the MARK I, was completed by Harvard
University professor Howard Aiken in 1944 under the sponsorship of
IBM. A monstrous 51 feet long and 8 feet high, the MARK I was
essentially a serial collection of electromechanical calculators and
was in many ways similar to Babbage's analytical machine. (Aiken was
unaware of Babbage's work, though.) The Mark I was a significant
improvement, but IBM's management still felt electromechanical
computers would never replace punched-card equipment.
1946:
The Electronic ENIAC Computer Dr. John W. Mauchly (middle)
collaborated with J. Presper Eckert, Jr. (foreground) at the
University of Pennsylvania to develop a machine that would compute
trajectory tables for the U.S. Army. (This was sorely needed; during
World War II, only 20% of all bombs came within 1000 feet of their
targets.) The end product, the first fully operational electronic
computer, was completed in 1946 and named the ENIAC (Electronic
Numerical Integrator and Computer). A thousand times faster than its
electromechanical predecessors, it occupied 15,000 square feet of
floor space and weighed 30 tons. The ENIAC could do 5000 additions
per minute and 500 multiplications per minute. Unlike computers of
today that operate in binary, it operated in decimal and required 10
vacuum tubes to represent one decimal digit. The ENIAC's use of
vacuum tubes signaled a major breakthrough. (Legend has it that the
ENIAC's 18,000 vacuum tubes dimmed the lights of Philadelphia
whenever it was activated.) Even before the ENIAC was finished, it
was used in the secret research that went into building the first
atomic bomb at Los Alamos.
1951:
The UNIVAC I and the First Generation of Computers The first
generation of computers (1951-1959), characterized by the use of
vacuum tubes, is generally thought to have begun with the
introduction of the first commercially viable electronic digital
computer. The Universal Automatic Computer (UNIVAC I for short),
developed by Mauchly and Eckert for the Remington-Rand Corporation,
was installed in the U.S. Bureau of the Census in 1951. Later that
year, CBS News gave the UNIVAC I national exposure when it correctly
predicted Owight Eisenhower's victory over Adlai Stevenson in the
presidential election with only 5% of the votes counted. Mr. Eckert
is shown here instructing news anchor Walter Cronkite in the use of
the UNIVAC I. 1954: The IBM 650 Not until the success of the
UNIVAC I did IBM make a commitment to develop and market computers.
IBM's first entry into the commercial computer market was the IBM
701 in 1953. However, the IBM 650 (shown here), introduced in 1954,
is probably the reason IBM enjoys such a healthy share of today's
computer market. Unlike some of its competitors, the IBM 650 was
designed as a logical upgrade to existing punched-card machines. IBM
management went out on a limb and estimated sales of 50—a figure
greater than the number of installed computers in the entire nation
at that time. IBM actually installed 1000. The rest is history.
1907-1992:
"Amazing" Grace Murray Hopper Dubbed "Amazing Grace" by her many
admirers, Dr. Grace Hopper was widely respected as the driving force
behind COBOL, the most popular programming language, and a champion
of standardized programming languages that ore hardware-independent.
In 1959 Dr. Hopper led an effort that laid the foundation for the
development of COBOL. She also created a compiler that enabled COBOL
to run on many types of computers. Her reason: "Why start from
scratch with every program you write when a computer could be
developed to do a lot of the basic work for you over and over
again?To Dr. Hopper's long list of honors, awards, and
accomplishments, add the fact that she found the first "bug" in a
computer—a real one. She repaired the Mark II by removing a moth
that was caught in Relay Number II. From that day on, every
programmer has debugged software by ferreting out its bugs, or
errors, in programming syntax or logic.
1959:
The Honeywell 400 and the Second Generation of Computers The
invention of the transistor signaled the start of the second
generation of computers (1959-1964). Transistorized computers were
more powerful, more reliable, less expensive, and cooler to operate
than their vacuum-tubed predecessors. Honeywell (its Honey-well 400
is shown here) established itself as a major player in the second
generation of computers. Burroughs, Univac, NCR, CDC, and
Honeywell—IBM's biggest competitors during the 1960s and early
1970s&151;became known as the BUNCH (the first initial of each
name).
1963:
The PDP-8 Minicomputer During the 1950s and early 1960s, only
the largest companies could afford the six- and seven-digit price
tags of mainframe computers. In 1963 Digital Equipment Corporation
introduced the PDP-8 (shown here). It is generally considered the
first successful minicomputer (a nod, some claim, to the playful
spirit behind the 1960s miniskirt). At a mere $18,000, the
transistor-based PDP-8 was an instant hit. It confirmed the
tremendous demand for small computers for business and scientific
applications. By 1971 more than 25 firms were manufacturing
minicomputers, although Digital and Data General Corporation took an
early lead in their sale and manufacture.
1964:
The IBM System 360 and the Third Generation of Computers The
third generation was characterized by computers built around
integrated circuits. Of these, some historians consider IBM's System
360 line of computers, introduced in 1964, the single most important
innovation in the history of computers. System 360 was conceived as
a family of computers with upward compatibility; when a company
outgrew one model it could move up to the next model without
worrying about converting its data. System 360 and other lines built
around integrated circuits made all previous computers obsolete, but
the advantages were so great that most users wrote the costs of
conversion off as the price of progress.
1964:
BASIC—More than a Beginner's Programming Language In the early
1960s, Dr. Thomas Kurtz and Dr. John Kemeny of Dartmouth College
began developing a programming language that a beginner could learn
and use quickly. Their work culminated in 1964 with BASIC. Over the
years, BASIC gained widespread popularity and evolved from a
teaching language into a versatile and powerful language for both
business and scientific applications. From micros to mainframes,
BASIC is supported on more computers than any other language.
1971:
Integrated Circuits and the Fourth Generation of Computers
Although most computer vendors would classify their computers as
fourth generation, most people pinpoint 1971 as the generation's
beginning. That was the year large-scale integration of circuitry
(more circuits per unit of space) was introduced. The base
technology, though, is still the integrated circuit. This is not to
say that two decades have passed without significant innovations. In
truth, the computer industry has experienced a mind-boggling
succession of advances in the further miniaturization of circuitry,
data communications, and the design of computer hardware and
software.
1975:
Microsoft and Bill Gates In 1968, seventh grader Bill Gates and
ninth grader Paul Alien were teaching the computer to play monopoly
and commanding it to play millions of games to discover gaming
strategies. Seven years later, in 1975, they were to set a course
which would revolutionize the computer industry. While at Harvard,
Gates and Alien developed a BASIC programming language for the first
commercially available microcomputer, the MITS Altair. After
successful completion of the project, the two formed Microsoft
Corporation, now the largest and most influential software company
in the world. Microsoft was given an enormous boost when its
operating system software, MS-DOS, was selected for use by the IBM
PC. Gates, now the richest man in America, provides the company's
vision on new product ideas and technologies.
1977:
The Apple II Not until 1975 and the introduction of the Altair
8800 personal computer was computing made available to individuals
and very small companies. This event has forever changed how society
perceives computers. One prominent entrepreneurial venture during
the early years of personal computers was the Apple II computer
(shown here). Two young computer enthusiasts, Steven Jobs and Steve
Wozniak (then 21 and 26 years of age, respectively), collaborated to
create and build their Apple II computer on a makeshift production
line in Jobs' garage. Seven years later, Apple Computer earned a
spot on the Fortune 500, a list of the 500 largest corporations in
the United States.
1981:
The IBM PC In 1981, IBM tossed its hat into the personal
computer ring with its announcement of the IBM Personal Computer, or
IBM PC. By the end of 1982,835,000 had been sold. When software
vendors began to orient their products to the IBM PC, many companies
began offering IBM-PC compatibles or clones. Today, the IBM PC and
its clones have become a powerful standard for the microcomputer
industry.
1982:
Mitchell Kapor Designs Lotus 1-2-3 Mitchell Kapor is one of the
major forces behind the microcomputer boom in the 1980s. In 1982,
Kapor founded Lotus Development Company, now one of the largest
applications software companies in the world. Kapor and the company
introduced an electronic spreadsheet product that gave IBM's
recently introduced IBM PC (1981) credibility in the business
marketplace. Sales of the IBM PC and the electronic spreadsheet,
Lotus 1-2-3, soared.
1984:
The Macintosh and Graphical User Interfaces In 1984 Apple
Computer introduced the Macintosh desktop computer with a very
friendly graphical user interface& #151: proof that computers
can be easy and fun to use. Graphical user interfaces (GUIs) began
to change the complexion of the software industry. They hove changed
the interaction between human and computer from a short,
character-oriented exchange modeled on the teletypewriter to the now
familiar WIMP interface—Windows, Icons, Menus, and Pointing devices.
Courtesy of Apple Computer, Inc.
1985
to present: Microsoft Windows Microsoft introduced Windows, a
GUI for IBM-PC-compatible computers in 1985; however, Windows did
not enjoy widespread acceptance until 1990 with the release of
Windows 3.0. Windows 3.0 gave a huge boost to the software industry
because larger, more complex programs could now be run on IBM-PC
compatibles. Subsequent releases, including Windows 95, Windows NT,
and Windows 98 make personal computers even easier to use, fueling
the PC explosion of the 1990s |