Boulder/Longmont Computer Repair – History of the Computer – Computer Physicians, LLC  

Computer Physicians provides data recovery, computer troubleshooting, virus removal, networking and other computer fixes.

Here is a good article about the history of computers by marygrove.edu

History of the Computer

The history of the computer can be divided into six generations each of which was
marked by critical conceptual advances.
The Mechanical Era (1623-1945)
The idea of using machines to solve mathematical problems can be traced at least as
far back as the early 17th century, to mathematicians who designed and implemented
calculators that were capable of addition, subtraction, multiplication, and division.
Among the earliest of these was Gottfried Wilhelm Leibniz (1646-1716), German
philosopher and co-founder (with Newton) of the calculus. Leibniz proposed the idea
that mechanical calculators (as opposed to humans doing arithmetic) would function
fastest and most accurately using a base-two, that is, binary system.
Leibniz actually built a digital calculator and presented it to the scientific authorities
in Paris and London in 1673. His other great contribution to the development of the
modern computer was the insight that any proposition that could be expressed
logically could also be expressed as a calculation, “a general method by which all the
truths of the reason would be reduced to a kind of calculation” (Goldstine 1972).
Inherent in the argument is the principle that binary arithmetic and logic were in some
sense indistinguishable: zeroes and ones could as well be made to represent positive
and negative or true and false. In modern times this would result in the understanding
that computers were at the same time calculators and logic machines.
The first multi-purpose, i.e. programmable, computing device was probably Charles
Babbage’s Difference Engine, which was begun in 1823 but never completed. A more
ambitious machine was the Analytical Engine. It was designed in 1842, but
unfortunately it also was only partially completed by Babbage.
That the modern computer was actually capable of doing something other than
numerical calculations is probably to the credit of George Boole (1815-1864), to
whom Babbage, and his successors, were in deep debt. By showing that formal logic
could be reduced to an equation whose results could only be zero or one, he made it
possible for binary calculators to function as logic machines (Goldstine 1972).
First Generation Electronic Computers (1937–1953)
Three machines have been promoted at various times as the first electronic computers.
These machines used electronic switches, in the form of vacuum tubes, instead of
electromechanical relays. Electronic components had one major benefit, however:
they could “open” and “close” about 1,000 times faster than mechanical switches.
A second early electronic machine was Colossus, designed by Alan Turing for the
British military in 1943. This machine played an important role in breaking codes
used by the German army in World War II. Turing’s main contribution to the field of
computer science was the idea of the “Turing machine,” a mathematical formalism,
indebted to George Boole, concerning computable functions.
The machine could be envisioned as a binary calculator with a read/write head
inscribing the equivalent of zeroes and ones on a movable and indefinitely long tape.
2
The Turing machine held the far-reaching promise that any problem that could be
calculated could be calculated with such an “automaton,” and, picking up from
Leibniz, that any proposition that could be expressed logically could, likewise, be
expressed by such an “automaton.”
The first general purpose programmable electronic computer was the Electronic
Numerical Integrator and Computer (ENIAC), built by J. Presper Eckert and John V.
Mauchly at the University of Pennsylvania. The machine wasn’t completed until 1945,
but then it was used extensively for calculations during the design of the hydrogen
bomb.
The successor of the ENIAC, the EDVAC project was significant as an example of
the power of interdisciplinary projects that characterize modern computational science.
By recognizing that functions, in the form of a sequence of instructions for a
computer, can be encoded as numbers, the EDVAC group knew the instructions could
be stored in the computer’s memory along with numerical data (a “von Neumann
Machine”).
The notion of using numbers to represent functions was a key step used by Gödel in
his incompleteness theorem in 1937, work with which von Neumann, as a logician,
was quite familiar. Von Neumann’s own role in the development of the modern digital
computer is profound and complex, having as much to do with brilliant administrative
leadership as with his foundation insight that the instructions for dealing with data,
that is, programming, and the data themselves, were both expressible in binary terms
to the computer, and in that sense indistinguishable one from the other. It is that
insight which laid the basis for the “von Neumann machine,” which remains the
principal architecture for most actual computers manufactured today.
Second Generation Computers (1954–1962)
The second generation saw several important developments at all levels of computer
system design, from the technology used to build the basic circuits to the
programming languages used to write scientific applications.
Memory technology was based on magnetic cores which could be accessed in random
order, as opposed to mercury delay lines, in which data was stored as an acoustic
wave that passed sequentially through the medium and could be accessed only when
the data moved by the I/O interface.
During this second generation many high level programming languages were
introduced, including FORTRAN (1956), ALGOL (1958), and COBOL (1959).
Important commercial machines of this era include the IBM 704 and its successors,
the 709 and 7094. The latter introduced I/O processors for better throughput between
I/O devices and main memory.
Third Generation Computers (1963–1972)
The third generation brought huge gains in computational power. Innovations in this
era include the use of integrated circuits, or ICs (semiconductor devices with several
transistors built into one physical component), semiconductor memories starting to be
used instead of magnetic cores, microprogramming as a technique for efficiently
designing complex processors, the coming of age of pipelining and other forms of
3
parallel processing, and the introduction of operating systems and time-sharing.
Fourth Generation Computers (1972–1984)
The next generation of computer systems saw the use of large scale integration (LSI —
1000 devices per chip) and very large scale integration (VLSI — 100,000 devices per
chip) in the construction of computing elements. At this scale entire processors will fit
onto a single chip, and for simple systems the entire computer (processor, main
memory, and I/O controllers) can fit on one chip. Gate delays dropped to about 1ns
per gate.
Two important events marked the early part of the third generation: the development
of the C programming language and the UNIX operating system, both at Bell Labs. In
1972, Dennis Ritchie, seeking to meet the design goals of CPL and generalize
Thompson’s B, developed the C language.
Fifth Generation Computers (1984–1990)
The development of the next generation of computer systems is characterized mainly
by the acceptance of parallel processing. The fifth generation saw the introduction of
machines with hundreds of processors that could all be working on different parts of a
single program. The scale of integration in semiconductors continued at an incredible
pace — by 1990 it was possible to build chips with a million components — and
semiconductor memories became standard on all computers.
Sixth Generation Computers (1990–)
Many of the developments in computer systems since 1990 reflect gradual
improvements over established systems, and thus it is hard to claim they represent a
transition to a new “generation”, but other developments will prove to be significant
changes.
One of the most dramatic changes in the sixth generation will be the explosive growth
of wide area networking. Network bandwidth has expanded tremendously in the last
few years and will continue to improve for the next several years.