Friday, April 9, 2010

Wednesday, May 20, 2009

History Of Computers

History of Computers

The development of the modern day computer was the result of advances in technologies and man's need to quantify. Papyrus helped early man to record language and numbers. The abacus was one of the first counting machines. . Some of the earlier mechanical counting machines lacked the technology to make the design work. For instance, some had parts made of wood prior to metal manipulation and manufacturing. Imagine the wear on wooden gears. This history of computers site includes the names of early pioneers of math and computing and links to related sites about the History of Computers, for further study. This site would be a good Web adjunct to accompany any book on the History of Computers or Introduction to Computers. The "H" Section includes a link to the History of the Web Beginning at CERN which includes Bibliography and Related Links. Hitmill.com strives to always include related links for a broader educational experience. The material was originally divided into Part 1 & Part 2
Generation Of Computers:

The the late 1960s and early 70s, there was much talk about "generations" of computer technology. This photo illustrates what were commonly known as the three generations:

First Generation Computers(1940-1956):

The first generation of computers is said by some to have started in 1946 with ENIAC, the first 'computer' to use electronic valves (ie. vacuum tubes). Others would say it started in May 1949 with the introduction of EDSAC, the first stored program computer. Whichever, the distinguishing feature of the first generation computers was the use of electronic valves.
My personal take on this is that ENIAC was the World's first electronic calculator and that the era of the first generation computers began in 1946 because that was the year when people consciously set out to build stored program computers (many won't agree, and I don't intend to debate it). The first past the post, as it were, was the EDSAC in 1949. The period closed about 1958 with the introduction of transistors and the general adoption of ferrite core memories.
OECD figures indicate that by the end of 1958 about 2,500 first generation computers were installed world-wide. (Compare this with the number of PCs shipped world-wide in 1997, quoted as 82 million by Dataquest).
Two key events took place in the summer of 1946 at the Moore School of Electrical Engineering at the University of Pennsylvania. One was the completion of the ENIAC. The other was the delivery of a course of lectures on "The Theory and Techniques of Electronic Digital Computers". In particular, they described the need to store the instructions to manipulate data in the computer along with the data. The design features worked out by John von Neumann and his colleagues and described in these lectures laid the foundation for the development of the first generation of computers. That just left the technical problems!

One of the projects to commence in 1946 was the construction of the IAS computer at the Institute of Advanced Study at Princeton. The IAS computer used a random access electrostatic storage system and parallel binary arithmetic. It was very fast when compared with the delay line computers, with their sequential memories and serial arithmetic.
The Princeton group was liberal with information about their computer and before long many universities around the world were building their own, close copies. One of these was the SILLIAC at Sydney University in Australia.
.

First Generation Technologies:
In 1946 there was no 'best' way of storing instructions and data in a computer memory. There were four competing technologies for providing computer memory: electrostatic storage tubes, acoustic delay lines (mercury or nickel), magnetic drums (and disks?), and magnetic core storage.
A high-speed electrostatic store was the heart of several early computers, including the computer at the Institute for Advanced Studies in Princeton. Professor F. C. Williams and Dr. T. Kilburn, who invented this type of store, described it in Proc.I.E.E. 96, Pt.III, 40 (March, 1949).
The great advantage of this type of "memory" is that, by suitably controlling the deflector plates of the cathode ray tube, it is possible to redirect the beam almost instantaneously to any part of the screen: random access memory.
Acoustic delay lines are based on the principle that electricity travels at the speed of light while mechanical vibrations travel at about the speed of sound. So data can be stored as a string of mechanical pulses circulating in a loop, through a delay line with its output connected electrically back to its input. Of course, converting electric pulses to mechanical pulses and back again uses up energy, and travel through the delay line distorts the pulses, so the output has to be amplified and reshaped before it is fed back to the start of the tube.

Second Generation - 1956-1963:

Transitors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly languages, which allowed programmers to specify instructions in words.High Level Programming Language were also being developed at this time, such as early versions of COBAL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.


Third Generation - 1964-1971:



Development of the integrated circuits was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips called Semiconductors which drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers through Keyboards and monitors and interfaced with an operating systems,which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.


Fourth Generation - 1971:





MicroprocessorsThe microprocessors brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer - from the central processing unitand memory to input/output controls - on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.


Fifth Generation - Present and Beyond:






Fifth generation computing devices, based on artifical intelligance are still in development, though there are some applications, such as voice recognition that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum Processing and molecular and nano technology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.