It is undeniable how impressive the generations of Computer Technology have flourish. The famous saying “history is history” cannot be said when it comes to this subject for until now, it is still history in the making. For today, the generation of Computer Technology is at its brightest.
As of now, there are five generations based on the improvement of computer technology over the ages. The first generation, dating from 1940-1959, is the generation of the Vacuum Tube.
During the first generation, computers were enormous, bulky, slow, undependable due to malfunctions caused by over-heating, and were overly-expensive considering its poor capabilities compared to the technology today. They used binary machine language, which is the lowest-level language used by computers. They can only solve one problem at a time. Input was based on punched cards and paper tape and outputs were shown as printouts. They had 1,000 circuits per cubic foot.
Magnetic Drums were used for memory, while Vacuum Tubes for circuitry. Vacuum Tubes were invented the same time the light bulb was invented by Thomas Edison. It’s purposes were to act like an amplifier, wherein it makes weak signals stronger, and a switch, wherein it can start and stop the flow of electricity.
The ENIAC, which was a first generation computer, was built by Americans Presper Eckert and John Mauchly in 1946. It used thousands of Vacuum Tubes and took up a lot of room. It also emanated a great amount of heat which were cooled off by using huge air conditioners. But even with that, it still overheated regularly, causing defections. The ENIAC led to the invention of the EDVAC (Electronic Discrete Variable Automatic Computer) and the UNIVAC I (Universal Automatic Computer). The UNIVAC was the first commercial computer ever been used by the U.S. Census Bureau in 1951.
Other examples of first generation computers were: Harvard Mark I (electromechanical), Whirlwind, EDSAC, UNIVAC II, UNIVAC 1101, RCA BIZMAC, NCR CRC 102A, NCR CRC 102D, Honeywell Datamatic 1000, Burroughs E101, Burroughs 220, IBM 604, IBM 650, IBM 701, IBM 702, IBM 704, IBM 705, and IBM 709.
The second generation, very well known for the debut of the Transistor, dated from 1956 to 1964.
In place of the Vacuum Tube, computers used the Transistor. In 1947, scientists John Bardeen, William Shockley, and Walter Brattain, who were all working under AT&T’s Bell Labs, invented the Transistor. Like the Vacuum Tube, it can also amplify and and switch electronic signals. On the other hand, it was also faster, more reliable, smaller and much cheaper to build. One Transistor can replace an approximate of 40 Vacuum Tubes. It was also energy-efficient, made of silicon (and other solid materials), and it doesn’t emit large quantities of heat. It also made space travel in the 1960’s possible.
Computers still used punched cards and printouts but elevated from the use of binary machine language to symbolic (or assembly) languages, enabling the medium of instruction to be rendered in wording. They were also the 1st of its kind to store instructions onto their memory, using magnetic core technology instead of magnetic drum. They were especially developed for the atomic energy industry. They had 100,000 circuits per cubic foot.
Examples of second generation computers were: UNIVAC 1107, UNIVAC III, RCA 501, Philco Transact S-2000, NCR 300 series, IBM 7030 Stretch, IBM 7070, IBM 7080, IBM 7090, IBM 1400 series, IBM 1600 series, Honeywell 400 series, Honeywell 800 series, GE (General Electric) 635, GE 645, GE 200, CDC (Control Data Corp.) 1604, CDC 3600, CDC 160A, LARC, Burroughs B500, and Burroughs 200 series.
The discovery made by Robert Noyce (founder of Intel Corp.) and Jack Kilby of the Integrated Circuit marked the start of the third generation.
An Integrated Circuit, also known as a Semiconductor Chip, stored a huge amount of Transistors within its singular wafer or silicon. Since its invention, the number of transistors it can incapacitate doubles every two years. Thus, it increases the power of a single computer and lowers its costs. Devices of today use some kind of numerous Integrated Circuits placed on printed circuit boards, otherwise known as mother boards.
The computers of this age could carry out instructions billions per second. Instead of using punched cards and printouts, users used keyboards and monitors. They also had an OS (Operating System) which allowed these computers to run various different applications at the same time while a central program records the memory. Their sizes dropped to the size of file cabinets, thus making it more available to the public. They were also cheaper. They had a million circuits per cubic foot.
Examples of third generation computers were: Burroughs 6700, CDC 3300, CDC 6600, CDC 7600, Honeywell 200, IBM System 360, IBM System 3, IBM System 7, NCR Century series, RCA Spectra 70 series, UNIVAC 9000 series, GE 600, and GE 235.
The Microprocessor was the star of the fourth generation of computer technology, dating from 1971 ’til the present.
From the use of Integrated Circuits, computers advanced to the use of Monolithic Integrated Circuits, wherein millions of Transistors are placed on a singular Integrated Chip. Because of this, more calculations and faster processing could be accomplished. However, this generation also dated the invention of the Microprocessor.
Originally invented for the use of the calculator, Ted Hoff invented a chip the size of a pencil eraser that could do all the computations and logic work of a computer. Thus, the invention of the Microprocessor, which to say the least, also led the innovation of the PC (Personal Computer) or the Microcomputer.
It was during the 70’s when people opened up to the idea of computers for personal use. Before, one could purchase a Altair 8800 Computer Kit, one of the earliest models of personal computers, and assemble it at home. IBM introduced their first personal computer in 1981, while Apple introduced Macintosh in 1984. Before Macintosh, Apple II was sold to the public in 1977.
Computers can now be placed on table tops, are now more powerful, and could be linked to form networks. From these developments, the start of the Internet, GUIs, the mouse and handheld devices were made possible. The Intel 4004 chip (1971) located all the components, which included the CPU (Central Processing Unit), memory and input/output controls, of the computer on a singular chip. They had billions of circuits per cubic foot.
Examples of fourth generation computers are: IBM System 3090, IBM RISC 6000, IBM RT, ILLIAC IV, Cray 2XMP, and HP 9000.
The fifth, and so far, the last generation, which is present and beyond, can be called the Era of Artificial Intelligence.
Computers make use of parallel processing and superconductors. They also have extremely large scale integration, high speed logic and memory chips, have very high performance level, is on on-going micro-miniturization, have voice/data integration, knowledge-based platforms, expert systems, virtual reality generation, satellite links, and most important, not to mention most advanced, Artificial Intelligence.
Artificial Intelligence, or AI, is a branch of computer science the focuses on the improvement of intelligence of machines. John McCarthy, who coined the term Artificial Intelligence in 1956, defines it as “the science and engineering of making intelligent machines.” AI research is very complex. It is divided to and is involved in many fields and sub-fields of science, like Quantum Computation, and Molecular and Nanotechnology. It is a very young study full of possibilities but still a many decades away from complete exploration. The central problems of AI, which is actually keeping them from their goal to create General Intelligence (or “strong AI”), are traits such as reasoning, knowledge, planning, learning, communication, perception, and the ability to move and manipulate objects. Basically, their goal is to actualize a man-made brain.
Examples of applications using a fragment of AI technology are: voice recognition, speech recognition, computer solution softwares, Autonomous Vehicles, computer Chess, Mathematical Theorem proving, scientific classification of outer space entities (NASA technology), and advanced user interference.
It is truly impeccable how the human mind could create such advance technology in such a short amount of time. Even now, no one can exactly predict how far or how high it can go . It is amazingly endless what can be done, invented, and be discovered. The sky’s the limit. Up until now it is happening. Who knows, maybe in the near future another generation could be added to the long history of Computer Technology.