ReviewEssays.com - Term Papers, Book Reports, Research Papers and College Essays
Search

Computers

Essay by   •  October 7, 2010  •  Essay  •  1,830 Words (8 Pages)  •  997 Views

Essay Preview: Computers

Report this essay
Page 1 of 8

A common misconception about computers is that they are smarter than humans. Actually, the degree of a computerâ„-s intelligence depends on the speed of its ignorance. Todayâ„-s complex computers are not really intelligent at all. The intelligence is in the people who design them. Therefore, in order to understand the intelligence of computers, one must first look at the history of computers, the way computers handle information, and, finally, the methods of programming the machines.

The predecessor to todayâ„-s computers was nothing like the machines we use today. The first known computer was Charles Babbageâ„-s Analytical Engine; designed in 1834. (Constable 9) It was a remarkable device for its time. In fact, the Analytical Engine required so much power and would have been so much more complex than the manufacturing methods of the time, it could never be built.

No more than twenty years after Babbageâ„-s death, Herman Hollerith designed an electromechanical machine that used punched cards to tabulate the 1890 U.S. Census. His tabulation machine was so successful, he formed IBM to supply them. (Constable 11) The computers of those times worked with gears and mechanical computation.

Unlike todayâ„-s chip computers, the first computers were non-programmable, electromechnical machines. No one would ever confuse the limited power of those early machines with the wonder of the human brain. An example was the ENIAC, or Electronic Numerical Integrator and Computer. It was a huge, room-sized machine, designed to calculate artillery firing tables for the military. (Constable 9) ENIAC was built with more than 19,000 vacuum tubes, nine times the amount ever used prior to this. The internal memory of ENIAC was a paltry twenty decimal numbers of ten digits each. (Constable 12) (Todayâ„-s average home computer can hold roughly 20,480 times this amount.)

Today, the chip-based computer easily packs the power of more than 10,000 ENIACs into a silicon chip the size of an infantâ„-s fingertip. (Reid 64) The chip itself was invented by Jack Kilby and Robert Noyce in 1958, but their crude devices looked nothing like the sleek, paper-thin devices common now. (Reid 66) The first integrated circuit had but four transistors and was half an inch long and narrower than a toothpick. Chips found in todayâ„-s PCs, such as the Motorola 68040, cram more than 1.2 million transistors onto a chip half an inch square. (Poole 136)

The ENIAC was an extremely expensive, huge and complex machine, while PCs now are shoebox-sized gadgets costing but a few thousand dollars. Because of the incredible miniaturization that has taken place, and because of the seemingly "magical" speed at which a computer accomplishes its tasks, many people look at the computer as a replacement for the human brain. Once again, though, the computer can only accomplish its amazing feats by breaking down every task into its simplest possible choices.

Of course, the computer must receive, process and store data in order to be a useful tool. Data can be text, programs, sounds, video, graphics, etc. Some devices for entering data are keyboards, mice, scanners, pressure-sensitive tablets, or any instrument that tells the computer something. The keyboard is the most popular input device for entering text, commands, programs, and the like. (Tessler 157) Newer computers which use a GUI (pronounced gooey), or Graphical User Interface, utilize a mouse as the main device for entering commands. A mouse is a small tool with at least one button on it, and a small tracking ball at the bottom. When the mouse is slid across a surface, the ball tracks the movement on the screen and sends the information to the computer. (Tessler 155) A pressure-sensitive tablet is mainly used by graphic artists to easily draw with the computer. The artist uses a special pen to draw on the large tablet, and the tablet sends the data to the computer.

Once the data is entered into the computer, it does no good until the computer can process it. This is accomplished by the millions of transistors compressed into the thumb-nail sized chip in the computer. These transistors are not at all randomly placed; they form a sequence, and together they make a circuit. A transistor alone can only turn on and off. In the "on" state, it will permit electricity to flow; in the "off" state, it will keep electricity from flowing. (Poole 136) However, when all the microscopic transistors are interconnected, they have the ability to control, manipulate, and move data according to the condition of other data. A computerâ„-s chip is so ignorant, it must use a series of sixteen transistors and two resistors just to add two and two. (Poole 141) Nevertheless, this calculation can be made in just a microsecond, an example of the incredible speed of the PC. The type of chip mainly used now is known as a CISC, or Complex Instruction Set Chip. (Constable 98)

Newer workstation variety computers use the RISC type of chip, which stands for Reduced Instruction Set Chip. While the "complex" type might sound better, the architecture of the RISC chip permits it to work faster. The first generation of CISC chip was called SSI, or Small Scale Integration. SSI chips have fewer than one hundred components. (Reid 124) The period of the late 1960s is known as the era of MSI, or Medium Scale Integration. MSI chips range from one hundred to one thousand components each. (Reid 124) LSI, or Large Scale Integration, was used primarily in the 1970s, each chip containing up to ten thousand components. Chips used in the 1990s are known as VLSI, or Very Large Scale Integration, with up to a million or more components per chip. In the not-so-distant future, ULSI, or Ultra Large Scale Integration, will be the final limit of the miniaturization of the chip.

The transistors will then be on the atomic level and the interconnections will be one atom apart. (Reid 124) Because further miniaturization is not practical parallel" systems that split jobs among hundreds of processors will become common in the future.

Once data is entered and processed, it will be lost forever if it is not stored. Computers can store information in a variety of ways. The computerâ„-s permanent read-only memory, which it uses for basic tasks such as system checks, is stored in ROM, or Read Only Memory. Programs, files, and system software are stored on either a hard disk or floppy disk in most systems.

The hard disk and floppy disk function similarly, but hard disks can hold much more information. They work by magnetizing and demagnetizing small areas on a plastic or metal platter. The "read" head then moves along the tracks to read the binary information. When

...

...

Download as:   txt (11 Kb)   pdf (134.8 Kb)   docx (13.3 Kb)  
Continue for 7 more pages »
Only available on ReviewEssays.com