History of Computers
Essay by review • December 5, 2010 • Essay • 265 Words (2 Pages) • 1,856 Views
World War II was a scientific war: Its outcome was determined largely by the effective deployment of scientific research and technical developments. The best-known wartime scientific program was the Manhattan Project at Los Alamos to develop the atomic bomb. Another major program of the same scale and importance as atomic energy was radar, in which the Radiation Laboratory at MIT played a major role. It has been said that while the bomb ended the war, radar won it.
Emphasis on these major programs can overshadow the rich tapestry of the scientific war effort. One of the threads running through this tapestry was the need for mathematical computation. For the atomic bomb, for example, massive computations had to be performed to perfect the explosive lens that assembled a critical mass of enriched uranium. At the outbreak of war, the only computing technologies available were analog machines such as differential analyzers, primitive digital technologies such as punched-card installations, and teams of human computers equipped with desktop calculating machines. Even relatively slow one-of-a-kind mechanical computers such as the Harvard Mark I lay some years in the future.
The scientific war effort in the United States was administered by the Office of Scientific Research and Development (OSRD). This organization was headed by Vannevar Bush, the former professor of electrical engineering at MIT and inventor of the differential analyzer. Bush was a brilliantly effective research director. Although he had developed analog computing machines in the 1930s, by the outbreak of war he had ceased to have any active interest in computing, despite his understanding of its importance in scientific research.
...
...