Monday, January 24, 2011

History of Computing I

The Abacus
The first efforts toward mechanical assistance aided in counting, not computation. An abacus is a mechanical device with beads sliding on rods, which is used as a counting device. It dates to at least the Roman Empire, and its ancestor, the counting board, was in use as far back as 500 B.C. The abacus is considered a counting device because all the computation is still done by the person using the device. The abacus did show, however, that a machine could be used to store numbers.


Jacquard’s Mechanical Loom
A loom is a machine for weaving a pattern into fabric. Early loom designs were operated by hand. For each “line” in the design, certain threads were “pulled” by an experienced weaver (or a poor worker under the direction of a weaver) to get the finished pattern. As you might guess, this process was slow and tedious.

In 1801, Frenchman Joseph-Marie Jacquard invented a mechanical loom in which threads had to be pulled at each stage in a pattern that was stored in a series of punch cards. A punch card encodes data with holes in specific locations. In the case of weaving, every thread that could be pulled had a location on the card. If there was a hole in that location, the thread was pulled. Jacquard’s loom used a series of these punch cards on a belt. The loom would weave the line dictated by the current card, then automatically advance to the next card.

Jacquard’s loom is not necessarily a computer, because it does no mathematical calculations, but it introduced the important idea of a programmable machine. The loom is a “universal weaver” that processes different sets of punch cards to make different woven fabrics.

Jacquard’s loom is not necessarily a computer, because it does no mathematical
calculations, but it introduced the important idea of a programmable machine. The loom
is a “universal weaver” that processes different sets of punch cards to make different
woven fabrics.


Babbage’s Counting Machine
Eventually someone put together a machine that could count and could execute a program, and wondered if a machine could be made to compute numbers. Charles Babbage, an English researcher, spent much of the 1800s trying to develop just such a machine.

One of Babbage’s early designs was for a device he called the “Difference Engine,” which produced successive terms in a mathematical series while an operator turned a crank. This may not seem a dramatic development, but at the time, mathematicians relied on tables of mathematical functions in which each value had been painstakingly calculated by hand. Thus, the Difference Engine was revolutionary.

Its success led Babbage to a more ambitious design: the Analytical Engine. Rather than being tied to a specific task like the Difference Engine, the Analytical Engine was conceived as a general-purpose computing device. Different programs would be fed to the machine using a belt of punch cards, just as in Jacquard’s loom.

The Analytical Engine was never built, because Babbage ran out of money. Like many researchers today, he was dependent on government grants to continue his work. In addition, his design may not have been possible to implement using the technology of the day. He was undoubtedly a man ahead of his time.


Hollerith’s Punch Cards
Under its Constitution, the U.S. government is required every ten years to count how many people reside in each state, a process known as the census. These numbers are used to determine the proportion of representatives each state receives in the House of Representatives.

Originally, this process was done entirely by hand. Census takers would fill out forms for each household, and then the results of these forms would be tabulated by state. This method was so onerous that by the late 1800s the 1880 census took more than ten years to complete, which meant that the next census was starting before the results of the previous one were known. Clearly, something had to be done.

The government created a contest to find the best solution to the problem. In 1890 it was won by a census agent named William Hollerith. In his design, each census form was encoded into a punch card. Machines called “tabulators” could rapidly process stacks of these cards.

This method was not only dramatically faster than manual tabulation, but also allowed the government to track demographics as never before, ask more questions of each citizen, and break up data along multiple categories. For example, rather than counting men who were above a certain age or were veterans, the tabulators could count the men who were in both categories, which allowed the government to better anticipate the funds that would be needed for veterans’ pensions.

The system was a resounding success and led to Hollerith’s founding of the Tabulating Machine Company, which, several mergers later, became International Business Machines, or IBM, a company that would later dominate the world of computers for decades.


ABC
In the period from 1939 to 1942, John Atanasoff, a professor at Iowa State University, and Clifford Berry, a graduate student at the same school, created what is now considered the first modern computer. Their machine, which they called the Atanasoff-Berry Computer, or ABC, weighed about 700 pounds and had to be housed in the basement of the physics department. By current standards, it was a terribly slow machine, reportedly capable of only a single calculation every fifteen seconds. In contrast, a computer today can perform billions of calculations a second.

Atanasoff and Berry never completed the patent process on their work, and the machine itself was dismantled a few years after it was built, when the physics department needed its basement space back. This was unfortunate, as their pioneering work in the field was underappreciated. Credit for the first modern computer was instead bestowed on a more famous project: ENIAC.


ENIAC
Like William Hollerith’s punched cards, the ENIAC story is driven by governmental need. When World War II began, the United States was woefully underprepared for military operations. The army needed to develop and test a large number of weapons in a short period of time. In particular, it had to perform a number of ballistics tests to create artillery tables—in essence, a book showing how far an artillery shell would fly from a specific gun, given wind conditions, the angle of the gun barrel, and so on.

Like the mathematical tables of Babbage’s time, these artillery tables had been created by hand, but by now the army already had some devices for assisting in calculation. Called differential analyzers, they operated on mechanical principles (much like Babbage’s machines), not on electronics. But something better was needed, in aid of which the army hired John Mauchly and J. Presper Eckert, computer scientists at the University of Pennsylvania. In 1946, the machine they proposed was called ENIAC, which stands for Electronic Numerical Integrator and Computer. Like the ABC, it was truly a modern computer.

The term “modern” might seem too strong if you actually saw this machine. Computers of that era relied on the vacuum tube, a device that resembled a lightbulb through which one electrical current can control another. This controlling aspect was used to build logical circuits, because by itself one vacuum tube doesn’t do much. Indeed, ENIAC required about 19,000 vacuum tubes to do its work, filled an entire room, weighed thirty tons, and drew about 200 kilowatts (that is, 200,000 watts) of power. In comparison, a desktop computer purchased today would draw about 400 watts of power, which means ENIAC drew about 500 times more current, even though its actual ability to compute is dwarfed by the most inexpensive desktop computers of today.

What makes ENIAC so important is its reliance on electronics to solve a real-world problem. There were as few mechanical parts as possible, although some mechanics were inevitable. For example, ENIAC still used punch cards for input and output, and the parts that read and produced these cards were mechanical. The vacuum tubes were built into minicircuits that performed elementary logical functions and were built into larger circuits. Those circuits were built into even larger circuits, a design idea that is still used today.

For decades the ENIAC was considered the first computer, but in the 1970s the judge in a patent infringement case determined that ENIAC was based on the designs of the ABC. Other claims were also made, including those of Konrad Zuse, whose work in wartime Germany wasn’t known to the rest of the world for decades; and the Mark I, a computer developed around the same time at Harvard. The question of what was the first modern computer may never be settled.


Knuth’s Research
To this point computers were seen as increasingly useful tools, but computer science was not considered a serious discipline, separate from mathematics. One of the leading figures who changed this was Donald Knuth.

As an undergraduate studying physics and mathematics at the Case Institute of Technology in Cleveland in the 1950s, Knuth had his first contact with the school’s IBM computer. From then on, computers and programs were his obsession. He wrote programs for the IBM computer to analyze the college’s basketball statistics, and published research papers while still an undergraduate. When he completed the work for his bachelor’s degree, the college was so impressed with his computer work that he was awarded a master’s at the same time. His most famous accomplishment is The Art of Computer Programming, a proposed masterwork of seven volumes, of which three are completed. It’s no exaggeration to say that Donald Knuth’s writings are to computer science what those of Albert Einstein are to physics.

Source of Information : Broadway-Computer Science Made Simple 2010

History of Computing ISocialTwist Tell-a-Friend
Digg Google Bookmarks reddit Mixx StumbleUpon Technorati Yahoo! Buzz DesignFloat Delicious BlinkList Furl

0 comments: on "History of Computing I"

Post a Comment