Sunday, November 1, 2009
In 1936, graduate student Claude Shannon arrived at the Massachusetts Institute of Technology.
In the best tradition of grad students, Shannon was short of money, and happy to be recruited by his professor, Vannevar Bush, to tend Bush's unwieldy mechanical computing device - the Differential Analyser.
The Differential Analyser, while a marvel of scientific engineering for its time, was a lot of hard work to maintain. Basically an assembly of shafts and gears, the gears themselves had to be manually configured to specific ratios before any problem could be ‘fed’ to the machine - a boring, laborious (and extremely messy) business:
"I had to kind of, you know, fix [it] from time to time to keep it going".
Encouraged by Bush to base his master's thesis on the logical operation of the Differential Analyser, Shannon inevitably considered ways of improving it, perhaps by using electrical circuits instead of the present cumbersome collection of mechanical parts.
Not long afterwards, it dawned on Shannon that the boolean algebra he'd learned as an undergraduate was in fact very similar to an electric circuit. The next obvious step would be to lay out circuitry according to boolean principles, allowing the circuits to binary-test propositions as well as calculate problems.
Shannon incorporated his musings into his 1937 thesis. The paper, and its author, were hailed as brilliant, and his ideas were almost immediately put into force in the design of telephone systems. Later, Shannon's thesis came to be seen as a focal point in the development of modern computers.
A half-century later, Shannon laid it all at the feet of Lady Luck: "It just happened that no one else was familiar with both fields at the same time."
Shannon's later work, 'A Mathematical Theory of Communication' (1948), outlining what we now know as Information Theory, described the measurement of information by binary digits representing yes-no alternatives - the fundamental basis of today's telecommunications.
Luckily, 'A Mathematical Theory of Communication' was written while Shannon was employed by Bell Labs - because Shannon wasn't planning on publishing his work, and only did so at the urging of fellow employees.
The paper outlined a mathematical definition of information and, probably based on his work in cryptography during the war, Shannon described ways to measure data using the quantity of disorder in any given system, together with the concept of entropy.
(Information, in this sense, includes messages that occur in any communications medium - television, radio, telephone, data processing devices such as computers and servo-mechanisms, even neural networks.)

0 comments: