From building two-seater unicycles, juggling robots to creating chess-playing machines, Claude Elwood Shannon was not just an information theorist. The gifted mathematician also used his skills to analyse the stock market with a system he designed though his methods remained unpublished. The American electrical engineer and cryptographer was the grandson of an inventor and a distant cousin of Thomas Edison and would earn money by repairing radios when he was a schoolboy. He went on to study electrical engineering and mathematics at the University of Michigan, graduating in 1936, and obtained his PhD in mathematics at Massachusetts Institute of Technology (MIT) in 1940. During the Second World War, he designed equipment to intercept V1 and V2 missiles and in Axis code-breaking and while working at Bell Labs, he is believed to have met codebreaker Alan Turing, though there is no record of their meeting.
If you are an ACM member, Communications subscriber, Digital Library subscriber, or use your institution's subscription, please set up a web account to access premium content and site features. If you are a SIG member or member of the general public, you may set up a web account to comment on free articles and sign up for email alerts.
Halfway through the last century, information became a thing. It became a commodity, a force -- a quantity to be measured and analyzed. It's what our world runs on. Information is the gold and the fuel. Shannon is the father of information theory, an actual science devoted to messages and signals and communication and computing.
Prior to the conference, Assistant Professor of Mathematics at Dartmouth John McCarthy and Claude Shannon from MIT had been co-editing the then forthcoming Volume 34 of the Annals of Mathematics Studies journal, on Automata Studies (Shannon & McCarthy, 1956). Automata are self-operating machines designed to automatically follow predetermined sequences of operations or respond to predetermined instructions. As engineering mechanisms they appear in a wide variety of everyday applications such as mechanical clocks where a hammer strikes a bell or a cuckoo appears to sing. "At the time I believed if only we could get everyone who was interested in the subject together to devote time to it and avoid distractions, we could make real progress" -- John McCarthy The initial group McCarthy had in mind included Marvin Minsky whom he had known since they were graduate students together at Fine Hall in the early 1950s. The two had talked about artificial intelligence then, and Minsky's PhD dissertation in mathematics had been on neural nets (Moor, 2006) and the structure of the human brain (Nasar, 1998).
Twelve years ago, Robert McEliece, a mathematician and engineer at Caltech, won the Claude E. Shannon Award, the highest honor in the field of information theory. During his acceptance lecture, at an international symposium in Chicago, he discussed the prize's namesake, who died in 2001. Claude Shannon: Born on the planet Earth (Sol III) in the year 1916 A.D. Generally regarded as the father of the information age, he formulated the notion of channel capacity in 1948 A.D. Within several decades, mathematicians and engineers had devised practical ways to communicate reliably at data rates within one per cent of the Shannon limit. As is sometimes the case with encyclopedias, the crisply worded entry didn't quite do justice to its subject's legacy. That humdrum phrase--"channel capacity"--refers to the maximum rate at which data can travel through a given medium without losing integrity.