Claude Shannon – the Father of Information Theory

Claude E. Shannon (1916-2001) was an American mathematician, electrical engineer, and cryptographer known as "the father of information theory"

Claude E. Shannon (1916-2001)

On April 30, 1916, American mathematician, electrical engineer, and cryptographer Claude Elwood Shannon was born, the “father of information theory“, whose groundbreaking work ushered in the Digital Revolution. Of course Shannon is famous for having founded information theory with one landmark paper published in 1948. But he is also credited with founding both digital computer and digital circuit design theory in 1937, when, as a 21-year-old master’s student at MIT, he wrote a thesis demonstrating that electrical application of Boolean algebra could construct and resolve any logical, numerical relationship. Believe it or not, it has been claimed that this was the most important master’s thesis of all time. Shannon contributed to the field of cryptanalysis during World War II and afterwards, including basic work on code breaking.

Claude Shannon was born in a hospital in Petoskey, Michigan, and grew up in nearby Gaylord, the home of his parents. His father was a judge, his mother a language teacher of German origin. During his high school years he worked as a messenger for the Western Union. He followed his sister Catherine to the University of Michigan in 1932. In 1936 he moved to MIT with a degree in mathematics and electrical engineering. In his master thesis (1937), A Symbolic Analysis of Relay and Switching Circuits, he applied Boolean algebra to construct digital circuits. The work arose from the analysis of the relay circuits in Vannevar Bush‘s Differential Analyzer analog computer,[2] which Shannon programmed for users. In 1940 he received his doctorate in mathematics with a thesis on theoretical genetics (An Algebra for Theoretical Genetics) at MIT.

After a short stay as a researcher at the Institute for Advanced Study in Princeton, New Jersey, he joined AT&T Bell Labs in 1941 as a mathematician. In 1948 Claude Shannon published his groundbreaking work A Mathematical Theory of Communication, which introduced the word “bit” as the fundamental unit of information for the first time.[3] In this paper, he focused on the conditions under which information encoded by a transmitter and transmitted through a noisy communication channel can be restored to its destination, i.e. decoded without loss of information. Shannon showed that adding extra bits to a signal allowed transmission errors to be corrected.[1] He was able to successfully apply the concept of entropy known from physics in information theory. At the same time, he published Communication in the presence of noise, in which he combined the representation of frequency-restricted functions by the cardinal series with considerations on maximum data rate, in particular by Harry Nyquist, on a theory of channel capacity in digital signal transmission. Before him, but without his knowledge, Vladimir Alexandrovich Kotelnikov published an identical result in 1933. Accordingly, the sampling rate for a signal must be at least twice as high as the highest frequency contained in it in order to be reconstructed into an analog signal without loss of information (Nyquist Shannon sampling theorem).

Another notable article appeared in 1949, Communication Theory of Secrecy Systems,[4] in which Shannon clarified the formal foundations of cryptography, elevating it to the rank of an independent science. Shannon was interested in many things and creative; he is said to have juggled around in the corridors of Bell on a unicycle. Peripheral products of his professional activity include a juggling machine, rocket-driven frisbees, motorized pogo sticks, a machine for reading thoughts, a mechanical mouse, which could orient itself by means of a simple memory consisting of relay circuits in labyrinths, and already in the 1960s an early chess computer. A work from 1950 already deals with chess programs, which was influential and led to the first chess game on computers on the MANIAC computer in Los Alamos in 1956. He also built the “ultimate machine”, a box with a switch that a mechanical hand turned off after it was turned on. The unit of information content of a message, the Shannon, was named after him.

In the mid-1960s he became interested in financial transactions and gave several well-attended lectures at MIT (one of his listeners was Paul Samuelson). He proposed a method, now called Constant Proportion Rebalanced Portfolio, to profit from random market fluctuations (after each transaction, the capital was divided into exactly two halves, one for speculation, the other cash reserve). Shannon received many honours for his work. Among a long list of awards were the Alfred Nobel American Institute of American Engineers Award in 1940, the National Medal of Science in 1966, the Audio Engineering Society Gold Medal in 1985, and the Kyoto Prize in 1985.[1]

At yovisto academic video search, there are many references to the work of Shannon. Obviously, because he has laid some of the foundations of computer science (and information theory), and on the other hand there are many basic lectures referring to these topics. But, there is also a very nice documentary about Claude Shannon exploring his life and the major influence his work had on today`s digital world through interviews with his friends and colleagues.

References and Further Reading:


One comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Relation Browser
0 Recommended Articles:
0 Recommended Articles: