NASA Logo Jet Propulsion Laboratory California Institute of Technology View the NASA Portal
NASA Banner
NASA Banner
NASA Banner
JPL HOME EARTH SOLAR SYSTEM STARS & GALAXIES SCIENCE & TECHNOLOGY
NASA Banner
JPL Information Processing Group
Overview
People
  Claude Shannon, Father of Information Theory
Video: Claude Shannon: Father of the Information Age . Co-produced by Cal-(IT)² and UCSD-TV, based on the Shannon Symposium sponsored by Cal-(IT)² and UCSD's Jacobs School of Engineering, February 2002.

It is difficult to overstate the importance of Claude Shannon's contributions to the field of communications. In a landmark 1948 paper, Shannon laid out the basic principles underlying digital communications and storage, revolutionizing the way that engineers approached the subject. He paved the way for the powerful codes now used in telephony, wireless communications, satellite communications, deep space communications, and storage devices such as CD players and hard drives; and he laid the groundwork for compression algorithms used for audio and video signals.

Many concepts that seem obvious today were first introduced in his paper, "A Mathematical Theory of Communication," published in two parts in the Bell System Technical Journal. The use of the term "bit" as a unit of information first appeared in this paper. In an age in which communication took place with continuous, analog waveforms, it was a startling notion that information traveling across any communications system could be defined mathematically as some quantity of binary symbols. It was entirely new that information of any kind-- whether for use on a telegraph, telephone, radio, or television-- could be decomposed into zeros and ones, encoded, transmitted, and decoded at the other end.

That was groundbreaking, but it was only one component of his theory. He went on to present the concept of the maximum rate of transmission on a channel-- the capacity or "Shannon limit"-- which provides the benchmark against which all codes and modulations are measured. He invented adding redundancy to a transmitted signal in order to enable correcting for transmission errors at the receiving end. He provided the foundation of data compression, years ahead of any widespread implementations of digital communications or storage.

In this one paper, Shannon founded the field of information theory. In a world in which no one knew the concept of a bit, asking the question, "How many bits per second can be transmitted a channel?" and doing nothing more would have been momentous. Shannon did this and proceeded to clearly state the answer, prove it, and begin to show how to design systems that achieve the limit. He presented a clear picture of the whole field of information theory, all at once.

Shannon's theory was an immediate success with communications engineers and stimulated the technology which led to today's Information Age. Error-correcting codes and data compression are used in virtually every form of electronic communications.

Shannon published many more provocative and influential articles in a variety of disciplines. His master's thesis, "A Symbolic Analysis of Relay and Switching Circuits," used Boolean algebra to establish the theoretical underpinnings of the field of digital logic. This work has broad significance because digital circuits are fundamental to the operation of modern computers and telecommunications systems.

He also is now generally credited with transforming cryptography from an art to a science, with his 1949 paper entitled "Communication Theory of Secrecy Systems".

Shannon was born in Petoskey, Michigan, on April 30, 1916. He graduated from the University of Michigan in 1936 with bachelor's degrees in mathematics and electrical engineering. In 1940 he earned both a master's degree in electrical engineering and a Ph.D. in mathematics from the Massachusetts Institute of Technology (MIT).

Shannon joined the mathematics department at Bell Labs in 1941 and remained affiliated with the Labs until 1972. He became a visiting professor at MIT in 1956, a permanent member of the faculty in 1958, and a professor emeritus in 1978.

In the 1990's, he developed Alzheimer's disease, and slowly withdrew from public life. He was unable to attend a statue dedication in his hometown of Gaylord, Michigan in 2000, and he died in 2001, at the age of 84.

The Information Society of the IEEE established the Shannon Award in 1974, in honor of lifetime achievements in the field of Information Theory. The winner of the Shannon Award gives a lecture at the International Symposium of Information Theory.

©2003, By Jon Hamkins


JPL Privacy Statement
FIRST GOV NASA Home Page