Translate

Thursday, December 4, 2014

Information Theory: Intellectual Foundation for the Digital Age

Claude Shannon
1948 A Mathematical Theory of Communication, by Bell Labs researcher Claude Shannon, is published, laying the intellectual foundation for the Digital Age
Shannon’s paper transformed information from a general term to a measurable, calculable quantity. Computers, which are machines for manipulating information, would have been impossible to program without Shannon’s work—it was the first to apply the “bit” to circuit design. Information theory would prove vital for everything from cryptography to statistical modeling of the weather or the stock market, from spam filters and computer animation to speech-recognition software.
One key equation determines what Shannon called “entropy,” or the informational content of a piece of communication. The concept was revolutionary and simple: The informational value of a message, whether it’s spoken, written, or encoded in silicon, is a function of how much the recipient already knows. If an outcome is highly unpredictable, a message communicating the outcome carries more information (and requires more bits) than if the outcome is more predictable (e.g., learning that a two-headed coin is going to come up heads). Shannon’s entropy allows programmers to create software that packs the most important information into the smallest file size.
Imagine two selfies on your iPhone. One is of you on a white background; the other is you in front of a beautiful tapestry. In Shannon terms, the latter will have a lot more to convey, and his theory allows programmers to figure out how to communicate it at once accurately and efficiently.

No comments:

Post a Comment