S-72.340 Information Theory (3 cr) P fall 2004

In 1948, Claude Shannon published his paper "A mathematical theory of communication" -- and information theory was born. The results took the scientific community by surprise. It was generally believed that increasing the transmission rate of information over a communication channel increased the probability of error. But Shannon proved that this is not true as long as the communication rate is below channel capacity.

This course gives an introduction to information theory and its most important applications to communications. The topic is mathematically oriented. The basic concepts of entropy, relative entropy, and mutual information are defined, and their connections to channel capacity, coding, and data compression are presented. In addition to limits for error-free communication, information theory also presents limits for data compression. Whereas coding methods for error control are discussed in the companion course S-72.341, data compression methods are here discussed in more detail, including Huffman, Lempel-Ziv, and Shannon coding. The course book provides a variety of interesting areas of application outside communications, including gambling and investment (stock market).

The teaching language is English. (P = may be included in postgraduate studies.)

If you have any questions, do not hesitate to contact the teacher (or the assistant).

Last update: September 16, 2004.