Article Source
Nuggets of Shannon Information Theory
In his 1948 scientific article entitled “A mathematical theory of communication”, Claude E. Shannon introduced the word “bit”. The article laid down the foundations for the field of information theory which in turn opened up the way to digital information processing.
In this overview talk, I will present in an accessible way three nuggets from Shannon information theory:
- Shannon entropy, a mathematical quantification of uncertainty of a probability distribution.
- Information Compression: Shannon entropy provides a fundamental lower bound on how much information from a source can be compressed so that it can later be recovered.
- Error correction: when digital information is transmitted over a noisy channel, the methods of error-correction provide ways to protect this information from noise. Yet again, Shannon entropy provides the fundamental quantity of how much information can be transmitted over a noisy channel.
While the content of this talk is of mathematical nature, I will try my best to make it accessible to anybody with (very) basic knowledge of probabilities and programming.
**All material (including presentation, Jupyter notebooks etc.) for this talk are available at https://github.com/cschaffner/ITNuggets **
Since 2014, I have been teaching a yearly master course about information theory at the University of Amsterdam. Together with my PhD student Yfke Dulek, we have written lecture notes on the topic and developed some additional learning tools based on these notes.
I love the mathematical beauty of Shannon’s information theory, and I believe that the three concepts above can be appreciated by a much wider audience that does not regularly read scientific papers of the mathematical kind. While I will focus on making the fundamental theoretical aspects accessible to the audience, all of these concepts also have some interesting (and challenging) programming aspects to them that can be explored further after my talk.
Christian Schaffner