Information Theory
Course Purpose |
---|
The goal of this course is to understand the notion and the significance of information theoretic quantities and source and channel coding. |
Learning Goals |
Students should - learn the definitions and the underlying significance of information theoretic quantities like entropy and mutual information, and compute it for a given probability distribution, - understand the source coding theorem and some efficient coding schemes, and - understand the system model for channel coding, and the associated coding theorem. |
Topic | |
---|---|
Session 1 | Introducing the outline of the lecture and explaining information theory in general |
Session 2 | Amount of information: Entropy of random variables |
Session 3 | Amount of information: Entropy of two random variables |
Session 4 | Amount of information: Mutual information |
Session 5 | Source model |
Session 6 | Source encoding concept |
Session 7 | Source coding theorem: forward theorem |
Session 8 | Source coding theorem: converse theorem |
Session 9 | Source encoding method |
Session 10 | Channel model |
Session 11 | Communication path coding concept |
Session 12 | Channel coding theorem |
Session 13 | Basics of error correction code |
Session 14 | Comprehensive exercise |