| Class |
Date |
Sections |
Topic |
| 1 |
01/11 |
1.0 |
Introductions and Key Concepts |
| 2 |
01/13 |
1.1 |
Functions of Random Variables; The Binary Symmetric Channel;
Entropy Calculations |
| 3 |
01/15 |
2.1,2.2 |
Entropy, Joint Entropy, and Conditional Entropy |
| 4 |
01/20 |
2.3,2.4,2.5 |
Mutual Information, Chain Rules, Conditional Relative Entropy |
| 5 |
01/22 |
2.6,2.7 |
Convex Functions, Jensen's Inequality |
| 6 |
01/25 |
2.8 |
Log Sum Inequality, Concavity of Entropy |
| 7 |
01/27 |
2.9-2.11 |
Sufficient Statistic, Fano's Inequality |
| 8 |
01/29 |
3.1 |
Weak Law of Large Numbers, The Asymptotic Equipartition Property |
| 9 |
02/01 |
3.2,3.3 |
Data Compression, The Typical Set |
| 10 |
02/03 |
N/A |
Intoduction to Markov Processes |
| 11 |
02/05 |
4.1,4.2 |
Stationary Markov Chains |
| 12 |
02/08 |
4.3,4.4 |
Entropy Rate Bounds on Markov Processes |
| 13 |
02/10 |
5.1 |
Definitions of the Types Of Codes, The Kraft Inequality |
| 14 |
02/12 |
5.3,5.4 |
Bounds on Optimal Codes, Uniquely Decodable Codes |
| 15 |
02/15 |
5.5 |
McMillan Theorem |
| 16 |
02/17 |
5.6-5.8 |
Huffman Coding |
| 17 |
02/19 |
Chps. 1-3 |
Exam No. 1 |
| 18 |
02/22 |
5.9,5.10 |
Shannon Coding |
| 19 |
02/24 |
5.11 |
Competitive Optimality of Optimal Codes |
| 20 |
02/26 |
5.12, 6.4 |
Discrete Distributions, Bounds on the Number of Fair Bits,
Markov Models of English Text |
| 21 |
03/01 |
6.1-6.6 |
Gambling, Wealth, Side Information, and
The Shannon Guessing Game |
| 22 |
03/03 |
7.1,7.2 |
The Definition of Kolmogorov Complexity,
Models of Computation, Turing Machines |
| 23 |
03/05 |
7.3,7.4 |
Theorems Relating To Kolomogorov Complexity |
| 24 |
03/15 |
7.5-7.12 |
Incompressible Sequences, Occam's Razor, Sufficient Statistics |
| 25 |
03/17 |
8.1-8.3 |
Examples of Binary Channels, Symmetric Channels, Properties
of Channel Capacity |
| 26 |
03/19 |
8.4,8.5 |
Preview of the Channel Coding Theorem, Definitions,
Jointly Typical Sequences |
| 27 |
03/22 |
8.6,8.7 |
The Channel Coding Theorem |
| 28 |
03/24 |
8.8,8.9 |
Fano's Inequality |
29 |
03/26 |
8.11 |
Hamming Codes |
| 30 |
03/29 |
8.13 |
Joint Source Channel Coding |
31 |
04/01 |
Chps. 4-7 |
Exam No. 2 |
| 32 |
04/05 |
9.1-9.6 |
Continuous Random Variables, Differential Entropy |
| 33 |
04/07 |
10.1-10.2 |
Power-Limited Gaussian Channels |
| 34 |
04/09 |
11.1-11.6 |
Maximum Entropy Spectral Estimation |
| 35 |
04/12 |
12.1, 12.2 |
The Method of Types, The Law of Large Numbers |
| 36 |
04/14 |
12.3-12.4 |
Universal Source Coding, Large Deviation Theory |
| 37 |
04/16 |
12.5-12.6 |
Sanov's Theorem, The Conditional Limit Theorem |
| 38 |
04/19 |
12.7-12.9 |
Hypothesis Testing, Neyman-Pearson Lemma |
| 39 |
04/21 |
12.10-12.11 |
Lempel-Ziv Coding, Fisher and Chernoff Information |
| 40 |
04/23 |
Chps. 8-13 |
Exam No. 3 |
41 |
04/26 |
13.1-13.2 |
Introduction to Rate Distortion Theory, Binary Sources |
| 42 |
04/28 |
13.3-13.5 |
Gaussian Source, Parallel Gaussian Sources |
| 43 |
04/30 |
13.6-13.8 |
Computation of the Channel Capacity and Rate Distortion Function
|
| 44 |
05/03 |
14.1, 14.4 |
Gaussian Multiple Access Channels,
Encoding of Correlated Sources |
| 45 |
05/05 |
16.1-16.9 |
Review of the Communications Channel Model (Student Participation) |
| 46 |
05/10 |
Comprehensive |
Final Exam (3 - 6 PM) |