3F1 - Information Theory Slides
Michaelmas Term 2011
Information Theory Slides
The slides I used for the lecture are available for download:
- Lecture_1 21 November 2011: Course organisation, historical background, Shannon's entropy
- Lecture_2 23 November 2011: Prefix-free codes, converse source coding theore, Shannon-Fano coding, Huffman coding
- Lecture_3 28 November 2011: Coding for a memoryless source, block coding, arithmetic coding, equivocation, entropy rate, coding for discrete stationary sources, Shannon's twin experiment
- Lecture_4 30 November 2011: coding a unifilar Markov chain (revision), channel coding, channels, mutual information, capacity, coding theorem, information theory for continuous random variables, additive white Gaussian noise (AWGN) channel
Note that some proofs are not included in the notes as these were done on the black board. The proofs of all statements in this part of the course are easy and should be approachable as an exercise for undergraduate students. Please do not hesitate to contact me
if you are having any difficulty with a proof.
The supervision version of the handouts is available for download
. I recommend that you attempt to fill the gaps by yourself and only consult the supervisor notes as a last resort.