A Short Course in Information Theory
8 lectures by David J.C. MacKay
January 1995. Cavendish Laboratory, Cambridge, Great Britain.
Summary
Is it possible to communicate reliably from one point
to another if we only have a noisy communication
channel? How can the information content of a random
variable be measured? This course will discuss the
remarkable theorems of Claude Shannon, starting from
the source coding theorem, which motivates the entropy
as the measure of information, and culminating in the
noisy channel coding theorem. Along the way we will
study simple examples of codes for data compression
and error correction.
This will be an informal course. All are welcome to
attend. The level of presentation is intended to be
appropriate for graduate students and final year
undergraduates.
You might also be interested in
my book
on Information Theory, Inference and Learning Algorithms (640 pages long, published by C.U.P. Sept 2003, and available online),
which grew out of this short course.
The postscript files can be obtained not only from my UK web server but also from a
mirror in North America (Toronto). Please click appropriately.
-
Course outline (postscript, 1 page).
| ps mirror |
pdf
| pdf mirror |
-
Lecture 1 notes (postscript, 2 pages).
| ps mirror |
pdf
| pdf mirror |
- Definitions of Probabilities and Entropies.
These notes do not include the main part of lecture 1, viz, the
45 minute overview of the noisy channel coding theorem.
(You can find that in the first chapter of my book)
-
Lecture 2 notes (postscript, 3 pages).
| ps mirror |
pdf
| pdf mirror |
- Why is entropy a fundamental measure of information content?
Assymptotic equipartition and the source coding theorem.
Note: there is a figure on page 1 of this document
which does not appear under
ghostview for some reason.
-
Lecture 3 notes (postscript, 3 pages).
| ps mirror |
pdf
| pdf mirror |
- Data compression I: Symbol codes
-
Lecture 4 notes (postscript, 2 pages).
| ps mirror |
pdf
| pdf mirror |
- Data compression II: Arithmetic coding
-
Lecture 5 notes (postscript, 4 pages).
| ps mirror |
pdf
| pdf mirror |
- Noisy Channel Coding Theorem I
-
Lecture 6 notes (postscript, 2 pages).
| ps mirror |
pdf
| pdf mirror |
- Noisy Channel Coding Theorem II
-
Lectures 7 and 8 notes (postscript, 4 pages).
| ps mirror |
pdf
| pdf mirror |
- Error Correcting Codes and Real Channels.
In lecture 7 I also described work on
Decoding by variational free energy minimization.
-
Additional notes (postscript, 3 pages).
| ps mirror |
pdf
| pdf mirror |
- Bayesian inference (notes prepared for a lecture that didn't happen)
- This site has received the LookSmart Editor's Choice Award
and is listed in the Infront Directory.
David MacKay / mackay@mrao.cam.ac.uk -
home page.