Relations between information theory and other fields

Information and coding theory

Louis Wehenkel
Université de Liège, Institut Montefiore


Organization: first course on Thursday Feb. 8, 2018, 2PM-6PM, room II/93, Institut Montefiore

Course description - follow this link

New on 25.02.2015: Project 1 statement: link to web-page of A. Sutera

New on 18.9.2013:

- Video lectures and Book of David MacKay (University of Cambridge) (Video lectures web page) + (Introduction and Chapter 1 of the BOOK)

New on 01.05.2017:

- Guidelines for preparing the written exam (May 2019)

New on 03.05.2018:

- Questions given for the written exam of May 2017

New on 9.10.2009:

- Reference book about probabilistic graphical models (pdf)

Overall goals:
Information theory provides a quantitative measure of the information provided by a message or an observation. This notion was introduced by Claude Shannon in 1948 in order to establish the limits of what is possible in terms of data compression and transmission over noisy channels. Since these times, this theory has found many applications in telecommunications, computer science and statistics.
The course is composed of three main parts.
1. The foundations of information theory.
2. An introduction to coding theory for data compression and error-free communication.
3. An overview of other applications of information theory.
Organization :
Tuesday PM (2PM), second semester. Room II.93 - Institut Montefiore
Personal projects
Oral exam in June.

Course Notes and reference book:
You can find below the original course notes in French.

.pdf with hyperlinks (10 Mbytes)

printable .ps.gz (3.6 Mbytes)

Reference book in English

David Mackay's book on "Information Theory, Inference, and Learning Algorithms" (web page with downloadable pdf)
0. Motivation/Organization 125699 bytes.

1. Measures of information and uncertainty 125699 bytes.

2. Algebra of information and entropy 114757 bytes.

3. Inference and learning 82754 bytes.

4. Probabilistic and graphical independence models 310490 bytes.

5. Source coding (theory) 92784 bytes.

6. Data compression 96561 bytes.

7. Image compression 1248655 bytes.

8. Channel coding I 86238 bytes.

9. Channel coding II 252622 bytes.

Back Louis Wehenkel' home page Back to Stochastic Methods home page Back to Institut Montefiore home page Back to the ULg home page
Last update: 01/02/2015