# ECE/TCOM-5583: Information Theory

This is a graduate introductory course on information theory. I have offered this course for almost ten years now. And I have decided to make some significant changes of materials for this offering. Past materials such as Kraft inequality and separation theorem are intellectually interesting. Unfortunately I found that most of my students (including myself) would not end up becoming information theorists and thus most of them do not feel connected with the topics.

Therefore, I plan to follow the approach of Professor Mackay for this offering. That is, try to incorporate core information theory materials (for example, information measures, AEP, source and channel coding theory) into statistical learning. Therefore, the main textbook will be Professor Mackay's textbook — Information Theory, Inference, and Learning Algorithms. Even that, I probably will not follow this book very closely. And I incline to borrow heavily with materials from the Internet. Two other most important reference texts are

• Element of Information Theory by Cover and Thomas

• Pattern Recognition and Machine Learning by Bishop

And a nice reference on network information theory is

• Network Information Theory by El Gamal and Kim

However, we won't go into network information theory for this introductory course.

### Office Hours

There are no “regular” office hours. But you are very likely to be able to find me everyday in the afternoon. And you are welcome to come catch me anytime or contact me through emails.

### Course Syllabus (Tentative)

• Probability review

• Maximum likelihood estimator, MAP, Bayesian estimator

• Graphical models and message passing algorithms

• Lossless source coding theory, Huffmann coding, and introduction to Arithmetic coding

• Asymptotic equipartition property (AEP), typicality and joint typicality

• Entropy, conditional entropy, mutual information, and their properties

• Channel coding theory, capacity, and Fano’s inequality Continuous random variables, differential entropy, Gaussian source, and Gaussian channel

• Error correcting codes, linear codes, and introduction to low-density parity check code

• Methods of type, large deviation theory, maximum entropy principle

N.B. You will expect to expose to some Python and Matlab. You won't become an expert on these things after this class. But it is good to get your hands dirty and play with them early.