User Tools

Site Tools


kucourses:eecs769:f15

HomeResearchMembersPublicationHonorCoursesServicesNewsOpenings

EECS 769: Information Theory

Fall 2015


Announcements

  • [08/23] Course webpage sets up!

Brief Course Description

<WRAP justify> Information theory is the science of operations on data such as compression, storage, and communication. It is one of the few scientific fields fortunate enough to have an identifiable beginning - Claude Shannon's 1948 paper. The main topics of mutual information, entropy, and relative entropy are essential for students, researchers, and practitioners in such diverse fields as communications, data compression, statistical signal processing, neuroscience, and machine learning. The topics covered in this course include mathematical definitions and properties of information, mutual information, source coding theorem, lossless compression of data, optimal lossless coding, noisy communication channels, channel coding theorem, the source channel separation theorem, multiple access channels, broadcast channels, Gaussian noise, time-varying channels, and network information theory. </WRAP>

Prerequisite

EECS 461 or an equivalent undergraduate probability course.

Time and Location

Class: MWF 12:00 - 12:50 PM, LEA 3153 - Lawrence, KS
Office hours: W 1:00 - 2:00 PM or by appointments, 2028 Eaton Hall
Instructor: Dr. Lingjia Liu (lingjialiu at ku dot edu)

Course Schedule

Detailed course schedule can be found <wrap em>HERE</wrap>.

Course Outline

<WRAP justify>

  1. The notion of information, entropy, mutual information, and analysis of mathematical properties of entropy and its various applications.
  2. The asymptotic equipartition property (AEP), typical sequences and applications.
  3. Entropy rates of stochastic processes.
  4. Data Compression: lossy and lossless compression, Shannon's source coding theorems, uniquely and instantaneously decodable codes, Kraft's inequality, Huffman codes, Elias codes, arithmetic coding, universal coding schemes.
  5. Channel capacity: introduction to various channel models, Shannon's channel coding theorems, channel capacity, feedback capacity, capacity of channels with intersymbol interference, Blahut-Arimoto's algorithm.
  6. Gaussian channels: band-limited, parallel, with colored noise, with feedback, and applications.
  7. Rate-distortion and distortion-rate theory.
  8. Introduction to multi-user information theory and network information theory.

</WRAP>

Textbooks

  • Elements of Information Theory, Second Edition, by Thomas M. Cover and Joy A. Thomas, Wiley Series in Telecommunications and Signal Processing, July 18, 2006, ISBN-10: 0471241954; ISBN-13: 978-0471241959.

Additional references

  • Information Theory, Inference, and Learning Algorithms, by David MacKay, Cambridge University Press, 2003, ISBN-10: 0521642981; ISBN-13: 978-0521642989.
  • Information Theory and Reliable Communication, by Robert G. Gallager, John Wiley & Sons, January 15, 1968, ISBN-10: 0471290483; ISBN-13: 978-0471290483.
  • A Mathematical Theory of Communication, by Claud E. Shannon, Bell System Technical Journal, 1948.

Grade Components

  • 30% Homework Assignments (no collaboration)
  • 30% Midterm Exam
  • 30% Final Exam/Project
  • 10% Quizzes & Participation

A Note on Homework

<WRAP justify> Homework assignments will be handed out on Monday (except for HW 1), due the next Monday (1 week). All assignments which are submitted electronically as a latex file, and a printed pdf copy handed in during class will receive 10 % bonus points. </WRAP>

Academic Misconduct

<WRAP justify> Instances of cheating [receiving/giving help] may result in expulsion from class and referral to the Dean. Cheating includes, but is not limited to: copying another exam paper, copying another homework paper, copying from solution manuals or previous students' homework papers, having another student do your work, etc. </WRAP>

kucourses/eecs769/f15.txt · Last modified: 2017/09/07 10:17 by lingjialiu