<WRAP justify> Information theory is the science of operations on data such as compression, storage, and communication. It is one of the few scientific fields fortunate enough to have an identifiable beginning - Claude Shannon's 1948 paper. The main topics of mutual information, entropy, and relative entropy are essential for students, researchers, and practitioners in such diverse fields as communications, data compression, statistical signal processing, neuroscience, and machine learning. The topics covered in this course include mathematical definitions and properties of information, mutual information, source coding theorem, lossless compression of data, optimal lossless coding, noisy communication channels, channel coding theorem, the source channel separation theorem, multiple access channels, broadcast channels, Gaussian noise, time-varying channels, and network information theory. </WRAP>
EECS 461 or an equivalent undergraduate probability course.
Class: TTh 1:00 - 2:15 PM, NIC 203 - Lawrence, KS
Office hours: T 12:00 - 1:00 PM or by appointments, 2028 Eaton Hall
Instructor: Dr. Lingjia Liu (lingjialiu at ku dot edu)
Detailed course schedule can be found <wrap em>HERE</wrap>.
<WRAP justify> Homework assignments will be handed out on Monday (except for HW 1), due the next Monday (1 week). All assignments which are submitted electronically as a latex file, and a printed pdf copy handed in during class will receive 10 % bonus points. </WRAP>
<WRAP justify> Instances of cheating [receiving/giving help] may result in expulsion from class and referral to the Dean. Cheating includes, but is not limited to: copying another exam paper, copying another homework paper, copying from solution manuals or previous students' homework papers, having another student do your work, etc. </WRAP>