User Tools

Site Tools


kucourses:eecs965:s13

HomeResearchMembersPublicationHonorCoursesServicesNewsOpenings

EECS 965: Estimation and Detection Theory

Spring 2013


Announcements

  • [11/26] Course webpage sets up!
  • [02/21] The lecture is canceled due to the close of the university!
  • [02/26] The lecture is canceled due to the close of the university!

Brief Course Description

<WRAP justify> This course will introduce students to two fundamental concepts in statistical signal processing, detection and estimation, along with their applications to electrical and computer engineering. The initial part of the course will focus on techniques in hypothesis testing, and their performance analysis. This will be followed by the treatment of statistical estimation theory. Properties of optimal estimators will be explored for the estimation of deterministic and random parameters including linear and nonlinear estimation, and filtering. Various applications of detection and estimation theory will be introduced and further explored as part of course projects. </WRAP>

Prerequisite

EECS 861 or relevant courses on random process.

Time and Location

Class: TR 04:00 - 05:15 PM, LEA 3153 - Lawrence, KS
Office hours: T 02:00 -04:00 PM or by appointments, 2028 Eaton Hall
Instructor: Dr. Lingjia Liu (lingjialiu at ku dot edu)

Course Schedule

Detailed course schedule can be found <wrap em>HERE</wrap>.

Major Goals

<WRAP justify>

  1. Introduce the student to inference problems and the process of decision making. Provide real-world examples pertaining to the material covered in this course.
  2. Formulate the problem of binary hypothesis testing. Distinguish between the Bayesian framework and the Neyman-Pearson approach. Extend the concept of decision making to composite hypothesis testing, and M-ary tests. (Examine the properties of an optimal detector and determine its receiver operating characteristic (ROC).)
  3. Provide an overview of classical estimation. Derive the Cramer-Rao lower bound and discuss the notion of sufficient statistics. Study fundamental estimation techniques including the best linear unbiased estimator (BLUE), the maximum likelihood estimator (MLE) and least squares estimation (LSE). (Define the expectation-maximization (EM) algorithm.)
  4. Provide an overview of Bayesian parameter estimation. Gain the ability to employ the maximum a posteriori (MAP) estimator, the minimum mean square error (MMSE) estimator, and the linear minimum mean square error (LMMSE) estimator. Review iterative estimation techniques including the least mean squares (LMS) algorithms and the recursive least square (RLS) algorithms.
  5. Explore fundamental concepts in signal estimation. Study Kalman-Bucy Filtering, linear estimation and Wiener-Kolmogorov filters.
  6. Carry performance analysis based on large sample sizes. Apply the Chernoff bound to give performance guarantees. Develop the theory of large deviations for independent observations and survey asymptotic performance metrics for inference problems.
  7. Engage the student in an active learning experience. Provide an opportunity for the student to conduct original research through small group projects. Initiate the student to team work and collaborative efforts.

</WRAP>

Textbooks

  • Fundamentals of Statistical Signal Processing, Vol 1: Estimation Theory by S.M. Kay, Prentice Hall
  • Fundamentals of Statistical Signal Processing, Vol 2: Detection Theory by S.M. Kay, Prentice Hall

Additional Reference

  • Statistical Inference by G. Casella and R. Berger, Duxbury Press

Grade Components

  • 30% Homework Assignments
  • 30% Midterm Exam
  • 30% Final Exam/Project
  • 10% Quizzes & Participation
kucourses/eecs965/s13.txt · Last modified: 2017/09/07 10:30 by lingjialiu