Jeffrey A. Bilmes
E E 596
Topics of current interest in signal and image processing. Content may vary from offering to offering. Prerequisite: permission of instructor.
This course will serve as a thorough overview of dynamic graphical models including hidden Markov models, dynamic Bayesian networks, sequential conditional random fields, deep belief networks and deep models in sequential processing, Kalman filters, switching Kalman filters, linear and non-linear dynamical systems, and other time-series methods.
For each of the above, we will discuss the issues behind and methods for computing various forms of exact and approximate inference, and how this is distinct from the static graphical model case. We will The approximation schemes we will discuss will be particularly suited to dynamic models. This includes forward/backwards algorithms, temporal junction trees and dynamic triangulation, conditioning and factorization approaches, variational approaches, sampling and particle filter approaches (including importance sampling and Rao-Blackwellization), beam pruning strategies, and multi-pass course-to-fine strategies. We'll also discuss the island algorithm and other time-space trade-off strategies, including methods to run sequential models on large parallel machines.
We will also cover learning algorithms, including generative and discriminative training (MMIE, MDI, and MCE, etc.) and model-specific approximations (often from the speech recognition community), and structured max margin approaches (often from the machine learning community).
We'll cover a number of applications including speech recognition, natural language processing, bioinformatics, econometrics, robotics (i.e., localization and mapping), activity recognition, and temporal models for inference with multi-rate sensing devices.
Prerequisites: basic probability, statistics, and random processes (e.g., EE505 or a Stat 5xx class or consent of the instructor). Knowledge of matlab. It would be useful to know static graphical models, but this is not necessary as the inference methods and applications are so different between the static and the dynamic case.
The course is open to students in all UW departments.
Texts: We will mainly use written material that will be made available on the web page, class presentation slides, as well as other printed and online material.
Grades and Assignments: Grades will be based on a combination of a final project, class attendance, class participation, and several homework assignments.
Meeting time/place: MW 1:30-3:20 PCAR 492
Final project: The final project will consist of a 4-page paper (conference style) and a 15 minute final project presentation on material related to the course (which may be a paper review).
Student learning goals
General method of instruction
Class assignments and grading