Course Schedule - Spring Semester 2020

     

Meeting location information can now be found on student schedules in ESTHER (for students) or on the Course Roster in ESTHER (for faculty and instructors).
Additional information available here.

ELEC 631 001 (CRN: 21943)

GENERATIVE MODELS FOR ML

Long Title: ADVANCED TOPICS IN SIGNAL PROCESSING AND MACHINE LEARNING
Department: Electrical & Computer Eng.
Instructor: Baraniuk, Richard G
Meeting: 2:00PM - 4:30PM F (13-JAN-2020 - 24-APR-2020) 
Part of Term: Full Term
Grade Mode: Standard Letter
Course Type: Lecture
Language of Instruction: Taught in English
Method of Instruction: Face to Face
Credit Hours: 3
Course Syllabus:
Course Materials: Rice Campus Store
 
Restrictions:
Must be enrolled in one of the following Level(s):
Graduate
Section Max Enrollment: 25
Section Enrolled: 11
Enrollment data as of: 7-OCT-2024 12:48AM
 
Additional Fees: None
 
Final Exam: GR Course-Dept Schedules Exam
 
Description: There is a long history of algorithmic development for solving inferential and estimation problems that play a central role in a variety of learning, sensing, and processing systems, including medical imaging scanners, numerous machine learning algorithms, and compressive sensing, to name just a few. Until recently, most algorithms for solving inferential and estimation problems have iteratively applied static models derived from physics or intuition. In this course, we will explore a new approach that is based on “learning” various elements of the problem including i) stepsizes and parameters of iterative algorithms, ii) regularizers, and iii) inverse functions. For example, we will explore a new approach for solving inverse problems that is based on transforming an iterative, physics-based algorithm into a deep network whose parameters can be learned from training data. For a range of different inverse problems, deep networks have been shown to offer faster convergence to a better quality solution. Specific topics to be discussed include: Ill-posed inverse problems, iterative optimization, deep learning, neural networks, learning regularizers. This is a “reading course,” meaning that students will read and present classic and recent papers from the technical literature to the rest of the class in a lively debate format. Discussions will aim at identifying common themes and important trends in the field. Students will also get hands on experience with optimization problems and deep learning software through a group project. Repeatable for Credit.