Course Schedule - Spring Semester 2026

     

Meeting location information can now be found on student schedules in ESTHER (for students) or on the Course Roster in ESTHER (for faculty and instructors).
Additional information available here.

CMOR 537 001 (CRN: 24504)

COMPUTER-ASSISTED ALG DESIGN

Long Title: COMPUTER-ASSISTED ALGORITHM DESIGN FOR OPTIMIZATION AND MACHINE LEARNING
Department: Comp Appl Math Operations Rsch
Instructor: Das Gupta, Shuvomoy
Meeting: 2:00PM - 4:50PM W (12-JAN-2026 - 24-APR-2026) 
Part of Term: Full Term
Grade Mode: Standard Letter
Course Type: Lecture
Language of Instruction: Taught in English
Method of Instruction: Face to Face
Credit Hours: 3
Course Syllabus:
Course Materials: Rice Campus Store
 
Restrictions:
Must be enrolled in one of the following Level(s):
Graduate
Prerequisites: CMOR 531 OR CMOR 532 OR CMOR 533 OR CMOR 536 OR INDE 517 OR INDE 577
Section Max Enrollment: 30
Section Enrolled: 7
Enrollment data as of: 25-NOV-2025 9:20AM
 
Additional Fees: None
 
Final Exam: GR Course-Dept Schedules Exam
 
Description: In a traditional first course in continuous optimization, students learn to use and analyze existing algorithms, such as applying Stochastic Gradient Descent to train neural networks, using Accelerated Gradient Descent to solve image processing problems, or study their con- vergence analysis. In contrast, this course methodically addresses the fundamental questions: Where do these algorithms come from? How do we know they are any good? And, most importantly, could we design something provably better? The core of this course is a transition from merely using and analyzing optimization algo- rithms to actively designing them, treating the algorithm design process itself as an opti- mization problem. The goal is to construct algorithms that are the provably fastest methods for specific classes of problems in consideration. In particular, we will focus on designing the 1 optimal first-order methods (FOMs)—methods that rely solely on gradient or subgradient information. These are the most commonly used optimization algorithms today, given their efficacy in solving high-dimensional problems, compatibility with large-scale datasets, and applicability to machine learning. The curriculum is structured in two main parts. First, we will learn to build a mathe- matical model—an optimization problem in itself—that calculates the absolute worst-case performance for a given algorithm. This provides a rigorous framework for computing tight convergence guarantees for a known algorithm. Subsequently, we will formulate the search for the provably fastest algorithm as a new, higher-level optimization problem. The solution to this problem is, in fact, the optimal algorithm we seek. This modern, computer-assisted approach to algorithm design is known as Performance Estimation Programming (PEP). This is a hands-on course with a strong emphasis on implementation, where students will learn to build the tools that discover these algorithms from scratch using PEP. By the end, students will not only understand the theory behind cutting-edge algorithm design but will also have the skills to discover and analyze new, high-performance algorithms for a variety of challenges in optimization and data science.