Scalable Convex Optimization with Applications to Semidefinite Programming
Dr. Alp Yurtsever , Information and Decision Systems at the Massachusetts Institute of Technology
December 8, Tuesday
16:00 via Zoom
Zoom Meeting
To request the event link, please contact to the department.
Storage and arithmetic costs are critical bottlenecks that prevent us from solving semidefinite programs (SDP) at the scale demanded by real-world applications. For instance, SDPs often have low-rank solutions, but the conventional methods work on a large matrix decision variable. We ask a fundamental question to address this problem: Suppose that the solution to an SDP has a compact representation. Can we develop algorithms that provably approximate a solution using storage that does not exceed the size of the problem data or the size of the solution?
This talk presents a convex optimization paradigm that achieves this goal. The key insight is to design algorithms that maintain only a small sketch of the matrix variable. Combining this idea with the conditional gradient methods, we introduce an algorithm that can solve large SDPs that are not accessible to other convex optimization methods.
(based on joint works with Volkan Cevher, Olivier Fercoq, Madeleine Udell, and Joel Tropp)
Alp Yurtsever is a postdoctoral researcher in the Laboratory for Information and Decision Systems at the Massachusetts Institute of Technology. He received his PhD in Computer and Communication Sciences from Ecole Polytechnique Federale de Lausanne in 2019. Before that, he received double major BSc degrees in Electrical & Electronics Engineering and Physics from the Middle East Technical University in Turkey in 2013. His work focuses on developing theory and algorithms for optimization problems with broad applications, and his doctoral dissertation entitled “Scalable Convex Optimization Methods for Semidefinite Programming” is honored with a Thesis Distinction by the EPFL Program Committee.