Yuxin Chen - [Lecture 1] The power of nonconvex optimization in solving random quadratic systems of equations
From Katie Gentilello
views
comments
From Katie Gentilello
We consider the fundamental problem of solving random quadratic systems of equations in n variables, which spans many applications ranging from the century-old phase retrieval problem to various latent-variable models in machine learning. A growing body of recent work has demonstrated the effectiveness of convex relaxation --- in particular, semidefinite programming --- for solving problems of this kind. However, the computational cost of such convex paradigms is often unsatisfactory, which limits applicability to large-dimensional data.
This talk follows another route: by formulating the problem into nonconvex programs, we attempt to optimize the nonconvex objectives directly. We demonstrate that for certain unstructured models of quadratic systems, nonconvex optimization algorithms return the correct solution in linear time, as soon as the ratio between the number of equations and unknowns exceeds a fixed numerical constant. We extend the theory to deal with noisy systems, and prove that our algorithms achieve a minimax optimal statistical accuracy. Numerical evidence suggests that the computational cost of our algorithm is about four times that of solving a least-squares problem of the same size.
© 2023 Georgia Institute of Technology
video portal by Kaltura