We present new stochastic methods for solving two important classes of
nonconvex optimization problems. We first introduce a randomized
accelerated proximal gradient (RapGrad) method for solving a class of
nonconvex optimization problems whose objective function consists of the
summation of m components, and show that it can significantly reduce
the number of gradient computations especially when the condition number
(i.e., the ratio between the Lipschitz constant and negative curvature)
is large. Inspired by RapGrad, we also develop a new randomized
accelerated proximal dual (RapDual) method for solving a class of
multi-block nonconvex optimization problems coupled with linear
constraints. We demonstrate that RapDual can also save significantly the
number of block updates than its deterministic counterpart. To the best
of our knowledge, all the complexity results associated with RapGrad
and RapDual seem to be new in the literature. We also illustrate
potential advantages of these algorithms through our preliminary
numerical experiments.
- Tags
-