We analyze stochastic gradient algorithms for optimizing nonconvex problems. In particular, our goal is to find local minima (second-order stationary points) instead of just finding first-order stationary points which may be some bad unstable saddle points. We show that a simple perturbed version of stochastic recursive gradient descent algorithm (called SSRGD) can find an (?, d)-second-order stationary point with Oe(vn/?2 + vn/d4 + n/d3) stochastic gradient complexity for nonconvex finite-sum problems. As a by-product, SSRGD finds an ?-first-order stationary point with O(n + vn/?2) stochastic gradients. These results are almost optimal since Fang et al.  provided a lower bound ?(vn/?2) for finding even just an ?-first-order stationary point. We emphasize that SSRGD algorithm for finding second-order stationary points is as simple as for finding first-order stationary points just by adding a uniform perturbation sometimes, while all other algorithms for finding second-order stationary points with similar gradient complexity need to combine with a negative-curvature search subroutine (e.g., Neon2 ). Moreover, the simple SSRGD algorithm gets a simpler analysis. Besides, we also extend our results from nonconvex finite-sum problems to nonconvex online (expectation) problems, and prove the corresponding convergence results.
|Original language||English (US)|
|Title of host publication||33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019|
|Publisher||Neural information processing systems foundation|
|State||Published - Jan 1 2019|