Article

Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates

We propose stochastic AC-FGM, a parameter-free accelerated method for stochastic composite convex optimization that is fully adaptive to the Lipschitz constant, iteration horizon, …

avatar
Yao Ji

High-Order Accumulative Regularization for Gradient Minimization in Convex Programming

A unified high-order framework that closes the gap between fast function-value residual convergence and slow gradient norm convergence in convex optimization.

avatar
Yao Ji