Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates

Mar 20, 2026·
Yao Ji
Yao Ji
,
Guanghui Lan
· 0 min read
Abstract
Achieving optimal rates for stochastic composite convex optimization without prior knowledge of problem parameters remains a central challenge. We propose a stochastic variant of the auto-conditioned fast gradient method, referred to as stochastic AC-FGM. The proposed method is fully adaptive to the Lipschitz constant, the iteration horizon, and the noise level, enabling both adaptive stepsize selection and adaptive mini-batch sizing without line-search procedures. Under standard bounded conditional variance assumptions, we show that stochastic AC-FGM achieves the optimal iteration complexity of $\mathcal{O}(1/\sqrt{\epsilon})$ and the optimal sample complexity of $\mathcal{O}(1/\epsilon^2)$.
Type
Publication
2026 INFORMS Optimization Society Conference, Atlanta, GA — Talk
presentations