Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates
Abstract
Achieving optimal rates for stochastic composite convex optimization without prior knowledge of problem parameters remains a central challenge. We propose a stochastic variant of the auto-conditioned fast gradient method, referred to as stochastic AC-FGM. The proposed method is fully adaptive to the Lipschitz constant, the iteration horizon, and the noise level, enabling both adaptive stepsize selection and adaptive mini-batch sizing without line-search procedures. Under standard bounded conditional variance assumptions, we show that stochastic AC-FGM achieves the optimal iteration complexity of $\mathcal{O}(1/\sqrt{\epsilon})$ and the optimal sample complexity of $\mathcal{O}(1/\epsilon^2)$.
Type
Publication
2026 INFORMS Optimization Society Conference, Atlanta, GA — Talk
Stochastic Optimization
Convex Optimization
Parameter-Free Optimization
Accelerated Gradient Methods

Authors
H. Milton Stewart Postdoctoral Fellow
I am an H. Milton Stewart Postdoctoral Fellow in the H. Milton Stewart School of Industrial and Systems Engineering (ISyE) at Georgia Institute of Technology. My postdoctoral mentor is Prof. Guanghui (George) Lan.
I earned my Ph.D. in Industrial Engineering at Purdue University (2024). Prior to that, I received my B.S. (2016) and M.S. (2019) degrees in the School of Mathematical Sciences from Beijing Normal University.