Convex Optimization

Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates

We propose stochastic AC-FGM, a parameter-free accelerated method for stochastic composite convex optimization that is fully adaptive to the Lipschitz constant, iteration horizon, …

avatar
Yao Ji

Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates

Talk at 2026 INFORMS Optimization Society Conference on stochastic AC-FGM, a parameter-free accelerated method achieving optimal convergence rates.

avatar
Yao Ji

High-Order Accumulative Regularization for Gradient Minimization in Convex Programming

A unified high-order framework that closes the gap between fast function-value residual convergence and slow gradient norm convergence in convex optimization.

avatar
Yao Ji

High-order Accumulative Regularization Methods for Gradient Minimization

Talk at 2025 INFORMS Annual Meeting on a unified high-order framework for optimal gradient minimization in convex optimization.

avatar
Yao Ji