Paper-Conference

Stochastic Auto-conditioned Fast Gradient Methods with Optimal Rates

Talk at 2026 INFORMS Optimization Society Conference on stochastic AC-FGM, a parameter-free accelerated method achieving optimal convergence rates.

avatar
Yao Ji

High-order Accumulative Regularization Methods for Gradient Minimization

Talk at 2025 INFORMS Annual Meeting on a unified high-order framework for optimal gradient minimization in convex optimization.

avatar
Yao Ji