High-Order Accumulative Regularization for Gradient Minimization in Convex Programming
A unified high-order framework that closes the gap between fast function-value residual convergence and slow gradient norm convergence in convex optimization.
A unified high-order framework that closes the gap between fast function-value residual convergence and slow gradient norm convergence in convex optimization.