This work introduces a moving anchor acceleration technique to extragradient algorithms for smooth structured minimax problems. The moving anchor is introduced as a generalization of the original algorithmic anchoring framework, i.e. the EAG method introduced in \cite{yoon_ryu}, in hope of further acceleration.
We show that the optimal order of convergence in terms of worst-case complexity on the squared gradient, $O(1/k^{2}),$ is achieved by our new method (where $k$ is the number of iterations).
We also extend the moving anchor to a more general nonconvex-nonconcave class of saddle point problems using the framework of \cite{lee2021fast}, which slightly generalizes \cite{yoon_ryu}.
We obtain similar order-optimal complexity results in this extended case.
A preconditioned version of our algorithms is also introduced and analyzed to match optimal theoretical convergence rates.
Our final theoretical contribution is the development and analysis of a moving anchor with a stochastic oracle, which matches accelerated convergence rates for convex-concave problems.
Underlying these theoretical contributions is a selection of robust new Lyapunov/energy functional techniques that account for the moving anchor structure while maintaining the optimal order of complexity under minimal assumptions.
Various numerical experiments demonstrate the efficacy of the moving anchor extragradient algorithms compared to their fixed anchor variants, and in many cases suggest a more optimal constant in the big O notation that may surpass the traditional fixed anchor methods.
We conclude by discussing current challenges and future directions this work may lead.