Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
Published Web Location
https://arxiv.org/pdf/1511.01562.pdfAbstract
We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an $m\times n$ rank $r$ matrix from $p < mn$ number of linear measurements. The algorithms are first interpreted as iterative hard thresholding algorithms with subspace projections. Based on this connection, we show that provided the restricted isometry constant $R_{3r}$ of the sensing operator is less than $C_\kappa /\sqrt{r}$, the Riemannian gradient descent algorithm and a restarted variant of the Riemannian conjugate gradient algorithm are guaranteed to converge linearly to the underlying rank $r$ matrix if they are initialized by one step hard thresholding. Empirical evaluation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements necessary.