Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery


We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of recovering a matrix $X \in \mathbb{C}^{d_1 \times d_2}$ from incomplete linear observations, solving a sequence of quadratic problems. The easily implementable algorithm, which we call Harmonic Mean Iteratively Reweighted Least Squares ($\texttt{HM-IRLS}$), is superior compared to state-of-the-art algorithms for the low-rank recovery problem in several performance aspects. More specifically, the strategy $\texttt{HM-IRLS}$ uses to optimize a non-convex Schatten-p penalization to promote low-rankness carries three major strengths, in particular for the matrix completion setting. First, we show empirically that, starting from a trivial initialization, it converges globally to a low-rank minimizer of the non-convex Schatten-p objective function for a large number of interesting cases, for which other (non-)convex optimization approaches fail to recover a low-rank matrix. Secondly, as a main theoretical result, we prove that $\texttt{HM-IRLS}$, unlike other IRLS variants, exhibits a locally superlinear rate of convergence if the linear observations fulfill a suitable null space property. This rate is of order $2-p$, depending on the non-convexity parameter $p$, and thus arbitrarily close to quadratic for $p \ll 1$, which is accurately confirmed by our numerical experiments. Thirdly, our experiments show that $\texttt{HM-IRLS}$ exhibits an empirical recovery probability close to $1$ even for very close to optimal sample complexities, i.e., already for significantly fewer linear observations than any other tractable approach in the literature.

In 12th International Conference on Sampling Theory and Applications (SampTA), Tallinn, Estonia, pp. 489-493
This work was extended to a paper that has been published in the JMLR.