Understanding and Enhancing Data Recovery Algorithms - From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization

Abstract

We prove new results about the robustness of noise-blind decoders for the problem of re- constructing a sparse vector from underdetermined linear measurements. Our results imply provable robustness of equality-constrained l1-minimization for random measurements with heavy-tailed distributions and, furthermore, correspond to a generalization of bounds on inscribed bodies of random polytopes.

We further propose a new algorithm for the reconstruction of low-rank matrices from few linear observations or from missing data, based on the iterative minimization of well- designed quadratic models of a non-convex objective. Our method, which is an instance of Iteratively Reweighted Least Squares (IRLS), is the first of its kind to combine data efficiency with computational scalability and fast local convergence rates. We show that the method attains a superlinear local convergence rate under near-optimal assumptions on the sample complexity for several random observation models. These theoretical statements are sup- ported by computational experiments which suggest an improved data efficiency compared to the state-of-the-art. We provide an implementation of the proposed IRLS algorithm that solves computational issues of previous related methods.

Finally, we extend the framework to the completion of structured low-rank matrices such as low-rank Hankel or Toeplitz matrices. For this more flexible model, we propose an IRLS algorithm with quadratic local convergence rate under weak assumptions on the number and distribution of provided samples.

Publication
Ph.D. Dissertation, Technical University of Munich, 2019