Non-Convex Optimization

Linear Convergence of Iteratively Reweighted Least Squares for Nuclear Norm Minimization

Low-rank matrix recovery problems are ubiquitous in many areas of science and engineering. One approach to solve these problems is Nuclear Norm Minimization, which is itself computationally challenging to solve. Iteratively Reweighted Least Squares …

Learning Transition Operators From Sparse Space-Time Samples

We consider the nonlinear inverse problem of learning a transition operator A from partial observations at T different times, in the form of sparse observations of entries of the powers $A,A^2,...,A^T$. We address the nonlinearity of this …

Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

We propose a new algorithm for the problem of recovering data that adheres to multiple, heterogeneous low-dimensional structures from linear observations. Focusing on data matrices that are simultaneously row-sparse and low-rank, we propose and …

On the Convergence of IRLS and Its Variants in Outlier-Robust Estimation

Outlier-robust estimation involves estimating some parameters (e.g., 3D rotations) from data samples in the presence of outliers, and is typically formulated as a non-convex and non-smooth problem. For this problem, the classical method called …

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

We advance both the theory and practice of robust $\ell_p$-quasinorm regression for $p \in (0,1]$ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, $p=1$, we prove …

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a …

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. …

Understanding and Enhancing Data Recovery Algorithms - From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization

We prove new results about the robustness of noise-blind decoders for the problem of re- constructing a sparse vector from underdetermined linear measurements. Our results imply provable robustness of equality-constrained l1-minimization for random …

Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares

We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of completing a low-rank matrix that is linearly structured, e.g., that possesses a Hankel, Toeplitz or block-Hankel/Toeplitz structures, which is of relevance for …

Denoising and Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares

We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of completing or denoising low-rank matrices that are structured, e.g., that possess a Hankel, Toeplitz or block-Hankel/Toeplitz structure. The algorithm optimizes …