Iteratively Reweighted Least Squares

Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization

The problem of finding suitable point embedding or geometric configurations given only Euclidean distance information of point pairs arises both as a core task and as a sub-problem in a variety of machine learning applications. In this paper, we aim …

Linear Convergence of Iteratively Reweighted Least Squares for Nuclear Norm Minimization

Low-rank matrix recovery problems are ubiquitous in many areas of science and engineering. One approach to solve these problems is Nuclear Norm Minimization, which is itself computationally challenging to solve. Iteratively Reweighted Least Squares …

Learning Transition Operators From Sparse Space-Time Samples

We consider the nonlinear inverse problem of learning a transition operator A from partial observations at T different times, in the form of sparse observations of entries of the powers $A,A^2,...,A^T$. We address the nonlinearity of this …

Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

We propose a new algorithm for the problem of recovering data that adheres to multiple, heterogeneous low-dimensional structures from linear observations. Focusing on data matrices that are simultaneously row-sparse and low-rank, we propose and …

On the Convergence of IRLS and Its Variants in Outlier-Robust Estimation

Outlier-robust estimation involves estimating some parameters (e.g., 3D rotations) from data samples in the presence of outliers, and is typically formulated as a non-convex and non-smooth problem. For this problem, the classical method called …

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

We advance both the theory and practice of robust $\ell_p$-quasinorm regression for $p \in (0,1]$ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, $p=1$, we prove …

Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate

The recovery of sparse data is at the core of many applications in machine learning and signal processing. While such problems can be tackled using $\ell_1$-regularization as in the LASSO estimator and in the Basis Pursuit approach, specialized …

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a …

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. …

Understanding and Enhancing Data Recovery Algorithms - From Noise-Blind Sparse Recovery to Reweighted Methods for Low-Rank Matrix Optimization

We prove new results about the robustness of noise-blind decoders for the problem of re- constructing a sparse vector from underdetermined linear measurements. Our results imply provable robustness of equality-constrained l1-minimization for random …