1

Sample-Efficient Geometry Reconstruction from Euclidean Distances using Non-Convex Optimization

The problem of finding suitable point embedding or geometric configurations given only Euclidean distance information of point pairs arises both as a core task and as a sub-problem in a variety of machine learning applications. In this paper, we aim …

Linear Convergence of Iteratively Reweighted Least Squares for Nuclear Norm Minimization

Low-rank matrix recovery problems are ubiquitous in many areas of science and engineering. One approach to solve these problems is Nuclear Norm Minimization, which is itself computationally challenging to solve. Iteratively Reweighted Least Squares …

Recovering Simultaneously Structured Data via Non-Convex Iteratively Reweighted Least Squares

We propose a new algorithm for the problem of recovering data that adheres to multiple, heterogeneous low-dimensional structures from linear observations. Focusing on data matrices that are simultaneously row-sparse and low-rank, we propose and …

On the Convergence of IRLS and Its Variants in Outlier-Robust Estimation

Outlier-robust estimation involves estimating some parameters (e.g., 3D rotations) from data samples in the presence of outliers, and is typically formulated as a non-convex and non-smooth problem. For this problem, the classical method called …

Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

We advance both the theory and practice of robust $\ell_p$-quasinorm regression for $p \in (0,1]$ by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, $p=1$, we prove …

Iteratively Reweighted Least Squares for Basis Pursuit with Global Linear Convergence Rate

The recovery of sparse data is at the core of many applications in machine learning and signal processing. While such problems can be tackled using $\ell_1$-regularization as in the LASSO estimator and in the Basis Pursuit approach, specialized …

A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a …

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. …

Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares

We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of completing a low-rank matrix that is linearly structured, e.g., that possesses a Hankel, Toeplitz or block-Hankel/Toeplitz structures, which is of relevance for …

Denoising and Completion of Structured Low-Rank Matrices via Iteratively Reweighted Least Squares

We propose a new Iteratively Reweighted Least Squares (IRLS) algorithm for the problem of completing or denoising low-rank matrices that are structured, e.g., that possess a Hankel, Toeplitz or block-Hankel/Toeplitz structure. The algorithm optimizes …