Speaker: Massimo Fornaiser (Princeton University)

Title: A unified approach to iterative thresholding algorithms for sparse recovery

Abstract:

Since the work of Donoho and Johnstone, soft and hard thresholding operators have been extensively studied for denoising of digital signals, mainly in a statistical framework. Usually associated to wavelet or curvelet expansions, thresholding allows to eliminate those coefficients which encode noise components. The assumption is that signals are sparse in origin with respect to such expansions and that the effect of noise is essentially to perturb the sparsity structure by introducing non zero coefficients with relatively small magnitude. While a simple and direct thresholding is used for statistical estimation of the relevant components of an explicitly given signal, and to discard those considered disturbance, the computation of the sparse representation of a signal implicitly given as the solution of an operator equation or of an inverse problem requires more sophistication. We refer, for instance, to deconvolution and superresolution problems, image recovery and enhancing, and problems arising in geophysics and brain imaging. In these cases, thresholding has been combined with classical Landweber iterations to compute the solutions. In this talk we present a general theory of iterative thresholding algorithms which includes soft, hard, and the so-called firm thresholding operators. In particular, we develop a unified variational approach of such algorithms which allows for a complete characterization of their convergence properties. As a matter of fact, despite of their simplicity which makes them very appealing to users and their enormous impact for applications, iterative thresholding algorithms converges very slowly and might be impracticable in certain situations. By analyzing their typical convergence dynamics we propose acceleration methods based 1. on projected gradient iterations, 2. on alternating subspace corrections (domain decompositions.) Also for both these latter families of algorithms, a variational approach is fundamental in order to correctly analyse the convergence.

The talk partially summarizes recent joint results with Ingrid Daubechies, Ron DeVore, Sinan Gunturk, and Holger Rauhut.