Authors: | |
Alyson Fletcher | UCLA |
Parthe Pandit | UCLA |
Sundeep Rangan | NYU |
Subrata Sarkar | The Ohio State University |
Phil Schniter | The Ohio State University |
Introduction:
Estimating a vector $mathbf{x}$ from noisy linear measurements $mathbf{Ax+w}$ often requires use of prior knowledge or structural constraintson $mathbf{x}$ for accurate reconstruction.Several recent works have considered combining linear least-squares estimation with a generic or plug-in ``denoiser" function that can be designed in a modular manner based on the prior knowledge about $mathbf{x}$.
Abstract:
Estimating a vector $\mathbf{x}$ from noisy linear measurements $\mathbf{Ax+w}$ often requires use of prior knowledge or structural constraintson $\mathbf{x}$ for accurate reconstruction. Several recent works have considered combining linear least-squares estimation with a generic or plug-in ``denoiser" function that can be designed in a modular manner based on the prior knowledge about $\mathbf{x}$. While these methods have shown excellent performance, it has been difficult to obtain rigorous performance guarantees. This work considers plug-in denoising combined with the recently-developed Vector Approximate Message Passing (VAMP) algorithm, which is itself derived via Expectation Propagation techniques. It shown that the mean squared error of this ``plug-in" VAMP can be exactly predicted for a large class of high-dimensional random $\Abf$ and denoisers. The method is illustrated in image reconstruction and parametric bilinear estimation.