Bilinear Compressed Sensing under known Signs via Convex Programming.

2019 
We consider the bilinear inverse problem of recovering two vectors, $\boldsymbol{x} \in\mathbb{R}^L$ and $\boldsymbol{w} \in\mathbb{R}^L$, from their entrywise product. We consider the case where $\boldsymbol{x}$ and $\boldsymbol{w}$ have known signs and are sparse with respect to known dictionaries of size $K$ and $N$, respectively. Here, $K$ and $N$ may be larger than, smaller than, or equal to $L$. We introduce $\ell_1$-BranchHull, which is a convex program posed in the natural parameter space and does not require an approximate solution or initialization in order to be stated or solved. Under the assumptions that $\boldsymbol{x}$ and $\boldsymbol{w}$ satisfy a comparable-effective-sparsity condition and are $S_1$- and $S_2$-sparse with respect to a random dictionary, we present a recovery guarantee in a noisy case. We show that $\ell_1$-BranchHull is robust to small dense noise with high probability if the number of measurements satisfy $L\geq\Omega\left((S_1+S_2)\log^{2}(K+N)\right)$. Numerical experiments show that the scaling constant in the theorem is not too large. We also introduce variants of $\ell_1$-BranchHull for the purposes of tolerating noise and outliers, and for the purpose of recovering piecewise constant signals. We provide an ADMM implementation of these variants and show they can extract piecewise constant behavior from real images.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    64
    References
    0
    Citations
    NaN
    KQI
    []