Learned deconvolution using physics priors for structured light-sheet microscopy

2021 
Structured propagation-invariant light fields, such as the Airy and Bessel beams, can encode high-resolution spatial information over an extended field of view. Their use in microscopy, however, has been limited due to the need for deconvolution, a challenging inverse problem. Here, we introduce a deep learning method that can deconvolve and super-resolve structured light-sheet images using such fields without the need for paired experimental data. We make use of the known physics of light propagation by constraining a generative adversarial network with estimated, simulated image data. We combine this with unpaired experimental data via a saliency constraint based on perceptual loss. The combined model results in an experimentally unsupervised network that is robust and lightweight, and that can be trained solely on a few regions of interest from one light-sheet volume. We demonstrate its performance on Airy light-sheet volumes of calibration beads, oocytes, preimplantation embryos, and excised brain tissue. This democratises the use of structured light fields and deconvolution, as it does not require data acquisition outwith the conventional imaging protocol.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    52
    References
    0
    Citations
    NaN
    KQI
    []