Photorealistic Reconstruction of Visual Texture From EEG Signals

2021 
Recent advances in brain decoding have made it possible to classify image categories based on neural activity. Increasing numbers of studies have further attempted to reconstruct the image itself. However, because images of objects and scenes inherently involve spatial layout information, the reconstruction usually requires retinotopically organized neural data with high spatial resolution, such as fMRI signals. In contrast, spatial layout does not matter in the perception of 9texture9, which is known to be represented as spatially global image statistics in the visual cortex. This property of 9texture9 enables us to reconstruct the perceived image from EEG signals, which have a low spatial resolution. Here, we propose an MVAE-based approach for reconstructing texture images from visual evoked potentials measured from observers viewing natural textures such as the textures of various surfaces and object ensembles. This approach allowed us to reconstruct images that perceptually resemble the original textures with a photographic appearance. A subsequent analysis of the dynamic development of the internal texture representation in the VGG network showed that the reproductivity of texture rapidly improves at 200 ms latency in the lower layers but improves more gradually in the higher layers. The present approach can be used as a method for decoding the highly detailed 9impression9 of sensory stimuli from brain activity.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    0
    Citations
    NaN
    KQI
    []