Interactions between auditory statistics processing and visual experience emerge only in late development

2021 
The human auditory system relies on both detailed and summarized representations to recognize different sounds. As local features can exceed the storage capacity, average statistics are computed over time to generate more compact representations at the expense of temporal details availability. This study aimed to identify whether these fundamental sound analyses develop and function exclusively under the influence of the auditory system or interact with other modalities, such as vision. We employed a validated computational synthesis approach allowing to control directly statistical properties embedded in sounds. To address whether the two modes of auditory representation (local features processing and statistical averaging) are influenced by the availability of visual input in different phases of development, we tested samples of sighted controls (SC), congenitally blind (CB), and late-onset (> 10 years of age) blind (LB) individuals in two separate experiments which uncovered auditory statistics computations from behavioral performances. In experiment 1, performance relied on the availability of local features at specific time points; in experiment 2, performance benefited from computing average statistics over longer durations. As expected, when sound duration increased, detailed representation gave way to summary statistics in SC. In both experiments, the sample of CB individuals displayed a remarkably similar performance revealing that both local and global auditory processes are not altered by blindness since birth. Conversely, LB individuals performed poorly compared to the other groups when relying on local features, with no impact on statistical averaging. The dampening in the performance was not associated with the onset and duration of visual deprivation. Results provide clear evidence that vision is not necessary for the development of the auditory computations tested here. Remarkably, a functional interplay between acoustic details processing and vision emerges at later developmental phases. Findings are consistent with a model in which the efficiency of local auditory processing is vulnerable in case sight becomes unavailable. Ultimately results are in favor of a shared computational framework for auditory and visual processing of local features, which emerges in late development.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    53
    References
    0
    Citations
    NaN
    KQI
    []