Distinct Cortical Pathways for Processing Tool versus Animal Sounds

2005 
Human listeners can effortlessly categorize a wide range of environmental sounds. Whereas categorizing visual object classes (e.g., faces, tools, houses, etc.) preferentially activates different regions of visually sensitive cortex, it is not known whether the auditory system exhibits a similar organization for different types or categories of complex sounds outside of human speech. Using functional magnetic resonance imaging, we show that hearing and correctly or incorrectly categorizing animal vocalizations (as opposed to hand-manipulated tool sounds) preferentially activated middle portions of the left and right superior temporal gyri (mSTG). On average, the vocalization sounds had much greater harmonic and phase-coupling content (acoustically similar to human speech sounds), which may represent some of the signal attributes that preferentially activate the mSTG regions. In contrast, correctly categorized tool sounds (and even animal sounds that were miscategorized as being tool-related sounds) preferentially activated a widespread, predominantly left hemisphere cortical “mirror network.” This network directly overlapped substantial portions of motor-related cortices that were independently activated when participants pantomimed tool manipulations with their right (dominant) hand. These data suggest that the recognition processing for some sounds involves a causal reasoning mechanism (a high-level auditory “how” pathway), automatically evoked when attending to hand-manipulated tool sounds, that effectively associates the dynamic motor actions likely to have produced the sound(s).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    74
    References
    256
    Citations
    NaN
    KQI
    []