Steering of audio input in hearing aids by eye gaze through electrooculography

2017 
The behavior of a person during a conversation typically involves both auditory and visual attention. Visual attention implies that the person directs his/her eye gaze towards the sound target of interest, and hence the detection of the gaze may provide a steering signal for future hearing aids. Identification of the sound target of interest could be used to steer a beamformer or select a specific audio stream from a set of remote microphones. We have previously shown that in-ear electrodes can be used to identify eye gaze through electrooculography (EOG) in offline recordings. However, additional studies are needed to explore the precision and real-time feasibility of the methodology. To evaluate the methodology we performed a test with hearing-impaired subjects seated with their head fixed in front of three targets positioned at −30°, 0°, and +30° azimuth. Each target presented speech from the Danish DAT material, which was available for direct input to the hearing aid using head related transfer functions. Speech intelligibility was measured in three conditions: a reference condition without any steering, an ideal condition with steering based on an eye-tracking camera, and a condition where eye gaze was estimated from EarEOG measures to select the desired audio stream. The capabilities and limitations of the methods are discussed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    5
    Citations
    NaN
    KQI
    []