Interactive sonification of U-depth images in a navigation aid for the visually impaired

2018 
In this paper we propose an electronic travel aid system for the visually impaired that utilizes interactive sonification of U-depth maps of the environment. The system is comprised of a depth sensor connected to a mobile device and a dedicated application for segmenting depth images and converting them into sounds in real time. An important feature of the system is that the user can interactively select the 3D scene region for sonification by simple touch gestures on the mobile device screen. The sonification scheme is using stereo panning for azimuth angle localization of scene objects, loudness for their size and frequency for distance encoding. Such a sonic representation of 3D scenes allows the user to identify the geometric structure of the environment and determine the distances to potential obstacles. The prototype application was tested by three visually impaired users who managed to successfully perform indoor mobility tasks. The system’s usefulness was evaluated quantitatively by means of system usability and task-related questionnaires.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    17
    Citations
    NaN
    KQI
    []