ANA: a virtual assistant that sees and hears to help tetraplegic online learning

2019 
To use a computer, users with spinal cord injury must employ special tools that, in most cases, demand a great effort. In online courses, such tools often become a distraction, which might hinder the learning process. Solutions such as tongue mouses, smart glasses and computer vision systems, although promising, still face considerable usability problems. On the other hand, ANA, our conversational agent that can hear and listen to the student, has proven quite promising [1]. Tests have shown that, while using our assistant, tetraplegia were able to focus more on their studies than on the system’s interface. In this paper, we present our latest advances on ANA’s architecture, introducing a computational grammar that allows for the insertion in online courses of “learning objects” (LOs) to which students can talk in a natural way. Those LOs respond by performing the requested action. We present the grammar used for processing commands in natural language and show how it is integrated with Google’s conversational AI assistant.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []