Autonomous Multi-Sensory Robotic Assistant for a Drinking Task

2019 
Assistive robots have the potential to support people with disabilities in their Activities of Daily Life. The drinking task has a high priority and requires constant assistance by caregivers to be executed regularly. Due to incapacitating disabilities such as tetraplegia, which is the paralysis of all limbs, affected people cannot use classic control interfaces such as joysticks. This paper presents a robotic solution to enable independent, straw-less drinking using a smart cup and no physically attached elements on the user. The system's hardware and software components are presented and the overarching control scheme described. The cup approaches the mouth utilising a user-friendly and vision-based robot control based on head pose estimation. Once contact has been established, the user can drink by tilting the cup with a force sensor-based control setup. Two experimental studies have been conducted, where the participants (mostly able-bodied and one tetraplegic), could separately experience the cup’s contactless approach and the contact-based sequence. First results show a high user acceptance rate and consistent positive feedback. The evaluation of internal data showed a high reliability of the safety-critical components with the test groups perceiving the system as intuitive and easy to use.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    13
    Citations
    NaN
    KQI
    []