Brain-computer Interface Based on Visual Evoked Potentials to Command Autonomous Robotic Wheelchair

2010 
This paper proposes the integration of two systems: a brain computer interface (BCI) based on steady-state visual evoked potentials (SSVFPs) and an autonomous robotic wheelchair, with the former being used to command the latter. The signals used in this work come from individuals who are visually stimulated. The stimuli are black-and-white checkerboards, like stripes, flickering at different frequencies. Four experiments were performed for the BCI development. For all experiments, the volunteers were asked to watch a stimulation screen with a central stripe or to watch a stimulation screen with four stripes presented simultaneously. one at each side, one at the top and one at the bottom. The FEG signal analysis consists of two steps: feature extraction, performed using a statistic test, and classification, performed by a rule-based classifier, This kind of classifier obtains good results in a short lime and does not demand any training. The result is a system with high classification rate (up to %), high information transfer rate ITR) (up to 1.01.66 bits/min), and processing time of about 120ms for each incremental analysis. Each frequency value can he associated to a user command or a user feeling. The user, who is seated on the wheelchair, can thus choose a specific place to move to. Upon such choice, the control system onboard the wheelchair generates reference paths with low risk of collision, connecting the current position to the chosen one. Therefore, a system to allow people with severe motor disfunction to have the quality of their lives improved is the proposal of the work.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    15
    References
    44
    Citations
    NaN
    KQI
    []