Brain-computer Interface Based on Visual Evoked Potentials to Command Autonomous Robotic Wheelchair
Sandra Mara Torres Müller,
Wanderley Cardoso Celeste,
Teodiano Freire Bastos-Filho,
This paper proposes the integration of two systems: a brain-computer interface (BCI) based on steady-state visual evoked potentials (SSVEPs) and an autonomous robotic wheelchair, with the former being used to command the latter. The signals used in this work come from individuals who are visually stimulated. The stimuli are black-and-white checkerboards, like stripes, flickering at different frequencies. Four experiments were performed for the BCI development. For all experiments, the volunteers were asked to watch a stimulation screen with a central stripe or to watch a stimulation screen with four stripes presented simultaneously, one at each side, one at the top and one at the bottom. The EEG signal analysis consists of two steps: feature extraction, performed using a statistic test, and classification, performed by a rule-based classifier. This kind of classifier obtains good results in a short time and does not demand any training. The result is a system with high classification rate (up to 96%), high information transfer rate (ITR) (up to 101.66 bits/min), and processing time of about 120 ms for each incremental analysis. Each frequency value can be associated to a user command or a user feeling. The user, who is seated on the wheelchair, can thus choose a specific place to move to. Upon such choice, the control system onboard the wheelchair generates reference paths with low risk of collision, connecting the current position to the chosen one. Therefore, a system to allow people with severe motor disfunction to have the quality of their lives improved is the proposal of the work.