The role of audio-visual feedback in a thought-based control of a humanoid robot: a BCI study in healthy and spinal cord injured people

01 Pubblicazione su rivista
TIDONI EMMANUELE, Gergondet Pierre, FUSCO GABRIELE, Kheddar Abderrahmane, AGLIOTI Salvatore Maria
ISSN: 1534-4320

The efficient control of our body and success- ful interaction with the environment are possible through the integration of multisensory information. Brain–computer interface (BCI) may allow people with sensorimotor dis- orders to actively interact in the world. In this study, visual information was paired with auditory feedback to improve the BCI control of a humanoid surrogate. Healthy and spinal cord injured (SCI) people were asked to embody a humanoid robot and complete a pick-and- place task by means of a visual evoked potentials BCI system. Participants observed the remote environment from the robot’s perspective through a head mounted display. Human-footsteps and computer-beep sounds were used as synchronous/asynchronous auditory feedback. Healthy participants achieved better placing accuracy when lis- tening to human footstep sounds relative to a computer- generated sound. SCI people demonstrated more difficulty in steering the robot during asynchronous auditory feed- back conditions. Importantly, subjective reports highlighted that the BCI mask overlaying the display did not limit the observation of the scenario and the feeling of being in control of the robot. Overall, the data seem to suggest that sensorimotor-related information may improve the control of external devices. Further studies are required to under- stand how the contribution of residual sensory channels could improve the reliability of BCI systems.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma