Online decoding of brain activity is the most promising approach to restore communication and interaction with the environment to people suffering from brain diseases that result from acute accidents or degenerative processes. Reading these subjects¿ intentions from the electrophysiological activity from their brains, i.e. devising a brain-machine interface (BMI), has indeed been proven to be a viable approach to extract the information needed to rebuild an otherwise interrupted communication channel. However, information transfer rates and patient usage of systems using non-invasive approaches, like those based on EEG signals, are still very low and recent results using on invasive intracortical recordings in humans have yielded only moderate advantages.
Aim of the present project is to contribute to a significant advance in the BMI state-of-the-art by devising novel approaches that rely on the multiscale nature of the signals to decode. Information will be obtained from neuronal activities in macaque monkeys learning rank order sets of symbols and then asked to manipulate the acquired relations to generate new knowledge. This cognitive ability will be taken as a model for studying how letters and syllables are combined to form words in the brain.
These findings will allow the development of algorithms, based on advanced recurrent neural networks for optimal decoding and classification of stimuli and intentions, for a next-generation speller able to efficiently support communication in people suffering of severe neurodegenerative diseases.
By combining a multiscale probe of neural signals from non-human primates with simultaneous EEG recordings it will be identified the neuronal signature of words encoding in both the intracortical and scalp signals. This information will support the identification of the proper spatiotemporal patterns at the EEG level and improve so the decoding of the signal for non-invasive approaches.