Our master student ,Ruiqi Fu, and postdoctoral fellow,Yifeng Chen, proposed a study about Continuous Bimanual Trajectory Decoding of Coordinated Movement from EEG Signals in IEEE Journal of Biomedical and Health Informatics.
While many voluntary movements involve bimanual coordination, few attempts have been made to simultaneously decode the trajectory of bimanual movements from electroencephalogram (EEG) signals. In this study, we proposed a novel bimanual brain-computer interface (BCI) paradigm to reconstruct the continuous trajectory of both hands during coordinated movements from EEG. The protocol required human subjects to complete a bimanual reaching task to the left, middle, or right target while EEG data were collected. A multi-task deep learning model combining the EEGNet and long short-term memory network (LSTM) was proposed to decode bimanual trajectories, including position and velocity. Decoding performance was evaluated in terms of the correlation coefficient (CC) and normalized root mean square error (NRMSE) between decoded and real trajectories. Experimental results from 13 human subjects showed that the grand-averaged combined CC values achieved 0.54 and 0.42 for position and velocity decoding, respectively. The corresponding combined NRMSE values were 0.22 and 0.23. Both CC and NRMSE were significantly superior to the chance level (p<0.05). Comparative experiments also indicated that the proposed model significantly outperformed some other commonly-used methods in terms of CC and NRMSE for continuous trajectory decoding. These findings demonstrated the feasibility of simultaneously decoding bimanual trajectory from EEG, indicating the potential of bimanual control for coordinated tasks.
Ruiqi Fu, a master’s student at Southern University of Science and Technology, and Yifeng Chen, a postdoctoral fellow, are co-first authors. Mingming Zhang, assistant professor at Southern University of Science and Technology, is the corresponding author of this article.