An Upper-Limb Rehabilitation Exoskeleton System Controlled by MI Recognition Model With Deep Emphasized Informative Features in a VR Scene

The prevalence of stroke continues to increase with the global aging. Based on the motor imagery (MI) brain–computer interface (BCI) paradigm and virtual reality (VR) technology, we designed and developed an upper-limb rehabilitation exoskeleton system (VR-ULE) in the VR scenes for stroke patients. The VR-ULE system makes use of the MI electroencephalogram (EEG) recognition model with a convolutional neural network and squeeze-and-excitation (SE) blocks to obtain the patient’s motion intentions and control the exoskeleton to move during rehabilitation training movement. Due to the individual differences in EEG, the frequency bands with optimal MI EEG features for each patient are different. Therefore, the weight of different feature channels is learned by combining SE blocks to emphasize the useful information frequency band features. The MI cues in the VR-based virtual scenes can improve the interhemispheric balance and the neuroplasticity of patients. It also makes up for the disadvantages of the current MI-BCIs, such as single usage scenarios, poor individual adaptability, and many interfering factors. We designed the offline training experiment to evaluate the feasibility of the EEG recognition strategy, and designed the online control experiment to verify the effectiveness of the VR-ULE system. The results showed that the MI classification method with MI cues in the VR scenes improved the accuracy of MI classification (86.49% ± 3.02%); all subjects performed two typ...
Source: IEE Transactions on Neural Systems and Rehabilitation Engineering - Category: Neuroscience Source Type: research