Deep learning for processing electromyographic signals
Electromyographic (EMG) signals have gained popularity for controlling prostheses and exoskeletons, particularly in the field of upper limbs for stroke patients. However, there is a lack of research in the lower limb area, and standardized open-source datasets of lower limb EMG signals, especially recording data of Asian race features, are scarce. Additionally, deep learning algorithms are rarely used for human motion intention recognition based on EMG, especially in the lower limb area. In response to these gaps, we present an open-source benchmark dataset of lower limb EMG with Asian race characteristics and large data volume, the JJ dataset, which includes approximately 13,350 clean EMG segments of 10 gait phases from 15 people. This is the first dataset of its kind to include the nine main muscles of human gait when walking. We used the processed time-domain signal as input and adjusted ResNet-18 as the classification tool. Our research explores and compares multiple key issues in this area, including the comparison of sliding time window method and other preprocessing methods, comparison of time-domain and frequency-domain signal processing effects, cross-subject motion recognition accuracy, and the possibility of using thigh and calf muscles in amputees. Our experiments demonstrate that the adjusted ResNet can achieve significant classification accuracy, with an average accuracy rate of 95.34% for human gait phases. Our research provides a valuable resource for future studies in this area and demonstrates the potential for ResNet as a robust and effective method for lower limb human motion intention pattern recognition.