Next Article in Journal
Integrating Automated Labeling Framework for Enhancing Deep Learning Models to Count Corn Plants Using UAS Imagery
Previous Article in Journal
Combined Method Comprising Low Burden Physiological Measurements with Dry Electrodes and Machine Learning for Classification of Visually Induced Motion Sickness in Remote-Controlled Excavator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method

1
College of Civil Aviation, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China
2
School of Automation, Northwestern Polytechnical University, Xi’an 710072, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2024, 24(19), 6466; https://doi.org/10.3390/s24196466
Submission received: 9 September 2024 / Revised: 28 September 2024 / Accepted: 29 September 2024 / Published: 7 October 2024
(This article belongs to the Section Biomedical Sensors)

Abstract

This paper presents an innovative approach for the Feature Extraction method using Self-Attention, incorporating various Feature Selection techniques known as the AtSiftNet method to enhance the classification performance of motor imaginary activities using electrophotography (EEG) signals. Initially, the EEG signals were sorted and then denoised using multiscale principal component analysis to obtain clean EEG signals. However, we also conducted a non-denoised experiment. Subsequently, the clean EEG signals underwent the Self-Attention feature extraction method to compute the features of each trial (i.e., 350×18). The best 1 or 15 features were then extracted through eight different feature selection techniques. Finally, five different machine learning and neural network classification models were employed to calculate the accuracy, sensitivity, and specificity of this approach. The BCI competition III dataset IV-a was utilized for all experiments, encompassing the datasets of five volunteers who participated in the competition. The experiment findings reveal that the average accuracy of classification is highest among ReliefF (i.e., 99.946%), Mutual Information (i.e., 98.902%), Independent Component Analysis (i.e., 99.62%), and Principal Component Analysis (i.e., 98.884%) for both 1 and 15 best-selected features from each trial. These accuracies were obtained for motor imagery using a Support Vector Machine (SVM) as a classifier. In addition, five-fold validation was performed in this paper to assess the fair performance estimation and robustness of the model. The average accuracy obtained through five-fold validation is 99.89%. The experiments’ findings indicate that the suggested framework provides a resilient biomarker with minimal computational complexity, making it a suitable choice for advancing Motor Imagery Brain–Computer Interfaces (BCI).
Keywords: brain–computer interface (BCI); attention sift network (AtSiftNet); motor imagery (MI); independent component analysis (ICA); principal component analysis (PCA) brain–computer interface (BCI); attention sift network (AtSiftNet); motor imagery (MI); independent component analysis (ICA); principal component analysis (PCA)

Share and Cite

MDPI and ACS Style

Xu, H.; Haider, W.; Aziz, M.Z.; Sun, Y.; Yu, X. Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method. Sensors 2024, 24, 6466. https://doi.org/10.3390/s24196466

AMA Style

Xu H, Haider W, Aziz MZ, Sun Y, Yu X. Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method. Sensors. 2024; 24(19):6466. https://doi.org/10.3390/s24196466

Chicago/Turabian Style

Xu, Haiqin, Waseem Haider, Muhammad Zulkifal Aziz, Youchao Sun, and Xiaojun Yu. 2024. "Transforming Motor Imagery Analysis: A Novel EEG Classification Framework Using AtSiftNet Method" Sensors 24, no. 19: 6466. https://doi.org/10.3390/s24196466

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop