Feature Generation with Genetic Algorithms for Imagined Speech Electroencephalogram Signal Classification
Abstract
:1. Introduction
- The development of feature generation to extract information from the raw signals and their respective frequency bands.
- The implementation of GA to find the selection of electrodes that allow a more efficient classification.
- The implementation of GA to select the most relevant features that maximize classification accuracy.
- The comparison of ten different classification models.
- The dataset used was added to a public repository and will allow other techniques to be compared with this methodology.
Organization
2. Methodology
2.1. Dataset
2.2. Computational Resources
- CPU: Intel Core i5 11gen.
- GPU: Nvidia RTX 3060ti.
- Ram: 16 GB.
- vRam: 8 GB.
- OS: Windows 11.
- Libraries: numpy, pandas, scipy, sklearn, matplotlib, antropy, and signalityca.
2.3. Split Signals into Frequency Bands
2.4. Feature Extraction
- Mean [25];
- Standard deviation [26];
- Coefficient variation [27];
- Median [26];
- Mode [26];
- Max [28];
- Min [28];
- First quartile [29];
- Third quartile [29];
- Interquartile range [29];
- Kurtosis [26];
- Skewness [26];
- Detrended fluctuation analysis [30];
- Activity Hjorth param [31];
- Mobility Hjorth param [31];
- Complexity Hjorth param [31];
- Permutation entropy [32];
- Approximate entropy [33];
- Spectral entropy [34];
- Higuchi fractal dimension [35];
- Total power spectral density [36];
- Centroid power spectral density [37];
- Determinism [38];
- Trapping time [38];
- Diagonal line entropy [38];
- Average diagonal line length [38];
- Recurrence rate [38];
- Spectral edge frequency 25 [39];
- Spectral edge frequency 50 [39];
- Spectral edge frequency 75 [39];
- Hurst exponent [40];
- Singular valued decomposition entropy [41];
- Petrosian fractal dimension [42];
- Katz fractal dimension [43];
- Relative band power [44];
- Band amplitude [45].
2.5. Optimization with Genetic Algorithm
- Number of individuals: 100;
- Type of gen: dichotomic {0,1};
- Number of genes: 228;
- Type of selection: rank;
- Type of crossover: homogenous;
- Mutation: bit flip;
- Fitness function: accuracy.
2.5.1. Explanation of the GA and Its Inputs
2.5.2. Individuals in the GA: Not the Subjects
2.5.3. GA Works as Follows
- Initialization:
- -
- A population of 100 binary vectors is randomly generated (a common population size in GA).
- Selection and Reproduction:
- -
- These 100 individuals form pairs and recombine (cross over) to create offspring.
- -
- Since each pair produces two offspring, the population temporarily doubles to 200 individuals.
- Evaluation and Survival of the Fittest:
- -
- Each individual (binary vector) defines a specific subset of electrodes and features used to train and evaluate a classification model.
- -
- Individuals are then ranked based on the model’s accuracy.
- -
- The 100 best individuals survive to the next generation, while the 100 worst are discarded.
- Iterative Optimization:
- -
- This process is repeated over several generations, refining the selection of electrodes and features until the algorithm converges on an optimal subset.
- Final Solution Selection:
- -
- In the last generation, the highest-ranked individual (binary vector) is selected as the final solution.
- -
- This vector indicates, with 0 and 1, which electrodes and features provide the most valuable information to the model.
2.6. Data Standardization
2.7. Flatten Data
2.8. Classifier Model
- k-Nearest Neighbors (kNN);
- Logistic Regression (LogReg);
- Random Forest (RF);
- Decision Tree (DT);
- Gradient Boosting Machine (GBM);
- AdaBoost;
- Naïve Bayes (NB);
- Linear Discriminant Analysis (LDA);
- Multi-Layer Perceptron (MLP);
- Support Vector Machine (SVM).
3. Results
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ACC | Accuracy; |
ANN | Artificial Neural Network; |
ACO | Ant Colony Optimization; |
CNN | Convolutional Neural Network; |
DL | Deep Learning; |
DNN | Deep Neural Networks; |
DT | Decision Tree; |
EEG | Electroencephalogram; |
FP | False positives; |
FN | False negatives; |
GBM | Gradient Boosting Machine; |
kNN | k-Nearest Neighbors; |
LDA | Linear Discriminant Analysis; |
LogReg | Logistic Regression; |
ML | Machine Learning; |
MLP | Multi-Layer Perceptron; |
NB | Naïve Bayes; |
RF | Random Forest; |
RNN | Recurrent Neural Networks; |
SVM | Support Vector Machine; |
TL | Transfer Learning; |
TN | True negatives; |
TP | True positives. |
Appendix A
Appendix A.1. Mean
- denotes each frame sample;
- N is the amount of frames.
Appendix A.2. Standard Deviation
- denotes each frame sample;
- N is the amount of frames;
- Mean is the average of all samples.
Appendix A.3. Coefficient Variation
- is the standard deviation of the signal,
- Mean is the average of the signal.
Appendix A.4. Median
Appendix A.5. Mode
Appendix A.6. Max
Appendix A.7. Min
Appendix A.8. First Quartile
Appendix A.9. Third Quartile
Appendix A.10. Interquartile Range
Appendix A.11. Kurtosis
Appendix A.12. Skewness
Appendix A.13. Detrended Fluctuation Analysis
- is the fluctuation function that describes the signal’s variance over a window of size ;
- H is the Hurst parameter, which characterizes the scaling behavior;
- is the window size over which the fluctuations are calculated [30].
Appendix A.14. Activity Hjorth Param
- is the EEG signal amplitude value at time point i;
- N is the amount of frames in the signal [31].
Appendix A.15. Mobility Hjorth Param
- x denotes the EEG signal;
- stands for the first derivative of the signal;
- corresponds to the variance of the signal;
- corresponds to the variance of the first derivative of the signal [31].
Appendix A.16. Complexity Hjorth Param
- x denotes the EEG signal;
- is the first derivative of the signal;
- corresponds to the mobility of the signal;
- corresponds to the mobility of the first derivative [31].
Appendix A.17. Permutation Entropy
Appendix A.18. Approximate Entropy
Appendix A.19. Spectral Entropy
Appendix A.20. Higuchi Fractal Dimmension
Appendix A.21. Total Power Spectral Density
Appendix A.22. Centroid Power Spectral Density
Appendix A.23. Determinism
Appendix A.24. Trapping Time
Appendix A.25. Diagonal Line Entropy
Appendix A.26. Average Diagonal Line Length
Appendix A.27. Recurrence Rate
Appendix A.28. Spectral Edge Frequency 25
Appendix A.29. Spectral Edge Frequency 50
Appendix A.30. Spectral Edge Frequency 75
Appendix A.31. Hurst Exponent
- Calculate the cumulative sum of deviations from the mean, :
- Calculate the rescaled range for different window sizes n:
- The Hurst exponent is then estimated from the relationship between the rescaled range and the window size n:
Appendix A.32. Singular Valued Decomposition Entropy
- Given a time series , first construct a time delay embedding matrix X with embedding dimension m and time delay :
- Apply Singular Value Decomposition (SVD) to the matrix X:
- Compute the entropy of the singular values (diagonal elements of ):
Appendix A.33. Petrosian Fractal Dimension
- For a given time series , compute the number of sign changes , which is the number of times the signal crosses its local mean:
- Calculate the Petrosian Fractal Dimension using the following formula:
Appendix A.34. Katz Fractal Dimension
- Define the time series as .
- Compute the total length of the curve using the following formula:
- Calculate the Katz Fractal Dimension using the following formula:
Appendix A.35. Relative Band Power
- Calculate the total power of the signal across all frequencies. This can be carried out by summing the power spectral density (PSD) across the entire frequency range to :
- For each frequency band b, calculate the band power:
- Finally, compute the relative band power for each frequency band b by dividing the band power by the total power:
Appendix A.36. Band Amplitude
- Calculate the amplitude: Once the signal has been filtered, calculate the instantaneous amplitude of the filtered signal. This can be completed by computing the envelope of the filtered signal using methods such as the Hilbert transform:
- Compute the mean amplitude: The mean amplitude over a period or window of time can be calculated to summarize the overall strength of oscillations within the band:
References
- Brigham, K.; Kumar, B.V.K.V. Imagined Speech Classification with EEG Signals for Silent Communication: A Preliminary Investigation into Synthetic Telepathy. In Proceedings of the 2010 4th International Conference on Bioinformatics and Biomedical Engineering, Chengdu, China, 18–20 June 2010; pp. 1–4. [Google Scholar] [CrossRef]
- Lee, S.H.; Park, J.H.; Kim, D.S. Imagined Speech and Visual Imagery as Intuitive Paradigms for Brain-Computer Interfaces. arXiv 2024, arXiv:2411.09400. [Google Scholar]
- Jäncke, L.; Langer, N.; Hänggi, J. Diminished Whole-brain but Enhanced Peri-sylvian Connectivity in Absolute Pitch Musicians. J. Cogn. Neurosci. 2012, 24, 1447–1461. [Google Scholar] [CrossRef]
- Geva, S.; Jones, P.S.; Crinion, J.T.; Price, C.J.; Baron, J.C.; Warburton, E.A. The neural correlates of inner speech defined by voxel-based lesion–symptom mapping. Brain 2011, 134, 3071–3082. [Google Scholar] [CrossRef]
- Kummer, A.W. Perceptual assessment of resonance and velopharyngeal function. In Proceedings of the Seminars in Speech and Language; Thieme Medical Publishers: New York, NY, USA, 2011; Volume 32, pp. 159–167. [Google Scholar]
- Broomfield, J. The nature of referred subtypes of primary speech disability. Child Lang. Teach. Ther. 2004, 20, 135–151. [Google Scholar] [CrossRef]
- DeWitt, I. Phoneme and word recognition in the auditory ventral stream. Proc. Natl. Acad. Sci. 2012, 109, E505–E514. [Google Scholar] [CrossRef]
- Cooney, C.; Folli, R.; Coyle, D. Optimizing Layers Improves CNN Generalization and Transfer Learning for Imagined Speech Decoding from EEG. In Proceedings of the International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019. [Google Scholar]
- Min, B.; Kim, J.; Park, H.J.; Lee, B. Vowel Imagery Decoding toward Silent Speech BCI Using Extreme Learning Machine with Electroencephalogram. BioMed Res. Int. 2016, 2016, 2618265. [Google Scholar] [CrossRef] [PubMed]
- Morooka, T.; Ishizuka, K.; Kobayashi, N. Electroencephalographic Analysis of Auditory Imagination to Realize Silent Speech BCI. In Proceedings of the 2018 IEEE 7th Global Conference on Consumer Electronics, GCCE, Nara, Japan, 9–12 October 2018; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2018; pp. 73–74. [Google Scholar] [CrossRef]
- Balaji, A.; Haldar, A.; Patil, K.; Ruthvik, S.; Baths, V. EEG-based Classification of Bilingual Unspoken Speech using ANN. In Proceedings of the 2017 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jeju Island, Republic of Korea, 11–15 July 2017. [Google Scholar]
- Sereshkeh, A.R.; Trott, R.; Bricout, A.; Chau, T. EEG Classification of Covert Speech Using Regularized Neural Networks. IEEE/ACM Trans. Audio Speech Lang. Process. 2017, 25, 2292–2300. [Google Scholar] [CrossRef]
- Nava, G.H. Predicción de eventos epilépticos mediante técnicas de aprendizaje profundo usando señales EEG 2023. Available online: https://ri-ng.uaq.mx/handle/123456789/9749 (accessed on 21 June 2024).
- Panachakel, J.T.; Ramakrishnan, A.G.; Ramakrishnan, A.G. Decoding Imagined Speech using Wavelet Features and Deep Neural Networks. In Proceedings of the 2019 IEEE 16th India Council International Conference (INDICON), Rajkot, India, 13–15 December 2019. [Google Scholar] [CrossRef]
- Fernandez-Fraga, S.; Aceves-Fernandez, M.; Pedraza-Ortega, J.; Tovar-Arriaga, S. Feature Extraction of EEG Signal upon BCI Systems Based on Steady-State Visual Evoked Potentials Using the Ant Colony Optimization Algorithm. Discret. Dyn. Nat. Soc. 2018, 2018, 2143873. [Google Scholar] [CrossRef]
- Li, Y.; Wu, L.; Wang, T.; Gao, N.; Wang, Q. EEG Signal Processing Based on Genetic Algorithm for Extracting Mixed Features. Int. J. Pattern Recognit. Artif. Intell. 2019, 33, 1958008. [Google Scholar] [CrossRef]
- Ocak, H. Optimal classification of epileptic seizures in EEG using wavelet analysis and genetic algorithm. Signal Process. 2008, 88, 1858–1867. [Google Scholar] [CrossRef]
- Albasri, A.; Abdali-Mohammadi, F.; Fathi, A. EEG electrode selection for person identification thru a genetic-algorithm method. J. Med Syst. 2019, 43, 297. [Google Scholar] [CrossRef]
- Takacs, A.; Toledano-Ayala, M.; Dominguez-Gonzalez, A.; Pastrana-Palma, A.; Velazquez, D.T.; Ramos, J.M.; Rivas-Araiza, E.A. Descriptor Generation and Optimization for a Specific Outdoor Environment. IEEE Access 2020, 8, 52550–52565. [Google Scholar] [CrossRef]
- Koizumi, K.; Ueda, K.; Nakao, M. Development of a Cognitive Brain-Machine Interface Based on a Visual Imagery Method. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 1062–1065. [Google Scholar]
- García-Salinas, J.S.; Villaseñor-Pineda, L.; Reyes-García, C.A.; Torres-García, A.A. Transfer learning in imagined speech EEG-based BCIs. Biomed. Signal Process. Control 2019, 50, 151–157. [Google Scholar] [CrossRef]
- Lee, S.H.; Lee, M.; Jeong, J.H.; Lee, S.W. Towards an EEG-based intuitive BCI communication system using imagined speech and visual imagery. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Bari, Italy, 6–9 October 2019; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2019; pp. 4409–4414. [Google Scholar] [CrossRef]
- Saha, P.; Abdul-Mageed, M.; Fels, S. Deep Learning the EEG Manifold for Phonological Categorization from Active Thoughts. In Proceedings of the ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Brighton, UK, 12–17 May 2019. [Google Scholar]
- Lara, E.; Rodríguez, J.; Takacs, A. EEGIS—Electroencephalogram Imagined Speech Dataset. Mendeley Data 2024. [Google Scholar]
- Mukherjee, S.P.; Banerjee, P.K.; Misra, B.P. Measures of central tendency: The mean. J. Pharmacol. Pharmacother. 2011, 2, 140–142. [Google Scholar]
- Livingston, E.H. The mean and standard deviation: What does it all mean? J. Surg. Res. 2004, 119, 117–123. [Google Scholar] [CrossRef]
- Abdi, H. Coefficient of variation. Encycl. Res. Des. 2010, 1, 169–171. [Google Scholar]
- Edelbaum, T.N. Theory of maxima and minima. In Mathematics in Science and Engineering; Elsevier: Amsterdam, The Netherlands, 1962; Volume 5, pp. 1–32. [Google Scholar]
- Altman, D.G.; Bland, J.M. Statistics notes: Quartiles, quintiles, centiles, and other quantiles. BMJ 1994, 309, 996. [Google Scholar] [CrossRef]
- Hu, K.; Ivanov, P.C.; Chen, Z.; Carpena, P.; Stanley, H.E. Effect of trends on detrended fluctuation analysis. Phys. Rev. E 2001, 64, 011114. [Google Scholar] [CrossRef]
- Hjorth, B. Eeg analysis based on time domain properties. Electroencephalogr. Clin. Neurophysiol. 1970, 29, 306–310. [Google Scholar] [CrossRef]
- Bandt, C.; Pompe, B. Permutation Entropy: A Natural Complexity Measure for Time Series. Phys. Rev. Lett. 2002, 88, 174102. [Google Scholar] [CrossRef] [PubMed]
- Pincus, S.M. Approximate entropy as a measure of system complexity. Proc. Natl. Acad. Sci. USA 1991, 88, 2297–2301. [Google Scholar] [CrossRef] [PubMed]
- Gibson, J. What is the interpretation of spectral entropy? In Proceedings of the 1994 IEEE International Symposium on Information Theory, Trondheim, Norway, 27 June–1 July 1994; p. 440. [Google Scholar] [CrossRef]
- Higuchi, T. Approach to an irregular time series on the basis of the fractal theory. Physica 1988, 31, 277–283. [Google Scholar] [CrossRef]
- Youngworth, R.N.; Gallagher, B.B.; Stamper, B.L. An overview of power spectral density (PSD) calculations. Opt. Manuf. Test. VI 2005, 5869, 206–216. [Google Scholar] [CrossRef]
- Massar, M.L.; Fickus, M.; Bryan, E.; Petkie, D.T.; Terzuoli, A.J. Fast computation of spectral centroids. Adv. Comput. Math. 2011, 35, 83–97. [Google Scholar] [CrossRef]
- Webber, C.; Zbilut, J. Recurrence Quantification Analysis of Nonlinear Dynamical Systems. In Nonlinear Dynamics and Time Series; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
- Schwender, D.; Daunderer, M.; Mulzer, S.; Klasing, S.; Finsterer, U.; Peter, K. Spectral edge frequency of the electroencephalogram to monitor depth of anaesthesia. Br. J. Anaesth. 1996, 77, 179–184. [Google Scholar] [CrossRef] [PubMed]
- Mandelbrot, B.B.; Wallis, J.R. Robustness of the rescaled range R/S in the measurement of noncyclic long run statistical dependence. Water Resour. Res. 1969, 5, 967–988. [Google Scholar] [CrossRef]
- Liu, H.; Li, Z.; Zhang, J. Singular Value Decomposition Entropy and Its Application to Time-Series Analysis. Remote Sens. 2022, 14, 5983. [Google Scholar] [CrossRef]
- Petrosian, A. Kolmogorov complexity of finite sequences and recognition of different preictal EEG patterns. In Proceedings of the IEEE Symposium on Computer-Based Medical Systems, Lubbock, TX, USA, 9–10 June 1995; pp. 212–217. [Google Scholar] [CrossRef]
- Katz, M.J. Fractals and the analysis of waveforms. Cornput. Bid. Med 1988, 18, 145–156. [Google Scholar] [CrossRef]
- Torrente, F.A.; Cortés, J.M.; Nunez, P.; Gaitán, E. Relative Band Power Estimation of Brain Waves Using EEG Signals: An Overview and a New Method. Int. J. Environ. Res. Public Health 2023, 20, 1447. [Google Scholar] [CrossRef]
- Baccigalupi, A.; Liccardo, A. The Huang Hilbert Transform for evaluating the instantaneous frequency evolution of transient signals in non-linear systems. Measurement 2016, 86, 1–13. [Google Scholar] [CrossRef]
Band | Lower Cut-Off Frequency | Upper Cut-Off Frequency |
---|---|---|
Delta | 0.5 Hz | 4 Hz |
Theta | 4 Hz | 8 Hz |
Alpha | 8 Hz | 13 Hz |
Beta | 13 Hz | 30 Hz |
Gamma | 30 Hz | 40 Hz |
Feature | Raw Signal | Delta | Theta | Alpha | Beta | Gamma |
---|---|---|---|---|---|---|
Mean | X | X | X | |||
Standard deviation | X | X | X | |||
Coefficient variation | X | X | ||||
Median | X | X | X | |||
Mode | X | X | X | |||
Max | X | X | X | X | ||
Min | X | X | ||||
First quartile | X | X | ||||
Third quartile | X | X | ||||
Interquartile range | X | X | X | X | ||
Kurtosis | ||||||
Skewness | X | X | X | X | X | |
Detrended fluctuation analysis | X | X | X | X | ||
Activity Hjorth param | X | X | X | X | ||
Mobility Hjorth param | X | X | X | |||
Complexity Hjorth param | X | X | X | X | ||
Permutation entropy | X | X | X | X | X | X |
Approximate entropy | X | X | X | |||
Spectral entropy | X | X | X | |||
Higuchi fractal dimension | X | X | ||||
Total power spectral density | X | X | X | |||
Centroid power spectral density | X | X | X | X | ||
Determinism | X | |||||
Trapping time | X | X | ||||
Diagonal line entropy | X | X | X | X | ||
Average diagonal line length | X | X | ||||
Recurrence rate | X | X | ||||
Spectral edge frequency 25 | X | |||||
Spectral edge frequency 50 | X | X | X | X | ||
Spectral edge frequency 75 | X | X | X | X | ||
Hurst exponent | X | |||||
Singular valued decomposition entropy | X | X | ||||
Petrosian fractal dimension | X | X | ||||
Katz fractal dimension | X | X | X | |||
Relative band power | X | |||||
Band amplitude | X | X | X | X | X | |
Total features activated | 15 | 23 | 14 | 16 | 20 | 15 |
Classfier | Electrodes (Max: 14) | Features (Max: 214) | Recall | F1-Score | Precision | Accuracy |
---|---|---|---|---|---|---|
kNN | 8 | 103 | 0.97 | 0.94 | 0.95 | 0.96 |
LogReg | 5 | 107 | 0.87 | 0.85 | 0.85 | 0.87 |
RF | 7 | 106 | 0.75 | 0.76 | 0.84 | 0.80 |
Decision Tree | 8 | 101 | 0.82 | 0.81 | 0.81 | 0.82 |
GBM | 6 | 99 | 0.81 | 0.82 | 0.87 | 0.84 |
AdaBoost | 5 | 107 | 0.80 | 0.77 | 0.78 | 0.80 |
Naïve Bayes | 7 | 101 | 0.87 | 0.81 | 0.82 | 0.84 |
LDA | 9 | 106 | 0.88 | 0.84 | 0.85 | 0.87 |
MLP | 9 | 112 | 0.78 | 0.78 | 0.79 | 0.80 |
SVM | 5 | 110 | 0.89 | 0.76 | 0.78 | 0.82 |
Article | Dataset (Words) | Processing | Classifier | Accuracy |
---|---|---|---|---|
Our work | /Sí/, /No/ | Feature Extraction, Genetic Algorithm | kNN | 96% |
[8] | /a/,/e/,/i/,/o/,/u/ | ICA | CNN+TL | 35.7% |
[9] | /a/,/e/,/i/,/o/,/u/ | Statistical characteristics | ELM, LDA, SVM | 87% |
[10] | /a/,/e/,/i/,/o/,/u/ | Statistical characteristics | SVM, DT, LDA, QDA, PCA | 79.7% |
[11] | /Yes/,/No/,/Haan/,/Na/ | Digital filters | ANN, SVM, RF, ADA | 73.4% |
[12] | /Yes/,/No/ | ICA, digital filters and DWT | LDA, SVM, kNN, NB, MLP | 63.16% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lara-Arellano, E.; Takacs, A.; Tovar-Arriaga, S.; Rodríguez-Reséndiz, J. Feature Generation with Genetic Algorithms for Imagined Speech Electroencephalogram Signal Classification. Eng 2025, 6, 75. https://doi.org/10.3390/eng6040075
Lara-Arellano E, Takacs A, Tovar-Arriaga S, Rodríguez-Reséndiz J. Feature Generation with Genetic Algorithms for Imagined Speech Electroencephalogram Signal Classification. Eng. 2025; 6(4):75. https://doi.org/10.3390/eng6040075
Chicago/Turabian StyleLara-Arellano, Edgar, Andras Takacs, Saul Tovar-Arriaga, and Juvenal Rodríguez-Reséndiz. 2025. "Feature Generation with Genetic Algorithms for Imagined Speech Electroencephalogram Signal Classification" Eng 6, no. 4: 75. https://doi.org/10.3390/eng6040075
APA StyleLara-Arellano, E., Takacs, A., Tovar-Arriaga, S., & Rodríguez-Reséndiz, J. (2025). Feature Generation with Genetic Algorithms for Imagined Speech Electroencephalogram Signal Classification. Eng, 6(4), 75. https://doi.org/10.3390/eng6040075