Using an Opportunity Matrix to Select Centers for RBF Neural Networks
Abstract
:1. Introduction
1.1. Overview of RBF Neural Networks
1.2. Selecting Widths and Centers for RBF Nodes
1.3. Research Goal
2. Materials and Methods
2.1. The Opportunity Matrix
2.2. The Opportunity Matrix RBF Center Selection Algorithm
Algorithm 1: Using an opportunity matrix to select centers for an RBF neural network. |
|
2.3. Evaluative Experiments
3. Results and Discussion
4. Summary, Limitations, and Concluding Remarks
4.1. Summary of Contributions and Findings
4.2. Limitations and Opportunities for Future Research
4.3. Concluding Remarks
Funding
Data Availability Statement
Conflicts of Interest
References
- Cybenko, G. Approximation by Superpositions of a Sigmoidal Function. Math. Control Signals Syst. 1989, 2, 303–314. [Google Scholar] [CrossRef]
- Anderson, J.A. An Introduction To Neural Networks; MIT Press: Cambridge, MA, USA, 1998. [Google Scholar]
- Soper, D.S. Hyperparameter Optimization using Successive Halving with Greedy Cross Validation. Algorithms 2022, 16, 17. [Google Scholar] [CrossRef]
- Soper, D.S. Greed is Good: Rapid Hyperparameter Optimization and Model Selection using Greedy k-Fold Cross Validation. Electronics 2021, 10, 1973. [Google Scholar] [CrossRef]
- Broomhead, D.; Lowe, D. Multivariable Functional Interpolation and Adaptive Networks. Complex Syst. 1988, 2, 321–355. [Google Scholar]
- Lagaris, I.E.; Likas, A.C.; Papageorgiou, D.G. Neural-Network Methods for Boundary Value Problems with Irregular Boundaries. IEEE Trans. Neural Netw. 2000, 11, 1041–1049. [Google Scholar] [CrossRef]
- Yang, F.; Paindavoine, M. Implementation of an RBF Neural Network on Embedded Systems: Real-Time Face Tracking and Identity Verification. IEEE Trans. Neural Netw. 2003, 14, 1162–1175. [Google Scholar] [CrossRef]
- Cho, S.-Y.; Chow, T.W. Neural Computation Approach for Developing a 3D Shape Reconstruction Model. IEEE Trans. Neural Netw. 2001, 12, 1204–1214. [Google Scholar]
- Jianping, D.; Sundararajan, N.; Saratchandran, P. Communication Channel Equalization Using Complex-Valued Minimal Radial Basis Function Neural Networks. IEEE Trans. Neural Netw. 2002, 13, 687–696. [Google Scholar] [CrossRef]
- Wu, Y.; Wang, H.; Zhang, B.; Du, K.-L. Using Radial Basis Function Networks for Function Approximation and Classification. Int. Sch. Res. Not. 2012, 2012, 34. [Google Scholar] [CrossRef]
- Poggio, T.; Girosi, F. Networks for Approximation and Learning. Proc. IEEE 1990, 78, 1481–1497. [Google Scholar] [CrossRef]
- Ibrikci, T.; Brandt, M.E.; Wang, G.; Acikkar, M. Mahalanobis Distance with Radial Basis Function Network on Protein Secondary Structures. In Proceedings of the Second Joint 24th Annual Conference and the Annual Fall Meeting of the Biomedical Engineering Society, Houston, TX, USA, 23–26 October 2002; pp. 2184–2185. [Google Scholar]
- Schwenker, F.; Kestler, H.A.; Palm, G. Three Learning Phases for Radial-Basis-Function Networks. Neural Netw. 2001, 14, 439–458. [Google Scholar] [CrossRef] [PubMed]
- Ben-Israel, A.; Greville, T.N. Generalized Inverses: Theory and Applications, 2nd ed.; Springer: New York, NY, USA, 2003. [Google Scholar]
- Deisenroth, M.P. Mathematics for Machine Learning; Cambridge University Press: Cambridge, UK, 2020. [Google Scholar]
- Moody, J.; Darken, C.J. Fast Learning in Networks of Locally Tuned Processing Units. Neural Comput. 1989, 1, 281–294. [Google Scholar] [CrossRef]
- Kosko, B. Neural Networks for Signal Processing; Prentice Hall: Englewood Cliffs, NJ, USA, 1992. [Google Scholar]
- Park, J.; Sandberg, I.W. Universal Approximation using Radial-Basis-Function Networks. Neural Comput. 1991, 3, 246–257. [Google Scholar] [CrossRef] [PubMed]
- Panchapakesan, C.; Palaniswami, M.; Ralph, D.; Manzie, C. Effects of Moving the Centers in an RBF Network. IEEE Trans. Neural Netw. 2002, 13, 1299–1307. [Google Scholar] [CrossRef] [PubMed]
- Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Du, K.-L. Clustering: A Neural Network Approach. Neural Netw. 2010, 23, 89–107. [Google Scholar] [CrossRef] [PubMed]
- Du, K.-L.; Swamy, M.N. Neural Networks in a Softcomputing Framework; Springer: London, UK, 2006. [Google Scholar]
- Särndal, C.-E.; Swensson, B.; Wretman, J. Model Assisted Survey Sampling; Springer: New York, NY, USA, 2003. [Google Scholar]
- Himmelblau, D.M. Applied Nonlinear Programming; McGraw-Hill: New York, NY, USA, 1972. [Google Scholar]
- Jamil, M.; Yang, X.-S. A Literature Survey of Benchmark Functions for Global Optimisation Problems. Int. J. Math. Model. Numer. Optim. 2013, 4, 150–194. [Google Scholar] [CrossRef]
- Haight, F.A. Handbook of the Poisson Distribution; John Wiley & Sons: New York, NY, USA, 1967. [Google Scholar]
- Student. The Probable Error of a Mean. Biometrika 1908, 6, 1–25. [Google Scholar] [CrossRef]
- VE, S.; Shin, C.; Cho, Y. Efficient Energy Consumption Prediction Model for a Data Analytic-Enabled Industry Building in a Smart City. Build. Res. Inf. 2021, 49, 127–143. [Google Scholar] [CrossRef]
- Kelly, M.; Longjohn, R.; Nottingham, K. The UCI Machine Learning Repository; University of California, Irvine: Irvine, CA, USA, 2023. [Google Scholar]
- Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0: Fundamental Algorithms for Scientific Computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef]
- Abramowitz, M.; Stegun, I.A.; Romer, R.H. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables; Wiley: New York, NY, USA, 1972. [Google Scholar]
- Wasserman, L. All of Statistics: A Concise Course in Statistical Inference; Springer: New York, NY, USA, 2013. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Google. Google Colaboratory; Alphabet, Inc.: Mountain View, CA, USA, 2023. [Google Scholar]
- Welch, B.L. The Generalization of “Student’s” Problem When Several Different Population Variances are Involved. Biometrika 1947, 34, 28–35. [Google Scholar] [CrossRef] [PubMed]
- Aloise, D.; Deshpande, A.; Hansen, P.; Popat, P. NP-Hardness of Euclidean Sum-of-Squares Clustering. Mach. Learn. 2009, 75, 245–248. [Google Scholar] [CrossRef]
- Qiao, J.-F.; Meng, X.; Li, W.-J.; Wilamowski, B.M. A Novel Modular RBF Neural Network Based on a Brain-Like Partition Method. Neural Comput. Appl. 2020, 32, 899–911. [Google Scholar] [CrossRef]
Method of Selecting Centers for RBF Nodes | ┌─────────── Average Observed Mean Absolute Error across 30 Trials ───────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 15.11282 | 4.99835 | 1.55582 | 0.78069 | 0.47679 |
k-Means Clustering | 14.75855 | 4.64292 | 1.77474 | 0.70187 | 0.53191 |
Opportunity Matrix | 16.09886 | 4.13144 | 1.16665 | 0.53522 | 0.29559 |
Method Comparison | ┌──────────────── t-Values and Significances for Welch’s t-Tests────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Opportunity Matrix vs. Random Sampling | 1.753 | −2.598 * | −3.930 *** | −5.179 *** | −3.828 *** |
Opportunity Matrix vs. k-Means Clustering | 2.322 † | −1.679 | −5.303 *** | −3.596 *** | −4.005 *** |
Method of Selecting Centers for RBF Nodes | ┌─────────── Average Observed Mean Absolute Error across 30 Trials ───────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 0.00988 | 0.00305 | 0.00110 | 0.00059 | 0.00055 |
k-Means Clustering | 0.01013 | 0.00285 | 0.00104 | 0.00054 | 0.00052 |
Opportunity Matrix | 0.00592 | 0.00138 | 0.00067 | 0.00046 | 0.00035 |
Method Comparison | ┌──────────────── t-Values and Significances for Welch’s t-Tests────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Opportunity Matrix vs. Random Sampling | −6.657 *** | −10.463 *** | −10.064 *** | −3.027 ** | −4.203 *** |
Opportunity Matrix vs. k-Means Clustering | −8.264 *** | −10.081 *** | −7.058 *** | −2.245 * | −3.726 *** |
Method of Selecting Centers for RBF Nodes | ┌─────────── Average Observed Mean Absolute Error across 30 Trials ───────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 0.00423 | 0.00068 | 0.00029 | 0.00022 | 0.00025 |
k-Means Clustering | 0.00379 | 0.00062 | 0.00025 | 0.00021 | 0.00022 |
Opportunity Matrix | 0.00176 | 0.00032 | 0.00021 | 0.00019 | 0.00020 |
Method Comparison | ┌──────────────── t-Values and Significances for Welch’s t-Tests────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Opportunity Matrix vs. Random Sampling | −9.190 *** | −6.932 *** | −5.547 *** | −2.723 ** | −4.023 *** |
Opportunity Matrix vs. k-Means Clustering | −8.944 *** | −5.035 *** | −2.897 ** | −1.953 | −1.550 |
Method of Selecting Centers for RBF Nodes | ┌─────────── Average Observed Mean Absolute Error Across 30 Trials ───────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 0.00349 | 0.00194 | 0.00194 | 0.00132 | 0.00154 |
k-Means Clustering | 0.00297 | 0.00217 | 0.00183 | 0.00129 | 0.00175 |
Opportunity Matrix | 0.00184 | 0.00135 | 0.00091 | 0.00063 | 0.00037 |
Method Comparison | ┌──────────────── t-Values and Significances for Welch’s t-Tests────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Opportunity Matrix vs. Random Sampling | −4.313 *** | −2.917 ** | −4.600 *** | −5.576 *** | −12.497 *** |
Opportunity Matrix vs. k-Means Clustering | −3.406 ** | −2.849 ** | −4.785 *** | −5.552 *** | −15.383 *** |
Method of Selecting Centers for RBF Nodes | ┌─────────── Average Observed Mean Absolute Error across 30 Trials ───────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 2.56237 | 2.49915 | 2.43682 | 2.26433 | 1.94755 |
k-Means Clustering | 2.62083 | 2.52551 | 2.47267 | 2.42276 | 2.03874 |
Opportunity Matrix | 2.38783 | 2.35383 | 2.27311 | 1.87854 | 1.34394 |
Method Comparison | ┌──────────────── t-Values and Significances for Welch’s t-Tests────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Opportunity Matrix vs. Random Sampling | −2.264 * | −2.685 ** | −4.980 *** | −7.237 *** | −20.783 *** |
Opportunity Matrix vs. k-Means Clustering | −3.079 ** | −3.778 *** | −7.162 *** | −15.374 *** | −25.258 *** |
Method of Selecting Centers for RBF Nodes | ┌───────────────── Average Peak Memory Requirement (MB) ─────────────────┐ | ||||
---|---|---|---|---|---|
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | 604.41 | 604.61 | 604.61 | 604.63 | 604.66 |
k-Means Clustering | 683.20 | 695.75 | 714.12 | 714.52 | 727.37 |
Opportunity Matrix | 601.33 | 610.98 | 616.45 | 621.48 | 628.59 |
Method of Selecting Centers for RBF Nodes | ┌──────────────────── Average Wall-Clock Time (Seconds)────────────────────┐ | ||||
16 RBF Nodes | 32 RBF Nodes | 64 RBF Nodes | 128 RBF Nodes | 256 RBF Nodes | |
Random Sampling | <0.01 | <0.01 | <0.01 | <0.01 | <0.01 |
k-Means Clustering | 2.53 | 5.83 | 11.27 | 23.31 | 55.74 |
Opportunity Matrix | 534.05 | 1100.89 | 2236.67 | 4665.81 | 9720.67 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Soper, D.S. Using an Opportunity Matrix to Select Centers for RBF Neural Networks. Algorithms 2023, 16, 455. https://doi.org/10.3390/a16100455
Soper DS. Using an Opportunity Matrix to Select Centers for RBF Neural Networks. Algorithms. 2023; 16(10):455. https://doi.org/10.3390/a16100455
Chicago/Turabian StyleSoper, Daniel S. 2023. "Using an Opportunity Matrix to Select Centers for RBF Neural Networks" Algorithms 16, no. 10: 455. https://doi.org/10.3390/a16100455
APA StyleSoper, D. S. (2023). Using an Opportunity Matrix to Select Centers for RBF Neural Networks. Algorithms, 16(10), 455. https://doi.org/10.3390/a16100455