A Comparative Study for Stock Market Forecast Based on a New Machine Learning Model
Abstract
:1. Introduction
2. Background
2.1. Literature Review
2.2. Artificial Halocarbon Compounds
- Least-squares regression (LSR) to define the structure of each molecule.
- Gradient descent (GD) to optimize the position and number of molecules in the feature space.
2.2.1. Formerly Reported Comparison and Implementations of AHN
- Online sales prediction: The AHN algorithm has been applied in forecasting online retail sales, employing a simple AHN topology featuring a linear and saturated compound. The implementation involved a comparative analysis with other well-established learning methods, including cubic splines (CSs), model trees (MTs), random forest (RF), linear regression (LR), Bayesian regularized neural networks (BNs), support vector machines with a radial basis function kernel (SVM), among others. Performance evaluation in the experiments was conducted based on the accuracy of the models, measured by the root-mean squared error (RMSE) metric. Notably, the results revealed that AHN outperformed the other models, demonstrating superior performance in this context [16].
- Forecast of exchange rate currencies: the effectiveness of the AHN model in generating forecasts for the exchange rates of BRICS currencies to USD was assessed. Specifically, the work focused on the exchange rate of the Brazilian Real to USD (BRL/USD). Following the execution of experiments, the model yielded a favorable chart behavior, accompanied by an error rate of 0.0102 [17].
- The AHN algorithm was employed in an intelligent diagnosis system using a double-optimized Artificial Hydrocarbon Network to identify mechanical faults in the In-Wheel Motor (IWM). The implementation aimed to validate enhanced performance across multiple rotating speeds and load conditions for the IWM. Comparative analysis was conducted against other methods, including support vector machines (SVMs), a particle swarm optimization-based SVM (PSO-SVM), among others. The double-optimized AHN method exhibited superior performance, achieving a diagnosis accuracy surpassing 80% [18].
2.2.2. Artificial Halocarbon Compounds Approach
2.2.3. AHC Algorithm Implementation
Algorithm 1 AHC Algorithm (): Implementation of the Artificial Halocarbon Compounds using the AHC algorithm. |
Input: the system , the maximum number of molecules , the tolerance value and the regularization factor . Output: the structure of the compound C and the type of halogenation for each molecule in C. The coefficients are included within the structure C.
|
2.3. Genetic Algorithms
3. Methodology
3.1. Data
- 1
- For each input, we applied an approximation using least-squares polynomial (LSP) regression; in this regard, the macroeconomic variables (MEVs) are treated as “continuous signals” instead of discrete information.
- 2
- The data were standardized by removing the mean so it could be scaled.
- 3
- We used principal component analysis (PCA) to reduce the dimensionality of the data; it was carried out by considering three principal components (PCs).
3.2. Forecast of the Stock Market Indices
4. Experimental Setup
4.1. AHC Parameter Tuning
4.2. GA Parameter Tuning
5. Results and Analysis
5.1. AHC Forecast
- 1
- Figure 3 shows a comparison between the original values of the IPC from the testing set displayed in blue and the forecast values displayed in red. From this graph, it can be noticed that the obtained forecast from the AHC algorithm replicates the behavior of the original IPC very well.
- 2
- Figure 4 shows the residuals of the model. The residuals display a satisfactory homogeneous distribution, reinforcing the claim that the model is behaving well.
- 3
- Figure 5 illustrates the behavior of the relative error with a mean and a SD; this shows how the results of the test data have kept a low error rate and low noise or residual variation.
5.2. Model Comparison with GA
- It has the capacity to perform a global search, since this method can explore the entire search space and can find global optima in complex spaces.
- It can find a solution via exploration, searching new areas of the solution space and thus focusing on specific areas.
- Its stochastic characteristic allows it to escape local optima.
- It requires a high computational intensity and thus can be computationally expensive for complex problems and large solution spaces, requiring a significant amount of computational resources and time.
- It can converge prematurely to suboptimal solutions.
- Its performance is sensitive to the choice of parameters, making it susceptible improper tuning, and its optimal parameter tuning can be challenging.
- Due to its stochastic nature, the results can be more susceptible to white noise.
5.3. Cross-Reference Comparison
5.4. Complementary Analysis
6. Conclusions and Future Work
- Improving financial forecasts. Taking into account that our results are evaluated by applying an out-of-sample forecast, changing this approach to a one-day-ahead forecast can improve the performance of the predictions.
- Extending the comparison with other state-of-the-art methods. An extensive assessment of the performance of the new AHC algorithm against other techniques, such as random forest, neural networks, multilayer perceptrons, long short-term memory neural networks and genetic programming, can be carried out.
- Exploring other types of substitutions for the AHC halogenations. Specific kinds of polynomial expressions are used to produce the halogenations for the AHC algorithm; these expressions were chosen based on empirical reasons, leaving space to explore other types of substitutions to yield the halogenations while forming the compounds.
- Increasing and diversifying the application of the AHC algorithm to other fields. One immediate natural application is electricity load forecasting, considering that this task is also based on time series prediction [27]. Another usage that in recent years has gained importance due to its relevance in the medical field is image and pattern recognition, such as cancer detection or kidney stone identification [28]. Further applications where the original AHN algorithm proved to be efficient can be tested, such as signal processing, facial recognition, motor controller and intelligent control systems for robotics, among many other possibilities.
- Extending the analysis of the results. An exhaustive examination of the results can be undertaken regarding more specific aspects, such as model robustness, variation in the results over time and consistency across countries.
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Elliott, G.; Timmerman, A. Handbook of Economic Forecasting; Elsevier: Oxford, UK, 2013; Volume 1. [Google Scholar]
- González, E.; Trejo, L.A. Artificial Organic Networks Approach Applied to the Index Tracking Problem. In Advances in Computational Intelligence, Proceedings of the 20th Mexican International Conference on Artificial Intelligence, MICAI 2021, Mexico City, Mexico, 25–30 October 2021; LNCS (LNAI); Springer: Cham, Switzerland, 2021. [Google Scholar]
- Salman, O.; Melissourgos, T.; Kampouridis, M. Optimization of Trading Strategies Using a Genetic Algorithm under the Directional Changes Paradigm with Multiple Thresholds. In Proceedings of the 2022 IEEE Congress on Evolutionary Computation (CEC), Padua, Italy, 18–23 July 2022. [Google Scholar] [CrossRef]
- Salman, O.; Kampouridis, M.; Jarchi, D. Trading Strategies Optimization by Genetic Algorithm under the Directional Changes Paradigm. In Proceedings of the 2022 IEEE Congress on Evolutionary Computation (CEC), Padua, Italy, 18–23 July 2022. [Google Scholar] [CrossRef]
- Ayyıldız, N. Predicting Stock Market Index Movements With Machine Learning; Ozgur Press: Şehitkamil/Gaziantep, Turkey, 2023. [Google Scholar]
- Saboor, A.; Hussain, A.; Agbley, B.L.Y.; ul Haq, A.; Ping Li, J.; Kumar, R. Stock Market Index Prediction Using Machine Learning and Deep Learning Techniques. Intell. Autom. Soft Comput. 2023, 37, 1325–1344. [Google Scholar] [CrossRef]
- Aliyev, F.; Eylasov, N.; Gasim, N. Applying Deep Learning in Forecasting Stock Index: Evidence from RTS Index. In Proceedings of the 2022 IEEE 16th International Conference on Application of Information and Communication Technologies (AICT), Washington, DC, USA, 12–14 October 2022. [Google Scholar] [CrossRef]
- Ding, Y.; Sun, N.; Xu, J.; Li, P.; Wu, J.; Tang, S. Research on Shanghai Stock Exchange 50 Index Forecast Based on Deep Learning. Math. Probl. Eng. 2022, 2022, 1367920. [Google Scholar] [CrossRef]
- Haryono, A.T.; Sarno, R.; Sungkono, R. Stock price forecasting in Indonesia stock exchange using deep learning: A comparative study. Int. J. Electr. Comput. Eng. 2024, 14, 861–869. [Google Scholar] [CrossRef]
- Pokhrel, N.R.; Dahal, K.R.; Rimal, R.; Bhandari, H.N.; Khatri, R.K.C.; Rimal, B.; Hahn, W.E. Predicting NEPSE index price using deep learning models. Mach. Learn. Appl. 2022, 9, 100385. [Google Scholar] [CrossRef]
- Singh, G. Machine Learning Models in Stock Market Prediction. Int. J. Innov. Technol. Explor. Eng. 2022, 11, 18–28. [Google Scholar] [CrossRef]
- Harahap, L.A.; Lipikorn, R.; Kitamoto, A. Nikkei Stock Market Price Index Prediction Using Machine Learning. J. Phys. Conf. Ser. 2020, 1566, 012043. [Google Scholar] [CrossRef]
- Ponce, H. A New Supervised Learning Algorithm Inspired on Chemical Organic Compounds. Ph.D. Thesis, Instituto Tecnológico y de Estudios Superiores de Monterrey, Mexico City, Mexico, 2013. [Google Scholar]
- Ponce, H.; Ponce, P.; Molina, A. Artificial Organic Networks: Artificial Intelligence Based on Carbon Networks, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Ponce, H.; Gonzalez, G.; Morales, E.; Souza, P. Development of Fast and Reliable Nature-Inspired Computing for Supervised Learning in High-Dimensional Data. In Nature Inspired Computing for Data Science; Springer: Cham, Switzerland, 2019; pp. 109–138. [Google Scholar]
- Ponce, H.; Miralles, L.; Martínez, L. Artificial hydrocarbon networks for online sales prediction. In Advances in Artificial Intelligence and Its Applications, Proceedings of the 14th Mexican International Conference on Artificial Intelligence, MICAI 2015, Cuernavaca, Morelos, Mexico, 25–31 October 2015; Springer: Cham, Switzerland, 2015. [Google Scholar]
- Ayala-Solares, J.R.; Ponce, H. Supervised Learning with Artificial Hydrocarbon Networks: An open source implementation and its applications. arXiv 2020, arXiv:2005.10348. [Google Scholar] [CrossRef]
- Xue, H.; Song, Z.; Wu, M.; Sun, N.; Wang, H. Intelligent Diagnosis Based on Double-Optimized Artificial Hydrocarbon Networks for Mechanical Faults of In-Wheel Motor. Sensors 2022, 22, 6316. [Google Scholar] [CrossRef] [PubMed]
- Ponce, H.; Acevedo, M. Design and Equilibrium Control of a Force-Balanced One-Leg Mechanism. In Advances in Computational Intelligence, Proceedings of the 17th Mexican International Conference on Artificial Intelligence, MICAI 2018, Guadalajara, Mexico, 22-27 October 2018; Springer: Cham, Switzerland, 2018. [Google Scholar]
- Ponce, H.; Acevedo, M.; Morales, E.; Martínez, L.; Díaz, G.; Mayorga, C. Modeling and Control Balance Design for a New Bio-inspired Four-Legged Robot. In Advances in Soft Computing, Proceedings of the 18th Mexican International Conference on Artificial Intelligence, MICAI 2019, Xalapa, Mexico, 27 October–2 November 2019; Springer: Cham, Switzerland, 2019. [Google Scholar]
- Ponce, H.; Ponce, P.; Molina, A. Stochastic parallel extreme artificial hydrocarbon networks: An implementation for fast and robust supervised machine learning in high-dimensional data. Eng. Appl. Artif. Intell. 2020, 89, 103427. [Google Scholar] [CrossRef]
- Ponce, H.; Martínez, L. Interpretability of artificial hydrocarbon networks for breast cancer classification. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar]
- Ponce, H.; Bravo, M. A Novel Design Model Based on Genetic Algorithms. In Proceedings of the 2011 10th Mexican International Conference on Artificial Intelligence, Puebla, Mexico, 26 November–4 December 2011. [Google Scholar]
- González, E.; Trejo, L.A. Datasets of Stock Market Indices. 2024. Available online: https://ieee-dataport.org/documents/datasets-stock-market-indices (accessed on 1 March 2024).
- González, E. AHC Related Code. Available online: https://github.com/egonzaleznez/ahc (accessed on 1 March 2024).
- Murphy, J.J. Technical Analysis Financial Markets; New York Institute of Finance: New York, NY, USA, 1999. [Google Scholar]
- Zuniga-Garcia, M.A.; Santamaría, G.; Arroyo, G.; Batres, R. Prediction interval adjustment for load-forecasting using machine learning. Appl. Sci. 2019, 9, 5269. [Google Scholar] [CrossRef]
- Lopez-Tiro, F.; Flores, D.; Betancur, J.P.; Reyes, I.; Hubert, J.; Ochoa, G.; Daul, C. Boosting Kidney Stone Identification in Endoscopic Images Using Two-Step Transfer Learning. In Advances in Soft Computing, Proceedings of the 22nd Mexican International Conference on Artificial Intelligence, MICAI 2023, Yucatán, Mexico, 13–18 November 2023; LNCS (LNAI); Springer: Cham, Switzerland, 2023. [Google Scholar] [CrossRef]
Method | Computational Complexity | Supervised | Unsupervised | Continuous | Discrete | Linear | Nonlinear | Static | Dynamic | Parametric | Nonparametric | Deterministic | Nondeterministic | Approximation | Classification | Optimization |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
General | ||||||||||||||||
linear regression | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||||
general regression | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||
running mean smoother | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||||
kernel smoother | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||||
decision trees | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
random forest | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
naive Bayes classifier | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||||||
Bayesian networks | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
support vector machine | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
k-nearest neighbor | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
k-means algorithm | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||||
fuzzy clustering means | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||||
simulated annealing | OP | X | X | NBD | X | X | ✓ | ✓ | ||||||||
Artificial Neural Networks | ||||||||||||||||
backpropagation | TD | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ||||
generalized Hebbian algorithm | TD | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
Hopfield’s nets | TD | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | |||||
Evolutionary | ||||||||||||||||
genetic algorithms | NBD | X | X | NBD | X | X | ✓ | ✓ | ||||||||
gene expression algorithms | NBD | X | X | NBD | X | X | ✓ | ✓ | ||||||||
Chemically Inspired | ||||||||||||||||
DNA computing | NBD | X | X | NBD | X | X | ✓ | ✓ | ||||||||
artificial hydrocarbon networks | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
Descriptive Statistics | |||||||
---|---|---|---|---|---|---|---|
Index | Mean | SD | Min | 25% | 50% | 75% | Max |
IPC | 39,899.97 | 8813.83 | 16,653.15 | 33,262.48 | 41,960.44 | 46,190.08 | 56,609.53 |
S&P 500 | 2175.65 | 1022.05 | 676.53 | 1343.80 | 1963.29 | 2801.97 | 4796.56 |
DAX | 9730.27 | 3228.43 | 3666.40 | 6795.31 | 9662.18 | 12,427.14 | 16,275.37 |
DJIA | 18,905.68 | 7938.82 | 6547.05 | 12,397.85 | 16,837.42 | 25,371.71 | 36,799.65 |
FTSE | 6400.61 | 880.11 | 3512.10 | 5850.83 | 6486.40 | 7129.97 | 8014.31 |
N225 | 17,399.64 | 6332.78 | 7054.97 | 10,965.59 | 16,958.52 | 22,011.19 | 31,328.16 |
NDX | 5332.16 | 4030.66 | 1036.51 | 2084.62 | 4089.62 | 7307.99 | 16,573.33 |
CAC | 4794.71 | 1073.30 | 2519.29 | 3939.81 | 4799.87 | 5501.77 | 7577.00 |
AHC Parameter Tuning | |
---|---|
Tolerance | |
Maximum number of molecules | 12 |
Regularization factor |
GA Parameter Tuning | |
---|---|
Training size | 0.85 |
Population size | 650 |
Mutation probability | 0.25 |
Genetic operator probability | 0.1 |
Generations | 25 |
Computed AHC Model | ||
---|---|---|
Molecule | 1 | 2 |
Cl | Cl | |
1.0751 | 1.0431 | |
0 | 0 |
Comparison of the AHC Computed Compounds | |||
---|---|---|---|
Index | Cl Molecules | Ts Molecules | Total Molecules |
IPC | 2 | 0 | 2 |
S&P 500 | 7 | 5 | 12 |
DAX | 9 | 3 | 12 |
DJIA | 2 | 0 | 2 |
FTSE | 9 | 3 | 12 |
N225 | 10 | 2 | 12 |
NDX | 8 | 4 | 12 |
CAC | 7 | 5 | 12 |
Testing Set Model Performance | ||||
---|---|---|---|---|
Index | RSS | SSR | TSS | R-Square |
IPC | 0.0397 | 2.0127 | 2.0524 | 0.9806 |
S&P 500 | 1.6715 | 4.4964 | 6.1679 | 0.729 |
DAX | 38.4175 | 36.9056 | 75.3231 | 0.49 |
DJIA | 0.0444 | 1.9437 | 1.988 | 0.9777 |
FTSE | 2.2429 | 5.4236 | 7.6666 | 0.7074 |
N225 | 2.632 | 4.13670 | 6.7687 | 0.6111 |
NDX | 36.3486 | 47.3396 | 83.6882 | 0.5657 |
CAC | 0.819 | 4.0481 | 4.8671 | 0.8317 |
Relative Error of the Testing Set | |||||||
---|---|---|---|---|---|---|---|
Index | Mean | Median | SD | MAD | Max | Min | Range |
IPC | 0.0007 | 0.0006 | 0.0006 | 0.0004 | 0.0031 | 0.0000 | 0.0031 |
S&P 500 | 0.0049 | 0.0027 | 0.0058 | 0.0042 | 0.0291 | 0.0000 | 0.0291 |
DAX | 0.019 | 0.0052 | 0.0251 | 0.022 | 0.0753 | 0.0752 | |
DJIA | 0.0007 | 0.0005 | 0.0007 | 0.0005 | 0.0039 | 0.0000 | 0.0039 |
FTSE | 0.0063 | 0.005 | 0.0052 | 0.0038 | 0.0265 | 0.0000 | 0.0265 |
N225 | 0.0064 | 0.0065 | 0.0044 | 0.004 | 0.0141 | 0.0000 | 0.0141 |
NDX | 0.011 | 0.0033 | 0.0293 | 0.0122 | 0.1464 | 0.0000 | 0.1464 |
CAC | 0.0038 | 0.0025 | 0.0034 | 0.0028 | 0.0136 | 0.0000 | 0.0136 |
Computed GA Model | |
---|---|
Gene | Value |
1.9749 | |
3.5800 | |
−2.8185 | |
10.9748 | |
11.1065 | |
5.8242 | |
9.6453 | |
8.6060 | |
10.7133 | |
1.2196 | |
−10.8794 | |
−7.6908 | |
3.5332 |
Testing Set Model Performance | ||||
---|---|---|---|---|
Index | RSS | SSR | TSS | R-Square |
IPC | 11.8032 | 15.0688 | 26.8721 | 0.5607 |
S&P 500 | 543.1242 | 549.0494 | 1092.1737 | 0.5027 |
DAX | 342.9938 | 347.9404 | 690.9343 | 0.5035 |
DJIA | 34.3704 | 37.6600 | 72.0305 | 0.5228 |
FTSE | 106.3347 | 104.6426 | 210.9773 | 0.4959 |
N225 | 87.8507 | 88.8637 | 176.7144 | 0.5028 |
NDX | 30.3829 | 23.5269 | 53.9099 | 0.4364 |
CAC | 79.4216 | 84.3237 | 163.7454 | 0.5149 |
Relative Error of the Testing Set | |||||||
---|---|---|---|---|---|---|---|
Index | Mean | Median | SD | MAD | Max | Min | Range |
IPC | 0.0144 | 0.0150 | 0.0054 | 0.0043 | 0.0275 | 0.0002 | 0.0273 |
S&P 500 | 0.1347 | 0.1290 | 0.0232 | 0.0169 | 0.2014 | 0.0903 | 0.1111 |
DAX | 0.0824 | 0.0893 | 0.0450 | 0.0396 | 0.1714 | 0.0000 | 0.1714 |
DJIA | 0.0263 | 0.0266 | 0.0077 | 0.0059 | 0.0510 | 0.0011 | 0.0499 |
FTSE | 0.0531 | 0.0492 | 0.0184 | 0.0127 | 0.1177 | 0.0232 | 0.0945 |
N225 | 0.0437 | 0.0437 | 0.0078 | 0.0060 | 0.0653 | 0.0180 | 0.0472 |
NDX | 0.0258 | 0.0278 | 0.0113 | 0.0095 | 0.0522 | 0.0009 | 0.0513 |
CAC | 0.0462 | 0.0473 | 0.0172 | 0.0136 | 0.1081 | 0.0006 | 0.1074 |
Statistics Comparison of the Relative Error | ||||||||
---|---|---|---|---|---|---|---|---|
Method | AHC | GA | ||||||
Index | Mean | Median | SD | R-Square | Mean | Median | SD | R-Square |
IPC | 0.0007 | 0.0006 | 0.0006 | 0.9806 | 0.0144 | 0.0150 | 0.0054 | 0.5607 |
S&P 500 | 0.0049 | 0.0027 | 0.0058 | 0.729 | 0.1347 | 0.1290 | 0.0232 | 0.5027 |
DAX | 0.019 | 0.0052 | 0.0251 | 0.49 | 0.0824 | 0.0893 | 0.0450 | 0.5035 |
DJIA | 0.0007 | 0.0005 | 0.0007 | 0.9777 | 0.0263 | 0.0266 | 0.0077 | 0.5228 |
FTSE | 0.0063 | 0.005 | 0.0052 | 0.7074 | 0.0531 | 0.0492 | 0.0184 | 0.4959 |
N225 | 0.0064 | 0.0065 | 0.0044 | 0.6111 | 0.0437 | 0.0437 | 0.0078 | 0.5028 |
NDX | 0.011 | 0.0033 | 0.0293 | 0.5657 | 0.0258 | 0.0278 | 0.0113 | 0.4364 |
CAC | 0.0038 | 0.0025 | 0.0034 | 0.8317 | 0.0462 | 0.0473 | 0.0172 | 0.5149 |
Wilcoxon Signed-Rank Test Results | |||
---|---|---|---|
Mean | Median | R-Square | |
Test Statistic | 0 | 0 | 1 |
p-value | 0.0078 | 0.0078 | 0.0156 |
Compared Results from the Testing Sets | ||||||
---|---|---|---|---|---|---|
Index | Method | Error | R-Square | Data Size (Years) | Time Period | Testing Set Size |
KSE 1 | SVR | 10,615.67 * | −2.51 | 22 | 2000–2022 | 30% |
KSE 1 | RF | 12,113.12 * | −3.57 | 22 | 2000–2022 | 30% |
KSE 1 | KNN | 13,404.33 * | −4.60 | 22 | 2000–2022 | 30% |
KSE 1 | LSTM | 1844.47 * | 0.89 | 22 | 2000–2022 | 30% |
DSE 1 | SVR | 170.89 * | 0.82 | 9 | 2013–2022 | 30% |
DSE 1 | RF | 163.01 * | 0.84 | 9 | 2013–2022 | 30% |
DSE 1 | KNN | 186.20 * | 0.79 | 9 | 2013–2022 | 30% |
DSE 1 | LSTM | 48.42 * | 0.99 | 9 | 2013–2022 | 30% |
BSE 1 | SVR | 12,569.63 * | −1.35 | 13 | 2009–2022 | 30% |
BSE 1 | RF | 12,356.13 * | −1.27 | 13 | 2009–2022 | 30% |
BSE 1 | KNN | 13,155.32 * | −1.57 | 13 | 2009–2022 | 30% |
BSE 1 | LSTM | 3295.93 * | 0.84 | 13 | 2009–2022 | 30% |
RTS 2 | ARIMA-GARCH | 35.93 * | 0.977 | 22 | 2000–2022 | 10% |
RTS 2 | LSTM | 14.91 * | 0.996 | 22 | 2000–2022 | 10% |
SSE 3 | ARIMA | 9.838 * | 0.9675 | 1 | 2020–2021 | 25% |
SSE 3 | LSTM | 1.319 * | NA | 1 | 2020–2021 | 25% |
IDX 4 | CNN | 719.9594 * | −75.4127 | 1 | 2022 | 20% |
IDX 4 | LSTM | 638.0830 * | −33.0115 | 1 | 2022 | 20% |
IDX 4 | GRU | 553.3277 * | −40.1303 | 1 | 2022 | 20% |
NEPSE 5 | LSTM | 10.4660 * | 0.9874 | 4 | 2016–2020 | 20% |
NEPSE 5 | GRU | 12.0706 * | 0.9839 | 4 | 2016–2020 | 20% |
NEPSE 5 | CNN | 13.6554 * | 0.9782 | 4 | 2016–2020 | 20% |
NIFTY 50 6 | ANN | 36.865 * | 0.999 | 25 | 1996–2021 | 20% |
NIFTY 50 6 | SGD | 42.456 * | 0.999 | 25 | 1996–2021 | 20% |
NIFTY 50 6 | SVM | 68.327 * | 0.998 | 25 | 1996–2021 | 20% |
NIFTY 50 6 | AdaBoost | 2277.710 * | −0.930 | 25 | 1996–2021 | 20% |
NIFTY 50 6 | RF | 2290.890 * | −0.952 | 25 | 1996–2021 | 20% |
NIFTY 50 6 | KNN | 2314.720 * | −0.993 | 25 | 1996–2021 | 20% |
N225 7 | SVR | NA * | 0.81 | 3 | 2016–2019 | 10% |
N225 7 | DNN | NA * | 0.79 | 3 | 2016–2019 | 10% |
N225 7 | BPNN | NA * | 0.82 | 3 | 2016–2019 | 10% |
N225 7 | SVR | NA * | 0.58 | 3 | 2016–2019 | 20% |
N225 7 | DNN | NA * | 0.58 | 3 | 2016–2019 | 20% |
N225 7 | BPNN | NA * | 0.56 | 3 | 2016–2019 | 20% |
IPC | AHC | 0.0007 † | 0.9806 | 17 | 2006–2023 | 15% |
S&P 500 | AHC | 0.0049 † | 0.729 | 17 | 2006–2023 | 15% |
DAX | AHC | 0.019 † | 0.49 | 17 | 2006–2023 | 15% |
DJIA | AHC | 0.0007 † | 0.9777 | 17 | 2006–2023 | 15% |
FTSE | AHC | 0.0063 † | 0.7074 | 17 | 2006–2023 | 15% |
N225 | AHC | 0.0064 † | 0.6111 | 17 | 2006–2023 | 15% |
NDX | AHC | 0.011 † | 0.5657 | 17 | 2006–2023 | 15% |
CAC | AHC | 0.0038 † | 0.8317 | 17 | 2006–2023 | 15% |
Financial Analysis | ||||
---|---|---|---|---|
Index | Return % | Risk Free Rate % | Volatility % | Sharpe Ratio |
IPC | 121.49 | 6.06 | 16.30 | 7.07 |
S&P 500 | 130.28 | 2.32 | 23.14 | 5.52 |
DAX | 84.04 | 0.66 | 30.14 | 2.76 |
DJIA | 114.42 | 2.32 | 13.85 | 8.09 |
FTSE | 91.01 | 2.15 | 27.00 | 3.29 |
N225 | 100.59 | -0.69 | 17.20 | 5.88 |
NDX | 170.45 | 2.32 | 50.76 | 3.31 |
CAC | 103.53 | 6.06 | 28.33 | 3.43 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
González-Núñez, E.; Trejo, L.A.; Kampouridis, M. A Comparative Study for Stock Market Forecast Based on a New Machine Learning Model. Big Data Cogn. Comput. 2024, 8, 34. https://doi.org/10.3390/bdcc8040034
González-Núñez E, Trejo LA, Kampouridis M. A Comparative Study for Stock Market Forecast Based on a New Machine Learning Model. Big Data and Cognitive Computing. 2024; 8(4):34. https://doi.org/10.3390/bdcc8040034
Chicago/Turabian StyleGonzález-Núñez, Enrique, Luis A. Trejo, and Michael Kampouridis. 2024. "A Comparative Study for Stock Market Forecast Based on a New Machine Learning Model" Big Data and Cognitive Computing 8, no. 4: 34. https://doi.org/10.3390/bdcc8040034