A Synergistic Multi-Objective Evolutionary Algorithm with Diffusion Population Generation for Portfolio Problems
Abstract
:1. Introduction
- A novel cooperative diffusion model generative algorithm (DPG-SMOEA) is proposed for explicit PDF solving, demonstrating evident advantages in low-dimensional data analysis (Hang Seng Index test dataset) and providing comparable results in the analysis of other high-dimensional data (such as the Nikkei 225 Index test dataset), consistent with theoretical conjecture.
- The DPG-SMOEA establishes a mixed-memory optimization pool, storing high-quality solutions generated by the MOEA/D-AEE and utilizing samples from this mixed-memory optimization pool for pretraining the diffusion model. The trained model shows significant effectiveness in generating high-quality offspring, addressing the high uncertainty that traditional random sampling methods cannot handle.
- Diffusion models exhibit typical cold start characteristics, especially in the initialization and pretraining stages. This paper proposes a new cooperative strategy, utilizing the MOEA/D-AEE to generate high-quality solutions in the early stages and employing the diffusion model to generate offspring in the later stages, thereby avoiding the disadvantage of diffusion models’ poor performance during the cold start phase.
2. Related Work
2.1. Overview of Multi-Objective Evolutionary Algorithms
2.2. Overview of Portfolio Optimization Problem
- Cardinality constraints (CCs), which restrict the number of assets to be invested in to a specific number or within a range, allowing asset managers to more conveniently track assets and reduce trading costs;
- Floor and ceiling constraints (FCs), which stipulate that the weight of each asset must fall within a certain range, reflecting investors’ preferences for specific assets;
- Round-lot constraints (RL), which specify that asset purchases must be made in multiples of a certain quantity, making transactions more similar to real-world occurrences;
- Pre-assignment constraints (PA), which determine in advance whether a certain group of assets must be invested in, also reflecting the preferences of investors for specific assets.
2.3. Standardized Test Functions for Multi-Objective Evolutionary Algorithms
- Realism: These functions are designed to simulate real-world problems to better reflect actual multi-objective optimization challenges.
- Nonlinear and multimodal: standard test functions are usually nonlinear and may contain multiple local optimal solutions (multimodal), which can be used to optimize the performance of the algorithm.
- Scalability: these functions can often be scaled to meet the needs of different dimensions, thus providing a good understanding of its ability to evaluate the performance of the algorithm in high-dimensional spaces.
- DTLZ (Deb–Thiele–Laumanns–Zitzler) functions are a set of commonly used benchmarking functions that are scalable and nonlinear, and are often used to evaluate the effectiveness of an algorithm when dealing with multiple objectives.
- ZDT (Zitzler–Deb–Thiele) functions are another set of widely used test functions, which are usually used to evaluate the effectiveness of multi-objective optimization algorithms. They involve both linear and nonlinear relationships and are designed to test the robustness of the algorithm against different types of functions.
- UFs (unconstrained optimization test functions) are used to evaluate the robustness of an algorithm to non-linear relationships that may be encountered in real-world problems.
2.4. Diffusion Model
- Denoising Diffusion Probabilistic Models involve two processes [26]: a forward noisy process and a reverse denoising process. In the forward process, noise is gradually added to the original data, with each step’s data depending on the previous step’s result until, at step T, the data become pure Gaussian noise. The reverse process involves removing noise points step by step to recover the original data.
- The core of Fractional-based Generative Models lies in the concept of Stein fractions (also known as fractional or score functions) [29]. Given a probability density function , its score function is defined as the gradient of the logarithm of the probability density, . Here, the Stein fraction is a function of the data x, rather than a function of model parameters . It represents a vector field indicating the direction of the maximum growth rate for the probability density function.
- Stochastic Differential-Equation-based Generative Models utilize stochastic differential equations for noise perturbation and sample generation. Additionally, the denoising process requires estimating the score function of the noise data distribution. This score function controls the diffusion process that perturbs the data into noise according to the following stochastic differential Equation (1):
3. Algorithm Implementation
3.1. Problem Definition
- Challenge 1: How can we coordinate diffusion networks?
- Challenge 2: How can we address the issue of training data for diffusion models?
- Challenge 3: How can we introduce offspring generated by diffusion models?
- Based on the cold-start characteristic of diffusion models, an optimized collaborative strategy is designed in this paper.
- Establish a mixed pool training pool, optimize the population using the MOEA/D-AEE in the early stage, and store good solutions in the mixed pool as training data.
- Calculate the relevant information of the offspring and replace the original solutions in the population, making the generated offspring more applicable for addressing the random uncertainty issues of traditional sampling methods.
3.2. DPG-SMOEA Framework
- Read the relevant information from the dataset and set the parameters of the problem, including the number of assets, the statistics of returns and risks, and other related information such as the covariance matrix. Also, set the initial parameters of the algorithm, such as the population size, number of neighbors, etc.
- Define neighbors, generate the initial population, and calculate the [return, risk] values and Pareto frontier extreme points for each individual.
- Instantiate the diffusion model.
- Iterate in a loop, where each iteration represents a complete search process, including selection, mutation, crossover, replacement, and other operations.
- Iterate through each individual in the population in each iteration. This step involves performing mutation and crossover operations on each individual to generate new offspring.
- Determine the mating pool for the offspring generated by the mutation operator.
- Decide the reproduction method for the offspring. For the first g generations, use polynomial mutation to generate offspring. After the g-th generation, use the saved better solutions in the population to train the diffusion model, and then use the diffusion model for inference to generate offspring. Calculate the returns, risks, reference points, and other information for the offspring.
- Update the extreme points of the Pareto frontier, calculate the reference points and weight vectors, determine whether to replace the parent nodes, and then update the population.
- Set counters, iterate through neighbors, update the population if the offspring is better than the parent node, and exit the loop when the iteration is complete or the counter reaches the upper limit.
- Iterate through the weight of all assets in the offspring, setting negative values to 0.
- Sum the weights of all assets to obtain a total sum (s).
- If (s) is not equal to 0, scale all weight vectors to ensure their sum equals 1; if (s) equals 0, use a generated solution satisfying the offspring constraints.
3.3. Synergy in Diffusion Models
4. Experiments and Analysis
4.1. Test Data and Evaluation Indexes
4.1.1. Test Data
4.1.2. Evaluation Indexes
4.2. Comparison Algorithm
4.3. Parameter Settings
4.4. Analysis of Results
4.4.1. Experiments on Synergistic Methods of Population Entropy
4.4.2. Collaborative Experiments Adapted to the Cold-Start Mechanism
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Appendix A. Algorithms
Appendix A.1. The DPG-SMOEA Framework
Algorithm 1: DPG-SMOEA Framework |
Input: T is the maximum number of iterations, is the maximum number of offspring updates, g is the threshold for collaborative strategy; Initialize: T = 1500, = 2;
|
Appendix A.2. Generating Offspring via the Diffusion Model
Algorithm 2: Generating Offspring via the Diffusion Model |
Input: High-quality solutions from MOEA/D-AEE stored in the mixed optimization pool as training samples.
|
Appendix B. Tables
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | DPG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 4.02 × | 4.21 × | 3.49 × | 1.74 × | 9.06 × | 4.17 × | |
GD | Median | 5.91 × | 7.26 × | 7.98 × | 2.29 × 10−6 | 1.18 × | 6.41 × |
Std. | 1.13 × | 2.22 × | 1.18 × | 2.60 × | 1.34 × | 6.00 × | |
Best | 1.65 × | 1.50 × | 8.51 × | 9.38 × | 3.94 × | 1.60 × | |
Spacing | Median | 2.05 × | 2.34 × | 1.80 × | 1.53 × 10−5 | 4.89 × | 2.23 × |
Std. | 6.25 × | 9.07 × | 6.76 × | 5.71 × | 4.18 × | 6.95 × | |
Best | 9.10 × | 9.23 × | 9.00 × | 8.92 × | 9.06 × | 9.13 × | |
MaxSpread | Median | 8.96 × 10−3 | 8.89 × | 8.54 × | 8.25 × | 8.63 × | 8.89 × |
Std. | 9.91 × | 1.98 × | 9.93 × | 3.16 × | 2.18 × | 4.48 × | |
Best | 2.44 × | 2.53 × | 2.33 × | 2.61 × | 4.48 × | 2.46 × | |
Delta | Median | 2.60 × 10−1 | 2.87 × | 2.87 × | 2.80 × | 4.96 × | 2.77 × |
Std. | 2.37 × | 3.99 × | 8.20 × | 1.40 × | 3.50 × | 5.72 × | |
Best | 2.86 × | 2.99 × | 3.15 × | 2.98 × | 3.92 × | 2.88 × | |
IGD | Median | 3.12 × 10−5 | 3.50 × | 6.03 × | 7.54 × | 5.01 × | 3.27 × |
Std. | 2.29 × | 8.54 × | 2.44 × | 3.97 × | 1.55 × | 5.38 × | |
Best | 1.18 × | 1.18 × | 1.18 × | 1.18 × | 1.18 × | 1.18 × | |
HV | Median | 1.18 × 10−2 | 1.18 × 10−2 | 1.18 × 10−2 | 1.18 × 10−2 | 1.18 × 10−2 | 1.18 × 10−2 |
Std. | 1.56 × | 3.36 × | 9.66 × | 6.49 × | 4.12 × | 1.67 × |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | DPG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 6.29 × | 7.11 × | 7.32 × | 1.83 × | 5.37 × | 6.26 × | |
GD | Median | 8.07 × | 9.53 × | 1.65 × | 2.78 × 10−6 | 8.04 × | 8.77 × |
Std. | 9.03 × | 1.81 × | 9.27 × | 6.36 × | 1.99 × | 1.94 × | |
Best | 2.59 × | 2.34 × | 1.64 × | 1.59 × | 2.27 × | 2.73 × | |
Spacing | Median | 3.38 × | 3.24 × | 2.98 × | 2.48 × 10−5 | 4.34 × | 3.45 × |
Std. | 6.04 × | 7.37 × | 6.61 × | 5.68 × | 6.95 × | 5.97 × | |
Best | 8.11 × | 8.12 × | 8.36 × | 7.37 × | 7.83 × | 8.19 × | |
MaxSpread | Median | 7.78 × 10−3 | 7.71 × | 7.40 × | 6.04 × | 7.20 × | 7.68 × |
Std. | 2.12 × | 2.31 × | 7.26 × | 3.70 × | 5.20 × | 3.00 × | |
Best | 3.95 × | 4.07 × | 3.60 × | 4.54 × | 5.70 × | 3.91 × | |
Delta | Median | 4.14 × 10−1 | 4.49 × | 4.36 × | 5.81 × | 6.68 × | 4.29 × |
Std. | 2.18 × | 3.79 × | 7.72 × | 3.65 × | 5.08 × | 3.26 × | |
Best | 3.41 × | 3.62 × | 4.40 × | 7.20 × | 4.14 × | 3.33 × | |
IGD | Median | 4.16 × 10−5 | 4.90 × | 9.48 × | 1.54 × | 6.55 × | 4.67 × |
Std. | 1.42 × | 1.60 × | 9.39 × | 3.75 × | 3.34 × | 2.54 × | |
Best | 1.87 × | 1.87 × | 1.87 × | 1.87 × | 1.87 × | 1.87 × | |
HV | Median | 1.87 × 10−2 | 1.87 × 10−2 | 1.87 × 10−2 | 1.75 × | 1.83 × | 1.87 × 10−2 |
Std. | 1.46 × | 3.08 × | 9.62 × | 4.47 × | 5.68 × | 4.79 × |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | DPG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 4.02 × | 4.21 × | 3.49 × | 1.74 × | 9.06 × | 1.81 × | |
GD | Median | 5.91 × | 7.26 × | 7.98 × | 2.29 × 10−6 | 1.18 × | 5.26 × |
Std. | 1.13 × | 2.22 × | 1.18 × | 2.60 × | 1.34 × | 9.89 × | |
Best | 1.65 × | 1.50 × | 8.51 × | 9.38 × | 3.94 × | 1.63 × | |
Spacing | Median | 2.05 × | 2.34 × | 1.80 × | 1.53 × 10−5 | 4.89 × | 1.95 × |
Std. | 6.25 × | 9.07 × | 6.76 × | 5.71 × | 4.18 × | 5.85 × | |
Best | 9.10 × | 9.23 × | 9.00 × | 8.92 × | 9.06 × | 9.16 × | |
Max Spread | Median | 8.96 × | 8.89 × | 8.54 × | 8.25 × | 8.63 × | 8.99 × 10−3 |
Std. | 9.91 × | 1.98 × | 9.93 × | 3.16 × | 2.18 × | 7.85 × | |
Best | 2.44 × | 2.53 × | 2.33 × | 2.61 × | 4.48 × | 2.44 × | |
Delta | Median | 2.60 × | 2.87 × | 2.87 × | 2.80 × | 4.96 × | 2.55 × 10−1 |
Std. | 2.37 × | 3.99 × | 8.20 × | 1.40 × | 3.50 × | 1.87 × | |
Best | 2.86 × | 2.99 × | 3.15 × | 2.98 × | 3.92 × | 2.80 × | |
IGD | Median | 3.12 × | 3.50 × | 6.03 × | 7.54 × | 5.01 × | 3.02 × 10−5 |
Std. | 2.29 × | 8.54 × | 2.44 × | 3.97 × | 1.55 × | 1.59 × | |
Best | 2.64 × | 2.64 × | 2.64 × | 2.64 × | 2.63 × | 2.64 × | |
HV | Median | 2.64 × 10−5 | 2.63 × | 2.63 × | 2.64 × | 2.63 × | 2.64 × 10−5 |
Std. | 1.22 × | 2.64 × | 2.21 × | 1.38 × | 1.31 × | 1.07 × |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | DPG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 5.27 × | 6.32 × | 7.01 × | 2.84 × | 7.38 × | 4.54 × | |
GD | Median | 7.05 × | 9.58 × | 1.83 × | 5.05 × 10−6 | 9.25 × | 7.72 × |
Std. | 7.28 × | 2.41 × | 1.55 × | 1.60 × | 9.50 × | 8.27 × | |
Best | 1.71 × | 1.38 × | 9.87 × | 1.14 × | 2.39 × | 1.69 × | |
Spacing | Median | 2.07 × | 2.01 × | 1.72 × 10−5 | 1.97 × | 2.99 × | 2.05 × |
Std. | 3.76 × | 4.54 × | 3.34 × | 5.10 × | 2.24 × | 5.04 × | |
Best | 5.96 × | 5.79 × | 5.74 × | 5.47 × | 5.67 × | 5.84 × | |
Max Spread | Median | 5.56 × 10−3 | 5.40 × | 5.15 × | 4.91 × | 5.45 × | 5.42 × |
Std. | 1.59 × | 1.93 × | 5.08 × | 4.19 × | 1.69 × | 1.63 × | |
Best | 4.01 × | 4.25 × | 4.21 × | 4.27 × | 5.47 × | 4.01 × | |
Delta | Median | 4.33 × 10−1 | 4.72 × | 4.51 × | 5.05 × | 6.06 × | 4.53 × |
Std. | 2.02 × | 3.06 × | 6.30 × | 6.88 × | 3.34 × | 3.58 × | |
Best | 2.30 × | 2.90 × | 4.26 × | 4.07 × | 3.22 × | 2.14 × | |
IGD | Median | 3.71 × 10−5 | 5.31 × | 9.09 × | 8.76 × | 4.74 × | 5.02 × |
Std. | 1.14 × | 2.16 × | 1.47 × | 4.33 × | 1.38 × | 1.35 × | |
Best | 1.37 × | 1.37 × | 1.37 × | 1.37 × | 1.37 × | 1.37 × | |
HV | Median | 1.37 × 10−5 | 1.37 × 10−5 | 1.36 × | 1.34 × | 1.37 × 10−5 | 1.37 × 10−5 |
Std. | 5.08 × | 1.64 × | 1.36 × | 6.19 × | 8.35 × | 6.01 × 10 |
Metric | MOEA/D-AEE | MOEA/D-DEM | MOEA/D-DE | MOEA/D-GA | NSGA-II | DPG-SMOEA | |
---|---|---|---|---|---|---|---|
Best | 5.32 × | 5.10 × | 5.93 × | 9.95 × | 2.54 × | 4.51 × | |
GD | Median | 7.36 × | 8.06 × | 1.45 × | 3.32 × | 4.24 × 10−6 | 8.33 × |
Std. | 9.19 × | 1.66 × | 7.09 × | 2.08 × | 2.01 × | 1.34 × | |
Best | 1.42 × | 1.28 × | 5.99 × | 0.00 × 10 | 1.05 × | 1.17 × | |
Spacing | Median | 1.79 × | 2.08 × | 1.15 × 10−5 | 1.92 × | 1.45 × | 1.87 × |
Std. | 4.87 × | 5.77 × | 4.53 × | 9.72 × | 1.87 × | 8.82 × | |
Best | 4.17 × | 4.23 × | 4.29 × | 2.63 × | 3.36 × | 4.11 × | |
Max Spread | Median | 3.93 × | 3.94 × 10−3 | 2.96 × | 2.20 × | 2.88 × | 3.80 × |
Std. | 1.21 × | 5.39 × | 5.48 × | 4.65 × | 2.54 × | 1.67 × | |
Best | 3.87 × | 3.99 × | 3.17 × | 8.40 × | 6.09 × | 3.78 × | |
Delta | Median | 4.42 × 10−1 | 4.81 × | 5.58 × | 9.34 × | 6.81 × | 4.87 × |
Std. | 6.09 × | 9.05 × | 1.00 × | 3.55 × | 2.94 × | 1.04 × | |
Best | 1.72 × | 1.90 × | 7.90 × | 1.77 × | 4.64 × | 1.63 × | |
IGD | Median | 2.47 × 10−5 | 2.73 × | 2.23 × | 2.41 × | 9.69 × | 3.51 × |
Std. | 7.05 × | 4.09 × | 6.95 × | 5.29 × | 3.67 × | 1.59 × | |
Best | 8.31 × | 8.29 × | 7.96 × | 7.87 × | 8.19 × | 8.32 × | |
HV | Median | 8.29 × 10−6 | 8.26 × | 7.23 × | 7.54 × | 7.94 × | 8.27 × |
Std. | 1.06 × | 9.52 × | 3.43 × | 1.20 × | 1.08 × | 1.75 × |
References
- Markowitz, H. Portfolio Selection. J. Financ. 1952, 7, 77–91. [Google Scholar]
- Metaxiotis, K.; Liagkouras, K. Multiobjective evolutionary algorithms for portfolio management: A comprehensive literature review. Expert Syst. Appl. 2012, 39, 11685–11698. [Google Scholar] [CrossRef]
- Zhang, Q.; Li, H.; Maringer, D.; Tsang, E. MOEA/D with NBI-style Tchebycheff approach for portfolio management. In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar] [CrossRef]
- Qian, W.; Liu, J.; Lin, Y.; Yang, L.; Zhang, J.; Xu, H.; Liao, M.; Chen, Y.; Chen, Y.; Liu, B. An improved MOEA/D algorithm for complex data analysis. Wirel. Commun. Mob. Comput. 2021, 2021, 6393638. [Google Scholar] [CrossRef]
- Nichol, A.; Dhariwal, P. Improved denoising diffusion probabilistic models. In Proceedings of the International Conference on Machine Learning, Virtual, 18–24 July 2021; PMLR: Cambridge MA, USA, 2021; pp. 8162–8171. [Google Scholar]
- Ho, J.; Jain, A.; Abbeel, P. Denoising diffusion probabilistic models. Adv. Neural Inf. Process. Syst. 2020, 33, 6840–6851. [Google Scholar]
- Song, J.; Meng, C.; Ermon, S. Denoising diffusion implicit models. arXiv 2020, arXiv:2010.02502. [Google Scholar]
- Ming, F.; Gong, W.; Wang, L.; Gao, L. A Constraint-Handling Technique for Decomposition-Based Constrained Many-Objective Evolutionary Algorithms. IEEE Trans. Syst. Man Cybern. Syst. 2023, 53, 7783–7793. [Google Scholar] [CrossRef]
- Asafuddoula, M.; Ray, T.; Sarker, R. A Decomposition-Based Evolutionary Algorithm for Many Objective Optimization. IEEE Trans. Evol. Comput. 2015, 19, 445–460. [Google Scholar] [CrossRef]
- Coello, C.A.C. Evolutionary multi-objective optimization: A historical view of the field. IEEE Comput. Intell. Mag. 2006, 1, 28–36. [Google Scholar] [CrossRef]
- Zhu, H.; Chen, Q.; Ding, J.; Zhang, X.; Wang, H. Parameter-Adaptive Paired Offspring Generation for Constrained Large-Scale Multiobjective Optimization Algorithm. In Proceedings of the 2023 IEEE Symposium Series on Computational Intelligence (SSCI), Mexico City, Mexico, 5–8 December 2023; pp. 470–475. [Google Scholar]
- Zhang, S.; Yang, T.; Liang, J.; Yue, C. A Novel Adaptive Bandit-Based Selection Hyper-Heuristic for Multiobjective Optimization. IEEE Trans. Syst. Man Cybern. Syst. 2023, 53, 7693–7706. [Google Scholar] [CrossRef]
- Deb, K.; Pratap, A.; Agarwal, S.; Meyarivan, T.A.M.T. A fast and elitist Multiobjective Genetic Algorithm: NSGA-II. IEEE Trans. Evol. Comput. 2002, 6, 182–197. [Google Scholar] [CrossRef]
- Deb, K.; Jain. H. An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints. IEEE Trans. Evol. Comput. 2014, 18, 577–601. [Google Scholar] [CrossRef]
- Shui, Y.; Li, H.; Sun, J.; Zhang, Q. The Combination of MOEA/D and WOF for Solving High-Dimensional Expensive Multiobjective Optimization Problems. In Proceedings of the 2023 IEEE Congress on Evolutionary Computation (CEC), Chicago, IL, USA, 1–5 July 2023; pp. 1–8. [Google Scholar]
- He, L.; Shang, K.; Nan, Y.; Ishibuchi, H.; Srinivasan, D. Relation Between Objective Space Normalization and Weight Vector Scaling in Decomposition-Based Multiobjective Evolutionary Algorithms. IEEE Trans. Evol. Comput. 2023, 27, 1177–1191. [Google Scholar] [CrossRef]
- Salih, A.; Moshaiov, A. Promoting Transfer of Robot Neuro-Motion-Controllers by Many-Objective Topology and Weight Evolution. IEEE Trans. Evol. Comput. 2023, 27, 385–395. [Google Scholar] [CrossRef]
- Zheng, W.; Sun, J.; Zhang, Q.; Xu, Z. Continuous Encoding for Overlapping Community Detection in Attributed Network. IEEE Trans. Cybern. 2023, 53, 5469–5482. [Google Scholar] [CrossRef] [PubMed]
- Trivedi, A.; Srinivasan, D.; Sanyal, K.; Ghosh, A. A Survey of Multiobjective Evolutionary Algorithms Based on Decomposition. IEEE Trans. Evol. Comput. 2017, 21, 440–462. [Google Scholar] [CrossRef]
- Zhao, S.Z.; Suganthan, P.N.; Zhang, Q. Decomposition-Based Multiobjective Evolutionary Algorithm With an Ensemble of Neighborhood Sizes. IEEE Trans. Evol. Comput. 2012, 16, 442–446. [Google Scholar] [CrossRef]
- Bienstock, D. Computational study of a family of mixed-integer quadratic programming problems. Math. Program. 1996, 74, 121–140. [Google Scholar] [CrossRef]
- Shaw, D.X.; Liu, S.; Kopman, L. Lagrangian relaxation procedure for cardinality-constrained portfolio optimization. Optim. Methods Softw. 2008, 23, 411–420. [Google Scholar] [CrossRef]
- Arnone, S.; Loraschi, A.; Tettamanzi, A. A genetic approach to portfolio selection. Neural Netw. World—Int. J. Neural MassParallel Comput. Inf. Syst. 1993, 3, 597–604. [Google Scholar]
- Anagnostopoulos, K.P.; Mamanis, G. The mean–variance cardinality constrained portfolio optimization problem: An experimental evaluation of five multiobjective evolutionary algorithms. Expert Syst. Appl. 2011, 38, 14208–14217. [Google Scholar] [CrossRef]
- Khin, L.; Qu, R.; Kendall, G. A learning-guided multi-objective evolutionary algorithm for constrained portfolio optimization. Appl. Soft Comput. 2014, 24, 757–772. [Google Scholar]
- Yang, L.; Zhang, Z.; Song, Y.; Hong, S.; Xu, R.; Zhao, Y.; Zhang, W.; Cui, B.; Yang, M.H. Diffusion Models: A comprehensive survey of methods and applications. ACM Comput. Surv. 2023, 56, 1–39. [Google Scholar] [CrossRef]
- Song, Y.; Ermon, S. Generative modeling by estimating gradients of the data distribution. Adv. Neural Inf. Process. Syst. 2019, 32, 1415–1428. [Google Scholar]
- Song, Y.; Durkan, C.; Murray, I.; Ermon, S. Maximum likelihood training of score-based Diffusion Models. Adv. Neural Inf. Process. Syst. 2021, 34, 1415–1428. [Google Scholar]
- Hyvärinen, A.; Dayan, P. Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 2005, 6, 695–709. [Google Scholar]
- Siddique, N.; Adeli, H. Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing; John Wiley, Sons: Hoboken, NJ, USA, 2013. [Google Scholar]
- Yu, G.; Chai, T.; Luo, X. Two-level production plan decomposition based on a hybrid MOEA for mineral processing. IEEE Trans. Autom. Sci. Eng. 2012, 10, 1050–1071. [Google Scholar] [CrossRef]
- Qian, W.; Xu, H.; Chen, H.; Yang, L.; Lin, Y.; Xu, R.; Yang, M.; Liao, M. A Synergistic MOEA Algorithm with GANs for Complex Data Analysis. Mathematics 2024, 12, 175. [Google Scholar] [CrossRef]
- Orlova, E. Decision-Making Techniques for Credit Resource Management Using Machine Learning and Optimization. Information 2020, 11, 144. [Google Scholar] [CrossRef]
- Zhang, J.; Liang, C.; Lu, Q. A novel small-population Genetic Algorithm based on adaptive mutation and population entropy sampling. In Proceedings of the 2008 7th World Congress on Intelligent Control and Automation, Chongqing, China, 25–27 June 2008; IEEE: Piscataway, NJ, USA, 2008. [Google Scholar]
- Artzner, P.; Delbaen, F.; Eber, J.; Heath, D. Coherent measures of risk. Math. Financ. 2011, 38, 14208–14217. [Google Scholar]
Dataset | Region | Dimensions |
---|---|---|
Hangsheng | Hongkong | 31 |
DAX100 | Germany | 85 |
FTSE100 | U.K. | 89 |
S&P100 | U.S. | 98 |
Nikkei | Japan | 225 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, M.; Qian, W.; Yang, L.; Hou, X.; Yuan, X.; Dong, Z. A Synergistic Multi-Objective Evolutionary Algorithm with Diffusion Population Generation for Portfolio Problems. Mathematics 2024, 12, 1368. https://doi.org/10.3390/math12091368
Yang M, Qian W, Yang L, Hou X, Yuan X, Dong Z. A Synergistic Multi-Objective Evolutionary Algorithm with Diffusion Population Generation for Portfolio Problems. Mathematics. 2024; 12(9):1368. https://doi.org/10.3390/math12091368
Chicago/Turabian StyleYang, Mulan, Weihua Qian, Lvqing Yang, Xuehan Hou, Xianghui Yuan, and Zhilong Dong. 2024. "A Synergistic Multi-Objective Evolutionary Algorithm with Diffusion Population Generation for Portfolio Problems" Mathematics 12, no. 9: 1368. https://doi.org/10.3390/math12091368
APA StyleYang, M., Qian, W., Yang, L., Hou, X., Yuan, X., & Dong, Z. (2024). A Synergistic Multi-Objective Evolutionary Algorithm with Diffusion Population Generation for Portfolio Problems. Mathematics, 12(9), 1368. https://doi.org/10.3390/math12091368