Next Article in Journal
Management of Distributed Renewable Energy Resources with the Help of a Wireless Sensor Network
Next Article in Special Issue
An Improved Robust Fractal Image Compression Based on M-Estimator
Previous Article in Journal
Numerical Analysis and Optimization of the Front Window Visor for Vehicle Wind Buffeting Noise Reduction Based on Zonal SAS k-ε Method
Previous Article in Special Issue
An Enhanced Artificial Electric Field Algorithm with Sine Cosine Mechanism for Logistics Distribution Vehicle Routing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Predicting Entrepreneurial Intention of Students: Kernel Extreme Learning Machine with Boosted Crow Search Algorithm

1
Zhejiang College of Security Technology, Wenzhou 325000, China
2
The Section of Employment, Wenzhou Vocational College of Science and Technology, Wenzhou 325006, China
3
Department of Information Technology, Wenzhou Vocational College of Science and Technology, Wenzhou 325006, China
4
Department of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou 325035, China
5
The Higher Education Research Institute of Wenzhou University in China, Wenzhou University, Wenzhou 325035, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2022, 12(14), 6907; https://doi.org/10.3390/app12146907
Submission received: 2 June 2022 / Revised: 1 July 2022 / Accepted: 3 July 2022 / Published: 7 July 2022
(This article belongs to the Special Issue Soft Computing Application to Engineering Design)

Abstract

:
College students are the group with the most entrepreneurial vitality and potential. How to cultivate their entrepreneurial and innovative ability is one of the important and urgent issues facing this current social development. This paper proposes a reliable, intelligent prediction model of entrepreneurial intentions, providing theoretical support for guiding college students’ positive entrepreneurial intentions. The model mainly uses the improved crow search algorithm (CSA) to optimize the kernel extreme learning machine (KELM) model with feature selection (FS), namely CSA-KELM-FS, to study entrepreneurial intention. To obtain the best fitting model and key features, the gradient search rule, local escaping operator, and levy flight mutation (GLL) mechanism are introduced to enhance the CSA (GLLCSA), and FS is used to extract the key features. To verify the performance of the proposed GLLCSA, it is compared with eight other state-of-the-art methods. Further, the GLLCSA-KELM-FS model and five other machine learning methods have been used to predict the entrepreneurial intentions of 842 students from the Wenzhou Vocational College in Zhejiang, China, in the past five years. The results show that the proposed model can correctly predict the students’ entrepreneurial intention with an accuracy rate of 93.2% and excellent stability. According to the prediction results of the proposed model, the key factors affecting the student’s entrepreneurial intention are mainly the major studied, campus innovation, entrepreneurship practice experience, and positive personality. Therefore, the proposed GLLCSA-KELM-FS is expected to be an effective tool for predicting students’ entrepreneurial intentions.

1. Introduction

Colleges, as the main front of college students’ entrepreneurship education, to further promote the reform of college entrepreneurship education has become an important task for the reform and development of higher education at present and in the future. The country has provided broad development space and unlimited opportunities for college students to start their businesses. However, the entrepreneurial rate of college students is not very high, and the success rate is even lower. Innovation and entrepreneurship education in colleges and universities are in full swing, but has it played its due role? What impact do college innovation and entrepreneurship education have on students’ entrepreneurial intentions? What is the impact of innovation and entrepreneurship education in colleges and universities on the entrepreneurial ability of college students? These are all topics worthy of our study. At present, in my country, entrepreneurship education in colleges and universities is still in the exploratory stage, and the training system for innovative and entrepreneurial talents in colleges and universities is not yet perfect, and the effect is not significant. Using entrepreneurial intentions data of college students in the process of talent training in colleges and universities can help analyze the influencing factors of entrepreneurial talent training in colleges and universities, formulate a scientific, entrepreneurial education system, accurately cultivate entrepreneurial talents, and ultimately help college students make reasonable career choices.
In recent years, many researchers have carried out related research work on graduates’ entrepreneurial intentions. Salamzadeh et al. [1] used analysis data of 382 students from two institutions in Malaysia and positively moderated the impact of the inputs on entrepreneurship. Laouiti et al. [2] conducted a qualitative comparative analysis of fuzzy sets and obtained the positive effect of gender on entrepreneurial intentions through a sample survey of 531 students in France. Barba et al. [3] proposed a planned behavior model to simulate 337 college students at the University of Oviedo and concluded that college students’ high environmental awareness had an impact on entrepreneurial intentions. Duong et al. [4] conducted a study of 685 undergraduate students with structural equation modeling at various universities in Vietnam to show the impact of perceived regulatory support on social entrepreneurial intentions. Suratno et al. [5] conducted a questionnaire survey on 1000 students in Indonesia, and the results showed that family economic education, peer groups, economic literacy, and entrepreneurial intention were positively correlated. Ashraf et al. [6] used bounded rational planning behavior theory to study the entrepreneurial intentions of young and senior students in Romania, Malaysia, and Bangladesh, and the results showed that Facebook’s business factors had a potential impact on students’ entrepreneurial intentions. Liu et al. [7] conducted a study on the cultivation of the entrepreneurial intention of tourism and hotel majors from the perspective of social impact, indicating that Holmium provided some clues for improving the entrepreneurial level of tourism and hotel majors. Leung et al. [8] conducted a questionnaire survey of 182 college students, and the results showed the difference between the independent influence and the joint influence of psychiatric symptoms on entrepreneurial intention. Iwu et al. [9] conducted a study on the entrepreneurial intention of students in a South African university and showed that the perceived competence of the lecturer team of entrepreneurial institutions was positively correlated with the entrepreneurial intentions of students. The above research shows that the entrepreneurial intentions of students in different countries are related to social culture, economic development, and personal ideology. However, the research methods are also different, but the main method is inseparable from the questionnaire survey, whose relevant data are collected by employing a questionnaire survey and then analyzed by statistical methods. Of course, some scholars build relevant models through the collected data to carry out follow-up research work.
Artificial intelligence technology has been a hot technology in recent years and is also widely used in students’ career planning and entrepreneurial intention prediction. Wei et al. [10] proposed an effective, intelligent model based on an improved Harris hawk optimizer (HHO) and kernel extreme learning machine to predict entrepreneurial intentions, and it performed better than traditional approaches. Zhu et al. [11] proposed an orthogonal learning (OL) strategy to optimize the integrated-kernel extreme learning machine model to evaluate Sino-foreign cooperative education projects and gained the best results in the comparative experiments. Lin et al. [12] proposed an improved fuzzy k-nearest neighbors framework to predict, in advance, college students’ intentions for a master’s program. The results showed that the proposed model was more effective than other methods. Tu et al. [13] developed an adaptive support vector machine framework to predict the entrepreneurial intentions of college students in advance. The results demonstrated that the proposed method could be regarded as a promising success with excellent predictive performance. Wei et al. [14] designed a support vector machine based on an improved grey wolf optimization to extract the factors that influence students’ final decisions to choose a particular major. Gao et al. [15] proposed a predictive model based on the improved slime mould algorithm and support vector machine for predicting graduate employment stability. Mishra et al. [16] used a variety of data mining techniques to predict the employment intention of computer application master students, and the results showed that the C4.5 decision tree had an accuracy of 70.19%. Jin et al. [17] used the improved C4.5 decision tree algorithm combined with the Bayesian model to mine the data related to the employment of students, and provided career guidance for the decision-making of promoting the employment of college students. Bell et al. [18] used principal component analysis, correlation analysis, and multilevel regression to analyze 1185 student data to effectively predict the entrepreneurial intentions across the university. Rahman et al. [19] used decision trees to predict students’ entrepreneurial intentions by collecting entrepreneurship education data from the University of Kuala Lumpur. Djordjevic et al. [20] used a classification decision tree model based on the QUEST (Quick, Unbiased, Efficient, Statistical Tree) algorithm to predict the entrepreneurial intentions of 5670 Serbian youths.
There are many different domains where advanced optimization algorithms have been applied as solution approaches, such as online learning, scheduling, multi-objective optimization, transportation, medicine, data classification, and others. Zhao et al. [21] designed an online-learning-based evolutionary many-object algorithm to solve benchmark problems. Pasha et al. [22] developed an integrated optimization method for tactical-level planning in liner shipping with heterogeneous ship fleets and environmental considerations. Dulebenets et al. [23] proposed a multi-object optimization model for emergency evacuation planning in geographical locations with vulnerable population groups. Dulebenets et al. [24] developed an adaptive polyploid memetic algorithm for scheduling trucks at a cross. Rabbani et al. [25] used the NSGA-II and MOPSO algorithms for ambulance routing in disaster responses, considering the variable of the patient’s condition. According to the “No Free Lunch” theorem [26], no single algorithm can solve all possible problems. Lu et al. [27] extended a new version of the CSA to estimate the proton-exchange membrane fuel-cell model parameter. Owing to its effectiveness in many problems, the CSA was used in this study. However, the classical CSA has a problem of low convergence speeds and becomes trapped in the local minimum easily when tackling a real-world problem. Therefore, in this work, we have introduced three mechanisms, including the gradient search rule (GSR), local escaping operator (LEO), and Levy mutation mechanism (LM) into the CSA to improve its search capability, and then the proposed GLLCSA is used to simultaneously optimize the parameters of the KELM classifier and feature space in the data. The results show that the proposed GLLCSA-KELM-FS model can effectively predict the entrepreneurial intentions of the school students, with an accuracy rate of 93.2%. In addition, the experimental results have found that the major selected, campus innovation and entrepreneurship practice experience, and a positive personality are key factors that influence the entrepreneurial intentions of the school’s students. Therefore, this research plays a key role in guiding relevant departments to guide students’ entrepreneurial intentions.
The main contributions of this study are as follows:
  • Incorporating the gradient search rule (GSR), local escaping operator (LEO), and Levy mutation mechanism (LM) into the CSA to improve their search capabilities.
  • The performance of GLLCSA is effectively verified through benchmark function experiments.
  • The GLLCSA-KELM-FS model is proposed to predict students’ entrepreneurial intentions.
  • Achieve an effective prediction of students’ entrepreneurial intentions and screen out the key features.
This paper is organized as follows. Section 2 briefly describes KELM and the CSA. Section 3 introduces GLLCSA. Section 4 presents the GLLCSA-KELM-FS model. Section 5 evaluates the performance of the proposed GLLCSA. Section 6 uses the GLLCSA-KELM-FS model to predict graduate entrepreneurial intentions. Section 7 draws conclusions and outlooks.

2. Background

2.1. Literature Review

According to the study, the performance of the classifier is affected greatly by its inner parameters and the features in the data. Metaheuristics have great effectiveness in solving this type of problem as shown in many works [28,29,30,31,32], such as object tracking [33,34], the traveling salesman problem [35], gate resource allocation [36,37], multi-attribute decision-making [38,39], the design of the power electronic circuit [40,41], fractional-order controllers [42], medical diagnoses [43,44], big data optimization problems [45], green supplier selections [46], economic emission dispatch problems [47], scheduling problems [48,49], and combination optimization problems [50]. This study proposes an enhanced crow search algorithm (CSA) [51] to simultaneously optimize the hyperparameters of the kernel extreme learning machine (KELM) and the feature space for predicting the entrepreneurial intention of college students. The CSA is a novel-inspired metaheuristic method proposed in 2016 and has resolved many complex optimization problems [51], with many researchers attempting to boost it from different aspects [52,53,54,55,56]. For example, Adamu et al. [57] proposed a hybrid particle swarm optimization with the CSA for feature selection, and the result of the proposed model gained an accuracy of 89.67% on 15 datasets. Aliabadi et al. [58] designed an improved CSA to optimize a hybrid renewable energy system in radial distribution networks, and the results showed significant active losses and voltage deviations. Al-Thanoon et al. [59] adopted the CSA for big data classification, and the results showed a higher classification performance compared with other algorithms. Awadallah et al. [60] developed a cellular CSA with topological neighborhood shapes to resolve three real-world problems. Bakhshaei et al. [61] designed a boosted CSA-based tournament selection strategy, and the results showed that the optimal determination of power exchange and the incentive rate could lead to decreasing operation costs. Chaudhuri et al. [62] developed a binary CSA with a time-varying flight length to solve the feature selection problem, and the results showed that the proposed feature selection technique behaved better than other approaches. Geetha et al. [63] designed an enhanced CSA for a forgery detection technique in the image, and the results exhibited that the proposed classification performs better than most algorithms. Guha et al. [64] used the CSA with chaotic mapping for fine-turning the controller and varied the efficacy of the controller in its frequency regulation, as validated. Gupta et al. [65] introduced a novel boosted CSA to predict Parkinson’s disease with an accuracy of 100% and helped patients obtain proper treatment. Hossain et al. [66] developed a hybrid support vector regression and CSA to handle the multi-objective optimization of microalgae-based wastewater treatment. Ke et al. [67] proposed an enhanced CSA to deal with energy optimization problems, and the results showed that the approach can better obtain proper solutions with lower calculation times. Khattab et al. [68] developed a novel crow spiral-based search algorithm to solve the design problem formulated, and the gained results confirmed the success of the filter design. Kumar et al. [69] designed a hybrid CSA with an arithmetic crossover to two real-world engineering optimization problems and gained effective results. Li et al. designed [70] an improved CSA with an extreme learning machine model to effectively forecast short-term wind power.

2.2. Kernel Extreme Learning Machine (KELM)

The extreme learning machine (ELM) [71] is a class of machine learning systems or methods based on feed-forward neural networks. The traditional ELMs have a single hidden layer, which is considered to have a low learning rate and generalization when compared with other shallow learning systems, such as the single-layer perceptron and support vector machine (SVM). The ELM algorithm randomly generates the connection weight between the input layer and the hidden layer and the threshold of the neurons in the hidden layer. Therefore, in the process of training, one only needs to set the number of neurons in the hidden layer, and the unique optimal solution can then be obtained.
KELM was designed by adding a radial basis function (RBF) kernel that is based on ELM [72]. The selection of RBF kernels aims to map samples into a high-dimensional mapping space, further solving nonlinear problems. In addition, the RBF kernel has few parameters; only the penalty factor and kernel parameters need to be considered. However, the choice of hyperparameters has a certain impact on the effect of model fitting. Therefore, it is necessary to select an appropriate hyperparameter optimization method according to the specific problem.

2.3. Crow Search Algorithm (CSA)

Recently, many metaheuristics have been proposed to solve global optimization problems, such as the slime mould algorithm (SMA) (https://aliasgharheidari.com/SMA.html, accessed on 1 April 2020) [73], weighted mean of vectors (INFO) (https://aliasgharheidari.com/INFO.html, accessed on 1 June 2022) [74], colony predation algorithm (CPA) [75], Harris hawks optimization (HHO) (https://aliasgharheidari.com/HHO.html, accessed on 1 August 2019) [76], Runge Kutta optimizer (RUN) (https://aliasgharheidari.com/RUN.html, accessed on 1 November 2021) [77], and the hunger games search (HGS) [78]. These optimizers have shown great potential in many fields, such as medicine [79,80,81], energy [82,83], finance [84,85], education [86] and engineering [87,88]. The CSA is one of these members, which focuses on the following behaviors by simulating crows: (a) Crows live in groups; (b) crows call back to mind where they hide; (c) crows track mutual for theft; (d) crows protect their food from theft [51].
Suppose there exists a space, and the number of crows (group size) is N and c r o w i in the search space in time (iteration), which is defined by a vector x i , i t e r a t i o n ( i = 1 ,   2 ,   ,   N ;   i t e r a t i o n = 1 ,   2 ,   ,   i t e r a t i o n m a x , x i t e r a t i o n = [ x 1 i , i t e r a t i o n ,   x 2 i , i t e r a t i o n , ,   x d i , i t e r a t i o n ] and i t e r a t i o n m a x is the maximum number of iterations. When i t e r a t i o n is going on, the hidden position of c r o w i is represented by m j , i t e r a t i o n . This is by far the best position to have. In fact, in the memory of each crow, what it considers to be in the optimal position is remembered. After that, the crow moves through the spatial environment and then looks for a more optimal location for the presence of food.
Suppose that in i t e r iteration, c r o w i wants to access its hidden location m j , i t e r a t i o n . In this iteration, c r o w i decides to follow c r o w j and close to c r o w j s hideout. In this case, there may be two states:
State 1: c r o w j does not know c r o w i is tracking it. As a result, c r o w i will be close to c r o w j s hiding position. In this case, the new location of c r o w i is obtained as follows:
x i , i t e r a t i o n + 1 = x i , i t e r a t i o n + r i × f l i , i t e r a t i o n × ( m j , i t e r a t i o n x i , i t e r a t i o n )
where r i is a random value between [0, 1], and f l i , i t e r a t i o n is the flight length of c r o w i when iterating i t e r .
State 2: c r o w j knows c r o w i is tracking it. Therefore, to keep its food from being stolen, c r o w j will deceive c r o w i by leaving for another location.
In general, state 1 and state 2 can be expressed as:
x i , i t e r a t i o n + 1 = { a r a n d o w p o s i t i o n o t h e r w i s e x i , i t e r a t i o n + r i × f l i , i t e r a t i o n × ( m j , i t e r a t i o n x i , i t e r a t i o n ) r j A P j , i t e r a t i o n  
Among them, r j is a random number that is evenly distributed between 0 and 1; i t e r a t i o n represents c r o w j perception probability in i t e r a t i o n .
The MA should supply a good equilibrium between exploration and exploitation. In the CSA, the perception/affinity probability (AP) parameter controls exploration and exploitation. By abating the value of perception chance, the CSA tends to search for the current good solution in the local area. Therefore, using a smaller AP value will enhance the local search capability. On the other hand, with the increase in perception chance, the ability to search for locally optimal solutions decreases and is replaced by an improved ability to search in the global space.

3. Proposed GLLCSA

3.1. Gradient Search Rule and Local Escaping Operator and Levy Flight Operator (GLL)

In this part, we will introduce the GLL mechanism, which aims to achieve better performance for the CSA and prevent the CSA from sinking into the local optimum (LO). In this paper, the GSR, LEO, and LM operators are used to improve the personal position updating function of the CSA.
The GSR strategy has shown great capability in many tasks [89,90,91,92,93]. In this study, the GSR strategy is mixed with the position update of the CSA to enhance the global search capability. The algorithm creates a population search method with strong robustness. The GSR can be expressed as:
G S R = r a n d n × ρ 1 × 2 Δ x × x n ( x w o r s t x b e s t + ε )
where r a n d n is a normally distributed random number, Δ x means the value of the increment, and ε is a number from 0 to 0.1. x b e s t and x w o r s t , respectively, represent the best and worst solutions gained in the search process. Equation (3) is better updated for the current solution. In addition, to better enhance the overall effect of the algorithm, ρ 1 is then introduced into the equation to modify the GSP, as detailed below. The purpose of the optimization algorithm is to be able to search both locally as well as globally to achieve optimal results. Therefore, the GSR is updated by adaptive coefficients. Further, ρ 1 is regarded as the vital variable in GBO to achieve this purpose, and it can be expressed as:
ρ 1 = 2 × r a n d × α α
α = | β × s i n ( 3 π 2 + s i n ( β × 3 π 2 ) ) |
β = β m i n + ( β m a x β m i n ) × ( 1 ( m M ) 3 ) 2
where β m i n and β m a x are equal to 0.2 and 1.2, respectively; m expresses the number of iterations; M represents the blanket number of iterations.
The GSR mechanism can be used to enhance the ability to search in the global space from the principle of the CSA. In Equation (3), Δ x is determined by distinguishing the best solution ( x b e s t ) and a selected position ( x r 1 i ) (see Equations (7)–(9)) at random. Parameter δ is defined by Equation (9). To improve exploration, a random number ( r a n d ) is mixed in Equation (9).
Δ x = r a n d ( 1 : N ) × | s t e p |
s t e p = ( x b e s t x r 1 i ) + δ 2
δ = 2 × r a n d × ( | x r 1 i + x r 2 i + x r 3 i + x r 4 i 4 x n i | )
where r a n d ( 1 : N ) is a vector composed of N dimensions; r 1 , r 2 , r 3 , and r 4 ( r 1 r 2 r 3 r 4 n ) are unequal integers chosen from 1 to N at random; s t e p is the size of the step determined by x b e s t and x r 1 i .
On the other hand, the local escaping operator (LEO) is mixed in the CSA to improve the search performance in the local space. The principle is to transform the moving direction of the current solution into the direction toward the optimal solution so far into the local space, thus speeding up the algorithm convergence. Furthermore, LEO can be expressed in Equation (10):
L E O = r a n d × ρ 2 × ( x b e s t x n )
where r a n d is a random number from 0 to 1; ρ 2 is based on a random number on α , which makes the vector have different step sizes. ρ 2 is given by:
ρ 2 = 2 × r a n d × α α
Finally, Equations (12) and (13) aim to update the position of the current vector ( x n i ) established by the GSR and LEO mechanism.
X 1 n i = x n i G S R + L E O
X 1 n i = x n i r a n d n × ρ 1 × 2 Δ x × x n i ( y p n i y q n i + ε ) + r a n d × ρ 2 × ( x r 1 i x r 2 i )
The position of the best vector ( x b e s t ) is replaced by the current vector ( x n i ) in Equation (13), and the new vector ( X 2 n i ) can generate as follows:
X 2 n i = x b e s t r a n d n × ρ 1 × 2 Δ x × x n i ( y p n i y q n i + ε ) + r a n d × ρ 2 × ( x r 1 i x r 2 i )
where
y p i = r a n d × ( [ z i + 1 + x i ] 2 + r a n d × Δ x )
y q i = r a n d × ( [ z i + 1 + x i ] 2 r a n d × Δ x )
The above-mentioned method of updating the location uses a searching direction at the local space. In Equations (13) and (14), there are advantages and disadvantages in the search, respectively. Equation (13) is better in global space but worse in local space, and Formula (14) is vice versa. Therefore, the CSA is based on Equations (13) and (14) to balance the exploration and exploitation capabilities. Thus, based on the positions X 1 n i , X 2 n i , the vector X n i and X n i + 1 can be defined as:
x n i + 1 = r a × ( r b × X 1 n i + ( 1 r b ) × X 2 n i ) + ( 1 r a ) × X 3 n i
X 3 n i = X n i ρ 1 × ( X 2 n i X 1 n i )
where r a and r b are two random numbers in [0, 1].
x i + 1 = { r a × ( r b × X 1 + ( 1 r b ) × X 2 ) + ( 1 r b ) × X 3 r j < A P x i + r i × f l i × ( m j x i ) r j A P
x L E O i = x i + f 1 × ( u 1 × x b e s t u 2 × x k i ) + f 2 × ρ 1 × ( u 3 × ( X 2 X 1 ) + u 2 × ( x r 1 i x r 2 i ) ) / 2
where f 1 is the uniform random number in [−1, 1]; f 2 expresses the random number of the standard normal distribution; u 1 , u 2 , u 3 are all random numbers in Equations (21)–(23):
X 3 n i = X n i ρ 1 × ( X 2 n i X 1 n i )
u 1 = { 2 × r a n d i f   μ 1 < 0.5 1 o t h e r w i s e
u 2 = { r a n d i f   μ 1 < 0.5 1 o t h e r w i s e
u 3 = { r a n d i f   μ 1 < 0.5 1 o t h e r w i s e
where r a n d is the random number from 0 to 1, and μ 1 is a number in [ 0 ,     1 ] . The above equations are further expressed as follows:
u 1 = L 1 × 2 × r a n d + ( 1 L 1 )
u 2 = L 1 × r a n d + ( 1 L 1 )
u 3 = L 1 × r a n d + ( 1 L 1 )
where L 1 is a binary parameter with a value of 0 or 1. If parameter μ 1 is less than 0.5, the value of L 1 is 1, otherwise, it is 0.
To determine the solution x p in Equation (20), the following scheme is suggested:
x k i = { x r a n d i f   μ 2 < 0.5   x p i o t h e r w i s e
x r a n d = X m i n + r a n d ( 0 , 1 ) × ( X m a x X m i n )
where x r a n d is a new solution, x p i is a randomly selected solution of the population ( p [ 1 , 2 , , N ] ) , and μ 2 is the random number in the range of [ 0 ,   1 ] . Equation (27) can be simplified as:
x k i = L 2 × x p i + ( 1 L 2 ) × x r a n d
where L 2 is a binary parameter with a value of 0 or 1. If μ 2 is less than 0.5, the value of L 2 is 1, otherwise, it is 0.
Moreover, the Levy mutation mechanism (LM) is adopted to update x i , which is expressed as Equation (31).
x n e w i = x i × ( 1 + 0.5 × L ( β ) )
where β is an index of stability. The Levy random number can be described by the following formula:
L e v y ( β ) ~ ϕ × μ | v | 1 / β
where μ and v are standard normal distributions, Γ is known as the standard Gamma function, β = 1.5 , and ϕ denotes as below:
ϕ = [ Γ ( 1 + β ) × sin ( π × β / 2 ) Γ ( ( 1 + β 2 ) × β × 2 β 1 2 ) ] 1 / β
where L ( β ) is the randomly spread number drawn from the Levy distribution. The LM operator is likely to generate a different offspring because of its heavy-tailed distribution. Hence, it can help all individuals avoid the local optima without striking a blow. Thus, the implementation of GLLCSA can be indicated in detail as below.

3.2. GLLCSA

We combined all the above strategies, and the pseudo-code for the whole GLLCSA framework is shown in Algorithm 1. Figure 1 is the flowchart of GLLCSA. In the proposed GLLCSA, almost all parameters are affected by the GLLCSA group size. However, the new tactic will not increase the complexity of the original algorithm too much. GLLCSA adopts a gradient search strategy, which includes the Newton method to change the gradient descent rate, and it can coordinate the rationality of each object and improve the authenticity of the algorithm simulation.
Algorithm 1: Pseudo-code of GLLCSA
Step 1. Initialization
  Randomly initialize the location of N-group crows in the search space
  Assess the location of crows
  Initialize the memory of crows
  Specify the best and worst solutions x b e s t m and x w o r s t m
Step 2. Main loop
  while ( i t e r a t i o n < i t e r a t i o n m a x )
    for i = 1:N
      Randomly choose a crow to follow
      Define perception probability
      if  r j A P j , i t e r a t i o n
        Calculate the position using Equation (1)
      else
        for d = 1:D
          if  F E S / i t e r a t i o n m a x
             x d = a random position of search space
          else
            
Select randomly r 1 r 2 r 3 r 4 n  in the range of [1, N]
            
Calculate the position using Equation (19)
          end if
        end for
      end if
      Local escaping operator
      Calculate the position using Equation (20)
      Update the positions x b e s t i and x w o r s t i
      Create the new position x n e w using Equation (30)
      if  x n e w better than x i
         x i = x n e w
      end if
    end for
    Inspect the feasibility of the new positions
    Assess the new location of the crows
    Update the memory of each crow
  end while
Step 3. Return g_best
According to Figure 1, the time complexity of GLLCSA relates to the following factors: initialization, fitness evaluation, individual location update, ranking, gradient search mechanism, local operator, and levy flight mutation. On account of the time consumption of fitness, an evaluation is decided by the concrete optimization problem. The computational complexity analysis centers on the other six aspects. Assuming the population and initialization of the solutions with N agents, the computational complexity of the sorting mechanism is O(N) + O(N × logN). The computational complexity of the individual location update is O(G × N × D), where G is the highest iteration number and D is the dimension. The complexity of the GSR, LEO, and LM update stages is O(G × N × D). Therefore, the computational complexity of GLLCSA is O(N + N*logN) + O(G*D(2*N + 1)), which is the same as the original CSA.

4. Proposed GLLCSA-KELM-FS Model

In this section, the proposed GLLCSA-KELM-FS model is introduced in detail, as shown in Figure 2. First of all, the input data needs to be preprocessed; the first is the normalization operation and feature selection to eliminate the redundant partial features. The ten-fold cross-validation was used to avoid overfitting. For the KELM technique, the input data was mapped into the hidden layer space through the RBF kernel, including the hyperparameters C and γ to be optimized and the n features. The proposed GLLCSA was adopted to optimize the two hyperparameters involved. Specifically, the dataset was divided into 10 parts, with 9 parts selected as training data and 1 part used as the test data, in turn. Further, the nine pieces of data were divided into five pieces again, and four pieces of data were selected, in turn, as training data, which were combined with GLLCSA and used to train the KELM model. We evaluated the trained KELM model with one copy as the test data and obtained an optimal KELM model. Finally, for this model, the original reserved one test data was used to evaluate the performance of the resulting KELM model.

5. Test Experiments of Benchmark Function Sets

In the experimental part, the GLLCSA was evaluated for a variety of aspects through a set of experiments on benchmarks and practical problems. To ensure the fairness of the experiment, we adopted the same environment and parameter settings in the same experiment. The population size and the maximum number of iterations were set to 30 and 500, respectively. Each algorithm ran independently on each function for a specified number of times to reduce the weight of unpredictability. In this experiment, two main metrics were used to test and estimate the proposed algorithm’s performance: the average result (AVG) and standard deviation (STD). In addition, we placed the optimal results obtained in each test function in bold.

5.1. Benchmark Functions

To compare the proposed algorithm and other algorithms, this experiment used 30 classical functions, including the unimodal functions, multimodal functions, hybrid functions, and composition functions. These 30 functions are all taken from CEC 2014 [94]. F1–F3 represent the unimodal functions, F4–F16 are simple multimodal functions, F17–F22 are hybrid functions, and F23–F30 are composition functions. Thirty different types of benchmarks allow a more comprehensive evaluation of the performance of the proposed algorithms.

5.2. Comparison with Classical Algorithms

In this section, we compare GLLSCA with BA [95], SCA [96], PSO, MFO, GWO, CESCA [12], OBLGWO [97], IGWO [86], CGPSO [98], CBA [99], and RCBA [100] on CEC 2014 to test its performance. Among them, Functions (1)–(7) are unimodal functions to test the local search ability of the function. Functions (8)–(13) are multimodal functions to test the global search ability of the function. Functions (14)–(23) are hybrid functions, which are applied to test the global and local comprehensive ability of the function. Functions (24)–(30) are composition functions, which can be used to fully test the performance of the algorithm in all aspects. Some of the parameter settings for the different algorithms are shown in Table 1. The unimodal and multimodal functions have been obtained in a great deal of literature, as shown in Table 2, Table 3 and Table 4. The parameter d i m expresses the dimension of the selected objective function. The parameter R a n g e is responsible for the boundary of the function search space, and the parameter F m i n is applied to represent the optimal value of the function.
Moreover, the value of dim is set to 30, the population size is set to 30, and the maximum number of iterations is 1000. In addition, we performed 30 tests for each function.

5.3. Results on 30D Functions

The proposed GLLCSA displays the lowest average value of the 30 functions, and the detailed comparison results of GLLCSA and the other eleven peers can also be seen in Table 5. In other words, the improved algorithm can obtain better results than the other algorithms on every test function. Moreover, the results of F2, F3, F5, F7, F8, F10, F14, F17, F18, F21, F23–F29 bear out the ability of this method to obtain the highest quality solutions. Compared with the other competitive algorithms, the statistical performance of GLLCSA is verified by the CEC 2014 benchmark functions. For further statistical comparison, according to the ARV index, the ranking of the 12 algorithms is shown in Table 6. According to the statistical value of ARV, when searching for the minimum value of the function, GLLCSA is better than the other methods, followed by IGWO, CGPSO, OBLGWO, PSO, BA, GWO, RCBA, CBA, MFO, SCA, and CESCA. In Table 7, the Wilcoxon signed-rank test [101] is given, which indicates whether the difference between GLLCSA and the other algorithms is significant. As can be seen from the table, the experimental results of GLLCSA have a better performance than SCA and CESCA in all test functions. In conclusion, compared with SCA, CESCA, and the other nine algorithms, the GLLCSA achieves the best results for these test functions. In addition, to display the significant advantages of GLLSCA, Figure 3 shows the convergence rates of IGWO, CGPSO, OBLGWO, PSO, BA, GWO, RCBA, CBA, MFO, SCA, and CESCA on 30 benchmarks. As shown in the figure, for the problems of F2, F3, etc., it can be seen from the convergence curve that, compared with the other algorithms, GLLCSA can improve both the convergence speed and the convergence accuracy based on the original SCA. Furthermore, the GLLCSA can reach the optimal solution when dealing with F5, F8, F10, F17, F18, F21, F27, F28, and F29. Since both F12 and F13 are multimodal functions, it is obvious to notice that there are many locally optimal solutions for these two functions. Although GLLCSA also falls into the optimal local solution, its convergence speed and accuracy are optimal when compared to the other algorithms. It is proved that the CSA based on GLL has a good exploration in avoiding LO. Therefore, the GLLCSA has more advantages than the original CSA.

6. Predicting Entrepreneurial Intention of Students

6.1. Data Collection

The data for the study was derived from 842 graduates from Wenzhou Vocational and Technical College in Zhejiang, China, over five years. The dataset covers a total of ten features, including gender, political affiliation (PA), major, place of student’s source (PSS), family financial situation (FFS), practical experience in innovation and entrepreneurship on campus (PEIEC), training course of innovation and entrepreneurship (TCIE), grade point average (GPA), scholarship awards (SA) and proactive personality (PP). In the program, the above ten features are marked with F1–F10, in turn. A detailed description of the dataset can be found in reference.

6.2. Condition Configuration

This experiment is carried out in the same simulation environment based on the Windows 10 system and using MATLAB 2016a. The experimental setting is important for computational sciences, such as drug discovery [102,103], information retrieval services [104,105,106], network analysis [107,108], active surveillance [109], and disease prediction [110,111,112]. For the sake of fairness, this experiment used statistical methods, including means and standard deviation, to analyze ten identical experiments. Specifically, average result (Avg) and standard deviation (Std) represent the average prediction result and standard deviation of each model under ten experiments, respectively. To find the best model, we evaluated each model using 10-fold cross-validation, which was adopted in many studies [43,44,113]. Moreover, the prediction results were analyzed by using the common metrics [114,115,116], including accuracy (ACC), sensitivity (Sens), specificity (Spec), and the Matthews correlation coefficient (MCC).

6.3. Experiment Results

In this section, we performed statistical analyses on the comparison results of the proposed GLLCSA-KELM-FS model with the other five models, including GLLCSA-KELM, CSA-KELM, CSA-KELM, RF, and FKNN. Among them, Table 8 shows the average results of the ten experiments of GLLCSA-KELM-FS and the other five models. The best results are in bold. It can be seen that the accuracy rate of the GLLCSA-KELM-FS model in predicting the entrepreneurial intention of graduates is as high as 93.20%, while the accuracy rates of the other five models are 90.17%, 88.22%, 88.81%, 89.42%, and 87.29%, respectively. For the sensitivity metric, the result of KELM is 0.99% higher than that of GLLCSA-KELM-FS. Regarding the specificity and MCC metrics, the GLLCSA-KELM-FS model achieves the best results.
Table 9 shows the standard deviation results of the ten experiments between GLLCSA-KELM-FS and the other five models. It is easily obtained that the GLLCSA-KELM-FS model has the best stability in the ACC, sensitivity, and MCC metrics. On the specificity index, the RF model has the best stability. For a better description, Figure 4 shows the above-mentioned histograms of Avg and Std in detail. It can be seen that the accuracy, specificity, and MCC of the proposed GLLCSA-KELM-FS model are higher than the other models. For stability, the specificity of the GLLCSA-KELM-FS model is only inferior to RF. The key features of the graduates’ entrepreneurial intentions screened by FS in the experiment are shown in Figure 5, in which the occurrences of features F3, F6, and F10 are 9, 8, and 8, respectively, which can be used as the key features to classify and measure students’ entrepreneurial intentions.
Table 10 shows the two-tailed t-test results of GLLCSA-KELM-FS and the other comparative methods, including the p-value, statistical t-values (t) and degree of freedom (df). The p-value of GLLCSA-KELM-FS and the other five methods are 0.0882, 0.0187, 0.0158, 0.0647 and 0.0200, respectively. At a significance level of 0.05, GLLCSA-KELM-FS performs significantly better than the CSA-KELM, KELM, and FKNN.

6.4. Discussion

According to the above research results, graduates’ entrepreneurial intention is affected by many factors. Among them, the major (F3), campus innovation and entrepreneurship practice experience (F6), and positive personality (F10) have the greatest impact. Usually, the majors chosen by students have the necessary connections to their subsequent entrepreneurial intentions. Of course, the practical experience of campus innovation and entrepreneurship also promotes students’ entrepreneurial preferences. Students who have participated in entrepreneurial practice have significantly stronger entrepreneurial motivation than students without entrepreneurial experience. In addition, it was found that entrepreneurial talents have the characteristics of “endogenous growth”, and entrepreneurial cognition and student behavior are easily affected by positive personality. There is also a moderating effect between the active personality with the growth ability in the new environment and the students’ entrepreneurial intention. The stronger the active personality, the stronger the motivation for entrepreneurial behavior, and the stronger the entrepreneurial practice ability. The difficulty of this paper mainly lies in the construction of the GLLCSA-KELM-FS model. First, the design of GLLCSA was an important difficulty. Second, how to filter out the key features from complex datasets was another important difficulty. Finally, how to combine GLLCSA with KELM and FS was the key to solving these problems.
This paper also has some limitations. First, the sample data is limited. An appropriate amount of datasets can effectively avoid underfitting imagination and improve the accuracy of model fitting. Secondly, the characteristics of the graduates’ entrepreneurial intentions involved in this paper still need to be further explored. Finally, the generality of the fitted model needs to be further verified. Owing to the optimization potential, the proposed GLLCSA can be also applied to other complex tasks, such as the recommender systems [117,118,119], location-based services [120,121], human motion capture [122], text clustering [123], kayak cycle phase segmentation [124], drug-disease associations prediction [125], practical engineering problems [126,127], medical diagnosis [28,128], fault diagnosis [129] and solar cell parameter identification [130].

7. Conclusions and Future Works

This paper constructs a reliable student entrepreneurial intention prediction model, namely the GLLCSA-KELM-FS model. On the one hand, the model optimizes two hyperparameters of KELM through GLLCSA, aiming to obtain the best fitting model. On the other hand, it uses FS to extract the key features that affect students’ entrepreneurial intentions. The main innovation of this paper was to introduce the GLL mechanism to effectively improve the performance of the CSA and obtain the best combination of KELM parameters. The benchmark function comparison experiments effectively verified the performance of GLLCSA. Further, GLLCSA-KELM-FS was used to predict the entrepreneurial intentions of 842 students from Zhejiang Wenzhou Vocational College over the past five years. Compared with the five other machine learning methods, the GLLCSA-KELM-FS model can correctly predict students’ entrepreneurial intentions with an accuracy of 93.2%. In addition, the key factors affecting the school’s entrepreneurial intention are the major studied, campus innovation and entrepreneurship practice experience, and positive personality. This research can deeply dig into how colleges and universities can cultivate students’ entrepreneurial ability more scientifically through the relevant factors so as to help them position their careers more concretely and rationally. Moreover, it can be used to help cultivate entrepreneurial talents so that they can make more conscious and focused career decisions.
In the follow-up research, the generality of the proposed GLLCSA-KELM-FS will be further improved, and the employment intention of more college students will be predicted. Furthermore, it can also be used to solve other problems, such as disease diagnoses and financial risk predictions [131,132]. Further, the GLLCSA method can also be used to optimize the hyperparameters of other models and solve more complex optimization problems [133,134].

Author Contributions

Conceptualization, L.Z., Y.F., Y.W., Z.C., H.C. and C.X.; Methodology, H.C., C.X., L.Z. and Y.F.; software, L.Z. and Y.F.; validation, H.C., C.X., Y.W., Z.C., Y.F. and L.Z.; formal analysis, L.Z., Y.F., Z.C. and Y.W.; investigation, L.Z., Y.F.; resources, H.C., C.X., Z.C. and Y.W.; data curation, L.Z. and Y.F.; writing—original draft preparation, L.Z. and Y.F.; writing—review and editing, H.C., C.X., L.Z., Y.F., Z.C. and Y.W.; visualization, L.Z., Y.F., Z.C. and Y.W.; supervision, L.Z., Y.F., Z.C. and Y.W.; project administration, L.Z. and Y.F.; funding acquisition, H.C. and C.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The data involved in this study can be provided upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Salamzadeh, Y.; Sangosanya, T.A.; Salamzadeh, A.; Braga, V. Entrepreneurial universities and social capital: The moderating role of entrepreneurial intention in the Malaysian context. Int. J. Manag. Educ. 2022, 20, 100609. [Google Scholar] [CrossRef]
  2. Laouiti, R.; Haddoud, M.Y.; Nakara, W.A.; Onjewu, A.-K.E. A gender-based approach to the influence of personality traits on entrepreneurial intention. J. Bus. Res. 2022, 142, 819–829. [Google Scholar] [CrossRef]
  3. Barba-Sánchez, V.; Mitre-Aranda, M.; del Brío-González, J. The entrepreneurial intention of university students: An environmental perspective. Eur. Res. Manag. Bus. Econ. 2022, 28, 100184. [Google Scholar] [CrossRef]
  4. Duong, Q.H.; Nguyen, T.B.N. The impact of perceived regulatory support on social entrepreneurial intention: A survey dataset in Vietnam. Data Brief 2021, 37, 107233. [Google Scholar] [CrossRef]
  5. Suratno; Narmaditya, B.S.; Wibowo, A. Family economic education, peer groups and students’ entrepreneurial intention: The mediating role of economic literacy. Heliyon 2021, 7, e06692. [Google Scholar]
  6. Ashraf, M.A.; Alam, M.M.D.; Alexa, L. Making decision with an alternative mind-set: Predicting entrepreneurial intention toward f-commerce in a cross-country context. J. Retail. Consum. Serv. 2021, 60, 102475. [Google Scholar] [CrossRef]
  7. Liu, X.; Zhao, W.W. Family education? Unpacking parental factors for tourism and hospitality students’ entrepreneurial intention. J. Hosp. Leis. Sport Tour. Educ. 2020, 29, 100284. [Google Scholar] [CrossRef]
  8. Leung, Y.K.; Franken, I.; Thurik, A. Psychiatric symptoms and entrepreneurial intention: The role of the behavioral activation system. J. Bus. Ventur. Insights 2019, 13, e00153. [Google Scholar] [CrossRef]
  9. Iwu, C.G.; Opute, P.A.; Nchu, R.; Eresia-Eke, C.; Tengeh, R.K.; Jaiyeoba, O.; Aliyu, O.A. Entrepreneurship education, curriculum and lecturer-competency as antecedents of student entrepreneurial intention. Int. J. Manag. Educ. 2019, 19, 100295. [Google Scholar] [CrossRef]
  10. Wei, Y.; Lv, H.; Chen, M.; Wang, M.; Heidari, A.A.; Chen, H.; Li, C. Predicting Entrepreneurial Intention of Students: An Extreme Learning Machine with Gaussian Barebone Harris Hawks Optimizer. IEEE Access 2020, 8, 76841–76855. [Google Scholar] [CrossRef]
  11. Zhu, W.; Ma, C.; Zhao, X.; Wang, M.; Heidari, A.A.; Chen, H.; Li, C. Evaluation of Sino Foreign Cooperative Education Project Using Orthogonal Sine Cosine Optimized Kernel Extreme Learning Machine. IEEE Access 2020, 8, 61107–61123. [Google Scholar] [CrossRef]
  12. Lin, A.; Wu, Q.; Heidari, A.A.; Xu, Y.; Chen, H.; Geng, W.; Li, Y.; Li, C. Predicting Intentions of Students for Master Programs Using a Chaos-Induced Sine Cosine-Based Fuzzy K-Nearest Neighbor Classifier. IEEE Access 2019, 7, 67235–67248. [Google Scholar] [CrossRef]
  13. Tu, J.; Lin, A.; Chen, H.; Li, Y.; Li, C. Predict the Entrepreneurial Intention of Fresh Graduate Students Based on an Adaptive Support Vector Machine Framework. Math. Probl. Eng. 2019, 2019, 2039872. [Google Scholar] [CrossRef] [Green Version]
  14. Wei, Y.; Ni, N.; Liu, D.; Chen, H.; Wang, M.; Li, Q.; Cui, X.; Ye, H. An Improved Grey Wolf Optimization Strategy Enhanced SVM and Its Application in Predicting the Second Major. Math. Probl. Eng. 2017, 2017, 9316713. [Google Scholar] [CrossRef] [Green Version]
  15. Gao, H.; Liang, G.; Chen, H. Multi-Population Enhanced Slime Mould Algorithm and with Application to Postgraduate Employment Stability Prediction. Electronics 2022, 11, 209. [Google Scholar] [CrossRef]
  16. Mishra, T.; Kumar, D.; Gupta, S. Students’ employability prediction model through data mining. Int. J. Appl. Eng. Res. 2016, 11, 2275–2282. [Google Scholar]
  17. Jin, Y. Research and Application of the Employment of College Students Based on Bayes Decision-Tree Algorithm. Master’s Thesis, Hefei University, Hefei, China, 2010. [Google Scholar]
  18. Bell, R. Predicting entrepreneurial intention across the university. Educ. Train. 2019, 61, 815–831. [Google Scholar] [CrossRef]
  19. Rahman, F.A.; Yahya, N.; Abdullah, A.M. A Decision Tree Approach for Predicting Students Entrepreneurial Intention. Sindh Univ. Res. J. 2017, 48, 45–50. [Google Scholar]
  20. Djordjevic, D.; Cockalo, D.; Bogetic, S.; Bakator, M. Predicting Entrepreneurial Intentions among the Youth in Serbia with a Classification Decision Tree Model with the QUEST Algorithm. Mathematics 2021, 9, 1487. [Google Scholar] [CrossRef]
  21. Zhao, H.; Zhang, C. An online-learning-based evolutionary many-objective algorithm. Inf. Sci. 2019, 509, 1–21. [Google Scholar] [CrossRef]
  22. Pasha, J.; Dulebenets, M.A.; Fathollahi-Fard, A.M.; Tian, G.; Lau, Y.-Y.; Singh, P.; Liang, B. An integrated optimization method for tactical-level planning in liner shipping with heterogeneous ship fleet and environmental considerations. Adv. Eng. Inform. 2021, 48, 101299. [Google Scholar] [CrossRef]
  23. Dulebenets, M.A.; Pasha, J.; Kavoosi, M.; Abioye, O.F.; Ozguven, E.E.; Moses, R.; Boot, W.R.; Sando, T. Multiobjective Optimization Model for Emergency Evacuation Planning in Geographical Locations with Vulnerable Population Groups. J. Manag. Eng. 2020, 36, 1–17. [Google Scholar] [CrossRef]
  24. Dulebenets, M.A. An Adaptive Polyploid Memetic Algorithm for scheduling trucks at a cross-docking terminal. Inf. Sci. 2021, 565, 390–421. [Google Scholar] [CrossRef]
  25. Rabbani, M.; Oladzad-Abbasabady, N.; Akbarian-Saravi, N. Ambulance routing in disaster response considering variable patient condition: NSGA-II and MOPSO algorithms. J. Ind. Manag. Optim. 2022, 18, 1035–1062. [Google Scholar] [CrossRef]
  26. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  27. Lu, X.; Kanghong, D.; Guo, L.; Wang, P.; Yildizbasi, A. Optimal estimation of the Proton Exchange Membrane Fuel Cell model parameters based on extended version of Crow Search Algorithm. J. Clean. Prod. 2020, 272, 122640. [Google Scholar] [CrossRef]
  28. Li, Q.; Chen, H.; Huang, H.; Zhao, X.; Cai, Z.-N.; Tong, C.; Liu, W.; Tian, X. An Enhanced Grey Wolf Optimization Based Feature Selection Wrapped Kernel Extreme Learning Machine for Medical Diagnosis. Comput. Math. Methods Med. 2017, 2017, 1–15. [Google Scholar] [CrossRef]
  29. Too, J.; Liang, G.; Chen, H. Memory-based Harris hawk optimization with learning agents: A feature selection approach. Eng. Comput. 2021, 2021, 1–22. [Google Scholar] [CrossRef]
  30. Hu, J.; Chen, H.; Heidari, A.A.; Wang, M.; Zhang, X.; Chen, Y.; Pan, Z. Orthogonal learning covariance matrix for defects of grey wolf optimizer: Insights, balance, diversity, and feature selection. Knowl. Based Syst. 2021, 213, 106684. [Google Scholar] [CrossRef]
  31. Zhang, Y.; Liu, R.; Wang, X.; Chen, H.; Li, C. Boosted binary Harris hawks optimizer and feature selection. Eng. Comput. 2020, 37, 3741–3770. [Google Scholar] [CrossRef]
  32. Hu, J.; Gui, W.; Heidari, A.A.; Cai, Z.; Liang, G.; Chen, H.; Pan, Z. Dispersed foraging slime mould algorithm: Continuous and binary variants for global optimization and wrapper-based feature selection. Knowl.-Based Syst. 2021, 237, 107761. [Google Scholar] [CrossRef]
  33. Hu, K.; Ye, J.; Fan, E.; Shen, S.; Huang, L.; Pi, J. A novel object tracking algorithm by fusing color and depth information based on single valued neutrosophic cross-entropy. J. Intell. Fuzzy Syst. 2017, 32, 1775–1786. [Google Scholar] [CrossRef] [Green Version]
  34. Liang, Z.; Zhang, J.; Feng, L.; Zhu, Z. A hybrid of genetic transform and hyper-rectangle search strategies for evolutionary multi-tasking. Expert Syst. Appl. 2019, 138, 112798. [Google Scholar] [CrossRef]
  35. Lai, X.; Zhou, Y. Analysis of multiobjective evolutionary algorithms on the biobjective traveling salesman problem (1,2). Multimed. Tools Appl. 2020, 79, 30839–30860. [Google Scholar] [CrossRef]
  36. Deng, W.; Xu, J.; Zhao, H.; Song, Y. A Novel Gate Resource Allocation Method Using Improved PSO-Based QEA. IEEE Trans. Intell. Transp. Syst. 2020, 23, 1731–1745. [Google Scholar] [CrossRef]
  37. Deng, W.; Xu, J.; Song, Y.; Zhao, H. An Effective Improved Co-evolution Ant Colony Optimization Algorithm with Multi-Strategies and Its Application. Int. J. Bio-Inspired Comput. 2020, 16, 158–170. [Google Scholar] [CrossRef]
  38. Fan, C.; Hu, K.; Feng, S.; Ye, J.; Fan, E. Heronian mean operators of linguistic neutrosophic multisets and their multiple attribute decision-making methods. Int. J. Distrib. Sens. Networks 2019, 15, 1–12. [Google Scholar] [CrossRef]
  39. Fan, C.; Fan, E.; Hu, K. New form of single valued neutrosophic uncertain linguistic variables aggregation operators for decision-making. Cogn. Syst. Res. 2018, 52, 1045–1055. [Google Scholar] [CrossRef] [Green Version]
  40. Liu, X.-F.; Zhan, Z.-H.; Zhang, J. Resource-Aware Distributed Differential Evolution for Training Expensive Neural-Network-Based Controller in Power Electronic Circuit. IEEE Trans. Neural Networks Learn. Syst. 2021, 1–11. [Google Scholar] [CrossRef]
  41. Zhan, Z.-H.; Liu, X.-F.; Zhang, H.; Yu, Z.; Weng, J.; Li, Y.; Gu, T.; Zhang, J. Cloudde: A Heterogeneous Differential Evolution Algorithm and Its Distributed Cloud Version. IEEE Trans. Parallel Distrib. Syst. 2016, 28, 704–716. [Google Scholar] [CrossRef]
  42. Li, G.; Li, Y.; Chen, H.; Deng, W. Fractional-Order Controller for Course-Keeping of Underactuated Surface Vessels Based on Frequency Domain Specification and Improved Particle Swarm Optimization Algorithm. Appl. Sci. 2022, 12, 3139. [Google Scholar] [CrossRef]
  43. Xia, J.; Wang, Z.; Yang, D.; Li, R.; Liang, G.; Chen, H.; Heidari, A.A.; Turabieh, H.; Mafarja, M.; Pan, Z. Performance optimization of support vector machine with oppositional grasshopper optimization for acute appendicitis diagnosis. Comput. Biol. Med. 2022, 143, 105206. [Google Scholar] [CrossRef] [PubMed]
  44. Xia, J.; Yang, D.; Zhou, H.; Chen, Y.; Zhang, H.; Liu, T.; Heidari, A.A.; Chen, H.; Pan, Z. Evolving kernel extreme learning machine for medical diagnosis via a disperse foraging sine cosine algorithm. Comput. Biol. Med. 2021, 141, 105137. [Google Scholar] [CrossRef] [PubMed]
  45. Yi, J.-H.; Deb, S.; Dong, J.; Alavi, A.H.; Wang, G.-G. An improved NSGA-III algorithm with adaptive mutation operator for Big Data optimization problems. Futur. Gener. Comput. Syst. 2018, 88, 571–585. [Google Scholar] [CrossRef]
  46. Liu, P.; Gao, H. A Novel Green Supplier Selection Method Based on the Interval Type-2 Fuzzy Prioritized Choquet Bonferroni Means. IEEE/CAA J. Autom. Sin. 2020, 8, 1549–1566. [Google Scholar] [CrossRef]
  47. Dong, R.; Chen, H.; Heidari, A.A.; Turabieh, H.; Mafarja, M.; Wang, S. Boosted kernel search: Framework, analysis and case studies on the economic emission dispatch problem. Knowl.-Based Syst. 2021, 233, 107529. [Google Scholar] [CrossRef]
  48. Han, X.; Han, Y.; Chen, Q.; Li, J.; Sang, H.; Liu, Y.; Pan, Q.; Nojima, Y. Distributed Flow Shop Scheduling with Sequence-Dependent Setup Times Using an Improved Iterated Greedy Algorithm. Complex Syst. Model. Simul. 2021, 1, 198–217. [Google Scholar] [CrossRef]
  49. Gao, D.; Wang, G.-G.; Pedrycz, W. Solving Fuzzy Job-Shop Scheduling Problem Using DE Algorithm Improved by a Selection Mechanism. IEEE Trans. Fuzzy Syst. 2020, 28, 3265–3275. [Google Scholar] [CrossRef]
  50. Zhao, F.; Di, S.; Cao, J.; Tang, J.; Jonrinaldi. A Novel Cooperative Multi-Stage Hyper-Heuristic for Combination Optimization Problems. Complex Syst. Model. Simul. 2021, 1, 91–108. [Google Scholar] [CrossRef]
  51. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  52. Turgut, M.S.; Turgut, O.E.; Eliiyi, D.T. Island-based Crow Search Algorithm for solving optimal control problems. Appl. Soft Comput. 2020, 90, 106170. [Google Scholar] [CrossRef]
  53. Sureja, N.; Chawda, B.; Vasant, A. An improved K-medoids clustering approach based on the crow search algorithm. J. Comput. Math. Data Sci. 2022, 3, 100034. [Google Scholar] [CrossRef]
  54. Sultana, N.; Hossain, S.Z.; Alam, S.; Hashish, M.; Islam, M. An experimental investigation and modeling approach of response surface methodology coupled with crow search algorithm for optimizing the properties of jute fiber reinforced concrete. Constr. Build. Mater. 2020, 243, 118216. [Google Scholar] [CrossRef]
  55. Saxena, A. An efficient harmonic estimator design based on Augmented Crow Search Algorithm in noisy environment. Expert Syst. Appl. 2022, 194, 116470. [Google Scholar] [CrossRef]
  56. Panah, P.G.; Bornapour, M.; Hemmati, R.; Guerrero, J.M. Charging station Stochastic Programming for Hydrogen/Battery Electric Buses using Multi-Criteria Crow Search Algorithm. Renew. Sustain. Energy Rev. 2021, 144, 111046. [Google Scholar] [CrossRef]
  57. Adamu, A.; Abdullahi, M.; Junaidu, S.B.; Hassan, I.H. An hybrid particle swarm optimization with crow search algorithm for feature selection. Mach. Learn. Appl. 2021, 6, 100108. [Google Scholar] [CrossRef]
  58. Aliabadi, M.J.; Radmehr, M. Optimization of hybrid renewable energy system in radial distribution networks considering uncertainty using meta-heuristic crow search algorithm. Appl. Soft Comput. 2021, 107, 107384. [Google Scholar] [CrossRef]
  59. Al-Thanoon, N.A.; Algamal, Z.Y.; Qasim, O.S. Feature selection based on a crow search algorithm for big data classification. Chemom. Intell. Lab. Syst. 2021, 212, 104288. [Google Scholar] [CrossRef]
  60. Awadallah, M.A.; Al-Betar, M.A.; Abu Doush, I.; Makhadmeh, S.N.; Alyasseri, Z.A.A.; Abasi, A.K.; Alomari, O.A. CCSA: Cellular Crow Search Algorithm with topological neighborhood shapes for optimization. Expert Syst. Appl. 2022, 194, 116431. [Google Scholar] [CrossRef]
  61. Bakhshaei, P.; Askarzadeh, A.; Arababadi, R. Operation optimization of a grid-connected photovoltaic/pumped hydro storage considering demand response program by an improved crow search algorithm. J. Energy Storage 2021, 44, 103326. [Google Scholar] [CrossRef]
  62. Chaudhuri, A.; Sahu, T.P. Feature selection using Binary Crow Search Algorithm with time varying flight length. Expert Syst. Appl. 2020, 168, 114288. [Google Scholar] [CrossRef]
  63. Geetha, M.; Saidhbi, S.; Gampala, V.; Maram, B. A novel approach for image forgery detection using improved crow search algorithm. Mater. Today Proc. 2021, 162, 107733. [Google Scholar] [CrossRef]
  64. Guha, D.; Roy, P.K.; Banerjee, S. Performance evolution of different controllers for frequency regulation of a hybrid energy power system employing chaotic crow search algorithm. ISA Trans. 2021, 120, 128–146. [Google Scholar] [CrossRef] [PubMed]
  65. Gupta, D.; Sundaram, S.; Khanna, A.; Hassanien, A.E.; de Albuquerque, V.H.C. Improved diagnosis of Parkinson’s disease using optimized crow search algorithm. Comput. Electr. Eng. 2018, 68, 412–424. [Google Scholar] [CrossRef]
  66. Hossain, S.Z.; Sultana, N.; Mohammed, M.E.; Razzak, S.A.; Hossain, M.M. Hybrid support vector regression and crow search algorithm for modeling and multiobjective optimization of microalgae-based wastewater treatment. J. Environ. Manag. 2021, 301, 113783. [Google Scholar] [CrossRef]
  67. Ke, Y.; Xie, J.; Pouramini, S. Utilization of an improved crow search algorithm to solve building energy optimization problems: Cases of Australia. J. Build. Eng. 2021, 38, 102142. [Google Scholar] [CrossRef]
  68. Khattab, N.M.; Aleem, S.H.A.; El’Gharably, A.; Boghdady, T.A.; Turky, R.A.; Ali, Z.M.; Sayed, M.M. A novel design of fourth-order harmonic passive filters for total demand distortion minimization using crow spiral-based search algorithm. Ain Shams Eng. J. 2021, 13, 101632. [Google Scholar] [CrossRef]
  69. Kumar, S.N.; Fred, A.L.; Miriam, L.R.J.; Padmanabhan, P.; Gulyás, B.; Kumar, H.A.; Dayana, N. 4-Improved crow search algorithm based on arithmetic crossover—A novel metaheuristic technique for solving engineering optimization problems. In Multi-Objective Combinatorial Optimization Problems and Solution Methods; Toloo, M., Talatahari, S., Rahimi, I., Eds.; Academic Press: Cambridge, MA, USA, 2022; pp. 71–91. [Google Scholar]
  70. Li, L.-L.; Liu, Z.-F.; Tseng, M.-L.; Jantarakolica, K.; Lim, M.K. Using enhanced crow search algorithm optimization-extreme learning machine model to forecast short-term wind power. Expert Syst. Appl. 2021, 184, 115579. [Google Scholar] [CrossRef]
  71. Huang, G.-B.; Zhu, Q.-Y.; Siew, C.-K. Extreme learning machine: Theory and applications. Neurocomputing 2006, 70, 489–501. [Google Scholar] [CrossRef]
  72. Huang, G.-B.; Zhou, H.; Ding, X.; Zhang, R. Extreme Learning Machine for Regression and Multiclass Classification. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2012, 42, 513–529. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  73. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  74. Ahmadianfar, I.; Heidari, A.A.; Noshadian, S.; Chen, H.; Gandomi, A.H. INFO: An efficient optimization algorithm based on weighted mean of vectors. Expert Syst. Appl. 2022, 195, 116516. [Google Scholar] [CrossRef]
  75. Tu, J.; Chen, H.; Wang, M.; Gandomi, A.H. The Colony Predation Algorithm. J. Bionic. Eng. 2021, 18, 674–710. [Google Scholar] [CrossRef]
  76. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. Int. J. Escience 2019, 97, 849–872. [Google Scholar] [CrossRef]
  77. Ahmadianfar, I.; Heidari, A.A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN beyond the metaphor: An efficient optimization algorithm based on Runge Kutta method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  78. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  79. Fu, J.; Zhang, Y.; Wang, Y.; Zhang, H.; Liu, J.; Tang, J.; Yang, Q.; Sun, H.; Qiu, W.; Ma, Y.; et al. Optimization of metabolomic data processing using NOREVA. Nat. Protoc. 2021, 17, 129–151. [Google Scholar] [CrossRef]
  80. Li, B.; Tang, J.; Yang, Q.; Li, S.; Cui, X.; Li, Y.H.; Chen, Y.Z.; Xue, W.; Li, X.; Zhu, F. NOREVA: Normalization and evaluation of MS-based metabolomics data. Nucleic Acids Res. 2017, 45, W162–W170. [Google Scholar] [CrossRef] [Green Version]
  81. Yang, Q.; Wang, Y.; Zhang, Y.; Li, F.; Xia, W.; Zhou, Y.; Qiu, Y.; Li, H.; Zhu, F. NOREVA: Enhanced normalization and evaluation of time-course and multi-class metabolomic data. Nucleic Acids Res. 2020, 48, W436–W448. [Google Scholar] [CrossRef]
  82. Jiao, S.; Chong, G.; Huang, C.; Hu, H.; Wang, M.; Heidari, A.A.; Chen, H.; Zhao, X. Orthogonally adapted Harris hawks optimization for parameter estimation of photovoltaic models. Energy 2020, 203, 117804. [Google Scholar] [CrossRef]
  83. Zhang, H.; Heidari, A.A.; Wang, M.; Zhang, L.; Chen, H.; Li, C. Orthogonal Nelder-Mead moth flame method for parameters identification of photovoltaic modules. Energy Convers. Manag. 2020, 211, 112764. [Google Scholar] [CrossRef]
  84. Yu, C.; Chen, M.; Cheng, K.; Zhao, X.; Ma, C.; Kuang, F.; Chen, H. SGOA: Annealing-behaved grasshopper optimizer for global tasks. Eng. Comput. 2021, 2021, 1–28. [Google Scholar] [CrossRef]
  85. Zhang, Y.; Liu, R.; Heidari, A.A.; Wang, X.; Chen, Y.; Wang, M.; Chen, H. Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Neurocomputing 2020, 430, 185–212. [Google Scholar] [CrossRef]
  86. Cai, Z.; Gu, J.; Luo, J.; Zhang, Q.; Chen, H.; Pan, Z.; Li, Y.; Li, C. Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Expert Syst. Appl. 2019, 138, 112814. [Google Scholar] [CrossRef]
  87. Tu, J.; Chen, H.; Liu, J.; Heidari, A.A.; Zhang, X.; Wang, M.; Ruby, R.; Pham, Q.-V. Evolutionary biogeography-based whale optimization methods with communication structure: Towards measuring the balance. Knowl.-Based Syst. 2020, 212, 106642. [Google Scholar] [CrossRef]
  88. Song, S.; Wang, P.; Heidari, A.A.; Wang, M.; Zhao, X.; Chen, H.; He, W.; Xu, S. Dimension decided Harris hawks optimization with Gaussian mutation: Balance analysis and diversity patterns. Knowl.-Based Syst. 2020, 215, 106425. [Google Scholar] [CrossRef]
  89. Ahmadianfar, I.; Gong, W.; Heidari, A.A.; Golilarz, N.A.; Samadi-Koucheksaraee, A.; Chen, H. Gradient-based optimization with ranking mechanisms for parameter identification of photovoltaic systems. Energy Rep. 2021, 7, 3979–3997. [Google Scholar] [CrossRef]
  90. Qiao, Z.; Shan, W.; Jiang, N.; Heidari, A.A.; Chen, H.; Teng, Y.; Turabieh, H.; Mafarja, M. Gaussian bare-bones gradient-based optimization: Towards mitigating the performance concerns. Int. J. Intell. Syst. 2021, 37, 3193–3254. [Google Scholar] [CrossRef]
  91. Yu, S.; Heidari, A.A.; Liang, G.; Chen, C.; Chen, H.; Shao, Q. Solar photovoltaic model parameter estimation based on orthogonally-adapted gradient-based optimization. Optik 2021, 252, 168513. [Google Scholar] [CrossRef]
  92. Yu, S.; Chen, Z.; Heidari, A.A.; Zhou, W.; Chen, H.; Xiao, L. Parameter identification of photovoltaic models using a sine cosine differential gradient based optimizer. IET Renew. Power Gener. 2022, 16, 1535–1561. [Google Scholar] [CrossRef]
  93. Zhou, W.; Wang, P.; Heidari, A.A.; Zhao, X.; Turabieh, H.; Chen, H. Random learning gradient based optimization for efficient design of photovoltaic models. Energy Convers. Manag. 2021, 230, 113751. [Google Scholar] [CrossRef]
  94. Liang, J.; Qu, B.Y.; Suganthan, P. Problem Definitions and Evaluation Criteria for the CEC 2014 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization; Technical Report; Zhengzhou University: Zhengzhou, China; Computational Intelligence Laboratory, Nanyang Technological University: Singapore, 2013. [Google Scholar]
  95. Yang, X.-S. A New Metaheuristic Bat-Inspired Algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Studies in Computational Intelligence; González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  96. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  97. Heidari, A.A.; Abbaspour, R.A.; Chen, H. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Appl. Soft Comput. 2019, 81, 105521. [Google Scholar] [CrossRef]
  98. Sun, T.-Y.; Liu, C.-C.; Tsai, S.-J.; Hsieh, S.-T.; Li, K.-Y. Cluster Guide Particle Swarm Optimization (CGPSO) for Underdetermined Blind Source Separation With Advanced Conditions. IEEE Trans. Evol. Comput. 2010, 15, 798–811. [Google Scholar] [CrossRef]
  99. Adarsh, B.R.; Raghunathan, T.; Jayabarathi, T.; Yang, X.-S. Economic dispatch using chaotic bat algorithm. Energy 2016, 96, 666–675. [Google Scholar] [CrossRef]
  100. Liang, H.; Liu, Y.; Shen, Y.; Li, F.; Man, Y. A Hybrid Bat Algorithm for Economic Dispatch with Random Wind Power. IEEE Trans. Power Syst. 2018, 33, 5052–5061. [Google Scholar] [CrossRef]
  101. Derraca, J.; Garcíab, S.; Molinac, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  102. Li, Y.H.; Li, X.X.; Hong, J.J.; Wang, Y.X.; Fu, J.B.; Yang, H.; Yu, C.Y.; Li, F.C.; Hu, J.; Xue, W.; et al. Clinical trials, progression-speed differentiating features and swiftness rule of the innovative targets of first-in-class drugs. Briefings Bioinform. 2019, 21, 649–662. [Google Scholar] [CrossRef] [Green Version]
  103. Zhu, F.; Li, X.X.; Yang, S.Y.; Chen, Y.Z. Clinical Success of Drug Targets Prospectively Predicted by In Silico Study. Trends Pharmacol. Sci. 2018, 39, 229–231. [Google Scholar] [CrossRef]
  104. Wu, Z.; Li, R.; Zhou, Z.; Guo, J.; Jiang, J.; Su, X. A user sensitive subject protection approach for book search service. J. Assoc. Inf. Sci. Technol. 2019, 71, 183–195. [Google Scholar] [CrossRef]
  105. Wu, Z.; Shen, S.; Lian, X.; Su, X.; Chen, E. A dummy-based user privacy protection approach for text information retrieval. Knowl.-Based Syst. 2020, 195, 105679. [Google Scholar] [CrossRef]
  106. Wu, Z.; Shen, S.; Zhou, H.; Li, H.; Lu, C.; Zou, D. An effective approach for the protection of user commodity viewing privacy in e-commerce website. Knowl.-Based Syst. 2021, 220, 106952. [Google Scholar] [CrossRef]
  107. Liu, X.; Yang, B.; Chen, H.; Musial, K.; Chen, H.; Li, Y.; Zuo, W. A Scalable Redefined Stochastic Blockmodel. ACM Trans. Knowl. Discov. Data 2021, 15, 1–28. [Google Scholar] [CrossRef]
  108. Su, Y.; Liu, C.; Niu, Y.; Cheng, F.; Zhang, X. A Community Structure Enhancement-Based Community Detection Algorithm for Complex Networks. IEEE Trans. Syst. Man Cybern. Syst. 2019, 51, 2833–2846. [Google Scholar] [CrossRef]
  109. Pei, H.; Yang, B.; Liu, J.; Chang, K.C. Active Surveillance via Group Sparse Bayesian Learning. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 44, 1133–1148. [Google Scholar] [CrossRef]
  110. Su, Y.; Li, S.; Zheng, C.; Zhang, X. A Heuristic Algorithm for Identifying Molecular Signatures in Cancer. IEEE Trans. NanoBioscience 2019, 19, 132–141. [Google Scholar] [CrossRef]
  111. Li, L.; Gao, Z.; Wang, Y.-T.; Zhang, M.-W.; Ni, J.-C.; Zheng, C.-H.; Su, Y. SCMFMDA: Predicting microRNA-disease associations based on similarity constrained matrix factorization. PLoS Comput. Biol. 2021, 17, e1009165. [Google Scholar] [CrossRef]
  112. Wu, Q.-W.; Cao, R.-F.; Xia, J.; Ni, J.-C.; Zheng, C.-H.; Su, Y. Extra Trees Method for Predicting LncRNA-Disease Association Based on Multi-layer Graph Embedding Aggregation. IEEE/ACM Trans. Comput. Biol. Bioinform. 2021. [Google Scholar] [CrossRef]
  113. Hu, J.; Han, Z.; Heidari, A.A.; Shou, Y.; Ye, H.; Wang, L.; Huang, X.; Chen, H.; Chen, Y.; Wu, P. Detection of COVID-19 severity using blood gas analysis parameters and Harris hawks optimized extreme learning machine. Comput. Biol. Med. 2021, 142, 105166. [Google Scholar] [CrossRef]
  114. Hu, J.; Liu, Y.; Heidari, A.A.; Bano, Y.; Ibrohimov, A.; Liang, G.; Chen, H.; Chen, X.; Zaguia, A.; Turabieh, H. An effective model for predicting serum albumin level in hemodialysis patients. Comput. Biol. Med. 2021, 140, 105054. [Google Scholar] [CrossRef]
  115. Li, C.; Hou, L.; Sharma, B.Y.; Li, H.; Chen, C.; Li, Y.; Zhao, X.; Huang, H.; Cai, Z.; Chen, H. Developing a new intelligent system for the diagnosis of tuberculous pleural effusion. Comput. Methods Programs Biomed. 2018, 153, 211–225. [Google Scholar] [CrossRef] [PubMed]
  116. Zhao, X.; Zhang, X.; Cai, Z.; Tian, X.; Wang, X.; Huang, Y.; Chen, H.; Hu, L. Chaos enhanced grey wolf optimization wrapped ELM for diagnosis of paraquat-poisoned patients. Comput. Biol. Chem. 2018, 78, 481–490. [Google Scholar] [CrossRef] [PubMed]
  117. Li, J.; Chen, C.; Chen, H.; Tong, C. Towards Context-aware Social Recommendation via Individual Trust. Knowl.-Based Syst. 2017, 127, 58–66. [Google Scholar] [CrossRef]
  118. Li, J.; Lin, J. A probability distribution detection based hybrid ensemble QoS prediction approach. Inf. Sci. 2020, 519, 289–305. [Google Scholar] [CrossRef]
  119. Li, J.; Zheng, X.-L.; Chen, S.-T.; Song, W.-W.; Chen, D.-R. An efficient and reliable approach for quality-of-service-aware service composition. Inf. Sci. 2014, 269, 238–254. [Google Scholar] [CrossRef]
  120. Wu, Z.; Li, G.; Shen, S.; Lian, X.; Chen, E.; Xu, G. Constructing dummy query sequences to protect location privacy and query privacy in location-based services. World Wide Web 2020, 24, 25–49. [Google Scholar] [CrossRef]
  121. Wu, Z.; Wang, R.; Li, Q.; Lian, X.; Xu, G.; Chen, E.; Liu, X. A Location Privacy-Preserving System Based on Query Range Cover-Up or Location-Based Services. IEEE Trans. Veh. Technol. 2020, 69, 5244–5254. [Google Scholar] [CrossRef]
  122. Qiu, S.; Zhao, H.; Jiang, N.; Wu, D.; Song, G.; Zhao, H.; Wang, Z. Sensor network oriented human motion capture via wearable intelligent system. Int. J. Intell. Syst. 2021, 37, 1646–1673. [Google Scholar] [CrossRef]
  123. Guan, R.; Zhang, H.; Liang, Y.; Giunchiglia, F.; Huang, L.; Feng, X. Deep Feature-Based Text Clustering and Its Explanation. IEEE Trans. Knowl. Data Eng. 2020, 34, 3669–3680. [Google Scholar] [CrossRef]
  124. Qiu, S.; Hao, Z.; Wang, Z.; Liu, L.; Liu, J.; Zhao, H.; Fortino, G. Sensor Combination Selection Strategy for Kayak Cycle Phase Segmentation Based on Body Sensor Networks. IEEE Internet Things J. 2021, 9, 4190–4201. [Google Scholar] [CrossRef]
  125. Cai, L.; Lu, C.; Xu, J.; Meng, Y.; Wang, P.; Fu, X.; Zeng, X.; Su, Y. Drug repositioning based on the heterogeneous information fusion graph convolutional network. Briefings Bioinform. 2021, 22, bbab319. [Google Scholar] [CrossRef] [PubMed]
  126. Chen, H.; Wang, M.; Zhao, X. A multi-strategy enhanced sine cosine algorithm for global optimization and constrained practical engineering problems. Appl. Math. Comput. 2019, 369, 124872. [Google Scholar] [CrossRef]
  127. Yu, H.; Qiao, S.; Heidari, A.A.; Bi, C.; Chen, H. Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design. Mathematics 2022, 10, 276. [Google Scholar] [CrossRef]
  128. Cai, Z.; Gu, J.; Wen, C.; Zhao, D.; Huang, C.; Huang, H.; Tong, C.; Li, J.; Chen, H. An Intelligent Parkinson’s Disease Diagnostic System Based on a Chaotic Bacterial Foraging Optimization Enhanced Fuzzy KNN Approach. Comput. Math. Methods Med. 2018, 2018, 1–24. [Google Scholar] [CrossRef] [PubMed]
  129. Yu, H.; Yuan, K.; Li, W.; Zhao, N.; Chen, W.; Huang, C.; Chen, H.; Wang, M. Improved Butterfly Optimizer-Configured Extreme Learning Machine for Fault Diagnosis. Complexity 2021, 2021, 1–17. [Google Scholar] [CrossRef]
  130. Ye, X.; Liu, W.; Li, H.; Wang, M.; Chi, C.; Liang, G.; Chen, H.; Huang, H. Modified Whale Optimization Algorithm for Solar Cell and PV Module Parameter Identification. Complexity 2021, 2021, 1–23. [Google Scholar] [CrossRef]
  131. Xu, B.; Heidari, A.A.; Kuang, F.; Zhang, S.; Chen, H.; Cai, Z. Performance optimization of photovoltaic systems: Reassessment of political optimization with a quantum Nelder-mead functionality. Sol. Energy 2022, 234, 39–63. [Google Scholar] [CrossRef]
  132. Luo, J.; Chen, H.; Zhang, Q.; Xu, Y.; Huang, H.; Zhao, X. An improved grasshopper optimization algorithm with application to financial stress prediction. Appl. Math. Model. 2018, 64, 654–668. [Google Scholar] [CrossRef]
  133. Shi, B.; Ye, H.; Heidari, A.A.; Zheng, L.; Hu, Z.; Chen, H.; Turabieh, H.; Mafarja, M.; Wu, P. Analysis of COVID-19 severity from the perspective of coagulation index using evolutionary machine learning with enhanced brain storm optimization. J. King Saud Univ. Comput. Inf. Sci. 2021. [Google Scholar] [CrossRef]
  134. Zhou, W.; Wang, P.; Heidari, A.A.; Zhao, X.; Turabieh, H.; Mafarja, M.; Chen, H. Metaphor-free dynamic spherical evolution for parameter estimation of photovoltaic modules. Energy Rep. 2021, 7, 5175–5202. [Google Scholar] [CrossRef]
Figure 1. Flowchart of GLLCSA.
Figure 1. Flowchart of GLLCSA.
Applsci 12 06907 g001
Figure 2. Schematic diagram of GLLCSA-KELM-FS.
Figure 2. Schematic diagram of GLLCSA-KELM-FS.
Applsci 12 06907 g002
Figure 3. Convergence curves of 9 selected benchmark functions.
Figure 3. Convergence curves of 9 selected benchmark functions.
Applsci 12 06907 g003
Figure 4. The Avg and Std results of involved methods on four indicators.
Figure 4. The Avg and Std results of involved methods on four indicators.
Applsci 12 06907 g004
Figure 5. Frequency of the features selected by GLLCSA-KELM-FS.
Figure 5. Frequency of the features selected by GLLCSA-KELM-FS.
Applsci 12 06907 g005
Table 1. Parameters for involved methods.
Table 1. Parameters for involved methods.
MethodParameters
RCBA Q m i n   =   0 ;   Q m a x = 2
CBA Q m i n   =   0 ;   Q m a x = 2
CGPSO w   =   1 ; c 1   =   2 ;   c 2 = 2
IGWO β   =   10 ;   ω   =   15 ;   a [0, 2]
OBLGWO a [ 0 ,   2 ] ;   a 2 [−1, −2]; b = 1
CESCA a   =   2 ;   r 3   =   c h a o t i c   s e q u e n c e
GWO a [0, 2]
MFO b = 1 ; t = [ 1 ,   1 ] ;   a [−1, −2]
PSO w = 1 ;   c 1   =   2 ;   c 2 = 2
SCAa = 2
BAA = 0.5; r = 0.5
Table 2. Unimodal benchmark functions.
Table 2. Unimodal benchmark functions.
FunctionDimRange f m i n
f 1 ( x ) = i = 1 n x i 2 30[−100, 100]0
f 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30[−10, 10]0
f 3 ( x ) = i = 1 n ( j 1 i x j ) 2 30[−100, 100]0
f 4 ( x ) = m a x i { | x i | , 1 i n } 30[−100, 100]0
f 5 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30[−30, 30]0
f 6 ( x ) = i = 1 n ( [ x i + 0.5 ] ) 2 30[−100, 100]0
f 7 ( x ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 30[−1.28, 1.28]0
Table 3. Multimodal benchmark functions.
Table 3. Multimodal benchmark functions.
FunctionDimRange f m i n
f 1 ( x ) = i = 1 n x i 2 30[−100, 100]0
f 2 ( x ) = i = 1 n | x i | + i = 1 n | x i | 30[−10, 10]0
f 3 ( x ) = i = 1 n ( j 1 i x j ) 2 30[−100, 100]0
f 4 ( x ) = m a x i { | x i | , 1 i n } 30[−100, 100]0
f 5 ( x ) = i = 1 n 1 [ 100 ( x i + 1 x i 2 ) 2 + ( x i 1 ) 2 ] 30[−30, 30]0
f 6 ( x ) = i = 1 n ( [ x i + 0.5 ] ) 2 30[−100, 100]0
f 7 ( x ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 30[−1.28, 1.28]0
Table 4. Simple multimodal functions.
Table 4. Simple multimodal functions.
FunctionDimRange f m i n
f 14 ( x ) = ( 1 500 + j = 1 25 1 j + i 1 2 ( x i a i j ) 6 ) 2[−65, 65]1
f 15 ( x ) = i = 11 11 [ a i x 1 ( b i 2 + b i x 2 ) b i 2 + b i x 3 + x 4 ] 4[−5, 5]0.00030
f 16 ( x ) = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5]−1.0316
f 17 ( x ) = ( x 2 5.1 4 π 2 x 1 2 + 5 π x 1 6 ) 2 + 10 ( 1 1 8 π ) c o s x 1 + 10 2[−5, 5]0.398
f 18 ( x ) = [ 1 + ( x 1 + x 2 + 1 ) 2 ( 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 ) ] × [ 30 + ( 2 x 1 3 x 2 ) 2 × ( 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 ) ] 2[−2, 2]3
f 19 ( x ) = i = 1 4 c i e x p ( j = 1 3 a i j ( x j p i j ) 2 ) 3[1, 3]−3.86
f 20 ( x ) = i = 1 4 c i e x p ( j = 1 6 a i j ( x j p i j ) 2 ) 6[0, 1]−3.32
f 21 ( x ) = i = 1 5 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.1532
f 22 ( x ) = i = 1 7 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.4028
f 23 ( x ) = i = 1 10 [ ( X a i ) ( X a i ) T + c i ] 1 4[0, 10]−10.5363
Table 5. Comparison results of the GLLCSA and the other eleven peers.
Table 5. Comparison results of the GLLCSA and the other eleven peers.
F1F2F3
AVGSTDAVGSTDAVGSTD
GLLCSA1.238849 × 1066.536334 × 1058.409119 × 1037.920252 × 1033.205022 × 1021.699287 × 101
RCBA1.322760 × 1064.175292 × 1052.985654 × 1041.333570 × 1043.281185 × 1021.067689 × 101
CBA4.274593 × 1061.314117 × 1069.365674 × 1038.092198 × 1034.450858 × 1035.531420 × 103
CGPSO9.372242 × 1062.632222 × 1061.568360 × 1082.017935 × 1072.220444 × 1035.134707 × 102
IGWO1.756705 × 1077.284733 × 1062.995857 × 1061.823980 × 1066.274603 × 1032.779490 × 103
OBLGWO1.631687 × 1077.688354 × 1061.592044 × 1071.091297 × 1079.317280 × 1032.669594 × 103
CESCA1.284071 × 1091.868461 × 1087.707841 × 10103.975591 × 1091.047442 × 1051.330180 × 104
GWO7.610796 × 1074.502244 × 1072.038592 × 1092.085598 × 1092.983399 × 1049.196130 × 103
MFO9.988797 × 1071.240090 × 1081.467643 × 10101.151737 × 10101.061701 × 1055.512228 × 104
PSO8.717867 × 1062.411178 × 1061.469586 × 1081.591299 × 1079.769192 × 1021.039244 × 102
SCA2.405193 × 1087.441417 × 1071.657061 × 10103.648448 × 1093.778071 × 1045.992623 × 103
BA8.248853 × 1053.347618 × 1055.541545 × 1052.912975 × 1055.419018 × 1023.886761 × 102
F4F5F6
AVGSTDAVGSTDAVGSTD
GLLCSA5.169091 × 1023.806454 × 1015.199997 × 1024.725102 × 10−46.270897 × 1023.332504 × 100
RCBA4.776711 × 1022.908743 × 1015.200835 × 1027.971845 × 10−26.378213 × 1023.243160 × 100
CBA4.993125 × 1022.933371 × 1015.201765 × 1021.778137 × 10−16.427952 × 1022.813514 × 100
CGPSO4.834582 × 1025.395728 × 1015.209736 × 1023.816708 × 10−26.238705 × 1022.721322 × 100
IGWO5.304485 × 1022.917210 × 1015.205215 × 1021.173440 × 10−16.189358 × 1022.456891 × 100
OBLGWO5.487494 × 1023.820381 × 1015.209499 × 1026.195169 × 10−26.197624 × 1024.844151 × 100
CESCA1.213248 × 1041.380450 × 1035.210273 × 1024.423163 × 10−26.422656 × 1029.575703 × 10−1
GWO6.626774 × 1021.336610 × 1025.209206 × 1021.472290 × 10−16.141932 × 1023.014494 × 100
MFO1.363746 × 1039.310686 × 1025.203519 × 1021.788243 × 10−16.235336 × 1024.669785 × 100
PSO4.654370 × 1023.324394 × 1015.209432 × 1024.365825 × 10−26.223294 × 1022.925522 × 100
SCA1.412809 × 1032.817093 × 1025.209057 × 1026.801991 × 10−26.346151 × 1022.021926 × 100
BA4.259904 × 1023.313659 × 1015.209642 × 1024.520320 × 10−26.344476 × 1023.482229 × 100
F7F8F9
AVGSTDAVGSTDAVGSTD
GLLCSA7.000141 × 1021.328128 × 10−28.659866 × 1021.797264 × 1011.042875 × 1032.441964 × 101
RCBA7.000608 × 1021.958573 × 10−21.020876 × 1035.029168 × 1011.176788 × 1036.280943 × 101
CBA7.000146 × 1021.813612 × 10−21.006982 × 1034.961612 × 1011.148943 × 1036.143216 × 101
CGPSO7.023717 × 1021.416684 × 10−19.911781 × 1021.515705 × 1011.119379 × 1033.173188 × 101
IGWO7.009874 × 1027.254470 × 10−28.821336 × 1021.573112 × 1011.012470 × 1031.942523 × 101
OBLGWO7.012162 × 1021.091624 × 10−19.177031 × 1023.428652 × 1011.070271 × 1034.491907 × 101
CESCA1.408830 × 1034.842119 × 1011.215613 × 1031.626564 × 1011.298773 × 1032.019193 × 101
GWO7.226746 × 1022.021748 × 1018.857211 × 1021.855631 × 1011.004773 × 1033.139114 × 101
MFO8.136821 × 1026.471887 × 1019.413879 × 1024.323163 × 1011.125361 × 1035.746786 × 101
PSO7.022715 × 1021.409068 × 10−19.722301 × 1021.766974 × 1011.120378 × 1032.327605 × 101
SCA8.255920 × 1022.778636 × 1011.046443 × 1031.799333 × 1011.173766 × 1031.806722 × 101
BA7.006022 × 1021.979492 × 10−11.008752 × 1035.409854 × 1011.168861 × 1036.375010 × 101
F10F11F12
AVGSTDAVGSTDAVGSTD
GLLCSA2.264201 × 1034.533532 × 1024.687098 × 1035.546305 × 1021.200648 × 1032.673387 × 10−1
RCBA5.480741 × 1036.379767 × 1025.750590 × 1036.862040 × 1021.200654 × 1032.777282 × 10−1
CBA5.401931 × 1035.701332 × 1025.919613 × 1037.504374 × 1021.201058 × 1034.862861 × 10−1
CGPSO5.373383 × 1036.931602 × 1025.913799 × 1035.650266 × 1021.202491 × 1033.331429 × 10−1
IGWO3.657471 × 1035.814534 × 1024.484245 × 1036.286188 × 1021.200666 × 1033.219966 × 10−1
OBLGWO4.092823 × 1038.204995 × 1024.991144 × 1037.370112 × 1021.202322 × 1037.309448 × 10−1
CESCA8.924540 × 1033.024366 × 1029.111373 × 1033.446813 × 1021.203574 × 1033.191205 × 10−1
GWO3.274683 × 1034.986018 × 1023.847539 × 1036.349137 × 1021.201314 × 1031.120997 × 100
MFO4.477430 × 1037.369407 × 1025.435942 × 1037.729281 × 1021.200503 × 1032.801502 × 10−1
PSO5.182674 × 1036.313629 × 1025.960069 × 1035.555818 × 1021.202506 × 1032.160367 × 10−1
SCA6.980019 × 1035.441916 × 1028.084621 × 1033.387779 × 1021.202449 × 1032.569528 × 10−1
BA5.392300 × 1038.934519 × 1025.657819 × 1037.223520 × 1021.201083 × 1033.178071 × 10−1
F13F14F15
AVGSTDAVGSTDAVGSTD
GLLCSA1.300472 × 1039.313515 × 10−21.400288 × 1039.356412 × 10−21.517640 × 1035.369255 × 100
RCBA1.300516 × 1031.258235 × 10−11.400301 × 1039.462535 × 10−21.535055 × 1037.526856 × 100
CBA1.300500 × 1031.300693 × 10−11.400337 × 1031.336849 × 10−11.565395 × 1031.637156 × 101
CGPSO1.300397 × 1038.786822 × 10−21.400265 × 1031.205785 × 10−11.517596 × 1031.308620 × 100
IGWO1.300567 × 1039.829856 × 10−21.400408 × 1032.718569 × 10−11.517348 × 1034.795016 × 100
OBLGWO1.300547 × 1039.772622 × 10−21.400399 × 1031.833313 × 10−11.514798 × 1034.483673 × 100
CESCA1.308031 × 1034.006350 × 10−11.651215 × 1032.028249 × 1014.185567 × 1051.487072 × 105
GWO1.300518 × 1034.894826 × 10−11.402728 × 1034.920171 × 1001.623933 × 1033.774340 × 102
MFO1.302118 × 1031.252521 × 1001.428573 × 1032.037037 × 1011.684422 × 1053.135423 × 105
PSO1.300356 × 1036.946997 × 10−21.400291 × 1031.227007 × 10−11.516541 × 1031.248870 × 100
SCA1.302885 × 1033.565020 × 10−11.443320 × 1038.453708 × 1004.119474 × 1032.407495 × 103
BA1.300512 × 1031.304709 × 10−11.400322 × 1031.697501 × 10−11.527377 × 1034.307567 × 100
F16F17F18
AVGSTDAVGSTDAVGSTD
GLLCSA1.612099 × 1033.725020 × 10−12.433739 × 1041.713144 × 1043.336244 × 1031.560479 × 103
RCBA1.613338 × 1034.361413 × 10−11.412722 × 1058.486928 × 1048.386815 × 1038.935484 × 103
CBA1.613470 × 1033.302232 × 10−12.864083 × 1051.670469 × 1056.016752 × 1033.769997 × 103
CGPSO1.611763 × 1035.165442 × 10−13.211343 × 1051.769758 × 1052.393177 × 1067.877471 × 105
IGWO1.611779 × 1036.182681 × 10−19.053915 × 1055.364536 × 1051.934529 × 1042.330403 × 104
OBLGWO1.611928 × 1035.049315 × 10−11.459806 × 1061.005176 × 1065.569811 × 1047.941705 × 104
CESCA1.613590 × 1031.581820 × 10−18.364791 × 1072.638806 × 1074.593663 × 1091.237270 × 109
GWO1.610966 × 1034.998869 × 10−11.209797 × 1061.614760 × 1061.289388 × 1072.785751 × 107
MFO1.612805 × 1035.643754 × 10−13.956083 × 1067.375744 × 1068.815251 × 1064.711840 × 107
PSO1.612148 × 1035.400342 × 10−12.987927 × 1052.161928 × 1052.046221 × 1065.303946 × 105
SCA1.612776 × 1032.943395 × 10−15.762196 × 1062.784132 × 1061.676141 × 1088.080917 × 107
BA1.613249 × 1032.844609 × 10−18.508355 × 1045.457347 × 1049.682249 × 1043.786119 × 104
F19F20F21
AVGSTDAVGSTDAVGSTD
GLLCSA1.936009 × 1034.355061 × 1012.350262 × 1031.405669 × 1021.211144 × 1046.294113 × 103
RCBA1.935123 × 1033.611263 × 1012.465127 × 1031.176717 × 1028.261323 × 1043.897679 × 104
CBA1.942633 × 1034.215181 × 1013.049421 × 1031.366173 × 1031.244635 × 1056.750097 × 104
CGPSO1.917583 × 1032.272685 × 1002.469900 × 1031.300228 × 1021.409690 × 1059.042172 × 104
IGWO1.917540 × 1031.321243 × 1013.492227 × 1031.902589 × 1033.034434 × 1052.487823 × 105
OBLGWO1.927023 × 1034.013226 × 1016.189628 × 1033.018964 × 1035.345087 × 1054.128952 × 105
CESCA2.256105 × 1034.942950 × 1013.595022 × 1051.398009 × 1054.080719 × 1071.524933 × 107
GWO1.949393 × 1033.049284 × 1011.404652 × 1046.998666 × 1035.552885 × 1059.333324 × 105
MFO1.987788 × 1038.918167 × 1015.535291 × 1043.913887 × 1048.545256 × 1059.825313 × 105
PSO1.917247 × 1032.179412 × 1002.326747 × 1035.786955 × 1011.019012 × 1055.885849 × 104
SCA1.983328 × 1032.776081 × 1011.504831 × 1044.617492 × 1031.421241 × 1067.574494 × 105
BA1.926877 × 1032.439285 × 1012.456193 × 1031.454076 × 1026.724830 × 1042.886316 × 104
F22F23F24
AVGSTDAVGSTDAVGSTD
GLLCSA2.760574 × 1032.228345 × 1022.500000 × 1037.709365 × 10−102.600000 × 1033.588642 × 10−7
RCBA3.350704 × 1033.621294 × 1022.615252 × 1035.521060 × 10−32.682070 × 1033.811663 × 101
CBA3.388456 × 1032.979146 × 1022.615852 × 1032.821484 × 10−12.680590 × 1033.078898 × 101
CGPSO2.858516 × 1032.290369 × 1022.500003 × 1033.329995 × 10−32.600025 × 1031.197872 × 10−2
IGWO2.568791 × 1031.619749 × 1022.620536 × 1032.940441 × 1002.600006 × 1036.146081 × 10−3
OBLGWO2.786901 × 1032.091463 × 1022.606628 × 1033.617981 × 1012.601120 × 1036.132652 × 100
CESCA5.696038 × 1031.286220 × 1033.136433 × 1031.242696 × 1022.660849 × 1032.318336 × 101
GWO2.585365 × 1031.821056 × 1022.635366 × 1031.134288 × 1012.600002 × 1031.015076 × 10−3
MFO3.067188 × 1032.667965 × 1022.665304 × 1034.109527 × 1012.676468 × 1032.593551 × 101
PSO2.872826 × 1031.962133 × 1022.615934 × 1035.329976 × 10−12.626891 × 1035.838959 × 100
SCA2.992704 × 1031.455672 × 1022.664581 × 1031.270815 × 1012.600096 × 1031.020093 × 10−1
BA3.453378 × 1033.118798 × 1022.615248 × 1032.521083 × 10−32.670639 × 1033.944067 × 101
F25F26F27
AVGSTDAVGSTDAVGSTD
GLLCSA2.700000 × 1035.744707 × 10−122.700426 × 1031.071929 × 10−12.900000 × 1032.240433 × 10−10
RCBA2.735860 × 1032.147012 × 1012.751366 × 1031.065771 × 1023.902577 × 1034.983783 × 102
CBA2.727914 × 1031.170224 × 1012.710743 × 1035.599137 × 1013.975665 × 1034.477796 × 102
CGPSO2.700000 × 1031.890297 × 10−52.793359 × 1032.527157 × 1013.038354 × 1032.739283 × 102
IGWO2.709510 × 1032.831350 × 1002.700683 × 1031.629874 × 10−13.108218 × 1032.481974 × 100
OBLGWO2.700000 × 1030.000000 × 1002.700556 × 1031.453861 × 10−13.009723 × 1032.542824 × 102
CESCA2.719608 × 1038.971841 × 1002.712316 × 1031.442380 × 1004.032420 × 1031.596939 × 102
GWO2.708322 × 1035.306966 × 1002.773555 × 1034.461346 × 1013.342896 × 1031.415047 × 102
MFO2.717403 × 1038.811953 × 1002.702266 × 1031.020229 × 1003.640968 × 1032.082932 × 102
PSO2.711836 × 1035.348456 × 1002.770458 × 1034.667645 × 1013.467269 × 1032.969203 × 102
SCA2.724541 × 1037.587825 × 1002.702440 × 1035.636831 × 10−13.538978 × 1033.131578 × 102
BA2.729845 × 1031.327662 × 1012.700501 × 1031.446831 × 10−13.893218 × 1034.118836 × 102
F28F29F30
AVGSTDAVGSTDAVGSTD
GLLCSA3.000000 × 1031.767349 × 10−104.193382 × 1031.228891 × 1031.194432 × 1041.085100 × 104
RCBA5.621571 × 1039.888195 × 1021.177843 × 1071.281848 × 1072.326497 × 1046.945118 × 104
CBA5.447547 × 1036.556685 × 1023.404018 × 1073.303410 × 1071.352624 × 1048.715829 × 103
CGPSO3.135128 × 1037.400602 × 1022.924359 × 1041.324794 × 1059.272246 × 1039.042430 × 103
IGWO3.835163 × 1032.206870 × 1021.038071 × 1064.039714 × 1062.714712 × 1041.318391 × 104
OBLGWO3.528511 × 1035.198397 × 1024.052395 × 1064.378362 × 1061.957810 × 1041.178368 × 104
CESCA5.350238 × 1032.666407 × 1021.839221 × 1072.762806 × 1061.493175 × 1063.109234 × 105
GWO3.863889 × 1031.781469 × 1021.755562 × 1064.039784 × 1064.526500 × 1042.666792 × 104
MFO3.899794 × 1031.681019 × 1022.847450 × 1063.878730 × 1065.845920 × 1044.496513 × 104
PSO6.991791 × 1039.394121 × 1028.591454 × 1041.586345 × 1051.367568 × 1045.707805 × 103
SCA4.765201 × 1032.990578 × 1021.009283 × 1074.606232 × 1062.364850 × 1059.906369 × 104
BA5.195068 × 1037.406253 × 1024.474452 × 1074.143041 × 1071.646326 × 1042.423418 × 104
Table 6. Comparison results of ranking between GLLCSA and other eleven peers.
Table 6. Comparison results of ranking between GLLCSA and other eleven peers.
FunctionRankingARV
GLLCSA12.300000 × 100
RCBA86.566667 × 100
CBA96.866667 × 100
CGPSO35.200000 × 100
IGWO24.766667 × 100
OBLGWO45.366667 × 100
CESCA121.136667 × 101
GWO76.333333 × 100
MFO108.166667 × 100
PSO55.566667 × 100
SCA119.433333 × 100
BA66.066667 × 100
Table 7. Wilcoxon test results of GLLSCA and other peers.
Table 7. Wilcoxon test results of GLLSCA and other peers.
FunctionRCBACBACGPSOIGWOOBLGWOCESCAGWOMFOPSOSCABA
F16.73 × 10−11.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−62.85 × 10−2
F21.36 × 10−54.41 × 10−11.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F38.22 × 10−31.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−62.84 × 10−5
F48.94 × 10−41.65 × 10−11.32 × 10−21.25 × 10−11.29 × 10−31.73 × 10−63.88 × 10−63.52 × 10−65.31 × 10−51.73 × 10−61.92 × 10−6
F51.73 × 10−61.92 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F61.73 × 10−61.73 × 10−66.64 × 10−41.73 × 10−68.47 × 10−61.73 × 10−61.73 × 10−68.31 × 10−41.49 × 10−51.92 × 10−65.75 × 10−6
F71.73 × 10−64.91 × 10−11.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F81.73 × 10−61.73 × 10−61.73 × 10−63.88 × 10−42.60 × 10−61.73 × 10−62.26 × 10−33.18 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F91.73 × 10−62.88 × 10−61.92 × 10−69.71 × 10−56.42 × 10−31.73 × 10−68.92 × 10−53.52 × 10−61.73 × 10−61.73 × 10−61.92 × 10−6
F101.73 × 10−61.73 × 10−61.73 × 10−62.60 × 10−62.13 × 10−61.73 × 10−61.80 × 10−51.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F112.37 × 10−51.24 × 10−51.73 × 10−62.13 × 10−11.71 × 10−11.73 × 10−65.79 × 10−58.31 × 10−41.73 × 10−61.73 × 10−65.79 × 10−5
F128.94 × 10−15.29 × 10−41.73 × 10−64.65 × 10−12.35 × 10−61.73 × 10−69.27 × 10−32.18 × 10−21.73 × 10−61.73 × 10−62.84 × 10−5
F131.41 × 10−14.53 × 10−17.73 × 10−39.63 × 10−41.32 × 10−21.73 × 10−61.25 × 10−11.73 × 10−62.22 × 10−41.73 × 10−62.89 × 10−1
F144.53 × 10−19.37 × 10−23.00 × 10−21.59 × 10−19.63 × 10−41.73 × 10−63.61 × 10−31.73 × 10−67.97 × 10−11.73 × 10−66.73 × 10−1
F151.73 × 10−61.73 × 10−68.61 × 10−17.19 × 10−11.47 × 10−11.73 × 10−62.41 × 10−31.73 × 10−64.28 × 10−11.73 × 10−62.88 × 10−6
F161.73 × 10−61.92 × 10−61.48 × 10−22.85 × 10−21.65 × 10−11.73 × 10−61.73 × 10−68.92 × 10−57.04 × 10−12.60 × 10−61.73 × 10−6
F171.92 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−64.86 × 10−5
F181.32 × 10−21.20 × 10−31.73 × 10−62.35 × 10−61.73 × 10−61.73 × 10−61.36 × 10−53.88 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F193.39 × 10−11.78 × 10−19.10 × 10−17.19 × 10−21.11 × 10−21.73 × 10−64.49 × 10−22.41 × 10−36.58 × 10−16.16 × 10−48.13 × 10−1
F205.29 × 10−45.22 × 10−65.32 × 10−32.60 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−67.66 × 10−11.73 × 10−61.48 × 10−2
F211.73 × 10−61.73 × 10−61.73 × 10−61.92 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F225.22 × 10−61.92 × 10−69.78 × 10−22.96 × 10−34.78 × 10−11.73 × 10−63.85 × 10−33.72 × 10−54.95 × 10−23.59 × 10−41.73 × 10−6
F231.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−63.18 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F241.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−63.11 × 10−51.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F251.73 × 10−61.73 × 10−61.73 × 10−62.56 × 10−66.38 × 10−61.73 × 10−64.17 × 10−51.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F261.83 × 10−34.68 × 10−32.13 × 10−63.18 × 10−64.90 × 10−41.73 × 10−63.18 × 10−61.73 × 10−66.89 × 10−51.73 × 10−69.37 × 10−2
F271.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−65.70 × 10−21.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F281.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−68.73 × 10−31.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−6
F291.73 × 10−61.73 × 10−61.20 × 10−31.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−61.73 × 10−62.99 × 10−11.73 × 10−61.73 × 10−6
F302.54 × 10−14.72 × 10−21.41 × 10−11.48 × 10−42.11 × 10−31.73 × 10−65.75 × 10−65.22 × 10−66.27 × 10−21.73 × 10−63.39 × 10−1
+/−/=23/1/624/0/621/5/420/4/622/3/530/0/024/5/128/2/020/3/730/0/023/2/5
Table 8. The Avg results of four indicators for GLLCSA-KELM-FS and the other five models.
Table 8. The Avg results of four indicators for GLLCSA-KELM-FS and the other five models.
ModelsACCSensitivitySpecificityMCC
Avg
GLLCSA-KELM-FS93.20%92.93%91.00%85.19%
GLLCSA-KELM90.17%91.66%88.94%80.51%
CSA-KELM88.22%92.08%83.87%76.69%
KELM88.81%93.92%83.65%77.99%
RF89.42%93.60%85.02%78.98%
FKNN87.29%91.71%84.26%75.52%
Table 9. The Std results of four indicators for GLLCSA-KELM-FS and the other five models.
Table 9. The Std results of four indicators for GLLCSA-KELM-FS and the other five models.
ModelsACCSensitivitySpecificityMCC
Std
GLLCSA-KELM-FS2.18 × 10−22.26 × 10−26.30 × 10−25.11 × 10−2
GLLCSA-KELM3.07 × 10−24.33 × 10−25.81 × 10−26.32 × 10−2
CSA-KELM4.35 × 10−25.82 × 10−29.13 × 10−28.46 × 10−2
KELM3.25 × 10−23.14 × 10−26.53 × 10−26.03 × 10−2
RF3.73 × 10−23.33 × 10−25.56 × 10−27.34 × 10−2
FKNN5.49 × 10−25.20 × 10−21.00 × 10−11.02 × 10−1
Table 10. The t-test results of the ACC of GLLCSA-KELM-FS and the other methods.
Table 10. The t-test results of the ACC of GLLCSA-KELM-FS and the other methods.
GLLCSA-KELM-FS vs. GLLCSA-KELMGLLCSA-KELM-FS vs. CSA-KELMGLLCSA-KELM-FS vs. KELMGLLCSA-KELM-FS vs. RFGLLCSA-KELM-FS vs. FKNN
One or two-tailed p-valueTwo-tailedTwo-tailedTwo-tailedTwo-tailedTwo-tailed
t 1.8032.5842.6651.9682.552
df1818181818
p-value0.08820.01870.01580.06470.0200
Significantly different (p < 0.05)NoYesYesNoYes
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, L.; Fu, Y.; Wei, Y.; Chen, H.; Xia, C.; Cai, Z. Predicting Entrepreneurial Intention of Students: Kernel Extreme Learning Machine with Boosted Crow Search Algorithm. Appl. Sci. 2022, 12, 6907. https://doi.org/10.3390/app12146907

AMA Style

Zhang L, Fu Y, Wei Y, Chen H, Xia C, Cai Z. Predicting Entrepreneurial Intention of Students: Kernel Extreme Learning Machine with Boosted Crow Search Algorithm. Applied Sciences. 2022; 12(14):6907. https://doi.org/10.3390/app12146907

Chicago/Turabian Style

Zhang, Lingling, Yinjun Fu, Yan Wei, Huiling Chen, Chunyu Xia, and Zhennao Cai. 2022. "Predicting Entrepreneurial Intention of Students: Kernel Extreme Learning Machine with Boosted Crow Search Algorithm" Applied Sciences 12, no. 14: 6907. https://doi.org/10.3390/app12146907

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop