6.2.1. Machine Learning

Tang et al. [174] introduced an MTEC algorithm for training multiple extreme learning machines with different number of hidden neurons for classification problem. The proposed method had achieved better quality of solutions even if some hidden neurons and connections were removed. Feature selection is an important data preprocessing technique to reduce the dimensionality in data mining and machine learning. Zhang et al. [89] proposed an ensemble classification framework based on evolutionary feature subspaces generation, which formulated the tasks of searching for the most suitable feature subspace into a MTO problem and solved it via a MTEC optimizer. Recently, MFPSO was also used to solve high-dimensional classification [173]. To be specific, two related tasks with the promising feature subset and the entire features set were developed, respectively. The MTO paradigm naturally fits the multi-classification problem by treating each binary classification problem as an optimization task within certain function evaluations. In the proposed framework, several knowledge transfer strategies (segment-based transfer, DE-based transfer, and feature transfer) were implemented to enable the interaction among the population of each separate binary task [172].

Training a deep neural network (DNN) with sophisticated architectures and a massive amount of parameters is equivalent to solving a highly complex non-convex optimization task. Zhang et al. [170] proposed a novel DNN training framework which formulated multiple related training tasks via a certain sampling method and solved them simultaneously via a MTEC algorithm. During the training process, the intermediate knowledge is identified and shared across all tasks to help their training. Recently, Martinez et al. [171]

also presented a MTEC framework to simultaneously optimize multiple deep Q learning (DQL) models.

By identifying the overlaps between communities and active modules, Chen et al. [73] revealed the complex and dynamic mechanisms of high-level biological phenomena that cannot be achieved through identifying them separately. This MTO problem contains two tasks: identification of active modules and division of network into structural communities.

The optimization problem of fuzzy systems is used to optimize the parameters or (and) structure of the fuzzy system. Zhang et al. [72] presented a general framework of the multi-task genetic fuzzy system (MTGFS) to effectively solve this problem. For the sake of better searches in multiple optimization tasks, an efficient assortative mating method (a chromosome-based shuffling strategy and a cross-task bias estimation based on shuffling) was designed according to the specialty of the membership functions.

Shen et al. [169] proposed a novel multi-objective MTEC for learning multiple largescale fuzzy cognitive maps (FCMs) simultaneously. Each task is treated as a bi-objective problem involving both the differences between the real and learned time series and the sparsity of the whole structure.
