entropy-logo

Journal Browser

Journal Browser

Unconventional Methods for Particle Swarm Optimization

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (31 January 2020) | Viewed by 17713

Special Issue Editor


E-Mail Website
Guest Editor
NOVA Information Management School (NOVA IMS), Universidade Nova of Lisbon, Campus de Campolide, 1070-312 Lisboa, Portugal
Interests: machine learning; genetic programming; particle swarm optimization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Particle swarm optimization (PSO) is a population-based optimization metaheuristic inspired by the collective dynamics of groups of animals, like insects, birds, and fishes. Recent research trends have indicated the potentiality of the approach and its large possibilities of improvement. With the term “unconventional methods for PSO”, here, we mean modifications of the standard PSO, with the objective of improving its performance, or bestowing on it some particular properties. For instance, new methods for choosing the inertia weight, constriction factor, cognition and social weights; parallelizing PSO in several different ways; defining hybrid algorithms in which PSO is integrated with other types of metaheuristic optimization methods; entropy-based PSO; etc. The study of unconventional methods for PSO is a very lively and active research field, and the objective of this Special Issue is to collect contributions in this recent and exciting area, with particular focus on entropic, information-theoretic, or probability theoretic techniques.

Prof. Dr. Leonardo Vanneschi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Entropy-based PSO
  • Information Theory for PSO
  • Probability Theory for PSO
  • Theoretically motivated hybrid PSO systems
  • Theoretically motivated parallelizations of PSO
  • Theoretically motivated niching
  • New accelertion strategies
  • Automatic static and/or dynamic parameter setting
  • Improvements and/or specializations of particle movements
  • PSO for the optimization/improvement of machine learning methods
  • Real-life applications using theortically motivated unconventional PSO systems

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 511 KiB  
Article
Particle Swarm Contour Search Algorithm
by Dominik Weikert, Sebastian Mai and Sanaz Mostaghim
Entropy 2020, 22(4), 407; https://doi.org/10.3390/e22040407 - 02 Apr 2020
Cited by 6 | Viewed by 2560
Abstract
In this article, we present a new algorithm called Particle Swarm Contour Search (PSCS)—a Particle Swarm Optimisation inspired algorithm to find object contours in 2D environments. Currently, most contour-finding algorithms are based on image processing and require a complete overview of the search [...] Read more.
In this article, we present a new algorithm called Particle Swarm Contour Search (PSCS)—a Particle Swarm Optimisation inspired algorithm to find object contours in 2D environments. Currently, most contour-finding algorithms are based on image processing and require a complete overview of the search space in which the contour is to be found. However, for real-world applications this would require a complete knowledge about the search space, which may not be always feasible or possible. The proposed algorithm removes this requirement and is only based on the local information of the particles to accurately identify a contour. Particles search for the contour of an object and then traverse alongside using their known information about positions in- and out-side of the object. Our experiments show that the proposed PSCS algorithm can deliver comparable results as the state-of-the-art. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Figure 1

17 pages, 9676 KiB  
Article
Surfing on Fitness Landscapes: A Boost on Optimization by Fourier Surrogate Modeling
by Luca Manzoni, Daniele M. Papetti, Paolo Cazzaniga, Simone Spolaor, Giancarlo Mauri, Daniela Besozzi and Marco S. Nobile
Entropy 2020, 22(3), 285; https://doi.org/10.3390/e22030285 - 29 Feb 2020
Cited by 15 | Viewed by 3995
Abstract
Surfing in rough waters is not always as fun as wave riding the “big one”. Similarly, in optimization problems, fitness landscapes with a huge number of local optima make the search for the global optimum a hard and generally annoying game. Computational Intelligence [...] Read more.
Surfing in rough waters is not always as fun as wave riding the “big one”. Similarly, in optimization problems, fitness landscapes with a huge number of local optima make the search for the global optimum a hard and generally annoying game. Computational Intelligence optimization metaheuristics use a set of individuals that “surf” across the fitness landscape, sharing and exploiting pieces of information about local fitness values in a joint effort to find out the global optimum. In this context, we designed surF, a novel surrogate modeling technique that leverages the discrete Fourier transform to generate a smoother, and possibly easier to explore, fitness landscape. The rationale behind this idea is that filtering out the high frequencies of the fitness function and keeping only its partial information (i.e., the low frequencies) can actually be beneficial in the optimization process. We prove our theory by combining surF with a settings free variant of Particle Swarm Optimization (PSO) based on Fuzzy Logic, called Fuzzy Self-Tuning PSO. Specifically, we introduce a new algorithm, named F3ST-PSO, which performs a preliminary exploration on the surrogate model followed by a second optimization using the actual fitness function. We show that F3ST-PSO can lead to improved performances, notably using the same budget of fitness evaluations. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Graphical abstract

21 pages, 544 KiB  
Article
A Self-Adaptive Discrete PSO Algorithm with Heterogeneous Parameter Values for Dynamic TSP
by Łukasz Strąk, Rafał Skinderowicz, Urszula Boryczka and Arkadiusz Nowakowski
Entropy 2019, 21(8), 738; https://doi.org/10.3390/e21080738 - 27 Jul 2019
Cited by 18 | Viewed by 3855
Abstract
This paper presents a discrete particle swarm optimization (DPSO) algorithm with heterogeneous (non-uniform) parameter values for solving the dynamic traveling salesman problem (DTSP). The DTSP can be modeled as a sequence of static sub-problems, each of which is an instance of the TSP. [...] Read more.
This paper presents a discrete particle swarm optimization (DPSO) algorithm with heterogeneous (non-uniform) parameter values for solving the dynamic traveling salesman problem (DTSP). The DTSP can be modeled as a sequence of static sub-problems, each of which is an instance of the TSP. In the proposed DPSO algorithm, the information gathered while solving a sub-problem is retained in the form of a pheromone matrix and used by the algorithm while solving the next sub-problem. We present a method for automatically setting the values of the key DPSO parameters (except for the parameters directly related to the computation time and size of a problem).We show that the diversity of parameters values has a positive effect on the quality of the generated results. Furthermore, the population in the proposed algorithm has a higher level of entropy. We compare the performance of the proposed heterogeneous DPSO with two ant colony optimization (ACO) algorithms. The proposed algorithm outperforms the base DPSO and is competitive with the ACO. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Graphical abstract

15 pages, 897 KiB  
Article
User-Oriented Summaries Using a PSO Based Scoring Optimization Method
by Augusto Villa-Monte, Laura Lanzarini, Aurelio F. Bariviera and José A. Olivas
Entropy 2019, 21(6), 617; https://doi.org/10.3390/e21060617 - 22 Jun 2019
Cited by 7 | Viewed by 2871
Abstract
Automatic text summarization tools have a great impact on many fields, such as medicine, law, and scientific research in general. As information overload increases, automatic summaries allow handling the growing volume of documents, usually by assigning weights to the extracted phrases based on [...] Read more.
Automatic text summarization tools have a great impact on many fields, such as medicine, law, and scientific research in general. As information overload increases, automatic summaries allow handling the growing volume of documents, usually by assigning weights to the extracted phrases based on their significance in the expected summary. Obtaining the main contents of any given document in less time than it would take to do that manually is still an issue of interest. In this article, a new method is presented that allows automatically generating extractive summaries from documents by adequately weighting sentence scoring features using Particle Swarm Optimization. The key feature of the proposed method is the identification of those features that are closest to the criterion used by the individual when summarizing. The proposed method combines a binary representation and a continuous one, using an original variation of the technique developed by the authors of this paper. Our paper shows that using user labeled information in the training set helps to find better metrics and weights. The empirical results yield an improved accuracy compared to previous methods used in this field. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Figure 1

23 pages, 1570 KiB  
Article
Competitive Particle Swarm Optimization for Multi-Category Text Feature Selection
by Jaesung Lee, Jaegyun Park, Hae-Cheon Kim and Dae-Won Kim
Entropy 2019, 21(6), 602; https://doi.org/10.3390/e21060602 - 18 Jun 2019
Cited by 7 | Viewed by 3803
Abstract
Multi-label feature selection is an important task for text categorization. This is because it enables learning algorithms to focus on essential features that foreshadow relevant categories, thereby improving the accuracy of text categorization. Recent studies have considered the hybridization of evolutionary feature wrappers [...] Read more.
Multi-label feature selection is an important task for text categorization. This is because it enables learning algorithms to focus on essential features that foreshadow relevant categories, thereby improving the accuracy of text categorization. Recent studies have considered the hybridization of evolutionary feature wrappers and filters to enhance the evolutionary search process. However, the relative effectiveness of feature subset searches of evolutionary and feature filter operators has not been considered. This results in degenerated final feature subsets. In this paper, we propose a novel hybridization approach based on competition between the operators. This enables the proposed algorithm to apply each operator selectively and modify the feature subset according to its relative effectiveness, unlike conventional methods. The experimental results on 16 text datasets verify that the proposed method is superior to conventional methods. Full article
(This article belongs to the Special Issue Unconventional Methods for Particle Swarm Optimization)
Show Figures

Figure 1

Back to TopTop