Discrete Optimization Theory, Algorithms, and Applications

A special issue of Algorithms (ISSN 1999-4893).

Deadline for manuscript submissions: closed (1 February 2022) | Viewed by 14363

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science and Information Engineering, Chaoyang University of Technology, Wufeng 413310, Taichung, Taiwan
Interests: graph algorithms; linux system; applications for smart phones; security on IoT
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear colleagues,

Discrete optimization is to express the optimization problem such that some or all of the variables in the problem are required to belong to a discrete set. In this special issue, it will publish research papers on the mathematical, computational and applied aspects of all areas of discrete and combinatorial optimization.

We invite you to submit high quality papers to this Special Issue on “Discrete Optimization Theory, Algorithms, and Applications”, with subjects covering the whole range from theory to applications. Topics of interest include, but are not limited to, the following:

  • Graph algorithms and theory
  • Computation theory
  • Discrete applied mathematics
  • Approximation algorithms
  • Interconnection networks
  • Time complexity analysis
  • Theoretical computer science
  • IoT realted analysis and security
  • Big data model and analysis

Prof. Dr. Ruo-Wei Hung
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Graph algorithms and theory
  • Computation theory
  • Discrete applied mathematics
  • Approximation algorithms
  • Interconnection networks
  • Time complexity analysis
  • Theoretical computer science
  • IoT realted analysis and security
  • Big data model and analysis

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 1219 KiB  
Article
Finding Hamiltonian and Longest (s,t)-Paths of C-Shaped Supergrid Graphs in Linear Time
by Fatemeh Keshavarz-Kohjerdi and Ruo-Wei Hung
Algorithms 2022, 15(2), 61; https://doi.org/10.3390/a15020061 - 13 Feb 2022
Cited by 8 | Viewed by 2164
Abstract
A graph is called Hamiltonian connected if it contains a Hamiltonian path between any two distinct vertices. In the past, we proved the Hamiltonian path and cycle problems for general supergrid graphs to be NP-complete. However, they are still open for solid supergrid [...] Read more.
A graph is called Hamiltonian connected if it contains a Hamiltonian path between any two distinct vertices. In the past, we proved the Hamiltonian path and cycle problems for general supergrid graphs to be NP-complete. However, they are still open for solid supergrid graphs. In this paper, first we will verify the Hamiltonian cycle property of C-shaped supergrid graphs, which are a special case of solid supergrid graphs. Next, we show that C-shaped supergrid graphs are Hamiltonian connected except in a few conditions. For these excluding conditions of Hamiltonian connectivity, we compute their longest paths. Then, we design a linear-time algorithm to solve the longest path problem in these graphs. The Hamiltonian connectivity of C-shaped supergrid graphs can be applied to compute the optimal stitching trace of computer embroidery machines, and construct the minimum printing trace of 3D printers with a C-like component being printed. Full article
(This article belongs to the Special Issue Discrete Optimization Theory, Algorithms, and Applications)
Show Figures

Graphical abstract

26 pages, 482 KiB  
Article
k-Center Clustering with Outliers in Sliding Windows
by Paolo Pellizzoni, Andrea Pietracaprina and Geppino Pucci
Algorithms 2022, 15(2), 52; https://doi.org/10.3390/a15020052 - 31 Jan 2022
Cited by 5 | Viewed by 3411
Abstract
Metric k-center clustering is a fundamental unsupervised learning primitive. Although widely used, this primitive is heavily affected by noise in the data, so a more sensible variant seeks for the best solution that disregards a given number z of points of the [...] Read more.
Metric k-center clustering is a fundamental unsupervised learning primitive. Although widely used, this primitive is heavily affected by noise in the data, so a more sensible variant seeks for the best solution that disregards a given number z of points of the dataset, which are called outliers. We provide efficient algorithms for this important variant in the streaming model under the sliding window setting, where, at each time step, the dataset to be clustered is the window W of the most recent data items. For general metric spaces, our algorithms achieve O1 approximation and, remarkably, require a working memory linear in k+z and only logarithmic in |W|. For spaces of bounded doubling dimension, the approximation can be made arbitrarily close to 3. For these latter spaces, we show, as a by-product, how to estimate the effective diameter of the window W, which is a measure of the spread of the window points, disregarding a given fraction of noisy distances. We also provide experimental evidence of the practical viability of the improved clustering and diameter estimation algorithms. Full article
(This article belongs to the Special Issue Discrete Optimization Theory, Algorithms, and Applications)
Show Figures

Figure 1

17 pages, 583 KiB  
Article
Using Decision Trees and Random Forest Algorithms to Predict and Determine Factors Contributing to First-Year University Students’ Learning Performance
by Thao-Trang Huynh-Cam, Long-Sheng Chen and Huynh Le
Algorithms 2021, 14(11), 318; https://doi.org/10.3390/a14110318 - 30 Oct 2021
Cited by 29 | Viewed by 7651
Abstract
First-year students’ learning performance has received much attention in educational practice and theory. Previous works used some variables, which should be obtained during the course or in the progress of the semester through questionnaire surveys and interviews, to build prediction models. These models [...] Read more.
First-year students’ learning performance has received much attention in educational practice and theory. Previous works used some variables, which should be obtained during the course or in the progress of the semester through questionnaire surveys and interviews, to build prediction models. These models cannot provide enough timely support for the poor performance students, caused by economic factors. Therefore, other variables are needed that allow us to reach prediction results earlier. This study attempts to use family background variables that can be obtained prior to the start of the semester to build learning performance prediction models of freshmen using random forest (RF), C5.0, CART, and multilayer perceptron (MLP) algorithms. The real sample of 2407 freshmen who enrolled in 12 departments of a Taiwan vocational university will be employed. The experimental results showed that CART outperforms C5.0, RF, and MLP algorithms. The most important features were mother’s occupations, department, father’s occupations, main source of living expenses, and admission status. The extracted knowledge rules are expected to be indicators for students’ early performance prediction so that strategic intervention can be planned before students begin the semester. Full article
(This article belongs to the Special Issue Discrete Optimization Theory, Algorithms, and Applications)
Show Figures

Figure 1

Back to TopTop