Surveys in Algorithm Analysis and Complexity Theory

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Analysis of Algorithms and Complexity Theory".

Deadline for manuscript submissions: closed (31 October 2022) | Viewed by 14089

Special Issue Editor


E-Mail Website
Guest Editor
Department of Communications and Computer Engineering, Graduate School of Informatics, Kyoto University, Yoshida-Honmachi, Sakyo-ku, Kyoto 606-8501, Japan
Interests: graph algorithms; bioinformatics; computational complexity; data structures
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

This is a Special Issue of Algorithms consisting of surveys in theoretical computer science. We invite original articles summarizing recent breakthroughs and/or describing the state of the art in any currently active research area related to algorithms, data structures, or computational complexity. Articles should be well-structured, and each article should focus on a clearly defined topic. In addition, sufficient background, definitions, and figures should be provided to ensure that the text will be accessible to anyone interested in theoretical computer science. Implementation-based surveys that compare the practical performance of various algorithms for a particular computational problem are also invited.

We hope that the surveys published in this Special Issue will be useful for other researchers and become highly cited in the near future.

Dr. Jesper Jansson
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • algorithm analysis
  • modern data structures
  • computational complexity
  • fixed-parameter tractability
  • approximation algorithms
  • lower bounds
  • bioinformatics algorithms
  • computational geometry
  • parallel and distributed computing
  • quantum computing

Related Special Issue

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

2 pages, 170 KiB  
Editorial
Editorial: Surveys in Algorithm Analysis and Complexity Theory (Special Issue)
by Jesper Jansson
Algorithms 2023, 16(4), 188; https://doi.org/10.3390/a16040188 - 30 Mar 2023
Viewed by 979
Abstract
This is a Special Issue of the open-access journal Algorithms consisting of surveys in theoretical computer science [...] Full article
(This article belongs to the Special Issue Surveys in Algorithm Analysis and Complexity Theory)

Research

Jump to: Editorial

28 pages, 489 KiB  
Article
Expansion Lemma—Variations and Applications to Polynomial-Time Preprocessing
by Ashwin Jacob, Diptapriyo Majumdar and Venkatesh Raman
Algorithms 2023, 16(3), 144; https://doi.org/10.3390/a16030144 - 06 Mar 2023
Cited by 1 | Viewed by 1538
Abstract
In parameterized complexity, it is well-known that a parameterized problem is fixed-parameter tractable if and only if it has a kernel—an instance equivalent to the input instance, whose size is just a function of the parameter. The size of the kernel can be [...] Read more.
In parameterized complexity, it is well-known that a parameterized problem is fixed-parameter tractable if and only if it has a kernel—an instance equivalent to the input instance, whose size is just a function of the parameter. The size of the kernel can be exponential or worse, resulting in a quest for fixed-parameter tractable problems with polynomial-sized kernels. The developments in machinery (showing lower bounds for the sizes of the kernels) have led researchers to question what are the asymptotically optimum sizes for the kernels of fixed-parameter tractable problems. In this article, we surveyed a tool called expansion lemma that helps in reducing the size of the kernel. Its early origin was in the form of crown decomposition, i.e., to obtain the linear kernel for the Vertex Cover problem; the specific lemma was identified as the tool behind the optimal O(k2) kernel for the undirected feedback vertex set problem. Since then, several variations and extensions of the tool have been discovered. We surveyed them along with their applications in this article. Full article
(This article belongs to the Special Issue Surveys in Algorithm Analysis and Complexity Theory)
Show Figures

Figure 1

21 pages, 498 KiB  
Article
Key Concepts, Weakness and Benchmark on Hash Table Data Structures
by Santiago Tapia-Fernández, Daniel García-García and Pablo García-Hernandez
Algorithms 2022, 15(3), 100; https://doi.org/10.3390/a15030100 - 21 Mar 2022
Cited by 2 | Viewed by 3164
Abstract
Most computer programs or applications need fast data structures. The performance of a data structure is necessarily influenced by the complexity of its common operations; thus, any data-structure that exhibits a theoretical complexity of amortized constant time in several of its main operations [...] Read more.
Most computer programs or applications need fast data structures. The performance of a data structure is necessarily influenced by the complexity of its common operations; thus, any data-structure that exhibits a theoretical complexity of amortized constant time in several of its main operations should draw a lot of attention. Such is the case of a family of data structures that is called hash tables. However, what is the real efficiency of these hash tables? That is an interesting question with no simple answer and there are some issues to be considered. Of course, there is not a unique hash table; in fact, there are several sub-groups of hash tables, and, even more, not all programming languages use the same variety of hash tables in their default hash table implementation, neither they have the same interface. Nevertheless, all hash tables do have a common issue: they have to solve hash collisions; that is a potential weakness and it also induces a classification of hash tables according to the strategy to solve collisions. In this paper, some key concepts about hash tables are exposed and some definitions about those key concepts are reviewed and clarified, especially in order to study the characteristics of the main strategies to implement hash tables and how they deal with hash collisions. Then, some benchmark cases are designed and presented to assess the performance of hash tables. The cases have been designed to be randomized, to be self-tested, to be representative of a real user cases, and to expose and analyze the impact of different factors over the performance across different hash tables and programming languages. Then, all cases have been programmed using C++, Java and Python and analyzed in terms of interfaces and efficiency (time and memory). The benchmark yields important results about the performance of these structures and its (lack of) relationship with complexity analysis. Full article
(This article belongs to the Special Issue Surveys in Algorithm Analysis and Complexity Theory)
Show Figures

Figure 1

29 pages, 448 KiB  
Article
Searching Monotone Arrays: A Survey
by Márcia R. Cappelle, Les R. Foulds and Humberto J. Longo
Algorithms 2022, 15(1), 10; https://doi.org/10.3390/a15010010 - 26 Dec 2021
Cited by 1 | Viewed by 2595
Abstract
Given a monotone ordered multi-dimensional real array A and a real value k, an important question in computation is to establish if k is a member of A by sequentially searching A by comparing k with some of its entries. This search [...] Read more.
Given a monotone ordered multi-dimensional real array A and a real value k, an important question in computation is to establish if k is a member of A by sequentially searching A by comparing k with some of its entries. This search problem and its known results are surveyed, including the case when A has sizes not necessarily equal. Worst case search algorithms for various types of arrays of finite dimension and sizes are reported. Each algorithm has order strictly less than the product of the sizes of the array. Present challenges and open problems in the area are also presented. Full article
(This article belongs to the Special Issue Surveys in Algorithm Analysis and Complexity Theory)
Show Figures

Figure 1

10 pages, 265 KiB  
Article
A Meeting Point of Probability, Graphs, and Algorithms: The Lovász Local Lemma and Related Results—A Survey
by András Faragó
Algorithms 2021, 14(12), 355; https://doi.org/10.3390/a14120355 - 08 Dec 2021
Cited by 2 | Viewed by 2922
Abstract
A classic and fundamental result, known as the Lovász Local Lemma, is a gem in the probabilistic method of combinatorics. At a high level, its core message can be described by the claim that weakly dependent events behave similarly to independent ones. A [...] Read more.
A classic and fundamental result, known as the Lovász Local Lemma, is a gem in the probabilistic method of combinatorics. At a high level, its core message can be described by the claim that weakly dependent events behave similarly to independent ones. A fascinating feature of this result is that even though it is a purely probabilistic statement, it provides a valuable and versatile tool for proving completely deterministic theorems. The Lovász Local Lemma has found many applications; despite being originally published in 1973, it still attracts active novel research. In this survey paper, we review various forms of the Lemma, as well as some related results and applications. Full article
(This article belongs to the Special Issue Surveys in Algorithm Analysis and Complexity Theory)
Back to TopTop