entropy-logo

Journal Browser

Journal Browser

Complexity, Entropy and the Physics of Information II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: 28 February 2025 | Viewed by 2438

Special Issue Editors


E-Mail Website
Guest Editor
Department of Physics, University of Fribourg, CH-1700 Fribourg, Switzerland
Interests: statistical physics; information networks; information economy; complex network
Special Issues, Collections and Topics in MDPI journals
School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China
Interests: complexity science; time series analysis; complex network; data mining; artificial intelligence
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

An information system is an evolutionary complex system that is widely studied by a large community of interdisciplinary scholars of computer scientists, mathematicians, economists, management and especially physicists. The concepts of complexity, entropy, and the physics of information are interconnected and play crucial roles in understanding the information systems across diverse scientific fields ranging from society, science, biology, engineering, management, etc. Complexity generally refers to the intricate, interconnected, or detailed nature of a system. In various fields, such as science, mathematics, and philosophy, complexity is a multidimensional concept that can involve factors like the number of components, their interactions, and the difficulty of understanding or predicting system behavior. It is related to the physics of information in the sense that information theory provides tools to quantify and analyze the complexity of systems. Entropy is a measure of the amount of disorder or randomness in a system, and it is a measure of uncertainty or unpredictability associated with a set of data in information theory. It is often used to quantify the amount of information or surprise associated with the outcomes of a random variable or a physical system. Thus, the physics of information merges principles from physics and information theory to understand how information is represented, transmitted, and processed in diverse information systems.

Meanwhile, with the rapid development of artificial intelligence (AI) techniques, complexity and entropy play important roles, often influencing the design, performance, and understanding of AI systems. Complexity is crucial for developing models that generalize well to unseen data. Entropy, as a measure of uncertainty, helps in understanding and quantifying the reliability of AI predictions. Balancing complexity and entropy is an ongoing challenge, and various techniques and methodologies are employed to strike an optimal balance for effective and reliable AI systems.

This Special Issue focuses on recent advances in the theories and methods in complexity science and statistical physics and their applications in understanding and analyzing information and AI systems in various scientific disciplines encompassing computer science, physics, biomedicine, management, economics, and more.

Prof. Dr. Yi-Cheng Zhang
Dr. Shimin Cai
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information systems
  • neural networks 
  • information theory
  • complexity theory
  • statistical physics
  • complex networks
  • complexity and entropy in AI
  • data-driven modelling
  • machine learning and deep learning
  • time-series analysis

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Related Special Issue

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 274 KiB  
Article
Building Test Batteries Based on Analyzing Random Number Generator Tests within the Framework of Algorithmic Information Theory
by Boris Ryabko
Entropy 2024, 26(6), 513; https://doi.org/10.3390/e26060513 - 14 Jun 2024
Viewed by 713
Abstract
The problem of testing random number generators is considered and a new method for comparing the power of different statistical tests is proposed. It is based on the definitions of random sequence developed in the framework of algorithmic information theory and allows comparing [...] Read more.
The problem of testing random number generators is considered and a new method for comparing the power of different statistical tests is proposed. It is based on the definitions of random sequence developed in the framework of algorithmic information theory and allows comparing the power of different tests in some cases when the available methods of mathematical statistics do not distinguish between tests. In particular, it is shown that tests based on data compression methods using dictionaries should be included in test batteries. Full article
(This article belongs to the Special Issue Complexity, Entropy and the Physics of Information II)
17 pages, 531 KiB  
Article
HEM: An Improved Parametric Link Prediction Algorithm Based on Hybrid Network Evolution Mechanism
by Dejing Ke and Jiansu Pu
Entropy 2023, 25(10), 1416; https://doi.org/10.3390/e25101416 - 5 Oct 2023
Viewed by 1226
Abstract
Link prediction plays an important role in the research of complex networks. Its task is to predict missing links or possible new links in the future via existing information in the network. In recent years, many powerful link prediction algorithms have emerged, which [...] Read more.
Link prediction plays an important role in the research of complex networks. Its task is to predict missing links or possible new links in the future via existing information in the network. In recent years, many powerful link prediction algorithms have emerged, which have good results in prediction accuracy and interpretability. However, the existing research still cannot clearly point out the relationship between the characteristics of the network and the mechanism of link generation, and the predictability of complex networks with different features remains to be further analyzed. In view of this, this article proposes the corresponding link prediction indexes Reg, DFPA and LW on a regular network, scale-free network and small-world network, respectively, and studies their prediction properties on these three network models. At the same time, we propose a parametric hybrid index HEM and compare the prediction accuracies of HEM and many similarity-based indexes on real-world networks. The experimental results show that HEM performs better than other Birnbaum–Saunders. In addition, we study the factors that play a major role in the prediction of HEM and analyze their relationship with the characteristics of real-world networks. The results show that the predictive properties of factors are closely related to the features of networks. Full article
(This article belongs to the Special Issue Complexity, Entropy and the Physics of Information II)
Show Figures

Figure 1

Back to TopTop