entropy-logo

Journal Browser

Journal Browser

Semantic Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 30 April 2025 | Viewed by 983

Special Issue Editors


E-Mail Website
Guest Editor
Department of Electronic Engineering, Shanghai Jiao Tong University, Shanghai 200240, China
Interests: edge learning; semantic communication; integrated communication-computing-sensing; MIMO beamforming

E-Mail Website
Guest Editor
The Key Laboratory of Universal Wireless Communications, Beijing University of Posts and Telecommunications, Beijing, China
Interests: information theory and channel coding; signal processing based on machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Information Science and Technology, Shanghai Tech University, Shanghai, China
Interests: coded caching; distributed computing; federated learning; joint source–channel coding; communication reliability
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information theory laid the foundation for modern communication systems by quantifying information and establishing fundamental limits for reliable data transmission over noisy channels. However, it primarily focuses on the syntactic aspects of data transmission without considering the semantic meaning. Semantic communication, in contrast, concentrates on the semantic or effectiveness levels and aims to extract and convey the information needed to make the receiver accomplish a goal with the desired effectiveness. This paradigm shift is envisioned to enhance efficiency and timeliness in a variety of emerging applications, such as autonomous driving, remote robotics, meta-universe, drone inspection, and more. Due to its exceptional advantages in communication efficiency and compatibility with emerging AI applications, semantic communication has garnered significant attention.

Despite the notable progress made in semantic communications as empowered by AI, a unified theoretical foundation and operational analytical tools for semantic communications remain elusive. The relationship between traditional information theory and semantic communications is not yet established. Specifically, the extent to which information theory can address challenges in semantic communication is yet to be explored. This Special Issue aims to bridge this gap by fostering the convergence of information theory and semantic communications. Prospective authors are invited to submit original manuscripts on topics including, but not limited to, the following:

  • Novel theoretical frameworks for semantic communications.
  • Source and channel coding with semantic aspects.
  • Rate-distortion-perception trade-off.
  • Application of information theory for analysis and optimization of semantic communication systems.
  • End-to-end design of semantic communications.
  • Resource management for semantic communications.
  • Privacy and security issues in semantic communications.
  • Analysis and design of multi-user semantic communications.

Prof. Dr. Meixia Tao
Prof. Dr. Kai Niu
Dr. Youlong Wu
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • semantic metric
  • information bottleneck
  • rate-distortion-perception trade-off
  • remote source coding
  • joint source–channel coding
  • task-oriented communications
  • AI-based communications
  • end-to-end communications
  • multi-modality communications

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

30 pages, 1467 KiB  
Article
Rate–Distortion–Perception Trade-Off in Information Theory, Generative Models, and Intelligent Communications
by Xueyan Niu, Bo Bai, Nian Guo, Weixi Zhang and Wei Han
Entropy 2025, 27(4), 373; https://doi.org/10.3390/e27040373 - 31 Mar 2025
Viewed by 46
Abstract
Traditional rate–distortion (RD) theory examines the trade-off between the average length of the compressed representation of a source and the additive distortions of its reconstruction. The rate–distortion–perception (RDP) framework, which integrates the perceptual dimension into the RD paradigm, has garnered significant attention due [...] Read more.
Traditional rate–distortion (RD) theory examines the trade-off between the average length of the compressed representation of a source and the additive distortions of its reconstruction. The rate–distortion–perception (RDP) framework, which integrates the perceptual dimension into the RD paradigm, has garnered significant attention due to recent advancements in machine learning, where perceptual fidelity is assessed by the divergence between input and reconstruction distributions. In communication systems where downstream tasks involve generative modeling, high perceptual fidelity is essential, despite distortion constraints. However, while zero distortion implies perfect realism, the converse is not true, highlighting an imbalance in the significance of distortion and perceptual constraints. This article clarifies that incorporating perceptual constraints does not decrease the necessary rate; instead, under certain conditions, additional rate is required, even with the aid of common and private randomness, which are key elements in generative models. Consequently, we project an increase in expected traffic in intelligent communication networks with the consideration of perceptual quality. Nevertheless, a modest increase in rate can enable generative models to significantly enhance the perceptual quality of reconstructions. By exploring the synergies between generative modeling and communication through the lens of information-theoretic results, this article demonstrates the benefits of intelligent communication systems and advocates for the application of the RDP framework in advancing compression and semantic communication research. Full article
(This article belongs to the Special Issue Semantic Information Theory)
Show Figures

Figure 1

23 pages, 1113 KiB  
Article
Feature-Driven Semantic Communication for Efficient Image Transmission
by Ji Zhang, Ying Zhang, Baofeng Ji, Anmin Chen, Aoxue Liu and Hengzhou Xu
Entropy 2025, 27(4), 369; https://doi.org/10.3390/e27040369 - 31 Mar 2025
Viewed by 56
Abstract
Semantic communication is an emerging approach that enhances transmission efficiency by conveying the semantic content of information more effectively. It has garnered significant attention in recent years. However, existing semantic communication systems for image transmission typically adopt direct transmission of features or uniformly [...] Read more.
Semantic communication is an emerging approach that enhances transmission efficiency by conveying the semantic content of information more effectively. It has garnered significant attention in recent years. However, existing semantic communication systems for image transmission typically adopt direct transmission of features or uniformly compress features before transmission. They have not yet considered the differential impact of features on image recovery at the receiver end and the issue of bandwidth limitations during actual transmission. This paper shows that non-uniform processing of features leads to better image recovery under bandwidth constraints compared to uniform processing. Based on this, we propose a semantic communication system for image transmission, which introduces non-uniform quantization techniques. In the feature transmission stage, the system performs varying levels of quantization based on the differences in feature performance at the receiver, thereby reducing the bandwidth requirement. Inspired by quantitative quantization techniques, we design a non-uniform quantization algorithm capable of dynamic bit allocation. This algorithm, under bandwidth constraints, dynamically adjusts the quantization precision of features based on their contribution to the completion of tasks at the receiver end, ensuring the quality and accuracy of the transmitted data even under limited bandwidth conditions. Experimental results show that the proposed system reduces bandwidth usage while ensuring image reconstruction quality. Full article
(This article belongs to the Special Issue Semantic Information Theory)
Show Figures

Figure 1

10 pages, 273 KiB  
Article
Broadcast Channel Cooperative Gain: An Operational Interpretation of Partial Information Decomposition
by Chao Tian and Shlomo Shamai (Shitz)
Entropy 2025, 27(3), 310; https://doi.org/10.3390/e27030310 - 15 Mar 2025
Viewed by 245
Abstract
Partial information decomposition has recently found applications in biological signal processing and machine learning. Despite its impacts, the decomposition was introduced through an informal and heuristic route, and its exact operational meaning is unclear. In this work, we fill this gap by connecting [...] Read more.
Partial information decomposition has recently found applications in biological signal processing and machine learning. Despite its impacts, the decomposition was introduced through an informal and heuristic route, and its exact operational meaning is unclear. In this work, we fill this gap by connecting partial information decomposition to the capacity of the broadcast channel, which has been well studied in the information theory literature. We show that the synergistic information in the decomposition can be rigorously interpreted as the cooperative gain, or a lower bound of this gain, on the corresponding broadcast channel. This interpretation can help practitioners to better explain and expand the applications of the partial information decomposition technique. Full article
(This article belongs to the Special Issue Semantic Information Theory)
Show Figures

Figure 1

Back to TopTop