entropy-logo

Journal Browser

Journal Browser

Distributed Signal Processing for Coding and Information Theory

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (15 June 2021) | Viewed by 11338

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical, Computer, and Systems Engineering, Rensselaer Polytechnic Institute, Troy, NY 12180, USA
Interests: communications; networks; information theory; point processes

Special Issue Information

Dear Colleagues,

With the ever-growing demand for connectivity in wireless networks, limitations such as topological constraints and storage capabilities challenge the processing of a large volume of data. The lack of a unified theory for deciding how to distribute various computation tasks in networks further exacerbates the understanding of the fundamental limits of processing. Motivated by these challenges, the goal of this Special Issue is to bring together the use of low-complexity distributed signal processing algorithms as well as techniques from information theory in order to enable efficient computation of complex tasks in networks by parallelizing the processing. The key directions include devising distributed coding or compression techniques to exploit the tradeoffs between communication and computation complexities of general tasks, understanding the computation rate region for general networks, and exploiting the flexibility of the topology under different rate-distortion requirements.

The topics of the Special Issue include (but are not restricted to) distributed quantization, compressive sensing, group testing, principle inertia components and their possible use cases in extracting representations in digital domain, and distributed computation in networks. We believe that the contributions from distinguished researchers in related fields will advance the understanding of the joint role of information-theory and distributed processing in networks. 

Dr. Derya Malak
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Function compression
  • Computation
  • Distributed source coding
  • Networks
  • Rate distortion

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 484 KiB  
Article
Source Symbol Purging-Based Distributed Conditional Arithmetic Coding
by Jingjian Li, Wei Wang, Hong Mo, Mengting Zhao and Jianhua Chen
Entropy 2021, 23(8), 983; https://doi.org/10.3390/e23080983 - 30 Jul 2021
Cited by 2 | Viewed by 1815
Abstract
A distributed arithmetic coding algorithm based on source symbol purging and using the context model is proposed to solve the asymmetric Slepian–Wolf problem. The proposed scheme is to make better use of both the correlation between adjacent symbols in the source sequence and [...] Read more.
A distributed arithmetic coding algorithm based on source symbol purging and using the context model is proposed to solve the asymmetric Slepian–Wolf problem. The proposed scheme is to make better use of both the correlation between adjacent symbols in the source sequence and the correlation between the corresponding symbols of the source and the side information sequences to improve the coding performance of the source. Since the encoder purges a part of symbols from the source sequence, a shorter codeword length can be obtained. Those purged symbols are still used as the context of the subsequent symbols to be encoded. An improved calculation method for the posterior probability is also proposed based on the purging feature, such that the decoder can utilize the correlation within the source sequence to improve the decoding performance. In addition, this scheme achieves better error performance at the decoder by adding a forbidden symbol in the encoding process. The simulation results show that the encoding complexity and the minimum code rate required for lossless decoding are lower than that of the traditional distributed arithmetic coding. When the internal correlation strength of the source is strong, compared with other DSC schemes, the proposed scheme exhibits a better decoding performance under the same code rate. Full article
(This article belongs to the Special Issue Distributed Signal Processing for Coding and Information Theory)
Show Figures

Figure 1

21 pages, 9710 KiB  
Article
Parallel Mixed Image Encryption and Extraction Algorithm Based on Compressed Sensing
by Jiayin Yu, Chao Li, Xiaomeng Song, Shiyu Guo and Erfu Wang
Entropy 2021, 23(3), 278; https://doi.org/10.3390/e23030278 - 25 Feb 2021
Cited by 17 | Viewed by 2260
Abstract
In the actual image processing process, we often encounter mixed images that contain multiple valid messages. Such images not only need to be transmitted safely, but also need to be able to achieve effective separation at the receiving end. This paper designs a [...] Read more.
In the actual image processing process, we often encounter mixed images that contain multiple valid messages. Such images not only need to be transmitted safely, but also need to be able to achieve effective separation at the receiving end. This paper designs a secure and efficient encryption and separation algorithm based on this kind of mixed image. Since chaotic system has the characteristics of initial sensitivity and pseudo-randomness, a chaos matrix is introduced into the compressed sensing framework. By using sequence signal to adjust the chaotic system, the key space can be greatly expanded. In the algorithm, we take the way of parallel transmission to block the data. This method can realize the efficient calculation of complex tasks in the image encryption system and improve the data processing speed. In the decryption part, the algorithm in this paper can not only realize the restoration of images, but also complete the effective separation of images through the improved restoration algorithm. Full article
(This article belongs to the Special Issue Distributed Signal Processing for Coding and Information Theory)
Show Figures

Figure 1

20 pages, 671 KiB  
Article
Improving the Accuracy of the Fast Inverse Square Root by Modifying Newton–Raphson Corrections
by Cezary J. Walczyk, Leonid V. Moroz and Jan L. Cieśliński
Entropy 2021, 23(1), 86; https://doi.org/10.3390/e23010086 - 9 Jan 2021
Cited by 11 | Viewed by 3858
Abstract
Direct computation of functions using low-complexity algorithms can be applied both for hardware constraints and in systems where storage capacity is a challenge for processing a large volume of data. We present improved algorithms for fast calculation of the inverse square root function [...] Read more.
Direct computation of functions using low-complexity algorithms can be applied both for hardware constraints and in systems where storage capacity is a challenge for processing a large volume of data. We present improved algorithms for fast calculation of the inverse square root function for single-precision and double-precision floating-point numbers. Higher precision is also discussed. Our approach consists in minimizing maximal errors by finding optimal magic constants and modifying the Newton–Raphson coefficients. The obtained algorithms are much more accurate than the original fast inverse square root algorithm and have similar very low computational costs. Full article
(This article belongs to the Special Issue Distributed Signal Processing for Coding and Information Theory)
Show Figures

Figure 1

14 pages, 5041 KiB  
Article
Side Information Generation Scheme Based on Coefficient Matrix Improvement Model in Transform Domain Distributed Video Coding
by Wei Wang and Jianhua Chen
Entropy 2020, 22(12), 1427; https://doi.org/10.3390/e22121427 - 17 Dec 2020
Cited by 5 | Viewed by 2533
Abstract
In order to effectively improve the quality of side information in distributed video coding, we propose a side information generation scheme based on a coefficient matrix improvement model. The discrete cosine transform coefficient bands of the Wyner–Ziv frame at the encoder side are [...] Read more.
In order to effectively improve the quality of side information in distributed video coding, we propose a side information generation scheme based on a coefficient matrix improvement model. The discrete cosine transform coefficient bands of the Wyner–Ziv frame at the encoder side are divided into entropy coding coefficient bands and distributed video coding coefficient bands, and then the coefficients of entropy coding coefficient bands are sampled, which are divided into sampled coefficients and unsampled coefficients. For sampled coefficients, an adaptive arithmetic encoder is used for lossless compression. For unsampled coefficients and the coefficients of distributed video coding coefficient bands, the low density parity check accumulate encoder is used to calculate the parity bits, which are stored in the buffer and transmitted in small amount upon decoder request. At the decoder side, the optical flow method is used to generate the initial side information, and the initial side information is improved according to the sampled coefficients by using the coefficient matrix improvement model. The experimental results demonstrate that the proposed side information generation scheme based on the coefficient matrix improvement model can effectively improve the quality of side information, and the quality of the generated side information is improved by about 0.2–0.4 dB, thereby improving the overall performance of the distributed video coding system. Full article
(This article belongs to the Special Issue Distributed Signal Processing for Coding and Information Theory)
Show Figures

Figure 1

Back to TopTop