Next Article in Journal
Sequence Versus Composition: What Prescribes IDP Biophysical Properties?
Previous Article in Journal
Improving Parameter Estimation of Entropic Uncertainty Relation in Continuous-Variable Quantum Key Distribution
Previous Article in Special Issue
Sum-Rate of Multi-User MIMO Systems with Multi-Cell Pilot Contamination in Correlated Rayleigh Fading Channel
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Information Theory Applications in Signal Processing

by
Sergio Cruces
1,*,
Rubén Martín-Clemente
1,* and
Wojciech Samek
2,*
1
Departamento de Teoría de la Señal y Comunicaciones, Universidad de Sevilla, Camino de los Descubrimientos s/n, 41092 Seville, Spain
2
Fraunhofer Heinrich Hertz Institute HHI, 10587 Berlin, Germany
*
Authors to whom correspondence should be addressed.
Entropy 2019, 21(7), 653; https://doi.org/10.3390/e21070653
Submission received: 18 June 2019 / Accepted: 1 July 2019 / Published: 3 July 2019
(This article belongs to the Special Issue Information Theory Applications in Signal Processing)
The birth of Information Theory, right after the pioneering work of Claude Shannon and his celebrated publication of the paper “A mathematical theory of Communication” [1], was a milestone that fuelled the posterior development of modern communications. Since its origins, this discipline has progressively expanded its influence on statistical signal processing, a closely related research field.
Information Theory has contributed to the statistical foundations and clarification of key concepts or underlying limits, not only in communications [2,3], but also in several other signal processing areas, such as: time series analysis [4], estimation theory [5,6], detection theory [7], machine learning [8], statistical modeling [9], image and multimedia processing [10], speech and audio processing [11,12], neural networks [13] and deep learning systems [14,15]. All of these are dynamic and fast-growing research fields that have to cope with increasingly complex scenarios and novel applications. Hence, as it was stated in the invitation to the Special Issue: “… there is a need for specific information theoretic criteria and algorithms that work in each of the considered situations and attain a set of desired goals, for instance, an enhancement in the interpretability of the solutions, improvements in performance, robustness with respect to the model uncertainties and possible data perturbations, a reliable convergence for the algorithms and any other kind of theoretical guarantees”.
Given the previously described context, the main focus of this Special Issue is on recent developments in information theory applications for advanced methods in signal processing. The selected collection of works comprises seventeen original contributions that span a wide range of signal processing topics. We briefly summarize below their scope and contents.
Five of the articles [16,17,18,19,20] are devoted to the analysis of the latent components of the observations. In [16], the authors suggest a nonlinear estimation of the partial correlation coefficients, due to its potential applications in graph signal processing. These estimates, which assume an underlying non-Gaussian density model based on a mixture of mutually independent components, better capture the connectivity of the graph in this kind of scenarios.
The works in [17,18] present some unsupervised learning criteria based on the family of Alpha–Beta divergences and validate them through both synthetic and real experiments. These generalized dissimilarity measures are governed by the two real parameters, α and β , that smoothly connect several classical divergences (see [21]). In [17], Sarmiento et al. obtain the closed-form expressions for the α β -centroids, and propose an original α β -k-means algorithm for centroid-based clustering. Delmaire et al. propose in [18] several informed and weighted α β -non-negative matrix factorization methods. The contribution reveals that the combination of the factorization criterion with a set of structural constraints (which are derived from some available information in the chemical source apportionment problem) enhances the robustness of the decomposition with respect to possible contaminations by noise and outliers.
In [19], Pinchas studies the problem of finding closed-form approximated expressions for the conditional expectation of blind adaptive deconvolution problem. She proposes an estimate that improves the one obtained via the maximum entropy density approximation technique and that continues to be reliable for low signal-to-noise ratios, whereas, in [20], Wu et al. address the application of an enhanced combination of compressed sensing with information based techniques for tracking the health monitoring of a machinery process and the prediction of its remaining useful life.
Within machine learning, ensemble learning techniques exploit the integration of multiple algorithms to obtain a better or more robust predictive performance. In [22], Vigneron et al. analyze the optimal combination rule for an ensemble of weak rank classifiers. Their aim is to obtain an improved decision and also to propose a suitable information measure for quantifying their degree of consensus.
Data and image analysis are also important areas of research in which information theory has found practical applications on an everyday basis. In the context of image and video processing, Szczęsna [23] puts forward a new form of approximate entropy, specifically designed for quaternion time series, as well as its application to the analysis of human gait kinematic data, where quaternions are used to represent rotations. Zhou et al. [24] present a dictionary learning algorithm, based on brightness and detail clustering, as inspired by the characteristics of the human visual system, for medical image fusion. In [25], Ballesteros et al. propose variable length codes, based on Collatz conjecture, for transforming images into non-intelligible audio, aiming in this way at providing privacy to image transmissions through an encryption scheme. Finally, in the field of acoustic signal processing, Shen et al. [26] develop an auditory inspired convolutional neural network that simulates the processing procedure of the human auditory system in discriminating ships of different classes from raw hydrophone data.
The remaining articles of this special issue [27,28,29,30,31,32,33] have topics within area of communications. On the one hand, the characterization of the achievable rate region of a given dual-hop multiple-access relay network, under linear beamforming, is analyzed in [27]. On the other hand, the authors of [28] study the sum-rate of multi-user MIMO systems for correlated Rayleigh fading channels, in the specific situation of multi-cell pilot contamination. Among the results of this contribution, we can mention the lower bound of the sum-rate, an approximation of its outage probability, and also of its asymptotic performance for a sufficiently large number of antennas at the base station.
We now turn to the field of coding theory: Zhang et al. [29] develop several novel types of maximum-distance separable self-dual codes over finite fields with odd characteristics, being based on a generalization of Reed–Solomon codes. In [30], Wang et al. analyze the Turbo Decoding Message Passing (TDMP) algorithm from the perspective of density evolution and Gaussian approximation, thus providing new theoretical foundations for this method. In addition, based on a certain normalized factor, the authors propose a simplified TDMP algorithm with improved performance.
Another exciting area of research covered in this special issue deals with radar signal processing. Two papers address the problem of designing the waveforms transmitted by the radar: in [31], Wang et al. propose a technique to maximize the signal-to-interference-plus-noise ratio (SINR) for known- and random-target models, and the mutual information (MI) between the radar echos and the random-target spectrum responses. In [32], Hao et al. present a cognitive waveform design method subject to a peak-to-average power ratio constraint. It is noteworthy that these authors develop a minorization–maximization technique to reduce the computation burden of the overall procedure.
Finally, bringing this brief outline to an end, we would also like to mention that block ciphering, an important subarea of cryptography, has been addressed in [33] by Wang et al., which present a novel algorithm that combines chaotic theory and Feistel block ciphers for securing the transmission of texts.

Author Contributions

S.C. and R.M.-C. collaborated in writing this editorial. All authors have read and approved the final manuscript.

Funding

This research was funded by the Ministerio de Economía, Industria y Competitividad (MINECO) of the Government of Spain, grant numbers TEC2017-82807-P and TEC2014-53103-P, and by the European Regional Development Fund (ERDF) of the European Commission.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Cover, T.M.; Thomas, J.A. Elements of Information Theory (Wiley Series in Telecommunications and Signal Processing); Wiley-Interscience: New York, NY, USA, 2006. [Google Scholar]
  3. Verdú, S.; McLaughlin, S.W. (Eds.) Information Theory: 50 Years of Discovery; IEEE Press: Piscataway, NJ, USA, 2000. [Google Scholar]
  4. Scharf, L.L. Statistical Signal Processing: Detection, Estimation, and Time Series Analysis; Addison Wesley: Boston, MA, USA, 1991. [Google Scholar]
  5. Verdú, S. The interplay between estimation theory and information theory. In Proceedings of the IEEE 6th Workshop on Signal Processing Advances in Wireless Communications, New York, NY, USA, 5–8 June 2005; pp. xxiv–xxv. [Google Scholar]
  6. Kay, S.M. Fundamentals of Statistical Signal Processing: Estimation Theory; Prentice-Hall Inc.: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
  7. Kay, S.M. Fundamentals of Statistical Signal Processing: Detection Theory; Prentice-Hall Inc.: Upper Saddle River, NJ, USA, 1993. [Google Scholar]
  8. MacKay, D.J.C. Information Theory, Inference & Learning Algorithms; Cambridge University Press: New York, NY, USA, 2002. [Google Scholar]
  9. Rissanen, J. Information and Complexity in Statistical Modeling, 1st ed.; Springer Publishing Company Incorporated: Berlin, Germany, 2007. [Google Scholar]
  10. Olshausen, B.; Field, D. Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 1996, 381, 607–609. [Google Scholar] [CrossRef] [PubMed]
  11. Smith, E.; Lewicki, M. Efficient auditory coding. Nature 2006, 439, 978–982. [Google Scholar] [CrossRef] [PubMed]
  12. Stilp, C.E.; Kluender, K.R. Cochlea-scaled entropy, not consonants, vowels, or time, best predicts speech intelligibility. Proc. Natl. Acad. Sci. USA 2010, 107, 12387–12392. [Google Scholar] [CrossRef] [Green Version]
  13. Amari, S.I. Information Geometry and Its Applications, 1st ed.; Springer Publishing Company Incorporated: Berlin, Germany, 2016. [Google Scholar]
  14. Tishby, N.; Zaslavsky, N. Deep learning and the information bottleneck principle. In Proceedings of the 2015 IEEE Information Theory Workshop (ITW), Jerusalem, Israel, 26 April–1 May 2015; pp. 1–5. [Google Scholar]
  15. Barron, A.R.; Klusowski, J.M. Complexity, Statistical Risk, and Metric Entropy of Deep Nets Using Total Path Variation. arXiv 2019, arXiv:1902.00800. [Google Scholar]
  16. Belda, J.; Vergara, L.; Safont, G.; Salazar, A. Computing the Partial Correlation of ICA Models for Non-Gaussian Graph Signal Processing. Entropy 2018, 21, 22. [Google Scholar] [CrossRef]
  17. Sarmiento, A.; Fondón, I.; Durán-Díaz, I.; Cruces, S. Centroid-Based Clustering with αβ-Divergences. Entropy 2019, 21, 196. [Google Scholar] [CrossRef]
  18. Delmaire, G.; Omidvar, M.; Puigt, M.; Ledoux, F.; Limem, A.; Roussel, G.; Courcot, D. Informed Weighted Non-Negative Matrix Factorization Using αβ-Divergence Applied to Source Apportionment. Entropy 2019, 21, 253. [Google Scholar] [CrossRef]
  19. Pinchas, M. A New Efficient Expression for the Conditional Expectation of the Blind Adaptive Deconvolution Problem Valid for the Entire Range of Signal-to-Noise Ratio. Entropy 2019, 21, 72. [Google Scholar] [CrossRef]
  20. Wu, B.; Gao, Y.; Feng, S.; Chanwimalueang, T. Sparse Optimistic Based on Lasso-LSQR and Minimum Entropy De-Convolution with FARIMA for the Remaining Useful Life Prediction of Machinery. Entropy 2018, 20, 747. [Google Scholar] [CrossRef]
  21. Cichocki, A.; Cruces, S.; Amari, S.i. Generalized Alpha-Beta Divergences and Their Application to Robust Nonnegative Matrix Factorization. Entropy 2011, 13, 134–170. [Google Scholar] [CrossRef]
  22. Vigneron, V.; Maaref, H. M-ary Rank Classifier Combination: A Binary Linear Programming Problem. Entropy 2019, 21, 440. [Google Scholar] [CrossRef]
  23. Szczęsna, A. Quaternion Entropy for Analysis of Gait Data. Entropy 2019, 21, 79. [Google Scholar] [CrossRef]
  24. Zhou, F.; Li, X.; Zhou, M.; Chen, Y.; Tan, H. A New Dictionary Construction Based Multimodal Medical Image Fusion Framework. Entropy 2019, 21, 267. [Google Scholar] [CrossRef]
  25. Ballesteros, D.M.; Peña, J.; Renza, D. A Novel Image Encryption Scheme Based on Collatz Conjecture. Entropy 2018, 20, 901. [Google Scholar] [CrossRef]
  26. Shen, S.; Yang, H.; Li, J.; Xu, G.; Sheng, M. Auditory Inspired Convolutional Neural Networks for Ship Type Classification with Raw Hydrophone Data. Entropy 2018, 20, 990. [Google Scholar] [CrossRef]
  27. Feng, G.; Guo, W.; Liu, B. Achievable Rate Region under Linear Beamforming for Dual-Hop Multiple-Access Relay Network. Entropy 2018, 20, 547. [Google Scholar] [CrossRef]
  28. Wang, M.; Wang, D. Sum-Rate of Multi-User MIMO Systems with Multi-Cell Pilot Contamination in Correlated Rayleigh Fading Channel. Entropy 2019, 21, 573. [Google Scholar] [CrossRef]
  29. Zhang, A.; Ji, Z. New Construction of Maximum Distance Separable (MDS) Self-Dual Codes over Finite Fields. Entropy 2019, 21, 101. [Google Scholar] [CrossRef]
  30. Wang, X.; Chang, H.; Li, J.; Cao, W.; Shan, L. Analysis of TDMP Algorithm of LDPC Codes Based on Density Evolution and Gaussian Approximation. Entropy 2019, 21, 457. [Google Scholar] [CrossRef]
  31. Wang, B.; Chen, X.; Xin, F.; Song, X. SINR- and MI-Based Maximin Robust Waveform Design. Entropy 2019, 21, 33. [Google Scholar] [CrossRef]
  32. Hao, T.; Cui, C.; Gong, Y. Efficient Low-PAR Waveform Design Method for Extended Target Estimation Based on Information Theory in Cognitive Radar. Entropy 2019, 21, 261. [Google Scholar] [CrossRef]
  33. Wang, J.; Ding, Q. Dynamic Rounds Chaotic Block Cipher Based on Keyword Abstract Extraction. Entropy 2018, 20, 693. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Cruces, S.; Martín-Clemente, R.; Samek, W. Information Theory Applications in Signal Processing. Entropy 2019, 21, 653. https://doi.org/10.3390/e21070653

AMA Style

Cruces S, Martín-Clemente R, Samek W. Information Theory Applications in Signal Processing. Entropy. 2019; 21(7):653. https://doi.org/10.3390/e21070653

Chicago/Turabian Style

Cruces, Sergio, Rubén Martín-Clemente, and Wojciech Samek. 2019. "Information Theory Applications in Signal Processing" Entropy 21, no. 7: 653. https://doi.org/10.3390/e21070653

APA Style

Cruces, S., Martín-Clemente, R., & Samek, W. (2019). Information Theory Applications in Signal Processing. Entropy, 21(7), 653. https://doi.org/10.3390/e21070653

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop