Next Article in Journal
Entropy Conditions Involved in the Nonlinear Coupled Constitutive Method for Solving Continuum and Rarefied Gas Flows
Previous Article in Journal
Altered Brain Complexity in Women with Primary Dysmenorrhea: A Resting-State Magneto-Encephalography Study Using Multiscale Entropy Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Channel Capacity of Coding System on Tsallis Entropy and q-Statistics

by
Tatsuaki Tsuruyama
Department of Pathology, Kyoto University, Graduate School of Medicine, Yoshida-Konoe-cho, Sakyo-ku, Kyoto 606-8315, Japan
Entropy 2017, 19(12), 682; https://doi.org/10.3390/e19120682
Submission received: 12 August 2017 / Revised: 4 December 2017 / Accepted: 8 December 2017 / Published: 12 December 2017
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

:
The field of information science has greatly developed, and applications in various fields have emerged. In this paper, we evaluated the coding system in the theory of Tsallis entropy for transmission of messages and aimed to formulate the channel capacity by maximization of the Tsallis entropy within a given condition of code length. As a result, we obtained a simple relational expression between code length and code appearance probability and, additionally, a generalized formula of the channel capacity on the basis of Tsallis entropy statistics. This theoretical framework may contribute to data processing techniques and other applications.

1. Introduction

Information theory has developed greatly in recent years, and has found broader applications in various research fields [1]. Shannon developed the information theory of entropy [2], and there have been extensive studies of entropy generalizations. On the other hand, the theory of Tsallis statistics originated in the 1980s, and the principle of entropy maximization was a means of expanding on the basic statistics theory. It was based on the fact that Boltzmann–Gibbs statistical mechanics could be reconstructed by the entropy maximization principle, a development beginning in 1957 with the work of Jaynes [3,4]; the framework has been developed primarily for extending statistical mechanics. Tsallis entropy was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics on the basis of q-statistics [5]. It is possible to derive Tsallis distributions from the optimization of Tsallis entropy. For example, the q-Gaussian is one of the probability distributions that arise from the maximization of Tsallis entropy.
In this short paper, we aim to define the channel capacity of the coding system for transmission of messages on the basis of Tsallis entropy and aim to understand Tsallis entropy theory from the viewpoint of coding theory and seek a way to maximize the entropy corresponding to the signal event number. For the purpose, we introduce the coding symbols, the appearance probability of the symbols, and the message duration. Through this theory-based development, we reconsider the significance of Tsallis entropy in coding theory. Especially with respect to source coding, the objective is to determine the relational expression when applying the entropy maximization principle to establishing the relationship between code length and the appearance probability of the code used to calculate the channel capacity of a coding system. This theory regarding the relationship between code length and the appearance probability of the code was developed by Brillouin and the simple formula was obtained by maximizing information in the coding system, on the basis of information theory [6,7,8]. Several research progressions have been achieved regarding mutual information [7]; however, the relational expression between source code length and the appearance probability of the code remains to be improved in the theory.

2. Source Coding for Tsallis Entropy Formulation

Consider all the possible distinct messages that correspond to all the possible combinations of the code Aj, whose code length is τj. For instance, a message is described using n types of code symbols, Sj (1 ≤ jn) as follows:
A1 A3 A2 A3 A1 A4 A3 A5 A3 A3 A3
Our aim is to identify the way of coding in which the total information within a given duration can be maximized. The messages, which consist of symbols Aj with numbers Nj (1 ≤ jn), will correspond to all the possible combinations of symbols Aj. Therefore, N1 = 2, N2 = 1, N3 = 6, and n = 5 in the message (1). Here, we consider Ψ, the total number of such distinct messages, in the selection of n symbols. We assume absolutely no restrictions, constraints, or correlations in using various symbols. We obtain information I derived from the above messages consisting of Nj,
I = K log ψ .
Here, K is an arbitrary constant. If we use entropy unit, we take K = kB, Boltzmann’s constant. On the other hand, in information science, K is equivalent to log2e. Shannon defines the channel capacity as follows [2]:
C = max lim τ K log ψ τ = max lim τ I τ .
Here, Ψ is signified as a function of τ, a message of total duration. The unit of channel capacity is given by bits per second, if the total duration is measured in seconds. We define the total number of code symbols N in a given message as:
N = j = 1 n N j
For example, N = 11 in (1). Thus, N is variable in different individual messages. Next, pj is the appearance probability of the Sj symbol in the messages consisting of a total of N symbols. In the following summations over j:
p j N j N
and
j = 1 n p j = 1
Using (5), we can rewrite (3) as follows:
C = max lim τ ( K N j = 1 n p j log p j / τ ) = max lim τ ( K S / τ ) .
Here, S represents the Shannon entropy, S. In this study, we investigate the channel capacity when the entropy is given by Tsallis entropy. Here, we introduce the q-duration of the message, as follows:
τ q = N j = 1 n ϕ j τ j
τj signifies the jth code length. Here, we used escort probability φj:
ϕ j p j q c q ,
In actuality,
j = 1 n ϕ j = j = 1 n p j q c q = 1
For simplification, we use general notation in Tsallis statistics:
c q j = 1 n p j q
Tsallis entropy is given by [6] (http://www.tsakkus,cat.cbpf.br/TEMUCO.pdf):
S q = N j = 1 n p j p j q q 1
The theory of Tsallis statistics, based on the generalized form of entropy Sq (qR), when q→1, recovers the Shannon entropy:
S = N j = 1 n p j log p j .
We aimed to maximize Tsallis entropy (12) [8], Sq, instead of Shannon entropy. Then, we introduced a function G, using non-determined parameters β and γ, in reference to (8), (9), and (10):
G ( p 1 , p 2 , p n ; N ) S q β j = 1 n ϕ j γ N j = 1 n ϕ j τ j
Then
p j G ( p 1 , p 2 , p n ; N ) = N 1 q p j q 1 q 1 ( β + γ N τ j ) q p j q 1 ( p j q + c q ) c q 2
N G ( p 1 , p 2 , p n ; N ) = j = 1 n p j p j q q 1 j = 1 n γ p j q c q τ j
For calculation of (15), we used
ϕ j p j = q p j q 1 ( p j q + c q ) c q 2
For maximization of G ( p 1 , p 2 , p n ; N ) , setting the right sides of (15) and (16) equal to zero, we have:
( β + γ N τ j ) q p j q 1 ( p j q + c q ) c q 2 = N 1 q p j q 1 q 1
γ p j q c q τ j = p j p j q q 1
and solving the above equations with respect to undetermined coefficients β and γ, we have:
β = N c q p j q ( p j q ( p j + p j q ) q + ( p j + q p j 2 q p j q ) c q ) q ( q 1 ) ( p j q + c q )
and
γ q γ c q = 1 p j 1 q ( q 1 ) τ j
Rewriting (21), using the q-logarithm function,
log q x = 1 x 1 q q 1 ,
and we obtain from (8), (21) and (22):
log q p j = γ q τ j
Equation (23) implies that most probable code symbols must be short, while the improbable code may be long. In fact, when q approaches 1, (23) gives the logarithm according to Brillouin’s work using another constant γ’:
log p j = γ τ j
Thus, the probability of symbol appearance can be described using the Tsallis duration in (22), which is similar to the Shannon entropy coding in (24) [9]. The above result is explicitly a natural extension of Brillouin’s theory, regarding the relationship between coding and Shannon entropy, to the concept of Tsallis entropy.

3. Channel Capacity and Tsallis Entropy

As shown in (23), γq is equivalent to the Tsallis average entropy production rate σq during the transmission of the message:
γ q = log q p j / τ j σ q
Our definition now yields the q-channel capacity Cq, in reference to (3), (7), (8), and (21) as follows:
C q lim τ q K N j = 1 n p j q log q p j τ q = lim τ q K N σ q j = 1 n p j q τ j τ q = K q σ q
with
K q K c q
Here, K is an arbitrary constant. Therefore, the channel capacity has a dimension identical to the entropy production rate and is equivalent to Tsallis average entropy production rate. Thus, the above result is explicitly a natural extension of channel capacity on Tsallis entropy.

4. Conclusions

In this short article, we achieved an important formulation between code length and appearance probability in Equation (26). In a similar way to how Shannon’s entropy was extended to Tsallis entropy, the source-coding theory based on the former entropy by Brillouin [9,10] was theoretically generalized for the theory based on Tsallis entropy [5,7,11]. On the other hand, it remains to be discussed how one should interpret q-duration in Equations (7) and (19). The duration of a given message event is generally shorter than the actual message duration; however, when the appearance probability distribution obeys a q-Gaussian, the q-duration is an indicator available for use in the analysis of the coding background, in place of the actual duration. From this perspective, we will further investigate the definition of the q-duration and the interpretation of the limitation of our calculation in future.
My theoretical attempt to generalize source coding can be applied to data management within the areas of data communications, processing, and conversion, particularly in the development of imaging applications. Further investigation is needed with regards to the tractability of Tsallis entropy and q-statistics in evaluating actual experimental data. We actually applied the medical imaging technique, following previous reports [12,13]; in future work, we look to investigate systems in which entropy is measurable and/or definable.

Acknowledgments

This work was supported by a Grant-in-Aid from the Ministry of Education, Culture, Sports, Science, and Technology of Japan (Synergy of Fluctuation and Structure: Quest for Universal Laws in Non-Equilibrium Systems, P2013-201 Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan). We thank Kenichi Yoshikawa of Doshisha University, for his advices.

Conflicts of Interest

The author declare no conflict of interest.

References

  1. Zoltan, D. Generalized information functions. Inf. Control 1970, 16, 36–51. [Google Scholar]
  2. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  3. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  4. Jaynes, E.T. Information theory and statistical mechanics II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  5. Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
  6. Rosso, O.A.R.; Teresa, M.; Larrondo, H.A.; Kowalski, A.; Plastino, A. Generalized statistical complexity: A new tool for dynamical systems. In Concepts and Recent Advances in Generalized Information Measures and Statistics; Kowalski, A.M., Rossignoli, R.D., Curado, E.M.F., Eds.; Bentham Science Publishers: Emirate of Sharjah, United Arab Emirates, 2013. [Google Scholar]
  7. Tsallis, C. Introduction to Nonextensive Statistical Mechanics: Approaching a Complex World; Springer: New York, NY, USA, 2009; pp. 329–334. [Google Scholar]
  8. Livadiotis, G. Entropy Maximization in Part 1 Theory and Formalism. In Kappa Distribution: Theory and Applications in Plasmas; Elsevier: Amsterdam, The Netherlands; Cambridge, UK; Atlanta, GA, USA, 2017; pp. 22–27. [Google Scholar]
  9. Brillouin, L. Chapter 4 Principle of coding. In Science and Information Theory, 2nd ed.; Dover Publications Inc.: Mineola, NY, USA, 2013; pp. 28–50. [Google Scholar]
  10. Brillouin, L. Chapter 8 The analysis of coding. In Science and Information Theory, 2nd ed.; Dover Publications Inc.: Mineola, NY, USA, 2013; pp. 78–113. [Google Scholar]
  11. Angulo, J.M.; Esquivel, F.J. Multifractal dimensional dependence assessment based on Tsallis mutual information. Entropy 2015, 17, 5382–5401. [Google Scholar] [CrossRef]
  12. Hamza, A.B. An information-theoretic method for multimodality medical image registration. Expert Syst. Appl. 2012, 39, 5548–5556. [Google Scholar]
  13. Tarmissi, K. Information-theoretic hashing of 3D objects using spectral graph theory. Expert Syst. Appl. 2009, 36, 9409–9414. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Tsuruyama, T. Channel Capacity of Coding System on Tsallis Entropy and q-Statistics. Entropy 2017, 19, 682. https://doi.org/10.3390/e19120682

AMA Style

Tsuruyama T. Channel Capacity of Coding System on Tsallis Entropy and q-Statistics. Entropy. 2017; 19(12):682. https://doi.org/10.3390/e19120682

Chicago/Turabian Style

Tsuruyama, Tatsuaki. 2017. "Channel Capacity of Coding System on Tsallis Entropy and q-Statistics" Entropy 19, no. 12: 682. https://doi.org/10.3390/e19120682

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop