Abstract
With the help of Tsallis residual entropy, we introduce Tsallis quantile entropy order between two random variables. We give necessary and sufficient conditions, study closure and reversed closure properties under parallel and series operations and show that this order is preserved in the proportional hazard rate model, proportional reversed hazard rate model, proportional odds model and record values model.
Keywords:
Tsallis entropy; Tsallis quantile entropy; Tsallis residual entropy; Tsallis quantile entropy order MSC:
60E15; 60K10; 62B10; 62N05; 90B25; 94A17
1. Introduction
The concept of entropy, defined mathematically by Shannon in [], measures the uncertainty of a physical system and has applications in many scientific and technological areas such as physics, probability theory, statistics, communication theory and economics. This notion appeared from thermodynamics and statistical mechanics. In this theory, for a data communication system, we have three elements: a receiver, a communication channel and a source of data. Based on the signal that is received through the channel, Shannon tried to identify what sort of data were generated. Many methods on how to encode, compress and transmit the message were considered. In Shannon’s source coding theorem, also known as Shannon’s first theorem, an error-free encoding is established. This result is generalized especially for noisy channel in Shannon’s noisy coding theorem. In the last couple of years, Shannon entropy was intensively studied and many generalizations have appeared (Tsallis entropy, Rényi entropy, Varma entropy, Kaniadakis entropy, relative entropy, weighted entropy, cumulative entropy, etc.).
In [], Tsallis used another formula instead of the classical algorithm which appears in Shannon entropy, defining, in this way, what we call today, Tsallis entropy. There are many applications of this new entropy, especially in physics, and, more precisely: superstatistics (see []), spectral statistics (see []), earthquakes (see [,,]), stock exchanges (see [,]), plasma (see []), income distribution (see []), non-coding human DNA (see []), internet (see []), and statistical mechanics (see [,]). For more information about Tsallis entropy, we recommend reading [].
Among the applications of other entropies (Rényi entropy, Varma entropy, Kaniadakis entropy, relative entropy, weighted entropy, etc.), we can list the following: Markov chains (see [,,]), model selection (see [,]), combinatorics (see [,]), finance (see [,,]), Lie symmetries (see [,]), and machine learning (see [,]).
There are several papers in which the authors compare random variables from the point of view of residual entropies: for Shannon residual entropy, see [,,]; for Rényi residual entropy, see [,]; for Varma residual entropy, see []; and for Awad-Varma residual entropy, see []. Other orders between random variables can be found in [,,,,,,,].
Rao et al. [] introduced an alternative measure to Shannon entropy, known as the cumulative residual entropy (CRE), by considering the survival function instead of the probability density function. Because the survival function is more regular than the probability density function, CRE is considered to be more stable and has more mathematical properties. Moreover, the distribution function exists even if the probability density function does not exist (see, e.g., generalized lambda, power-Pareto and Govindarajulu distributions). Sati and Gupta [] introduced a cumulative residual Tsallis entropy and extended it to its dynamic form based on the residual lifetime. Rajesh and Sunoj [] introduced an alternative form of the cumulative residual Tsallis entropy and proved some results with applications in reliability. Toomaj and Atabay [] elaborated some further consequences of the alternative cumulative residual Tsallis entropy, introduced by Rajesh and Sunoj [], including stochastic ordering, expressions and bounds and proposed a normalized version of the cumulative residual Tsallis entropy, which can be used as a dispersion measure in place of coefficient of variation. Kumar [] obtained characterization results based on the dynamic cumulative residual Tsallis entropy. In many realistic situations, uncertainty is not necessarily related to the future and can refer to the past as well. For instance, if at time t, a system which is observed only at certain preassigned inspection times is found to be down, then the uncertainty of the system life relies on the past, i.e., on which instant in it has failed. A wide variety of research is available on entropy measures and its applications in past lifetime. For more detail, one can refer to Di Crescenzo and Longobardi [], Di Crescenzo and Longobardi [], Sachlas and Papaioannou [] and Di Crescenzo and Toomaj []. Also, a study on the cumulative Tsallis entropy for past lifetime is available in Nair et al.’s work [], Calì et al.’s work [], Khammar and Jahanshahi’s work [], Sunoj et al.’s study [] and Alomani and Kayid’s study []. Baratpour and Khammar [] studied Tsallis entropy of order statistics. The quantile-based approach has some advantages: it provides an alternative methodology in deriving the cumulative Tsallis entropy in past lifetime and facilitates the extension of domain of application of the cumulative Tsallis entropy in past lifetime to many flexible quantile functions which serve as useful lifetime models and which possess no probability distribution function.
The paper is organized as follows. After this Introduction section, in Section 2, Background and Notations, we present the main notions and notations used throughout the article. In Section 3, Fundamental Results, we present the main theorem (Theorem 1), which is used in all of our results. In this section, we also prove that the dispersive order and the convex transform order apply to the Tsallis quantile entropy order. In Section 4, Closure and Reversed Closure Properties, we show the closure and reversed closure properties of Tsallis quantile entropy order under parallel and series operations. In the last four sections, we show the preservation of Tsallis quantile entropy in some stochastic models: the proportional hazard rate model (Section 5—Preservation of Tsallis Quantile Entropy Order in the Proportional Hazard Rate Model), the proportional reversed hazard rate model (Section 6—Preservation of Tsallis Quantile Entropy Order in the Proportional Reversed Hazard Rate Model), the proportional odds model (Section 7—Preservation of Tsallis Quantile Entropy Order in the Proportional Odds Model) and the proportional record values model (Section 8—Preservation of Tsallis Quantile Entropy Order in the Record Values Model).
2. Background and Notations
Throughout this paper, we assume that all expectations are finite and all ratios and powers are well defined. For information on notions of probability theory, we recommend [].
We consider X a non-negative random variable with an absolutely continuous cumulative distribution function , a survival function and a probability density function (X represents a living thing or the lifetime of a device).
Shannon entropy of X is given by
where “log” is the natural logarithm function and Z is a non-negative random variable identically distributed like X.
Let . Tsallis logarithm is given via
From this point onward, we assume that .
Tsallis entropy of X is defined by
In this paper, we work with Tsallis residual entropy, defined via
We recall that the quantile function of X is given by
We have for any . Differentiating both sides of this equality with respect to u, we obtain for any . With the notation for any , it follows that for any .
Let for any .
For any , we obtain
where U is a random variable uniformly distributed on .
In this paper, we are concerned about comparing two absolutely continuous non-negative random variables from the point of view of Tsallis residual entropy. More precisely, if X and Y are absolutely continuous non-negative random variables, we compare and for any .
In the proofs, we will make use of the lemma below.
Lemma 1
(see []). Let an increasing function and such that
Then
3. Fundamental Results
Definition 1.
We say that X is smaller than Y in Tsallis quantile entropy order (and denote by ) if for any .
In the last several years, stochastic orders and inequalities have been used intensively in many areas of probability and statistics, like reliability theory, queuing theory, survival analysis, biology, economics, insurance, actuarial science, operations research and management science. The simplest way of comparing two distribution functions is by a comparison of the associated means. Because this comparison is based on only two numbers (the means), it is sometimes not very informative. Moreover, the means to not exist is possible. In many applications, we have more detailed information concerning the comparison of two distribution functions than just the two means. If we compare two distribution functions with the same mean (or that are centered about the same value), we can compare the dispersion of these distributions. The simplest way of doing this is by the comparison of the associated standard deviations. But, again, the comparison depends on only two numbers (the standard deviations), which are at times not very informative. As mentioned above, it is possible for the standard deviations to not exist. The concept of stochastic orders plays a major role in the theory and practice of statistics. It generally refers to a set of relations that may hold between a pair of distributions of random variables. In reliability, the stochastic orders which compare life distributions based on different characteristics are used to study aging properties, to develop bounds on reliability functions, to compare the performance of policies and systems and to derive new inference procedures. Many of such orders are defined in terms of concepts based on distribution functions.
The theorem below is the main result of this paper.
Theorem 1.
The following assertions are equivalent:
- 1.
- .
- 2.
- for any .
Proof.
From Definition 1, if and only if
If we take in the preceding inequality, the following equivalences are valid for any :
In order to obtain the conclusion, it is sufficient to denote . □
Definition 2
(see []). We say that:
- 1.
- X is smaller than Y in the dispersive order (and write ) if
- 2.
- X is smaller than Y in the convex transform order (and write ) if the function
The dispersive order is a basic concept for comparing spread among probability distributions, with applications to order statistics, spacings and convolutions of independent random variables. The convex transform order is used to make precise comparisons between the skewness of probability distributions on the real line. From the point of view of the aging interpretation, this order can be seen as identifying aging rates in a way that also works when lifetimes do not start simultaneously (for more details concerning these two orders, the reader can consult []).
Theorem 2.
If , then .
Proof.
Assume that . Then for any , hence for any and the conclusion follows Theorem 1. □
Theorem 3.
If and , then .
Proof.
Assume that . Then the function is non-negatively increasing; hence
With Theorem 1, we obtain the conclusion. □
4. Closure and Reversed Closure Properties
We consider and to be independent and identically distributed (i.i.d.) copies of X and Y, respectively, and
Theorem 4.
If , then .
Proof.
Because , we can determine with Theorem 1 that
It can be seen that, for any
and
Then
Because the function
it follows, via inequality (1) and Lemma 1, that
The relationship follows Theorem 1. □
Theorem 5.
If , then .
Proof.
Because , we have, by Theorem 1, that
We can see that, for any
and
Then
By applying Theorem 1, we obtain that . □
The natural step is to generalize the preceding two theorems from a finite number n to a random variable N.
We consider and as sequences of independent and identically distributed copies of X and Y, respectively. Let N be a positive integer random variable with the probability mass function , and such that N is independent of and . Take
and
Theorem 6.
If , then .
Proof.
Because , we can determine by Theorem 1 that
One can see that, for any
and
It was proven in [] that
Hence, for any
and
Then
The conclusion thus follows Theorem 1. □
Theorem 7.
If , then .
Proof.
Because , we can determine by Theorem 1 that
We can see that, for any ,
and
It was proven in [] that
Hence, for any
and
Then
By Theorem 1, we conclude that . □
5. Preservation of Tsallis Quantile Entropy Order in the Proportional Hazard Rate Model
We consider the following proportional hazard rate model (see []), namely for any , for which we take and as two absolutely continuous non-negative random variables with the survival functions and , respectively.
Theorem 8.
- 1.
- If and , then .
- 2.
- If and , then .
Proof.
For any , we can obtain:
and
Then:
- 1.
- If and , then the functionUsing Lemma 1, we can determine that .
- 2.
- If and , then the functionUsing Lemma 1, we can determine that .
□
6. Preservation of Tsallis Quantile Entropy Order in the Proportional Reversed Hazard Rate Model
We consider the following proportional reversed hazard rate model (see []), namely for any , for which we take and as two absolutely continuous non-negative random variables with the distribution functions and , respectively.
Theorem 9.
- 1.
- If and , then .
- 2.
- If and , then .
Proof.
We can determine for any :
and
Then:
- 1.
- If and , then the functionUsing Lemma 1, we can determine that .
- 2.
- If and , then the functionUsing Lemma 1, we can determine that .
□
7. Preservation of Tsallis Quantile Entropy Order in the Proportional Odds Model
We work with the following proportional odds model (see []), namely for any , for which we take the proportional odds random variables and , defined by the survival functions and , respectively, for any .
Theorem 10.
- 1.
- If and , then .
- 2.
- If and , then .
Proof.
For any we have
and
Then
- 1.
- Assume that and . ThenHence, by Lemma 1, we obtain .
- 2.
- Assume that and . ThenHence, by Lemma 1, we obtain .
□
8. Preservation of Tsallis Quantile Entropy Order in the Record Values Model
Let | and | be sequences of i.i.d. random variables from the random variables X and Y, respectively, with survival functions and , respectively, and density functions and , respectively. We consider the nth record times and , respectively, defined via and | for any and and |, respectively.
We denote and , respectively, and call them the nth record values (see []).
For any , we can obtain
and
where is the survival function of a Gamma random variable with the shape parameter n and the scale parameter 1, is the cumulative failure rate function of X and is the cumulative failure rate function of Y.
Theorem 11.
Let .
- 1.
- If , then .
- 2.
- If and , then .
Proof.
- 1.
- If , thenWe have, for any ,
- 2.
- If , then
□
9. Conclusions
We introduced Tsallis quantile entropy order between two random variables, found necessary and sufficient conditions for it and proved closure and reversed closured properties of this order under parallel and series operations. We also showed that Tsallis quantile entropy order is preserved in some stochastic models, like proportional hazard rate model, proportional reversed hazard rate model, proportional odds model and record values model. In this way, there are generalized results from other papers working with Tsallis residual entropy instead of Shannon residual entropy (which is used in [,,]), Rényi residual entropy (which is used in [,]), Varma residual entropy (which is used in []) and Awad-Varma residual entropy (which is used in []). The difference is that we work with Tsallis residual entropy instead of other residual entropies considered in the aforementioned papers.
Author Contributions
Conceptualization, R.-C.S. and V.P.; methodology, R.-C.S. and V.P.; software, R.-C.S. and V.P.; validation, R.-C.S. and V.P.; formal analysis, R.-C.S. and V.P.; investigation, R.-C.S. and V.P.; writing—original draft, R.-C.S. and V.P.; writing—review and editing, R.-C.S. and V.P.; visualization, R.-C.S. and V.P.; supervision, R.-C.S. and V.P.; project administration, R.-C.S. and V.P. All authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Data Availability Statement
No new data were created or analyzed in this study. Data sharing is not applicable to this article.
Acknowledgments
The authors are very much indebted to the anonymous referees and to the editors for their most valuable comments and suggestions which improved the quality of the paper.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Shannon, C. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys. 1988, 52, 479–487. [Google Scholar] [CrossRef]
- Beck, C.; Cohen, E.G.D. Superstatistics. Phys. A 2003, 322, 267–275. [Google Scholar] [CrossRef]
- Tsekouras, G.A.; Tsallis, C. Generalized entropy arising from a distribution of q indices. Phys. Rev. E 2005, 71, 046144. [Google Scholar] [CrossRef] [PubMed]
- Abe, S.; Suzuki, N. Law for the distance between successive earthquakes. J. Geophys. Res. 2003, 108, 2113. [Google Scholar] [CrossRef]
- Darooneh, A.H.; Dadashinia, C. Analysis of the spatial and temporal distributions between successive earthquakes: Nonextensive statistical mechanics viewpoint. Phys. A 2008, 387, 3647–3654. [Google Scholar] [CrossRef]
- Hasumi, T. Hypocenter interval statistics between successive earthquakes in the twodimensional Burridge-Knopoff model. Phys. A 2009, 388, 477–482. [Google Scholar] [CrossRef]
- Jiang, Z.Q.; Chen, W.; Zhou, W.X. Scaling in the distribution of intertrade durations of Chinese stocks. Phys. A 2008, 387, 5818–5825. [Google Scholar] [CrossRef]
- Kaizoji, T. An interacting-agent model of financial markets from the viewpoint of nonextensive statistical mechanics. Phys. A 2006, 370, 109–113. [Google Scholar] [CrossRef]
- Lima, J.; Silva, R., Jr.; Santos, J. Plasma oscillations and nonextensive statistics. Phys. Rev. E 2000, 61, 3260. [Google Scholar] [CrossRef]
- Soares, A.D.; Moura, N.J., Jr.; Ribeiro, M.B. Tsallis statistics in the income distribution of Brazil. Chaos Solitons Fractals 2016, 88, 158–171. [Google Scholar] [CrossRef]
- Oikonomou, N.; Provata, A.; Tirnakli, U. Nonextensive statistical approach to non-coding human DNA. Phys. A 2008, 387, 2653–2659. [Google Scholar] [CrossRef]
- Abe, S.; Suzuki, N. Itineration of the Internet over nonequilibrium stationary states in Tsallis statistics. Phys. Rev. E 2003, 67, 016106. [Google Scholar] [CrossRef]
- Preda, V.; Dedu, S.; Sheraz, M. New measure selection for Hunt-Devolder semi-Markov regime switching interest rate models. Phys. A 2014, 407, 350–359. [Google Scholar] [CrossRef]
- Tsallis, C. Introduction to Nonextensive Statistical Mechanics; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
- Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy, divergence rates and weighted divergence rates for Markov chains. I: The alpha-gamma and beta-gamma case. Proc. Rom. Acad. Ser. A Math. Phys. Tech. Sci. Inf. Sci. 2017, 18, 293–301. [Google Scholar]
- Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy and divergence rates for Markov chains. II: The weighted case. Proc. Rom. Acad. Ser. A Math. Phys. Tech. Sci. Inf. Sci. 2018, 19, 3–10. [Google Scholar]
- Barbu, V.S.; Karagrigoriou, A.; Preda, V. Entropy and divergence rates for Markov chains. III: The Cressie and Read case and applications. Proc. Rom. Acad. Ser. A Math. Phys. Tech. Sci. Inf. Sci. 2018, 19, 413–421. [Google Scholar]
- Toma, A. Model selection criteria using divergences. Entropy 2014, 16, 2686–2698. [Google Scholar] [CrossRef]
- Toma, A.; Karagrigoriou, A.; Trentou, P. Robust model selection criteria based on pseudodistances. Entropy 2020, 22, 304. [Google Scholar] [CrossRef] [PubMed]
- Raşa, I. Convexity properties of some entropies. Results Math. 2018, 73, 105. [Google Scholar] [CrossRef]
- Raşa, I. Convexity properties of some entropies. II. Results Math. 2019, 74, 154. [Google Scholar] [CrossRef]
- Preda, V.; Dedu, S.; Iatan, I.; Dănilă Cernat, I.; Sheraz, M. Tsallis entropy for loss models and survival models involving truncated and censored random variables. Entropy 2022, 24, 1654. [Google Scholar] [CrossRef]
- Trivellato, B. The minimal k-entropy martingale measure. Int. J. Theor. Appl. Financ. 2012, 15, 1250038. [Google Scholar] [CrossRef]
- Trivellato, B. Deformed exponentials and applications to finance. Entropy 2013, 15, 3471–3489. [Google Scholar] [CrossRef]
- Hirică, I.-E.; Pripoae, C.-L.; Pripoae, G.-T.; Preda, V. Lie symmetries of the nonlinear Fokker-Planck equation based on weighted Kaniadakis entropy. Mathematics 2022, 10, 2776. [Google Scholar] [CrossRef]
- Pripoae, C.-L.; Hirică, I.-E.; Pripoae, G.-T.; Preda, V. Lie symmetries of the nonlinear Fokker-Planck equation based on weighted Tsallis entropy. Carpathian J. Math. 2022, 38, 597–617. [Google Scholar] [CrossRef]
- Iatan, I.; Dragan, M.; Dedu, S.; Preda, V. Using probabilistic models for data compression. Mathematics 2022, 10, 3847. [Google Scholar] [CrossRef]
- Wang, X.; Li, Y.; Qiao, Q.; Tavares, A.; Liang, Y. Water quality prediction based on machine learning and comprehensive weighting methods. Entropy 2023, 25, 1186. [Google Scholar] [CrossRef] [PubMed]
- Ebrahimi, N. How to measure uncertainty in the residual lifetime distribution. Sankhyā A 1996, 58, 48–56. [Google Scholar]
- Ebrahimi, N.; Pellerey, F. New partial ordering of survival functions based on the notion of uncertainty. J. Appl. Probab. 1995, 32, 202–211. [Google Scholar] [CrossRef]
- Sunoj, S.M.; Sankaran, P.G. Quantile based entropy function. Statist. Probab. Lett. 2012, 82, 1049–1053. [Google Scholar] [CrossRef]
- Nanda, A.K.; Sankaran, P.G.; Sunoj, S.M. Rényi’s residual entropy: A quantile approach. Statist. Probab. Lett. 2014, 85, 114–121. [Google Scholar] [CrossRef]
- Yan, L.; Kang, D.-T. Some new results on the Rényi quantile entropy ordering. Stat. Methodol. 2016, 33, 55–70. [Google Scholar] [CrossRef]
- Sfetcu, S.-C. Varma quantile entropy order. Analele Ştiinţifice Univ. Ovidius Constanţa 2021, 29, 249–264. [Google Scholar] [CrossRef]
- Sfetcu, R.-C.; Sfetcu, S.-C.; Preda, V. Ordering Awad-Varma entropy and applications to some stochastic models. Mathematics 2021, 9, 280. [Google Scholar] [CrossRef]
- Furuichi, S.; Minculete, N.; Mitroi, F.-C. Some inequalities on generalized entropies. J. Inequal. Appl. 2012, 2012, 226. [Google Scholar] [CrossRef]
- Furuichi, S.; Minculete, N. Refined Young inequality and its application to divergences. Entropy 2021, 23, 514. [Google Scholar] [CrossRef]
- Răducan, A.M.; Rădulescu, C.Z.; Rădulescu, M.; Zbăganu, G. On the probability of finding extremes in a random set. Mathematics 2022, 10, 1623. [Google Scholar] [CrossRef]
- Rădulescu, M.; Rădulescu, C.Z.; Zbăganu, G. Conditions for the existence of absolutely optimal portfolios. Mathematics 2021, 9, 2032. [Google Scholar] [CrossRef]
- Băncescu, I. Some classes of statistical distributions. Properties and applications. Analele Ştiinţifice Univ. Ovidius Constanţa 2018, 26, 43–68. [Google Scholar] [CrossRef]
- Catană, L.-I.; Răducan, A. Stochastic order for a multivariate uniform distributions family. Mathematics 2020, 8, 1410. [Google Scholar] [CrossRef]
- Catană, L.-I. Stochastic orders for a multivariate Pareto distribution. Analele Ştiinţifice Univ. Ovidius Constanţa 2021, 29, 53–69. [Google Scholar] [CrossRef]
- Suter, F.; Cernat, I.; Dragan, M. Some information measures properties of the GOS-concomitants from the FGM family. Entropy 2022, 24, 1361. [Google Scholar] [CrossRef] [PubMed]
- Rao, M.; Chen, Y.; Vemuri, B.C.; Wang, F. Cumulative residual entropy: A new measure of information. IEEE Trans. Inf. Theory 2004, 50, 1220–1228. [Google Scholar] [CrossRef]
- Sati, M.M.; Gupta, N. Some characterization results on dynamic cumulative residual Tsallis entropy. J. Probab. Stat. 2015, 8, 694203. [Google Scholar] [CrossRef]
- Rajesh, G.; Sunoj, S.M. Some properties of cumulative Tsallis entropy of order α. Stat. Pap. 2019, 60, 933–943. [Google Scholar] [CrossRef]
- Toomaj, A.; Atabay, H.A. Some new findings on the cumulative residual Tsallis entropy. J. Comput. Appl. Math. 2022, 400, 113669. [Google Scholar] [CrossRef]
- Kumar, V. Characterization results based on dynamic Tsallis cumulative residual entropy. Commun. Stat. Theory Methods 2017, 46, 8343–8354. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Longobardi, M. Entropy-based measure of uncertainty in past lifetime distributions. J. Appl. Probab. 2002, 39, 434–440. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Longobardi, M. On cumulative entropies. J. Stat. Plan. Inference 2009, 139, 4072–4087. [Google Scholar] [CrossRef]
- Sachlas, A.; Papaioannou, T. Residual and past entropy in actuarial science and survival models. Methodol. Comput. Appl. Probab. 2014, 16, 79–99. [Google Scholar] [CrossRef]
- Di Crescenzo, A.; Toomaj, A. Extension of the past lifetime and its connection to the cumulative entropy. J. Appl. Probab. 2015, 52, 1156–1174. [Google Scholar] [CrossRef]
- Nair, N.U.; Sankaran, P.G.; Balakrishnan, N. Quantile-Based Reliability Analysis; Springer: New York, NY, USA, 2013. [Google Scholar]
- Calì, C.; Longobardi, M.; Ahmadi, J. Some properties of cumulative Tsallis entropy. Phys. A 2017, 486, 1012–1021. [Google Scholar] [CrossRef]
- Khammar, A.H.; Jahanshahi, S.M.A. On weighted cumulative residual Tsallis entropy and its dynamic version. Phys. A 2018, 491, 678–692. [Google Scholar] [CrossRef]
- Sunoj, S.M.; Krishnan, A.S.; Sankaran, P.G. A quantile-based study of cumulative residual Tsallis entropy measures. Phys. A 2018, 494, 410–421. [Google Scholar] [CrossRef]
- Alomani, G.; Kayid, M. Further properties of Tsallis entropy and its application. Entropy 2023, 25, 199. [Google Scholar] [CrossRef]
- Baratpour, S.; Khammar, A.H. Results on Tsallis entropy of order statistics and record values. Istat. J. Turk. Stat. Assoc. 2016, 8, 60–73. [Google Scholar]
- Athreya, K.; Lahiri, S. Measure Theory and Probability Theory; Springer Science+Business Media, LLC.: New York, NY, USA, 2006. [Google Scholar]
- Shaked, M.; Shanthikumar, J.G. Stochastic Orders; Springer Science+Business Media, LLC.: New York, NY, USA, 2007. [Google Scholar]
- Navarro, J.; del Aguila, Y.; Asadi, M. Some new results on the cumulative residual entropy. J. Statist. Plann. Inference 2010, 140, 310–322. [Google Scholar] [CrossRef]
- Arnold, B.C.; Balakrishnan, N.; Nagaraja, H.N. Records; John Wiley & Sons: New York, NY, USA, 1998. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).