*2.2. Renyi Entropy*

Alfred Renyi introduced the generalization of Shannon entropy that maintains the additive property [29]. For a finite set of *k* probabilities *pi* with *i* = 1, ··· , *k*, the Renyi entropy of degree *M* is defined as

S*M* = 1 1 − *M* log ∑ *i* (*pi*) *M* , (2)

with positive entropy order *M* > 0. The symbol S*M* indicates that this is the original definition of Renyi entropy to make it distinct from the simplified definition *SM* we use in this paper. The constant prefactor 1/(1 − *M*) in Equation (2) has certain advantages. One of the advantages is that it helps to compactify the definition of some other entropies using Equation (2); i.e., the analytical continuation of Renyi entropy in the limit of *M* approaching 1 ( ∞) defines Shannon (min) entropy. Another advantage of the prefactor is that it allows for interpretation of the quantity as the number of bits (thanks to one of the referees for pointing out these remarks).

Here, we present a simplified version of the definition. The logic behind such simplification is that the calculation in the limits requires L'Hopital's rule; i.e., *SShannon*, *min* = lim *M* <sup>→</sup>1,∞ *S*<sup>R</sup> *M* = − lim *M* <sup>→</sup>1,∞ *d*(log ∑*i* (*pi*) *<sup>M</sup>*)/*dM*. We define a rescaled Renyi entropy, which is different from the original definition by a prefactor 1/(*M* − <sup>1</sup>):

$$S\_M = -\log \sum\_i \left( p\_i \right)^M. \tag{3}$$

The reason to define the simplified formula is that evaluating entropy itself is beyond the scope of this paper. Instead, we need to find the time derivative of the entropy (i.e., entropy flow). Due to the presence of a logarithm in Equation (3), any contact prefactor in the definition of entropy will be canceled out from the numerator and denominator of entropy flow. The only trouble is that we must keep in mind that the Shannon entropy can be reproduced after taking the *dS M*/*dM* in the limit of *M* → 1. In fact, given that *dx <sup>M</sup>*/*dM* = *d* exp (*M* ln *x*) /*dM* = *x<sup>M</sup>* ln *x*, one can write

$$\lim\_{M \to 1} \frac{d\mathcal{S}\_M}{dM} = -\lim\_{M \to 1} \frac{\sum\_i \left(p\_i\right)^M \ln p\_i}{\sum\_i \left(p\_i\right)^M} = -\sum\_i p\_i \ln p\_i = \mathcal{S}\_{\text{Shannon}}.\tag{4}$$

In the rest of the paper, we use the simplified definition. However, given that the difference between the two definitions is marginal, only a constant factor, the reader may decide to use either definition, subject to the discussion above.

In a point contact, given that Renyi entropy is additive for independent attempts, the total Renyi entropy after *N* uncorrelated attempts will be *SM* = − *N* log *p<sup>M</sup>* + (1 − *<sup>p</sup>*)*<sup>M</sup>*. In a classical heat reservoir, the Renyi entropy is more closely related to free energy. Consider a bath at temperature *T* with a large number of energy states  *i*. The corresponding Gibbs probabilities are *pi* = exp (− *<sup>i</sup><sup>T</sup>*) /*Z*(*T*) and *Z* (*T*) ≡ ∑*i pi* is the corresponding partition function. The Renyi entropy of the heat bath is *SM* = − ln (∑*i* exp (− *<sup>M</sup> i<sup>T</sup>*)) + *M* ln *Z* (*T*). The free energy will be *F* (*T*) = − *T* ln *Z* (*T*), which is related to the Renyi entropy as *SM* = (*M*/*T*) (*F* (*T*) − *F* (*T*/*M*)), i.e., the free energy difference at temperatures *T* and *T*/*M*.
