*2.1. Classical Entropy*

Many systems in classical physics carry entropy. Some of the most studied systems are: charge transport at a point contact [22,23], energy transport in heat engines [24], and a gravitational hypersurface falling into a black hole [25–28]. Let us for simplicity of the discussion review classical entropy by means of the example of charge transport through a point contact. Consider for this purpose two large conductor plates connected at a point, the so-called 'point contact system'. This classical point contact either transmits a charged particle with probability *p* or blocks the transmission with probability 1 − *p*. Let us consider *N* attempts take place. For *N* 1 it is most likely that, in *pN* out of *N* times, the particles are successfully transferred and, in (1 − *p*) *N* out of *N* times, they are not. For unmarked particles, the order of events does not matter, therefore the number of possibilities with *pN* transfers out of *N* attempts is

$$\mathcal{N} = \binom{N}{pN} \approx \frac{N^N}{(pN)^{pN} \left[ (1-p)N \right]^{(1-p)N}} = \frac{1}{p^{pN} \left( 1-p \right)^{(1-p)N}}.\tag{1}$$

This number rapidly grows with *N*. In order to keep the number small, we take its logarithm. This defines the so-called Shannon entropy, i.e., *SShannon* = log2 N = −*N* [*p* log2 *p* + (1 − *p*)log2 (1 − *p*)].

The linear dependence of the Shannon entropy on the number of attempts *N* indicates its additivity. The definition of entropy can be generalized to account for extended geometries such as a *k* + 1-path terminal that connects any reservoir to *k* others. In this case, *k* probabilities contribute to understanding the possibility of transmission from a reservoir to any one of the other *k* reservoirs, thus entropy is generalized to *SShannon* = −*N* ∑*kn*=<sup>1</sup> *pn* log2 *pn*. This entropy may vary in time. One possible reason for such variation could be due to time-dependent probabilities *pn*(*t*). Another possibility for time evolution of entropy could be the presence of some bias in controlling the system. For example, consider that, after one successful transfer, the transmission is reduced or closed for a rather long time before it opens again to another transfer attempt. The entropy of such a system depends on whether or not a success transfer has taken place in the past.

In fact, in this paper, what we call entropy production refers to the time variation of partial entropy associated with a part of a closed system. Moreover, as stated in the Introduction, in this paper, we are only interested in the time variation in thermodynamic systems such as heat baths; therefore, our focus is only on thermodynamic entropies and its time evolution, namely 'entropy production'. In this section, although we discuss Shannon entropy *SShannon*, we have to distinguish between the Shannon entropy, which can be measured as a number of bits, and the rest of the paper in which we study von Neumann thermodynamic entropy measured in the unit Joule per Kelvin. The Shannon entropy and the thermodynamic entropy are related by the Boltzmann constant *kB*, i.e., *SThermodynamic* = *kBSShannon*. Without the loss of generality, we use the convention that *kB* = 1, although the reader should keep in mind that, in this paper, we are interested in finding changes in thermodynamic entropy flow as the result of energy exchange processes.
