Next Article in Journal
Discrete Wavelet Entropy Aided Detection of Abrupt Change: A Case Study in the Haihe River Basin, China
Next Article in Special Issue
Socio-Thermodynamics—Evolutionary Potentials in a Population of Hawks and Doves
Previous Article in Journal
Association of Finite-Dimension Thermodynamics and a Bond-Graph Approach for Modeling an Irreversible Heat Engine
Previous Article in Special Issue
Nonparametric Estimation of Information-Based Measures of Statistical Dispersion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quantum Dynamical Entropies and Gács Algorithmic Entropy

1
Department of Physics, University of Trieste, Strada Costiera 11, I-34151 Trieste, Italy
2
INFN, Trieste, Strada Costiera 11, I-34151 Trieste, Italy
Entropy 2012, 14(7), 1259-1273; https://doi.org/10.3390/e14071259
Submission received: 13 April 2012 / Revised: 8 June 2012 / Accepted: 3 July 2012 / Published: 12 July 2012
(This article belongs to the Special Issue Concepts of Entropy and Their Applications)

Abstract

:
Several quantum dynamical entropies have been proposed that extend the classical Kolmogorov–Sinai (dynamical) entropy. The same scenario appears in relation to the extension of algorithmic complexity theory to the quantum realm. A theorem of Brudno establishes that the complexity per unit time step along typical trajectories of a classical ergodic system equals the KS-entropy. In the following, we establish a similar relation between the Connes–Narnhofer–Thirring quantum dynamical entropy for the shift on quantum spin chains and the Gács algorithmic entropy. We further provide, for the same system, a weaker linkage between the latter algorithmic complexity and a different quantum dynamical entropy proposed by Alicki and Fannes.

1. Introduction

Several proposals have been put forward with the aim of extending the dynamical entropy of Kolmogorov and Sinai (KS-entropy) [1] to the quantum realm [2,3,4,5,6]. The KS-entropy is a dynamical invariant and a powerful indicator of the randomness of a classical dynamical system as it accounts for the predictability of the future based on the knowledge of the past. Roughly speaking, the KS-entropy measures our knowledge about the next step along a typical phase-space trajectory provided one knows the trajectory’s past. Since in quantum mechanics there are neither phase-space nor trajectories and, moreover, observations perturb the observed system, many different non-commutative extensions can be envisaged, all of them reducing to the KS-invariant in the case of classical, that is, commutative systems.
While the KS-entropy is related to the rate at which information is produced by the dynamics with respect to an equilibrium state, algorithmic complexity theory has been developed by Kolmogorov, Solomonoff and Chaitin [7] in order to qualify and quantify the randomness of individual objects, say binary strings, independently of the statistical ensemble of which they may be part, that is, independently of any a-priori probability distribution. In fact, probability theory cannot sort out regular from random binary strings; for instance, in the case of a fair coin tossing, all strings of N zeroes and ones have the same probability 2 - N . Instead, algorithmic complexity measures their randomness based upon the difficulty of description by means of programs that run by Universal Turing Machines (UTM) to reproduce the target string.
The connection between the KS-entropy and classical algorithm complexity was established by a theorem of Brudno [8,9] who proved that for ergodic classical systems the algorithmic complexity per unit step of typical trajectories equals the KS-entropy.
The quantum extensions of the KS-entropy in [2,3,4,5,6] were formulated before quantum information and computation theory revolutionized our views concerning the transmission and manipulation of information [10]. The mere possibility of having quantum computers at disposal that might outperform classical computers aroused the need of confronting the notion of algorithmic complexity with the new avenues opened up by non-commutativity. The development of quantum information raised the question of which impact the existence of quantum computers may have on the issue of the algorithmic complexity of classical and quantum states [11]; in other words, can one do with classical algorithmic complexity or is it necessary to formulate a quantum algorithmic complexity that generalizes the classical notion? If one tries to pursue the second objective, the same scenario appears as for the quantum extension of the KS-entropy: several inequivalent and more or less related quantities have been put forward [12,13,14,15].
In the very special case of the space-translations over quantum spin chains, a quantum analogue of Brudno’s relation was proved to exist [16,17] between Connes–Narnhofer–Thirring [2] (CNT-) entropy and Berthiaume–van Dam–Laplante algorithmic complexity [13]; in the following we shall instead consider the Gács algorithmic entropy [14] and show how it is related to the CNT-entropy and the entropy of Alicki and Fannes (AF-entropy).
The paper is organized as follows: in Section 2, we briefly survey certain basic aspects of quantum mechanics and quantum information, in particular the notion of quantum spin chain and its von Neumann entropy rate. In Section 3, we first reformulate the notion of classical stationary source in a way where it appears as a particular realization of a quantum one; then, we summarize the fundamentals of algorithmic complexity theory and its relation to Shannon entropy and algorithmic probability. In Section 4, we introduce the AF-entropy, compute it for a binary quantum spin chain, explain the difference between the latter and the CNT-entropy and relate it to the measurement processes. In Section 5, we introduce Gács algorithmic entropy and establish a relation between it and the CNT and AF-entropies. Finally, in the conclusions we comment on the difficulties and the necessary steps to arrive at a full quantum Brudno theorem.

2. Qubits: Fundamentals

The generalization of a classical bit is a qubit, namely any quantum degree of freedom that can be described by a two-dimensional Hilbert space, like a spin 1 / 2 particle, photon polarization or a two-level atom. A qubit vector state is thus an element of a two-dimensional Hilbert space:
H = C 2 | ψ = x y = x 1 0 + y 0 1 , | x | 2 + | y | 2 = 1
Classical bits can then be identified by the elements of the orthonormal basis 1 0 , 0 1 , while generic qubit (vector) states are as in (1). Classical binary strings of length n, i ( n ) = i 1 i 2 i n , will then correspond to the tensor product states = 1 n | i in the Hilbert space ( C 2 ) n = C 2 n . Though these states are easily associated to familiar concepts from classical information theory, generic linear combinations of these elementary states escape such an association. Even more, Hilbert space vectors are described by pure state projectors P ψ = | ψ ψ | , and generic quantum states by mixed states or density matrices, ρ = j λ j P ψ j . These are convex combinations of projections and are (non-uniquely) associated with statistical ensembles of pure states P ψ j mixed with weights λ j 0 , j λ j = 1 .
An important indicator of the mixedness of a state ρ is its von Neumann entropy:
S ( ρ ) = - Tr ρ log ρ = - i = 1 2 r i log r i
where r i are the eigenvalues of ρ such that ρ | r i = r i | r i with orthonormal eigenvectors r i | r j = δ i j . In the following log will denote the logarithm in base 2: log 2 = 1
Any quantum mechanical process can be schematized in three steps: a preparation process, a time-evolution and a measurement process. Differently from the time-evolution, preparation and measurement are always irreversible processes. The last one is the most relevant of the two as it corresponds to reading the system, an irreversible quantum process that collapses the system states from pure ones to mixtures. These processes, also known as wave-packet reductions, correspond to the following completely positive maps [17,18]
P ψ M [ P ψ ] = i X i | ψ ψ | X i , i X i X i = 1
The set X = { X i } i I of bounded operators on the system Hilbert space is known as POVM and the corresponding processes do in general increase entropy: S ( M [ P ψ ] ) S ( P ψ ) = 0 .

Quantum Information Sources

From the algebraic point of view, a qubit is described by the algebra M 2 ( C ) of 2 × 2 complex matrices acting on C 2 ; whence n qubits by the tensor product algebra
M [ 1 , n ] = M 2 ( C ) M 2 ( C ) M 2 ( C ) = M 2 n = M 2 n ( C )
also called local spin algebra. It proves convenient to figure it out as describing n qubits located at the sites from = 0 to = n - 1 of a 1-dimensional lattice. Of course, one can locate n qubits along any segment [ - p , - p + n - 1 ] ; let n = 2 p , the family of local spin algebras M [ - p , p - 1 ] can be completed, via an inductive limit, to an infinite dimensional C * algebra [18,19], or two-sided quantum spin chain,
M 2 : = lim p M [ - p , p - 1 ]
In such a context the qubit at site 0 is described by observables of the form 1 - 1 ] A 1 [ 1 where A M 2 ( C ) and 1 - 1 ] stands for the infinite tensor products of identity matrices up to site - 1 , while 1 [ 1 for the infinite tensor product of infinitely many identity matrices from site 1 onwards.
The simplest dynamics on such a quantum spin chain is given by the right shift
Θ [ M [ - p , p - 1 ] ] = M [ - p + 1 , p ] , Θ [ 1 - p - 1 ] A [ - p , p - 1 ] 1 [ p ] = 1 - p ] A [ - p , p - 1 ] 1 [ p + 1
Any probability distribution over a classical source is assigned by fixing the probability measures of the local configurations; in the same vein, a global state on the quantum spin chain is determined by local density matrices on M [ - p , p - 1 ] . Translation-invariant states on M 2 are then equivalent to a family of density matrices ρ ( n ) defining the quantum statistic for all local algebras M [ - p , - p + n - 1 ] , independently of p; these density matrices determine a global, shift-invariant state ω on M 2 , in the sense that
ω ( A [ 0 , n - 1 ] ) = Tr ρ ( n ) A [ 0 , n - 1 ] = ω ( Θ p [ A [ 0 , n - 1 ] ] )
for all A [ 0 , n - 1 ] M [ 0 , n - 1 ] ( C ) and p 0 .
To such a global state one associates the von Neumann Entropy rate
s ( ω ) = lim n + 1 n S ( ρ ( n ) ) = - lim n + 1 n Tr ρ ( n ) log ρ ( n )
that corresponds to the entropy rate of classical stationary sources [20].
The entropy per unit step s ( ω ) is also the quantum dynamical entropy of the shift as proposed by Connes, Narnhofer and Thirring in [2], but differs, as we shall see, from the quantum dynamical entropy of Alicki and Fannes [3].

3. Classical Algorithmic Complexity and Entropy

In the light of the previous section, classical information sources can be accounted for as commutative spin chains, consisting of spins at each site that can be either up, | i = 0 , or down, | i = 1 , along a fixed direction in space. Namely, at each site of the lattice one locates diagonal 2 × 2 matrix algebras and diagonal p-site algebras are equipped with diagonal p dimensional density matrices. Finite binary strings i ( n ) Ω 2 n correspond to projectors that are tensor products of single bit projectors:
| i ( n ) i ( n ) | = | i 1 i 1 | | i 2 i 2 | | i n i n |
local probability distributions μ ( n ) = μ ( i ( n ) ) to diagonal density matrices
ρ μ ( n ) = i ( n ) μ ( i ( n ) ) | i ( n ) i ( n ) |
whose von Neumann entropy equals the Shannon entropy of μ ( n ) ,
S ( ρ μ ( n ) ) = - i ( n ) μ ( i ( n ) ) log μ ( i ( n ) ) = H ( μ ( n ) )
If the local probability distributions give rise to a translation-invariant global probability measure μ, then the stationary source is characterized by the (Shannon) entropy rate
h ( μ ) = lim n + 1 n H ( μ ( n ) )
The entropy rate of a classical stationary source is just the simplest example of KS-entropy, here the dynamics corresponding to the shift along the words emitted by the source, or along the diagonal spin chain in the quantum-like approach proposed above. In general, for a dynamical system equipped with an equilibrium state, one reduces to a suitable shift over a symbolic model by means of a finite partition of the system phase-space [1].

3.1. Algorithmic Complexity

In algorithmic complexity theory as developed by Kolmogorov, Solomonoff and Chaitin, one attributes to a binary string i ( n ) Ω 2 n of length n a complexity K ( i ( n ) ) measured by the length of any shortest program p * , that is, another binary string of length ( p * ) , read by a prefix Universal Turing Machine (UTM) U , would output i ( n ) , U [ p * ] = i ( n ) ,
K ( i ( n ) ) = min ( p ) : U [ p ] = i ( n )
The prefix property means that if U halts on a program p it does not continue to read on if another program q is appended to p; in other words, no halting program can be used as prefix to a halting program.
Suppose one has an (algorithmically) computable probability distribution μ ( n ) = { μ ( i ( n ) ) } on the statistical ensemble of strings of length n, Ω 2 n = { 0 , 1 } n . Then, the Shannon entropy is essentially the average Kolmogorov complexity; specifically,
H ( μ ( n ) ) - i ( n ) Ω 2 n μ ( i ( n ) ) K ( i ( n ) ) K ( μ ( n ) )
where K ( μ ( n ) ) is the complexity of the computable probability distribution [7]. Brudno [9] proved that, for ergodic sources, the difference disappears if one considers the rates. Actually, Brudno proved much more as his theorem establishes a relation between the entropy rate and the algorithmic information per unit step of almost all trajectories with respect to the ergodic measure of the source:
h ( μ ) = lim n + 1 n H ( μ ( n ) ) = k ( i ) : = lim n + 1 n K ( i ( n ) )
for almost all sequences i with respect to the measure μ, that is, the set of sequences where the equality fails has measure 0.
As already noticed, the Shannon entropy rate h ( μ ) is the Kolmogorov–Sinai entropy associated to the shift dynamics; therefore, the source entropy rate can also be interpreted as the maximal information provided by the shift dynamics per unit step of time. Indeed, Brudno’s theorem shows that the equality holds between the KS-entropy, that is the dynamical entropy, and the algorithmic complexity rate along almost all trajectories of a generic ergodic classical dynamical system.

3.2. Universal Probability

With prefix UTMs one attributes a universal probability [7] P ( i ( n ) ) to all strings i ( n ) Ω 2 n :
P ( i ( n ) ) : = p : U [ p ] = i ( n ) 2 - ( p ) , U prefix UTM
Notice that i ( n ) P ( i ( n ) ) 1 because the prefix property implies the Kraft’s inequality [20]. The map P defines a semi-computable semi-measure on Ω 2 n : indeed, it is not normalized and can be approximated only from below because we cannot be sure that U will ever halt on a given program (Halting Problem) [7]. Furthermore, for all other semi-computable semi-measures μ, there exists a constant C ( μ ) 0 depending only on μ and not on the input string i ( n ) such that
C ( μ ) μ ( i ( n ) ) P ( i ( n ) )
This relation exhibits the universality of the algorithmic probability. Most important of all is the connection between universal probability and algorithmic complexity:
K ( i ( n ) ) + log P ( i ( n ) ) C
where C is a constant independent of i ( n ) .

4. Alicki–Fannes Entropy

In a snapshot, the previous sections indicate that the complexity of the strings emitted by ergodic classical information sources is essentially captured by the Shannon entropy rate of the source; indeed, the ensemble point of view (Shannon entropy rate) and the individual point of view (algorithmic complexity) give the same complexity per unit step (Brudno’s theorem).
Moving from classical to quantum sources, the obvious question is: what measures the complexity of quantum information sources? One first answer is the von Neumann entropy rate; however, this is a quantifier that is neither sensitive to the specific structure of the statistical ensemble nor to the objects (quantum states) that compose the statistical ensemble. Furthermore, from the point of view of the spirit behind the construction of the KS-entropy, predictions have to be made provided some information have been gathered. However, unlike for classical systems, gathering information on a time-evolving quantum system unavoidably perturbs the state of the system. How could then measurement processes (3) be incorporated into the definition of the complexity of the shift along quantum spin chains? The answer is given by the Alicki–Fannes entropy [3,18], whose main ideas we now briefly summarize for the specific case of quantum spin chains.
We shall identify a quantum spin chain by the triple ( M 2 , ω , Θ ) and denote as partition of unit any finite collection
X = { X i } i = 1 m , i = 1 m X i X i = 1 , X i M 2
of local operators X i belonging to some local algebra M [ - p , p + 1 ] . Notice that, by means of ω and X we can define a m × m density matrix ρ [ X ] with entries
ρ i j [ X ] = ω ( X j X i ) = Tr ρ ( 2 p ) X j X i
and von Neumann entropy
S ( ρ [ X ] ) = - Tr ρ [ X ] log ρ [ X ]
In order to introduce the dynamics into the game, we let the partitions of unit transform under the action of the dynamical map Θ in (6),
Θ [ X ] = { Θ [ X i ] } i = 1 m
Then, we refine the partitions of unit from = 0 to = n - 1 as follows
X ( n ) = { X i ( n ) } , X i ( n ) = Θ ( n - 1 ) [ X i n - 1 ] Θ [ X i 1 ] X i 0 , i ( n ) = i 0 i 1 i n - 1
with i ( n ) an m-nary string in Ω m ( n ) . The refined sets X ( n ) are still partitions of unit according to (18); one can thus associate to them density matrices ρ [ X ( n ) ] acting on C m n with entries as in (19) and von Neumann entropies S [ X ( n ) ] as in (20).
Notice the analogy with using a partition of phase-space to associate a symbolic model to a classical dynamical system; here, the choice of a partition of unit X allows to describe the quantum triple ( M 2 , ω , Θ ) via a quantum symbolic model consisting of the family of density matrices ρ [ X ( n ) ] . The corresponding entropy rate is defined by
h ω A F ( Θ , X ) = lim sup n 1 n S [ X ( n ) ]
The Alicki–Fannes entropy of ( M 2 , ω , Θ ) is then defined by
h ω A F ( Θ ) = sup X h ω A F ( Θ , X )
as the supremum over all possible finite partitions X of unit by local operators of the quantum spin chain.
Remark 1. The lim sup in (23) has to be used for the sequence of density matrices ρ [ X ( n ) ] is not a stationary one [3,18]. In fact, while consistency holds as tracing ρ [ X ( n ) ] over the n-th factor yields the density matrix corresponding to the first n - 1 factors, Tr n ρ [ X ( n ) ] = ρ [ X ( n - 1 ) ] , stationarity does not. Indeed, in general, Tr 1 ρ [ X ( n ) ] ρ [ X ( n - 1 ) ] . Thence, the density matrix for the local algebra M p , q corresponding to sites from p to q in line of principle depends on the starting factor M 2 ( C ) at site p.
As a concrete example consider a set of 4 matrix units U i j M 2 ( C ) such that U i j = U j i , U i j U k = δ j k U i and i = 1 2 U i i = 1 . Dividing them by 2 one gets a partition of unit
U = U i j 2 i , j = 1 , 2 M 2 ( C )
the simplest choice being
U 11 = 1 0 0 0 , U 22 = 0 0 0 1 , U 12 = U 21 = 0 1 0 0
The refined partition that results after n applications of the right shift is
U ( n ) = U i ( n ) j ( n ) 2 n / 2 , U i ( n ) j ( n ) = U i 0 j 0 U i 1 j 1 U i n - 1 j n - 1 M 2 n ( C ) = M [ 0 , n - 1 ]
The associated density matrices ρ [ U ( n ) ] M 4 n ( C ) have entries and von Neumann entropy given by
1 2 n Tr ρ ( n ) U i ( n ) j ( n ) U k ( n ) ( n ) = 1 2 n Tr ρ ( n ) U j 0 i 0 U k 0 0 U j 1 i 1 U k 1 1 (26) = 1 2 n Tr ρ ( n ) δ i 0 k 0 U j 0 0 δ i 1 k 1 U j 1 1 = 1 2 n ρ ( n )
S ρ [ U ( n ) ] = S ( ρ ( n ) ) + n
The last equality in (26) comes from the fact that Tr ρ ( n ) U i ( n ) j ( n ) are the matrix elements of ρ ( n ) with respect to the orthonormal basis defined by the choice of matrix units. Then, entropy rate and Alicki–Fannes entropy are readily computed to be [3,18]
h ω A F ( Θ ) = h ω A F ( Θ , U ) = lim sup n 1 n S ( ρ [ U ( n ) ] ) = s ( ω ) + 1
Remark 2. With respect to the AF-quantum dynamical entropy, the quantum dynamical entropy introduced by Connes, Narnhofer and Thirring [2] has a more complicated construction essentially due to the fact that it is based on a classical symbolic modeling with superimposed quantum corrections [17,21]. Consider a quantum spin chain endowed with a translation invariant state with sufficiently fast decaying correlations, the simplest case being the tensor product of a same density matrix ρ, that is
ω ( A 1 A 2 A n ) = = 1 n Tr ( ρ A )
In such cases, the CNT-entropy of the shift equals the von Neumann entropy rate (7), that is
h ω C N T ( Θ ) = s ( ω ) ( = - Tr ρ log ρ in the product state case )
One then notices that the AF-entropy exceeds the CNT-entropy by 1 = log 2 in the case of the shift on quantum spin chains. The origin of the difference lies in that the AF-entropy accounts for the disturbances brought about by the description of the quantum dynamics via a quantum symbolic model. Does this addition have an interpretation in terms of quantum algorithmic complexity and of which one among the many available on the market? In the following we shall seek answers to these questions. A preliminary step is to clarify that the extra term 1 = log 2 is a consequence of having introduced the measurement process through the partitions of unit.

AF-Entropy: Operational Interpretation

As we have seen in the second section, in quantum mechanics generic measurement processes on a system in a state ρ lead to a modification of the state as in (3). Suppose
M 2 n ( C ) ρ ( n ) = i r i ( n ) | r i ( n ) r i ( n ) |
is the spectral decomposition of a local state for n qubits described by the local algebra M [ 0 , n - 1 ] ; any such mixed state can be purified, that is transformed into a projector, by coupling M [ 0 , n - 1 ] to itself and by doubling ρ ( n ) into
C 4 n | ρ ( n ) = i r i ( n ) | r i ( n ) | r i ( n )
Given the refined partition of unit U ( n ) in (25), one further amplifies the Hilbert space from C 4 n to C 4 n C 4 n and constructs the following vector state
C 4 n C 4 n | Ψ [ U ( n ) ] = i ( k ( n ) ( n ) ) r i ( n ) U k ( n ) ( n ) | r i ( n ) | r i ( n ) | k ( n ) ( n )
where the vectors | k ( n ) ( n ) indexed by pairs of binary strings in Ω 2 ( n ) form an auxiliary orthonormal basis in C 4 n of cardinality 2 n × 2 n .
One thus sees that | Ψ [ U ( n ) ] is the vector state of a three-partite system consisting of the n qubits, system I, a copy of the latter, system I I , and a copy of the first two, system I I I . From the projection P = | Ψ [ U ( n ) ] Ψ [ U ( n ) ] | , by tracing over the first two systems, respectively over the last one, one obtains the following marginal states on M [ 0 , n - 1 ] M [ 0 , n - 1 ] ,
Tr I , I I ( P ) = ρ [ U ( n ) ]
Tr I I I ( P ) = ( k ( n ) ( n ) ) U k ( n ) ( n ) 1 | ρ ( n ) ρ ( n ) | U k ( n ) ( n ) 1 = R [ U ( n ) ]
Since they are marginal density matrices of a pure state, they have the same von Neumann entropy
S ρ [ U ( n ) ] = S R [ U ( n ) ] = S ( ρ ( n ) ) + n .
Thence, the entropy associated to ω and to the partition of unit U ( n ) , that is ρ [ U ( n ) ] , is also the entropy of the state R [ U ( n ) ] which results from the action of the POVM { U k ( n ) ( n ) 1 } on the purified state | ρ ( n ) ρ ( n ) | .

5. Quantum Dynamical Entropies and Gács Complexity

In order to extend classical algorithmic complexity theory to the quantum setting, P. Gács [14] started from the notion of universal probability and its relation to algorithmic complexity embodied by (17). He introduced the notion of universal semi-density matrix; it goes as follows. First, one defines the elementary vectors as those n-qubit vector states | Ψ C 2 n that have a representation along a fixed orthonormal basis { | φ j } j in terms of computable coefficients that are algebraic numbers
| Ψ = j C j Ψ | φ j , C j Ψ algebraic numbers
These elementary vectors can thus be encoded by finite binary strings,
| Ψ i Ψ Ω 2 p p < +
and characterized by algorithmic complexities K ( i Ψ ) and universal probabilities P ( i Ψ ) . Then, one proceeds to construct the convex combination of all elementary projectors weighted by their universal probabilities:
D = Ψ e l e m e n t a r y P ( i Ψ ) | Ψ Ψ | , D = Tr D = Ψ e l e m e n t a r y P ( i Ψ ) 1
The latter inequality states that D is a semi-density matrix for it is not normalized; further, exactly as for the classical universal probability, D is universal among lower semi-computable semi-density matrices, those whose entries with respect to the fixed orthonormal basis can be approximated as much as one likes from below by computable complex numbers. Namely, given a lower semi-computable semi-density matrix ρ M [ 0 , n - 1 ] there exists a constant C ( ρ ) 0 such that
C ( ρ ) ρ D , C ( ρ ) = 2 - K ( ρ )
where the constant depends only on the classical algorithmic complexity of (the entries of) ρ.
It is suggestive then to introduce the quantum algorithmic complexity as an operator
κ q = - log D
and to define the complexity of | ψ C 2 n , called algorithmic entropy in [14] and referred to as Gács complexity in this paper, as the mean value of the operator complexity with respect to such a state:
GC ( ψ ) : = ψ | κ q | ψ
A remarkable and simple property that follows from such a definition is that any computable density matrix ρ has von Neumann entropy that coincides with the mean value with respect to ρ of the operator complexity:
S ( ρ ) S ( ρ ) - log D Tr ( ρ κ q ) S ( ρ ) - log C ( ρ ) = S ( ρ ) + K ( ρ )
The first inequality follows from D 1 , the second one is a consequence of the positivity of the relative entropy
S ( ρ 1 , ρ 2 ) = Tr ρ 1 ( log ρ 1 - log ρ 2 ) 0
of any two density matrices ρ 1 , 2 when one chooses ρ 1 = ρ and [see (37)] ρ 2 = D / D . Finally, the third inequality comes from the universality of D , that is from (38).
In the following, we shall consider states of varying number n of qubits: the parameter n will always be considered as an implicit parameter in all the occurrences of Gács complexity.

5.1. Gács Complexity and CNT-Entropy for Quantum Spin Chains

Consider a family of computable local states ρ ( n ) on M [ 0 , n - 1 ] ; the universal semi-density matrix D n M [ 0 , n - 1 ] is a 2 n × 2 n matrix corresponding to the local sub-algebras; it gives rise to a Gács operator complexity κ q ( n ) = - log D n for which
S ( ρ ( n ) ) Tr ( ρ ( n ) κ q ( n ) ) S ( ρ ( n ) ) + K ( ρ ( n ) )
Therefore, if K ( ρ ( n ) ) n 0 , a condition holding for computable local states of sufficient regularity, as for instance for the totally uncorrelated states ρ ( n ) = ρ n , then
S ( ρ ( n ) ) n Tr ( ρ ( n ) κ q ( n ) ) n S ( ρ ( n ) ) n + K ( ρ ( n ) ) n lim n Tr ( ρ ( n ) κ q ( n ) ) n = s ( ω )
This equality establishes a relation between the von Neumann entropy rate for a quantum spin chain and the average Gács algorithmic entropy rate relative to the local density matrices giving rise to the global translation invariant state ω. However, comparing the above relation with Brudno’s relation (14), one sees that the latter is a stronger pointwise relation. In the quantum setting, one would like to show that, for sufficiently large n, GC ( ψ ) is close to n s ( ω ) for all states ψ that are, in some precise sense to be defined, typical for the state ω.
Remark 3. A quantum Brudno’s relation was proved in [16] for an ergodic quantum spin chain and the quantum complexity of a quantum state ψ, QC δ ( ψ ) , introduced by [13,16] and defined by
QC δ ( ψ ) = min ( σ ) | | ψ ψ | - M ( σ ) t r δ
In the above expression,
  • σ is any density matrix acting on the Hilbert space H [ 0 , 1 ] * = lim n + k = 0 n H k , where H k = C k is the Hilbert space of k qubits: H [ 0 , 1 ] * accommodates classical binary strings of arbitrary length as quantum states and makes meaningful referring to generic density matrices acting on it as quantum programs. Furthermore, these density matrices constitute a Banach space T 1 + ( H [ 0 , 1 ] * ) under the so-called trace-distance X t r = 1 2 Tr ( X X ) .
  • ( σ ) is the length of quantum programs and is defined by
    ( σ ) = min n N | σ T 1 + ( H n )
    with H n = k = 0 n H k .
  • M is a quantum operation, that is a completely positive map from T 1 + ( H [ 0 , 1 ] * ) into itself: these maps are interpreted as Quantum Turing Machines (see [22] for a detailed mathematical characterization and a discussion of the halting time in the context of the quantum superposition principle).
The quantum Brudno relation involving the quantum complexity QC δ has the following form: for any given δ 0 , there exists a sequence of projections P n ( δ ) that are typical with respect to ω, namely such that lim n + ω ( P n ( δ ) ) = 1 . Furthermore, choosing n large enough, one has
1 n QC δ ( ψ ) s ( ω ) - δ ( 2 + δ ) s ( ω ) , s ( ω ) + δ
for all vector states | ψ such that P n ( δ ) | ψ = | ψ .
We now prove a similar relation for the Gács complexity. The clue is given by the following result in [14]: for any n-qubit state ψ ( C 2 ) n and δ < 1 / 2 , if QC δ ( ψ ) k then
GC ( ψ ) k + K ( k ) + 2 n δ + C
where C is a constant independent of ψ.
Let k = QC δ ( ψ ) be the smallest integer larger than QC δ ( ψ ) and choose N 0 N such that
1 n GC ( ψ ) - s ( ω ) 1 n QC δ ( ψ ) - s ( ω ) + 3 δ n N 0
In turn, for all ψ in one of the typical subspaces projected out by P n ( δ ) as specified in the previous remark, the relation (46) for QC δ ( ψ ) gives 1 n GC ( ψ ) - s ( ω ) 4 δ . In order to show that, for n large enough
1 n GC ( ψ ) - s ( ω ) 4 δ
and hence a full quantum Brudno relation for the Gács complexity, we argue by contradiction.
Suppose that for any N 0 one can find n N 0 such that 1 n GC ( ψ ) < s ( ω ) - 4 δ . The convexity of the function log x yields [14] GC ( ψ ) - log ψ | D n | ψ . It thus follows that ψ | D n | ψ > 2 - n ( s ( ω ) - 4 δ ) , whence, by taking the trace of D n with respect to the typical projection P n ( δ ) ,
1 Tr D n Tr P n ( δ ) D n > Tr ( P n ( δ ) ) 2 - n ( s ( ω ) - 4 δ )
The last step in the proof is based on the following result [23]: for an ergodic quantum spin chain with von Neumann entropy rate s ( ω ) ,
lim n + 1 n β ϵ , n ( ω ) = s ( ω )
β ϵ , n ( ω ) = min log Tr ( P ) | P = P = P 2 M [ 0 , n - 1 ] , ω ( P ) 1 - ϵ ,
for any 1 > ϵ > 0 . Roughly speaking, the dimension of typical subspaces grows like 2 n s ( ω ) . Thus, by choosing n large enough one can make Tr ( P n ( δ ) ) > 2 n ( s ( ω ) - 3 δ ) , hence obtaining a contradiction from inequality (49):
1 Tr P n ( δ ) D n > 2 n ( s ( ω ) - 3 δ ) 2 - n ( s ( ω ) - 4 δ ) = 2 n δ

5.2. Gács Complexity and AF-Entropy for Quantum Spin Chains

We now consider the Gács algorithmic entropy in relation to the AF-entropy that we have seen to amount to a quantum dynamical entropy that takes into account the measurement processes, these latter providing quantum symbolic models.
The partition of unit U ( n ) M [ 0 , n - 1 ] is computable (all matrix elements are computable algebraic numbers) with respect to the fixed standard orthonormal basis. Given the computable local states ρ ( n ) , construct the auxiliary states R [ U ( n ) ] M [ 0 , n - 1 ] M [ 0 , n - 1 ] and the universal density matrix D ˜ n M [ 0 , n - 1 ] M [ 0 , n - 1 ] . Notice that, unlike D n , this matrix acts on the Hilbert space C 4 n . The corresponding operator complexity κ ˜ ( n ) = - log D ˜ n is such that
S R [ U ( n ) ] Tr R [ U ( n ) ] κ ˜ q ( n ) S R [ U ( n ) ] + K R [ U ( n ) ]
Thus, if K ( R [ U ( n ) ] ) n 0 , a condition also holding for sufficiently regular ρ ( n ) in view of the structure of the partitions of unit U ( n ) , one gets
lim sup n Tr R [ U ( n ) ] κ ˜ q ( n ) n = s ( ω ) + 1
Though this is not a quantum Brudno’s relation as in (48), it establishes a connection, not available so far, between the AF-entropy and the Gács complexity. For a point-wise relation between n ( s ( ω ) + 1 ) and ψ ˜ | κ ˜ q ( n ) | ψ ˜ , one would need to define the notion of typicality of projectors in M [ 0 , n - 1 ] M [ 0 , n - 1 ] with respect to the sequence of auxiliary states R [ U ( n ) ] and to prove an estimate of their dimension along the line of (50).

6. Conclusions

In classical information theory and, more in general, for classical dynamical systems, Brudno’s theorem establishes a relation between the dynamical information rate of ergodic time-evolutions and the algorithmic complexity per unit time step of almost all trajectories. In quantum information theory, classical sources are replaced by quantum spin chains which represent an arena for putting to test the various non-commutative extensions of both the Kolmogorov–Sinai dynamical entropy and the Kolmogorov–Solomonoff–Chaitin algorithmic complexity and their possible relations. There is a relation between the von Neumann entropy rate of quantum spin chains and the Berthiaumme–van Dam–Laplante quantum algorithmic complexity rate for every pure state in typical subspaces with respect to ergodic shift-invariant states over the chain. As the von Neumann entropy rate coincides with the quantum dynamical entropy of the shift along the quantum spin chain as defined by Connes, Narnhofer and Thirring, the above relation can rightfully be considered as a quantum Brudno’s relation. No similar relations have so far been proved to exist among any of the other existing proposals. In this paper, in the case of quantum sources, we have proved a quantum Brudno relation between the quantum dynamical entropy proposed by Connes, Narnhofer and Thirring and the rate of the algorithmic entropy of Gács. In the case of the Alicki–Fannes entropy, a first step in setting a link between the latter and the Gács complexity rate has been provided. For a full quantum Brudno relation new tools have to be developed, as the quantum symbolic models behind the AF-construction are not even stationary and the techniques of ergodic quantum spin chains cannot be used directly. Furthermore, should one succeed in proving a relation to exist between the AF-entropy and the Gács algorithmic entropy rate for pure states in typical subspaces for quantum spin chains, the next step, also in the case of the CNT-entropy, would be to move on and consider more general ergodic quantum dynamical systems and face the problem of a full quantum Brudno’s theorem for non-trivial dynamics.

Acknowledgement

The author wants to thank one of the referees for precious comments on the manuscript.

References

  1. Cornfeld, I.P.; Fomin, S.V.; Sinai, Y.G. Ergodic Theory; Springer: New York, NY, USA, 1982. [Google Scholar]
  2. Connes, A.; Narnhofer, H.; Thirring, W. Dynamical entropy of C*- algebras and von Neumann algebras. Commun. Math. Phys. 1986, 112, 691–719. [Google Scholar]
  3. Alicki, R.; Fannes, M. Defining quantum dynamical entropy. Lett. Math. Phys. 1994, 32, 75–82. [Google Scholar] [CrossRef]
  4. Słomczyński, W.; Życzkowski, K. Quantum chaos: An entropy approach. J. Math. Phys. 1994, 35, 5674–5701. [Google Scholar]
  5. Voiculescu, D. Dynamical approximation entropies and topological entropy in operator algebras. Commun. Math. Phys. 1995, 170, 249–281. [Google Scholar] [CrossRef]
  6. Accardi, L.; Ohya, M.; Watanabe, N. Dynamical entropy through quantum Markov chains. Open Sys. Inf. Dyn. 1997, 4, 71–87. [Google Scholar] [CrossRef]
  7. Li, M.; Vitany, P. An Introduction to Kolmogorov Complexity and Its Applications; Springer: New York, NY, USA, 1997. [Google Scholar]
  8. Alekseev, V.M.; Yakobson, M.V. Symbolic dynamics and hyperbolic dynamic systems. Phys. Rep. 1981, 75, 290–325. [Google Scholar] [CrossRef]
  9. Brudno, A.A. The complexity of the trajectories of a dynamical system. Trans. Moscow Math. Soc. 1983, 2, 127–151. [Google Scholar] [CrossRef]
  10. Nielsen, M.A.; Chuang, I.L. Quantum Computation and Quantum Information; Cambridge University Press: Cambridge, UK, 2000. [Google Scholar]
  11. It is important to distinguish algorithmic complexity from computational complexity the zoo of whose classes has been greatly enriched by the birth of quantum information.
  12. Vitanyi, P. Quantum Kolmogorov complexity based on classical descriptions. IEEE Trans. Inform. Theor. 2001, 47, 2464–2479. [Google Scholar] [CrossRef]
  13. Berthiaume, A.; van Dam, W.; Laplante, S. Quantum Kolmogorov complexity. J. Comput. System Sci. 2001, 63, 201–221. [Google Scholar] [CrossRef]
  14. Gács, P. Quantum algorithmic entropy. J. Phys. A 2011, 34, 1–22. [Google Scholar]
  15. Mora, C.; Briegel, H.J. Algorithmic complexity and entanglement of quantum states. Phys. Rev. Lett. 2005, 95, 200503. [Google Scholar] [CrossRef] [PubMed]
  16. Benatti, F.; Krüger, T.; Müller, M.; Siegmund-Schultze, R.; Szkola, A. Entropy and quantum Kolmogorov complexity: A quantum Brudno’s theorem. Commun. Math. Phys. 2006, 265, 437–461. [Google Scholar] [CrossRef]
  17. Benatti, F. Dynamics, Information and Complexity in Quantum Systems; Theoretical and Mathematical Physics; Springer: Berlin, Germany, 2009. [Google Scholar]
  18. Alicki, R.; Fannes, M. Quantum Dynamical Systems; Oxford University Press: Oxford, UK, 2001. [Google Scholar]
  19. Bratteli, O.; Robinson, D.W. Operator Algebras and Quantum Statistical Mechanics IIEquilibrium states. Models in Quantum Statistical Mechanics; Springer: Berlin, Germany, 2003. [Google Scholar]
  20. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: New York, NY, USA, 1991. [Google Scholar]
  21. Sauvageot, J.-L.; Thouvenot, J.-P. Une nouvelle définition de l’entropie dynamique des systèmes non commutatifs (in French). Commun. Math. Phys. 1992, 145, 411–423. [Google Scholar] [CrossRef]
  22. Müller, M. Strongly universal quantum Turing machines and invariance of Kolmogorov complexity. IEEE Trans. Inform. Theor. 2008, 54, 763–780. [Google Scholar] [CrossRef]
  23. Bjelakovic, I.; Krüger, T.; Siegmund-Schultze, R.; Szkola, A. The Shannon–Mc Millan theorem for ergodic quantum lattice systems. Invent. Math. 2004, 155, 203–222. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Benatti, F. Quantum Dynamical Entropies and Gács Algorithmic Entropy. Entropy 2012, 14, 1259-1273. https://doi.org/10.3390/e14071259

AMA Style

Benatti F. Quantum Dynamical Entropies and Gács Algorithmic Entropy. Entropy. 2012; 14(7):1259-1273. https://doi.org/10.3390/e14071259

Chicago/Turabian Style

Benatti, Fabio. 2012. "Quantum Dynamical Entropies and Gács Algorithmic Entropy" Entropy 14, no. 7: 1259-1273. https://doi.org/10.3390/e14071259

Article Metrics

Back to TopTop