1. Introduction
Theoretical foundation supporting today’s information-oriented society is Information Theory founded by Shannon [
1] about 60 years ago. Generally, this theory can treat the efficiency of the information transmission by using measures of complexity, that is, the entropy, in the commutative system of signal space. The information theory is based on the entropy theory that is formulated mathematically. Before Shannon’s work, the entropy was first introduced in thermodynamics by Clausius and in statistical mechanics by Boltzmann. These entropies are the criteria to characterize a property of the physical systems. Shannon’s construction of entropy is a use of the discrete probability theory based on the idea that “information obtained from a system with a large vagueness has been highly profitable”, and he introduced (1) the entropy measuring the amount of information of the state of system and (2) the mutual entropy (information) representing the amount of information correctly transmitted from the initial system to the final system through a channel. This entropy theory agreed with the development of the probability theory, due to Kolmogorov, gives a mathematical foundation of the classical information theory with the relative entropy of two states by Kullback-Leibler [
2] and the mutual entropy by Gelfand-Kolmogorov-Yaglom [
3,
4] on the continuous probability space. In addition, a channel of the discrete systems given by a transition probability was generalized to the integral kernel theory. The channel of the continuous systems is expressed as a state change on the commutative probability space by introducing the averaged operator by Umegaki and it is extended to the quantum channel (prob. measure) describing a state change in the noncommutative systems [
5]. Since the present optical communication uses laser signal, it is necessary to construct new information theory dealing with those quantum quantities in order to discuss the efficiency of information transmission of optical communication processes rigorously. It is called the quantum information theory extending the important measures such as the entropy, the relative entropy and the mutual entropy formulated by Shannon et al into the quantum systems. The study of the entropy in quantum system was begun by von Neumann [
6] in 1932, and the quantum relative entropy was introduced by Umegaki [
7], and it is extended to general quantum system by Araki [
8,
9] , Uhlmann [
10] and Donald [
11]. In the quantum information theory, one of the important subjects is to examine how much information correctly carried through a channel, so that it is necessary to extend the mutual entropy of the classical system to the quantum system.
The mutual entropy in the classical system is defined by the joint probability distribution between input and output systems. However, the joint probability does not exist generally (see [
12]) in quantum systems. The compound state devised in [
13,
14] gives a key solving this problem. It is defined through the Schatten decomposion [
15] (one dimensional orthogonal decomposition) of the input state and the quantum channel. Ohya introduced the quantum mutual entropy based on the compound state in 1983 [
13,
16]. Since it satisfies Shannon’s inequalities, it describes the amount of information correctly transmitted from input system through a quantum channel. By using fundamental entropies such as the von Neumann entropy and the Ohya mutual entropy, the complete quantum version of Shannon’s information theory was formulated.
The quantum entropy for a density operator was defined by von Neumann [
6] about 20 years before the Shannon entropy appeared. The properties of entropy are summarized in [
17]. Main properties of the quantum relative entropy are taken from the articles [
8,
9,
10,
11,
17,
18,
19,
20]. The quantum mutual entropy was introduced by Holevo, Livitin, Ingarden [
21,
22] for classical input and output passing through a possible quantum channel. The complete quantum mechanical mutual entropy was defined by Ohya [
13], and its generalization to C*-algebra was done in [
23]. The applications of the mutual entropy have been been studied in various fields [
16,
24,
25,
26,
27,
28,
29]. The applications of channel were given in [
13,
16,
30,
31,
32,
33].
Concerning quantum communication, the following studies have been done. The characterization of quantum communication or stochastic procesess is discussed and the beam splitting was rigorously studied by Fichtner, Freutenberg and Liebscher [
34,
35]. The transition expectation was introduced by Accardi [
36] to study quantum Markov process [
37]. The noisy optical channel was discussed in [
28]. In quantum optics, a linear amplifier have been discussed by several authours [
38,
39], and its rigorous expression given here is in [
30]. The channel capacities are discussed here based on the papers [
40,
41]. The bound of the capacity has been studied by first Holevo [
42] and many others [
39,
41,
43].
The entangled state is an important concept for quantum theory and it has been studied recently by several authors and its rigorous mathematical study was given in [
44,
45].
Let us comment general entropies of states in C*-dynamical systems. The C*-entropy was introduced in [
23] and its property is discussed in [
25,
26,
46]. The relative entropy for two general states was introduced by Araki [
8,
9] in von Neumann algebra and Uhlmann [
10] in *-algebra. The mutual entropy in C*-algebra was introduced by Ohya [
16]. Other references of quantum entropy is totally discussed in a book [
17].
The classical dynamical (or Kolmogorov-Sinai) entropy
[
47] for a measure preserving transformation
T was defined on a message space through finite partitions of the measurable space.
The classical coding theorems of Shannon are important tools to analyse communication processes which have been formulated by the mean dynamical entropy and the mean dynamical mutual entropy. The mean dynamical entropy represents the amount of information per one letter of a signal sequence sent from an input source, and the mean dynamical mutual entropy does the amount of information per one letter of the signal received in an output system.
The quantum dynamical entropy (QDE) was studied by Connes-Størmer [
48], Emch [
49], Connes-Narnhofer-Thirring [
50], Alicki-Fannes [
51], and others [
52,
53,
54,
55,
56]. Their dynamical entropies were defined in the observable spaces. Recently, the quantum dynamical entropy and the quantum dynamical mutual entropy were studied by the present authors [
16,
29]: (1) The dynamical entropy is defined in the state spaces through the complexity of Information Dynamics [
57]. (2) It is defined through the quantum Markov chain (QMC) was done in [
58]. (3) The dynamical entropy for a completely positive (CP) maps was introduced in [
59].
In this review paper, we give an overview of the entropy theory mentioned above. Their details of are discussed in the following book [
17,
60].
2. Setting of Quantum Systems
We first summarize mathematical description of both classical and quantum systems.
(1) Classical System: Let be the set of all real random variables on a probability measure space and be the set all probability measures on a measurable space . and represent a observable and a state of classical systems, respectively. The expectation value of the observable with respect to a state is given by
(2) Usual Quantum Systems: We denote the set of all bounded linear operators on a Hilbert space by , and the set of all density operators on by . A hermite operator and denote an observable and a state of usual quantum systems. The expectation value of the observable with respect to a state is obtained by
(3) General Quantum System: More generally, let be a C*-algebra (i.e., complex normed algebra with involution * such that , and complete w.r.t. ) and be the set of all states on (i.e., positive continuous linear functionals φ on such that if the unit I is in ).
If is a unital map from a C*-algebra to a C*-algebra then its dual map is called the channel. That is, tr tr Remark that the algebra sometimes will be denoted
Such algebraic approach contains both classical and quantum theories. The description of a classical probabilistic system (CDS), a usual quantum dynamical system(QDS) and a general quantum dynamical system(GQDS) are given in the following Table:
Table. 1.1.
Descriptions of CDS, QDS and GQDS.
Table. 1.1.
Descriptions of CDS, QDS and GQDS.
| CDS | QDS | GQDS |
| real r.v. | Hermitian operator | self-adjoint element |
observable | in | A on | A in |
| | (self adjoint operator | C*-algebra |
| | in ) | |
state | probability measure | density operator | p.l.fnal |
| | ρ on | with |
expectation | | | |
3. Communication Processes
We discuss the quantum communication processes in this section.
Let
be the infinite direct product of the alphabet
calling a message space. A coding is a one to one map Ξ from
to some space
X which is called the coded space. This space
X may be a classical object or a quantum object. For a quantum system,
X may be a space of quantum observables on a Hilbert space
, then the coded input system is described by
. The coded output space is denoted by
and the decoded output space
is made by another alphabets. An transmission (map) Γ from
X to
(actually its dual map as discussed below) is called
a channel, which reflects the property of a physical device. With a decoding
the whole information transmission process is written as
That is, a message is coded to and it is sent to the output system through a channel Γ, then the output coded message becomes and it is decoded to at a receiver.
Then the occurrence probability of each message in the sequence
of
N messages is denoted by
which is a state in a classical system. If Ξ is a quantum coding, then
is a quantum object (state) such as a coherent state. Here we consider such a quantum coding, that is,
is a quantum state, and we denote
by
Thus the coded state for the sequence
is written as
This state is transmitted through the dual map of Γ which is called a
channel in the sequel. This channel (the dual of
is expressed by a completely positive mapping
in the sense of Chapter 5 , from the state space of
X to that of
, hence the output coded quantum state
is
Since the information transmission process can be understood as a process of state (probability) change, when Ω and
are classical and
X and
are quantum, the process is written as
where
(resp.
) is the channel corresponding to the coding Ξ (resp.
) and
(resp.
) is the set of all density operators (states) on
(resp.
).
We have to be care to study the objects in the above transmission process. Namely, we have to make clear which object is going to study. For instance, if we want to know the information of a quantum state through a quantum channel Γ(or then we have to take X so as to describe a quantum system like a Hilbert space and we need to start the study from a quantum state in quantum space not from a classical state associated to messages. We have a similar situation when we treat state change (computation) in quantum computer.
4. Quantum Entropy for Density Operators
The entropy of a quantum state was introduced by von Neumann. This entropy of a state
ρ is defined by
For a state
there exists a unique spectral decomposition
where
is an eigenvalue of
ρ and
is the associated projection for each
. The projection
is not one-dimensional when
is degenerated, so that the spectral decomposition can be further decomposed into one-dimensional projections. Such a decomposition is called a Schatten decomposition, namely,
where
is the one-dimensional projection associated with
and the degenerated eigenvalue
repeats dim
times; for instance, if the eigenvalue
has the degeneracy 3, then
. To simplify notations we shall write the Schatten decomposition as
where the numbers
form a probability distribution
This Schatten decomposition is not unique unless every eigenvalue is non-degenerated. Then the entropy (von Neumann entropy)
of a state
ρ equals to the Shannon entropy of the probability distribution
:
Therefore the von Neumann entropy contains the Shannon entropy as a special case.
Let us summarize the fundamental properties of the entropy .
Theorem 1 For any density operator , the followings hold:- (1)
Positivity : .
- (2)
Symmetry : Let for an unitary operator U. Then - (3)
Concavity : for any and .
- (4)
Additivity : for any .
- (5)
Subadditivity : For the reduced states of , - (6)
Lower Semicontinuity : If as , then - (7)
Continuity : Let be elements in satisfying the following conditions : (i) weakly as , (ii) for some compact operator A, and (iii) for the eigenvalues of A. Then .
- (8)
Strong Subadditivity : Let and denote the reduced states by and by . Then and
- (9)
Entropy increasing: (i) Let be finite dimensional space. If the channel is unital, that is, for the dual map Λ of satisfies then (ii) For arbitrary Hilbert space , if the dual map Λ of the channel satisfies , then
In order to prove Theorem, we need the following lemma.
Lemma 2 Let f be a convex function on a proper domain and . Then - (1)
Klein’s inequality: tr
- (2)
Peierls inequality: tr for any CONS in . (Remark: )
5. Relative Entropy for Density Operators
For two states
, the
relative entropy is first defined by Umegaki
where
means that
for
. Main properties of relative entropy are summarized as:
Theorem 3 The relative entropy satisfies the following properties: - (1)
Positivity: iff
- (2)
Joint Convexity : for any .
- (3)
Additivity : .
- (4)
Lower Semicontinuity : If and , then . Moreover, if there exists a positive number λ satisfying , then .
- (5)
Monotonicity : For a channel from to ,
- (6)
Lower Bound :
- (7)
Invaiance under the unitary mapping: where U is a unitary operator.
Let us extend the relative entropy to two positive operators instead of two states. If
A and
B are two positive Hermitian operators (not necessarily the states,
i.e. not necessarily with unit traces) then we set
The following
Bogoliubov inequality holds [
41].
This inequality gives us the upper bound of the channel capacity [
41].
6. Channel and Lifting
The concept of the channel plays an important role for mathematical description of the quantum communication. The attenuation channel is one of the most important models to discuss in quantum optical communication [
13] . Moreover, there exists a special channel named ”lifting”, and it is useful to characterize quantum communication or stochastic procesess. Here, we briefly review the definition and fundamental properties of quantum channel and lifting [
17,
28,
29,
60].
6.1. Quantum Channels
A general quantum system containing all systems such as discrete and continuous in both classical and quantum is described by a C*-algebra or a von Neumann algebra, so that we discuss the channeling transformation in C*-algebaric contexts. However it is enough for the readers who are not familiar with C*-algebra to imagine usual quantum system, for instace, regard and below as and , respectively. Let and be C*-algebras and and be the set of all states on and .
A channel is a mapping from to . There exist channels with various properties.
Definition 5 Let be an input system and be an output system. Take any . - (1)
is linear if holds for any .
- (2)
is completely positive (CP) if
is linear and its dual
satisfies
for any
and any
,
.
- (3)
is Schwarz type if and .
- (4)
is stationary if for any .
(Here and are groups of automorphisms of the algebra and respectively.)
- (5)
is ergodic if is stationary and .
(Here is the set of extreme points of the set of all stationary states .)
- (6)
is orthogonal if any two orthogonal states (denoted by ) implies .
- (7)
is deterministic if is orthogonal and bijective.
- (8)
For a subset of , is chaotic for if for any .
- (9)
is chaotic if is chaotic for .
- (10)
Stinespring-Sudarshan-Kraus representation: a completely positive channel
can be represented as
Here
are bounded operators in
H.
Most of channels appeared in physical processes are CP channels. Examples of such channels are the followings: Take a density operator ρ as an input state.
(1) Unitary evolution: Let
H be the Hamiltonian of a system.
where
.
(2) Semigroup evolution: Let
be an one parameter semigroup on
.
(3) Quantum measurement: If a measuring apparatus is prepared by an positive operator valued measure
then the state
ρ changes to a state
after this measurement,
(4) Reduction: If a system
interacts with an external system
described by another Hilbert space
and the initial states of
and
are
ρ and
σ, respectively, then the combined state
of
and
at time
t after the interaction between two systems is given by
where
with the total Hamiltonian
H of
and
. A channel is obtained by taking the partial trace w.r.t.
such as
(5) Optical communication processes: Quantum communication process is described by the following scheme [
13].
The above maps
are given as
where
ν is a noise coming from the outside of the system. The map
is a certain channel determined by physical properties of the device transmitting information. Hence the channel for the above process is given as
(6)Attenuation process: Based on the construction of the optical communication processes of (5), the attenuation channel is defined as follows [
13]: Take
vacuum state and
given by
Then the output state of the attenuation channel
is obtained by
(
) is called a transmission rate of the attenuation channel
. In particular, for a coherent input state
, one has
which is called a beam splitting operator.
(7) Noisy optical channel: Based on (5), the noisy optical channel is defined as follows [
28]: Take a noise state
,
photon number state of
and a linear mapping
as
with
where
Then the output state of the noisy optical channel
is defined by
for the input state
. In particular, for a coherent input state
and a coherent noise state
, we obtain
which is called a generalized beam splitting operator.
6.2. Liftings
There exists a special channel named ”lifting”, and it is a useful concept to characterize quantum communication or stochastic procesess. It can be a mathematical tool to describe a process in quantum algorithm, so that we will explain its foundation here.
Definition 6 Let be C*-algebras and let be a fixed C*-tensor product of and . A lifting from to is a weak *-continuous map If is affine and its dual is a completely positive map, we call it a linear lifting; if it maps pure states into pure states, we call it pure.
The algebra
can be that of the output, namely,
above. Note that to every lifting from
to
we can associate two channels: one from
to
, defined by
another from
to
, defined by
In general, a state
such that
is called a compound state of the states
and
. In classical probability theory, also the term coupling between
and
is used.
The following problem is important in several applications: Given a state
and a channel
, find a standard lifting
such that
is a compound state of
and
. Several particular solutions of this problem have been proposed by Ohya [
13,
14], Ceccini and Petz [
63], however an explicit description of all the possible solutions to this problem is still missing.
Definition 7 A lifting from to is called nondemolition for a state if is invariant for i.e.
, if for all The idea of this definition being that the interaction with system 2 does not alter the state of system 1. Definition 8 Let be C*-algebras and let be a fixed C*-tensor product of and . A transition expectation from to is a completely positive linear map satisfying An input signal is transmitted and received by an apparatus which produces an output signal. Here (resp. ) is interpreted as the algebra of observables of the input (resp. output) signal and describes the interaction between the input signal and the receiver as well as the preparation of the receiver. If is the input signal, then the state is the state of the (observed) output signal. Therefore in the reduction dynamics discussed before, the correspondence from a state ρ to the interacting state gives us a time dependent lifting.
Further another important lifting related to this signal transmission is one due to a quantum communication process discussed above. In several important applications, the state of the system before the interaction (preparation, input signal) is not known and one would like to know this state knowing only , i.e., the state of the apparatus after the interaction (output signal). From a mathematical point of view this problem is not well posed, since the map is usually not invertible. The best one can do in such cases is to acquire a control on the description of those input states which have the same image under and then choose among them according to some statistical criterion.
In the following we rewrite some communication processes by using liftings.
Example 9 (1) : Isometric lifting. Let be an isometry Then the map is a transition expectation in the sense of Accardi, and the associated lifting maps a density matrix on into on Liftings of this type are called isometric. Every isometric lifting is a pure lifting, which is applied to some of quantum algorithm such as Shor’s. These extend linearly to isometry, and their isometric liftings are neither of convex product type nor nondemolition type.
Example 10 (2) : The compound lifting. Let be a channel. For any in the closed convex hull of the external states, fix a decomposition of as a convex combination of extremal states in where μ is a Borel measure on with support in the extremal states, and define Then is a lifting, nonlinear even if is linear, and it is a nondemolition type. The most general lifting, mapping into the closed convex hull of the extermal product states on is essentially of this type. This nonlinear nondemolition lifting was first discussed by Ohya to define the compound state and the mutual entropy as explained before. However the above is a bit general because we shall weaken the condition that μ is concentrated on the extremal states. Therefore once a channel is given, by which a lifting of convex product type can be constructed. For example, the von Neumann quantum measurement process is written, in the terminology of lifting, as follows: Having measured a compact observable
(spectral decomposition with
) in a state
ρ, the state after this measurement will be
and a lifting
, of convex product type, associated to this channel
and to a fixed decomposition of
ρ as
ρ (
) is given by :
Before closing this section, we reconsider noisy channel, attenutaion channel and amplifier process (lifting) in optical communication.
Example 11 (3) : The attenuation (or beam splitting) lifting.
It is the particular isometric lifting characterized by the properties. is characterized by the expression where is the normalized coherent vector parametrized by and are such that Notice that this liftings maps coherent states into products of coherent states. So it maps the simplex of the so called classical states (i.e., the convex combinations of coherent vectors) into itself. Restricted to these states it is of convex product type explained below, but it is not of convex product type on the set of all states.
Denoting, for
the coherent state on
namely,
then for any
so that this lifting is not nondemolition. These equations mean that, by the effect of the interaction, a coherent signal (beam)
splits into 2 signals (beams) still coherent, but of lower intensity, but the total intensity (energy) is preserved by the transformation.
Finally we mention two important beam splitting which are used to discuss quantum gates and quantum teleportation [
64,
65].
(1) Superposed beam splitting:
(2) Beam splitting with two inputs and two output: Let
and
be two input coherent vectors. Then
Example 12 (4) Amplifier channel: To recover the loss, we need to amplify the signal (photon). In quantum optics, a linear amplifier is usually expressed by means of annihilation operators a and b on and , respectively : where is a constant and c satisfies CCR i.e., ) on . This expression is not convenient to compute several informations like entropy. The lifting expression of the amplifier is good for such use and it is given as follows: Let
with
and
be the eigenvector of
c :
. For two coherent vectors
on
and
on
,
can be written by the squeezing expression :
and the lifting is defined by an isometry
such that
The channel of the amplifier is
7. Quantum Mutual Entropy
Quantum relative entropy was introduced by Umegeki and generalized by Araki, Uhlmann. Then a quantum analogue of Shannon’s mutual entropy was considered by Levitin, Holevo, Ingarden for classical input and output passing through a possible quantum channel, in which case, as discussed below, the Shannon theory is essentially applied. Thus we call such quantum mutual entropy semi-quantum mutual entropy in the sequel. The fully quantum mutual entropy, namely, for quantum input and quantum output with quantum channel, was introduced by Ohya, which is called the quantum mutual entropy. It could be generalized to a general quantum system described by a C*-algebra.
The quantum mutual entropy clearly contains the semi-quantum mutual entropy as shown below. We mainly discuss the quantum mutual entropy in usual quantum system described by a Hilbert space, and its generalization to C-systems will be explained briefly for future use (e.g., relativistic quantum information ) in the last section of this Chapter. Note that the general mutual entropy contains all other cases including the measure theoretic definition of Gelfand and Yaglom.
Let be a Hilbert space for an input space, and an output space is described by another Hilbert space , often one takes . A channel from the input system to the output system is a mapping from to .
An input state ρ is sent to the output system through a channel , so that the output state is written as Then it is important to investigate how much information of ρ is correctly sent to the output state This amount of information transmitted from input to output is expressed by the mutual entropy (or mutual information).
The quantum mutual entropy was introduced on the basis of von Neumann entropy for purely quantum communication processes. The mutual entropy depends on an input state ρ and a channel , so it is denoted by , which should satisfy the following conditions:
(1) The quantum mutual entropy is well-matched to the von Neumann entropy. That is, if a channel is trivial, i.e., identity map, then the mutual entropy equals to the von Neumann entropy: .
(2) When the system is classical, the quantum mutual entropy reduces to classical one.
(3) Shannon’s type fundamental inequality is held.
In order to define the quantum mutual entropy, we need the quantum relative entropy and the joint state (it is called ”compound state” in the sequel) describing the correlation between an input state
ρ and the output state
through a channel
. A finite partition of Ω in classical case corresponds to an orthogonal decomposition
of the identity operator
I of
in quantum case because the set of all orthogonal projections is considered to make an event system in a quantum system. It is known that the following equality holds
and the supremum is attained when
is a Schatten decomposition of
Therefore the Schatten decomposition is used to define the compound state and the quantum mutual entropy.
The compound state
(corresponding to joint state in classical systems) of
ρ and
was introduced by Ohya in 1983. It is given by
where
E stands for a Schatten decomposition
of
ρ so that the compound state depends on how we decompose the state
ρ into basic states (elementary events), in other words, how to see the input state. It is easy to see that tr
Applying the relative entropy
for two compound states
and
(the former includes a certain correlation of input and output and the later does not), we can define the
Ohya’s quantum mutual entropy (information) as
where the supremum is taken over all Schatten decompositions of
ρ because this decomposition is not always unique unless every eigenvalue of
ρ is not degenerated. Some computations reduce it to the following form for a linear channel.
It is easy to see that the quantum mutual entropy satisfies all conditions (1)∼(3) mentioned above.
When the input system is classical, an input state
ρ is given by a probability distribution or a probability measure. In either case, the Schatten decomposition of
ρ is unique, namely, for the case of probability distribution ;
where
is the delta measure, that is,
Therefore for any channel
the mutual entropy becomes
which equals to the following usual expression when one of the two terms is finite for an infinite dimentional Hilbert space:
The above equality has been taken by Levitin and Holevo (LH for short in the sequel), which is one associated with a classical-quantum channel. Thus the Ohya’s quantum mutual entropy (we call it the quantum mutual entropy in the sequel) contains the LH quantum mutual entropy (we call it the semi-quantum mutual entropy in the sequel) as a special one.
Note that the definition of the quantum mutual entropy might be written as
where
is the set of all orthogonal finite decompositions of
Here
is orthogonal to
(denoted by
means that the range of
is orthogonal to that of
We briefly explain this equality in the next theorem.
Theorem 14 One has
Moreover the following fundamental inequality follows from the monotonicity of relative entropy :
Theorem 15 (Shannon’s inequality) For given two channels
and
one has the
quantum data processing inequality. That is,
The second inequality follows from monotonicity of the relative entropy.
This is analogous to the classical data processing inequality for a Markov process
where
is the mutual information between random variables
X and
The mutual entropy is a measure for not only information transmission but also description of state change, so that this quantity can be applied to several topics in quantum dynamics. It can be also applied to some topics in quantum computer or computation to see the ability of information transmission.
8. Some Applications to Statistical Physics
8.1. Ergodic theorem
We have an ergodic type theorem with respect to quantum mutual entropy.
Theorem 16 Let a state φ be given by . - (1)
If a channel is deterministic, then .
- (2)
If a channel is chaotic, then .
- (3)
If ρ is a faithful state and the every eigenvalue of ρ is nondegenerate, then .
(Remark: Here
ρ is said to be faithful if tr
implies
)
8.2. CCR and channel
We discuss the attenuation channel in the context of the Weyl algebra.
Let
T be a symplectic transformation of
to
,
i.e.,
. Then there is a homomorphism
such that
We may regard the Weyl algebra
) as
), and given a state
ψ on CCR(
), a channeling transformation arises as
where the input state
ω is an arbitrary state of
and
(this
ψ is a noise state above). To see a concrete example discussed in [
13], we choose
,
and
If
holds for the numbers
a and
b, this
F is an isometry, and a symplectic transformation, and we arrive at the channeling transformation
In order to have an alternative description of Λ in terms of density operators acting of
we introduce the linear operator
defined by
we have
hence
Theorem 17 Let ω be a state of CCR(H) which has a density D in the Fock representation. Then the output state of the attenuation channel has density tr in the Fock representation.
The lemma says that is really the same to the noisy channel with
We note that
the dual of
is a so-called quasifree completely positive mapping of
given as
Theorem 18 If ψ is a regular state of , that is is a continuous function on for every , then pointwise, where φ is a Fock state. It is worth noting that the singular state
is an invariant state of CCR(
H). On the other hand, the proposition applies to states possesing density operator in the Fock representation. Therefore, we have
Corollary 19 regarded as a channel of has a unique invariant state, the Fock state, and correspondingly is ergodic.
is not only ergodic but it is completely dissipative in the sense that
may happen only in the trivial case when
A is a multiple of the identity, which was discussed by M. Fannes and A. Verbeure. In fact,
where
is given by (
1) and (
3) and
is a quasi-free state.
8.3. Irreversible processes
Irreversible phenomena can be treated by several different methods. One of them is due to the entropy change. However, it is difficult to explain the entropy change from reversible equations of motion such as Schrodinger equation, Liouville equation. Therefore we need some modifications to explain the irreversibility of nature:
(i) QM + "α", where α represents an
effect of noise, the coarse graining, etc.
Here we discuss some trials concerning (i) and (iii) above, essentially done in [
16]. Let
ρ be a state and
be some channels. Then we ask
(1) ?
(2) ?
(3) Consider the change of . ( should be decreasing!)
8.4. Entropy change in linear response dynamics
We first discuss the entropy change in the linear response dynamics. Let H be a lower bounded Hamiltonian and take
For a KMS state
φ given by a density operator
ρ and a perturbation
(
), the perturbed time evolution is defined by a Dyson series:
and the perturbed state is
where
The linear response time evolution and the linear response perturbed state are given by
This linear response perturbed state
is written as
where
The linear response time dependent state is
Put
The change of the linear response entropy
is shown in the following theorem [
16].
Theorem 20 If goes to as and , then as .
Remark: Even when
we always have
Concerning the entropy change in exact dynamics, we have the following general result [
16]:
Theorem 21 Let : be a channel satisfying Then 8.5. Time development of mutual entropy
Frigerio studied the approach to stationarity of an open system in [
68]. Let an input
and an output
be a same von Neumann algebra and
be a dynamical semigroup (
i.e.,
is a weak* continuous semigroup and
is a normal channel) on
having at least one faithful normal stationary state
θ (
i.e.,
for any
). For this
, put
and
Then
is a von Neumann subalgebra of
. Frigerio proved the following theorem [
68].
Theorem 22 (1) There exists a conditional expectation from to .
(2) When , for any normal states ω, converges to a stationary state in the w*- sense.
From the above theorem, we obtain [
16].
Theorem 23 For a normal channel and a normal state φ, if a measure , is orthogonal and if holds and A is type I, then decreases in time and approaches to as .
This theorem tells that the mutual entropy decreases with respect to time if the system is dissipative, so that the mutual entropy can be a measure for the irreversibility.
9. Entropies for General Quantum States
We briefly discuss some basic facts of the entropy theory for general quantum systems, which might be needed to treat communication (computation) process from general standing point, that is, independently from classical or quantum.
Let
be a C*-system. The entropy (uncertainty) of a state
seen from the reference system, a weak *-compact convex subset of the whole state space
on the C
-algebra
, was introduced by Ohya [
16]. This entropy contains von Neumann’s entropy and classical entropy as special cases.
Every state
has a maximal measure
μ pseudosupported on
(extreme points in
) such that
The measure
μ giving the above decomposition is not unique unless
is a Choquet simplex (
i.e., for the set
, define an order such that
iff
,
is a Choquet simplex if
is a lattice for this order), so that we denote the set of all such measures by
. Take
where
is the delta measure concentrated on
. Put
for a measure
.
Definition 24 The entropy of a general state w.r.t. is defined by When
is the total space
we simply denote
by
This entropy (mixing -entropy) of a general state φ satisfies the following properties.
Theorem 25 When and (i.e.
, for any ) with a unitary operator , for any state φ given by with a density operator ρ, the following facts hold: - (1)
.
- (2)
If φ is an α-invariant faithful state and every eigenvalue of ρ is non-degenerate, then where is the set of all α-invariant faithful states.
- (3)
If , then , where K is the set of all KMS states.
Theorem 26 For any , we have - (1)
.
- (2)
.
This (or mixing) entropy gives a measure of the uncertainty observed from the reference system so that it has the following merits : Even if the total entropy is infinite, is finite for some , hence it explains a sort of symmetry breaking in . Other similar properties as hold for . This entropy can be appllied to characterize normal states and quantum Markov chains in von Neumann algebras.
The relative entropy for two general states φ and ψ was introduced by Araki and Uhlmann and their relation is considered by Donald and Hiai et al.
<
Araki’s relative entropy>
[8,9]Let
be
σ-finite von Neumann algebra acting on a Hilbert space
and
be normal states on
given by
and
with
(a positive natural cone)
. The operator
is defined by
on the domain
, where
is the projection from
to
, the
-support of
y. Using this
, the relative modular operator
is defined as
, whose spectral decomposition is denoted by
(
is the closure of
). Then the Araki relative entropy is given by
Definition 27 where means that implies for . <
Uhlmann’s relative entropy>
[10]Let
be a complex linear space and
be two seminorms on
. Moreover, let
be the set of all positive hermitian forms
α on
satisfying
for all
. Then the quadratical mean
of
p and
q is defined by
There exists a family of seminorms
of
for each
satisfying the following conditions:
- (1)
For any , is continuous in t,
- (2)
,
- (3)
,
- (4)
.
This seminorm
is denoted by
and is called the quadratical interpolation from
p to
q. It is shown that for any positive hermitian forms
, there exists a unique function
of
with values in the set
such that
is the quadratical interpolation from
to
. The relative entropy functional
of
α and
β is defined as
for
. Let
be a *-algebra
and
be positive linear functionals on
defining two hermitian forms
such as
and
.
Definition 28 The relative entropy of φ and ψ is defined by <
Ohya’s mutual entropy>
[16]Next we discuss the mutual entropy in C*−systems. For any
and a channel
, define the compound states by
and
The first compound state generalizes the joint probability in classical systems and it exhibits the correlation between the initial state φ and the final state .
Definition 29 The mutual entropy w.r.t. and μ is and the mutual entropy w.r.t.
is defined as
where
The following fundamental inequality is satisfied for almost all physical cases.
The main properties of the relative entropy and the mutual entropy are shown in the following theorems.
Theorem 30 - (1)
Positivity : .
- (2)
Joint Convexity : for any .
- (3)
Additivity : .
- (4)
Lower Semicontinuity : If and , then . Moreover, if there exists a positive number λ satisfying , then .
- (5)
Monotonicity : For a channel from to , - (6)
Lower Bound :
Remark 31 This theorem is a generalization of the theorem 3.
<Connes-Narnhofer-Thirring Entropy>
Before closing this section, we mention the dynamical entropy introduced by Connes, Narnhofer and Thirring [
50].
The CNT entropy
of C*-subalgebra
is defined by
where the supremum is taken over all finite decompositions
of
φ and
is the restriction of
φ to
. This entropy is the mutual entropy when a channel is the restriction to subalgebra and the decomposition is orthogonal. There are some relations between the mixing entropy
and the CNT entropy [
26].
Theorem 32. - (1)
For any state φ on a unital C*-algebra , - (2)
Let with a certain group G be a W*-dynamical system andf φ be a G-invariant normal state of , then where is the fixed points algebra of w.r.t. α. - (3)
Let be the C*-algebra of all compact operators on a Hilbert space , and G be a group, α be a *-automorphic action of G-invariant density operator. Then - (4)
There exists a model such that
10. Entropy Exchange and Coherent Information
First we define the entropy exchange [
43,
70,
71,
72,
73]
. If a quantum operation
is represented as
then the
entropy exchange of the quantum operation
with input state
ρ is defined to be
where the matrix
W has elements
Remark that if
holds, then the quantum operation
is a channel.
Definition 33 [] The coherent information is defined by Let
ρ be a quantum state and
and
trace-preserving quantum operations. Then
which has similar property of quantum mutual entropy.
Another entropy is defined by this coherent information with the von Neuman entropy
such that
We call this mutua type information the coherent mutual entropy here.
However these coherent information can not be considered as a candidate of the mutual entropy due to a theorem of the next section.
11. Comparison of various quantum mutual type entropies
There exist several different information a la mutual entropy. We compare these mutual type entropies [
60,
74].
Let
be a CONS in the input Hilbert space
, a quantum channel
is given by
where
is a one-dimensional projection satisfying
Then one has the following theorem:
Theorem 34 When is a projection valued measure and dim(ran for arbitary state ρ we have (1) , (2) (3)
Proof. For any density operator
and the channel
given above, one has
tr
so that one has
Then the entropy exchange of
ρ with respect to the quantum channel
is
Since
the coherent information of
ρ with respect to the quantum channel
is given by
for any
The Lindblad-Nielsen entropy is defined by
for any
. The quantum mutual entropy becomes
where the sup is taken over all Schatten decompositions
so we obtain
where
and
This means that
takes various values depending on the input state
for instance, ■
We further can prove that the coherent information vanishes for a general class of channels.
Theorem 35 Let in the input Hilbert space be given a CONS and in the output Hilbert space a sequence of the density operators . Consider a channel given by where ρ is any state in the input Hilbert space. (One can check that it is a trace preserving CP map). Then the coherent information vanishes: for any state Remark 36 The channel of the form can be considered as the classical-quantum channel iff the classical probability distribution is a priori given.
For the attenuation channel
, one can obtain the following theorems [
74,
75]:
Theorem 37 For any state and the attenuation channel with , one has - 1.
(Ohya mutual entropy),
- 2.
(coherent entropy),
- 3.
(Lindblad-Nielsen entropy).
Theorem 38 For the attenuation channel and the input state, we have - 1.
(Ohya mutual entropy),
- 2.
(coherent entropy),
- 3.
(Lindblad-Nielsen entropy).
The above theorem shows that the coherent entropy takes a minus value for and the Lindblad-Nielsen entropy is grater than the von Neumann entropy of the input state ρ for .
From these theorems, Ohya mutual entropy only satisfies the inequality held in classical systems, so that Ohya mutual entropy can be a most suitable candidate as quantum extension of the classical mutual entropy.
12. Quantum Capacity and Coding
We discuss the following topics in quantum information; (1) the channel capacity for quantum communication processes by applying the quantum mutual entropy, (2) formulations of quantum analogues of McMillan’s theorem.
As we discussed, it is important to check ability or efficiency of channel. It is the channel capacity which describes mathematically this ability. Here we discuss two types of the channel capacity, namely, the capacity of a quantum channel and that of a classical (classical-quantum-classical) channel
12.1. Capacity of quantum channel
The capacity of a quantum channel is the ability of information transmission of the channel itself, so that it does not depend on how to code a message being treated as a classical object.
As was discussed in Introduction, main theme of quantum information is to study information carried by a quantum state and its change associated with a change of the quantum state due to an effect of a quantum channel describing a certain dynamics, in a generalized sense, of a quantum system. So the essential point of quantum communication through a quantum channel is the change of quantum states by the quantum channel, which should be first considered free from any coding of messages. The message is treated as classical object, so that the information transmission started from messages and their quantum codings is a semi-quantum and is discussed in the next subsection. This subsection treats the pure quantum case, in which the (pure) quantum capacity is discussed as a direct extension of the classical (Shannon’s) capacity.
Before starting mathematical discussion, we explain a bit more about what we mean "pure quantum" for transmission capacity. We have to start from any quantum state and a channel, then compute the supremum of the mutual entropy to define the "pure" quantum capacity. One often confuse in this point, for example, one starts from the coding of a message and compute the supremum of the mutual entropy and he says that the supremum is the capacity of a quantum channel, which is not purely quantum but a classical capacity through a quantum channel.
Even when his coding is a quantum coding and he sends the coded message to a receiver through a quantum channel, if he starts from a classical state, i.e., a probability distribution of messages, then his capacity is not the capacity of the quantum channel itself. In his case, usual Shannon’s theory is applied because he can easily compute the conditional distribution by a usual (classical) way. His supremum is the capacity of a classical-quantum-classical channel, and it is in the second category discussed in the next subsection.
The capacity of a quantum channel
is defined as follows: Let
be the set of all states prepared for expression of information. Then the
quantum capacity of the channel
with respect to
is defined by
Here
is the mutual entropy given in
Section 7 with
When
,
is denoted by
for simplicity. The capacity
is the largest information possibly sent through the channel
We have
Remark 40 We also considered the pseudo-quantum capacity defined [
76]
with the pseudo-mutual entropy where the supremum is taken over all finite decompositions instead of all orthogonal pure decompositions: However the pseudo-mutual entropy is not well-matched to the conditions explained, and it is difficult to be computed numerically. It is easy to see that It is worthy of noting that in order to discuss the details of transmission process for a sequence of n messages we have to consider a channel on the n-tuple space and the average mutual entropy (transmission rate) per a message.
12.2. Capacity of classical-quantum-classical channel
The capacity of C-Q-C channel is the capacity of the information transmission process starting from the coding of messages, therefore it can be considered as the capacity including a coding (and a decoding). The channel sends a classical state to a quantum one, and the channel does a quantum state to a classical one. Note that and can be considered as the dual maps of ( and respectively.
The capacity of the C-Q-C channel
is
where
is the set of all probability distributions prepared for input (a-priori) states (distributions or probability measures, so that classical states). Moreover the capacity for coding free is found by taking the supremum of the mutual entropy over all probability distributions and all codings
:
The last capacity is for both coding and decoding free and it is given by
These capacities
do not measure the ability of the quantum channel
itself, but measure the ability of
through the coding and decoding.
The above three capacities
satisfy the following inequalities
Here
is the Shannon entropy:
for the initial probability distribution
of the message.
12.3. Bound of mutual entropy and capacity
Here we discuss the bound of mutual entropy and capacity. The discussion of this subsection is based on the papers [
30,
31,
41,
42,
77].
To each input symbol
there corresponds a state
of the quantum communication system,
functions as the codeword of
. The coded state is a convex combination
whose coefficients are the corresponding probabilities,
is the probability that the letter
should be transmitted over the channel. To each output symbol
there corresponds a non-negative observable, that is a selfadjoint operator
on the output Hilbert space
, such that
(
is called POVM). In terms of the quantum states the transition probabilities are tr
and the probability that
was sent and
is read is
On the basis of these joint probability distribution the classical mutual information is given.
where
tr
The next theorem provides a fundamental bound for the mutual information in terms of the quantum von Neumann entropy, which was proved by Holevo [
21] in 1973. Ohya introduced in 1983 the quantum mutual entropy by means of the relative entropy as discussed above.
Theorem 41 With the above notation holds. Holevo’s upper bound can now be expressed by
For general quantum case, we have the following inequality according to
the theorem 9.Theorem 42 When the Schatten decomposition i.e., one dimensional spectral decomposition) is unique, for any channel . Go back to general discussion, an input state
ρ is the probability distribution
of messages and its Schatten decomposition is unique as
with delta measures
, so the mutual entropy is written by
If the coding
is a quantum coding, then
is expressed by a quantum state. Let denote the coded quantum state by
as above and put
Then the above mutual entropy in a
channel
is written as
This is the expression of the mutual entropy of the whole information transmission process starting from a coding of classical messages.
Remark that if
is finite, then (
18) becomes
Further, if
ρ is a probability measure having a density function
; that is,
where
A is an interval in
, and each
λ corresponds to a quantum coded state
then
and
One can prove that this is less than
This upper bound is a special one of the following inequality
which comes from the monotonicity of the relative entropy and gives the proof of Theorem 42 above
.We can use the
Bogoliubov inequality where
A and
B are two positive Hermitian operators and use the monotonicity of the mutual entropy, one has the following bound of the mutual entropy
[
41].
Theorem 43 For a probability distribution and a quantum coded state , , , one has the following inequality for any quantum channel decomposed as such that , In the case that the channel
is identical,
the above inequality reduces to the bound of Theorem 41:
where
Note that and are the quantum mutual entropy for special channels as discussed above and that the lower bound is equal to the classical mutual entropy, which depends on the POVM
Using the above upper and lower bounds of the mutual entropy, we can compute these bounds of the capacity in many different cases.
13. Computation of Capacity
Shannon’s communication theory is largely of asymptotic character, the message length
N is supposed to be very large. So we consider the
N-fold tensor product of the input and output Hilbert spaces
and
,
Note that
A channel
sends density operators acting on
into those acting on
. In particular, we take a
memoryless channel which is the tensor product of the same
single site channels:
(
N-fold). In this setting we compute the quantum capacity and the classical-quantum-classical capacity, denoted by
and
below.
A
pseudo-quantum code (of order
N) is a probability distribution on
with finite support in the set of product states. So
is
a pseudo-quantum code if is a probability vector and are product states of. This code is nothing but a quantum code for a classical input (so a classical-quantum channel) such that
, as discussed in the previous chapter. Each quantum state
is sent over the quantum mechanical media (e.g., optical fiber) and yields the output quantum states
. The performance of coding and transmission is measured by the quantum mutual entropy
We regard
φ as the quantum state of the
n-component quantum system during the information transmission. Taking the supremum over certain classes of pseudo-quantum codes, we obtain various capacities of the channel. The supremum is over product states when we consider memoryless channels, so the capacity is
Next we consider a subclass of pseudo-quantum codes. A
quantum code is defined by the additional requirement that is a set of pairwise orthogonal pure states. This code is pure quantum, namely, we start a quantum state
φ and take
orthogonal extremal decompositions , whose decomposition is not unique. Here the coding is how to take such an orthogonal extremal decomposition. The quantum mutual entropy is
where the supremum is over all
orthogonal extremal decompositions as defined in
Section 7. Then we arrive at the capacity
It follows from the definition that
holds for every channel.
Proposition 44 For a memoryless channel the sequences and are subadditive.
Therefore the following limits exist and they coincide with the infimum.
(For multiple channels with some memory effect, one may take the limsup in (
20) to get a good concept of capacity per single use.)
Example 45 Let be a channel on the density matrices such that Consider the input density matrix For the orthogonal extremal decomposition is unique, in fact and we have However, . Since we conclude that 13.1. Divergence center
In order to estimate the quantum mutual entropy , we introduce the concept of divergence center. Let be a family of states and a constant .
Definition 46 We say that the state ω is a divergence center for a family of states with radius if In the following discussion about the geometry of relative entropy (or divergence as it is called in information theory) the ideas of the divergence center can be recognized very well.
Lemma 47 Let be a quantum code for the channel and ω a divergence center with radius for . Then Definition 48 Let be a family of states. We say that the state ω is an exact divergence center with radius r if and ω is a minimizer for the right hand side. When
r is finite, then there exists a minimizer, because
is lower semicontinuous with compact level sets: (cf. Proposition 5.27 in [
17].)
Lemma 49 Let and ω be states of B such that the Hilbert space is finite dimensional and set . If , are finite and then Lemma 50 Let be a finite set of states of such that the Hilbert space is finite dimensional. Then the exact divergence center is unique and it is in the convex hull on the states .
Theorem 51 Let be a channel with finite dimensional . Then the capacity is the divergence radius of the range of .
13.2. Comparison of capacities
Up to now our discussion has concerned the capacities of coding and transmission, which are bounds for the performance of quantum coding and quantum transmission. After a measurement is performed, the quantum channel becomes classical and Shannon’s theory is applied. The
total capacity (or
classical-quantum-classical capacity) of a quantum channel
is
where the supremum is taken over both all pseudo-quantum codes
and all measurements
. Due to the monotonicity of the mutual entropy
and similarly
holds for the capacities per single use.
Example 52 Any density operators has the following standard representation where are the Pauli matrices and with . For a positive semi-definite matrix A the application gives a channel when . Let us compute the capacities of . Since a unitary conjugation does not change capacity obviously, we may assume that A is diagonal with eigenvalues . The range of is visualized as an ellipsoid with (Euclidean) diameter . It is not difficult to see that the tracial state τ is the exact divergence center of the segment connected the states and hence τ must be the divergence center of the whole range. The divergence radius is This gives the capacity according to Theorem 51. The inequality (19) tells us that the capacity cannot exceed this value. On the other hand, and we have . The relations among
,
and
form an important problem and are worthy of study. For a noiseless channel
was obtained in [
78], where
n is the dimension of the output Hilbert space (actually identical to the input one). Since the tracial state is the exact divergence center of all density matrix, we have
and also
.
We expect that for "truely quantum mechanical channels" but must hold for a large class of memoryless channels.
One can obtain the following results for the attenuation channel which is discussed in the previous chapter.
Lemma 53 Let be the attenuation channel. Then when the supremum is over all pseudo-quantum codes applying n coherent states. The next theorem follows directly from the previous lemma.
Theorem 54 The capacity of the attenuation channel is infinite.
Since the argument of the proof of the above Lemma works for any quasi-free channel, we can conclude also in that more general case. Another remark concerns the classical capacity . Since the states used in the proof of Lemma commute in the limit , the classical capacity is infinite as well. follows also from the proof of the next theorem.
Theorem 55 The capacity of the attenuation channel is infinite.
Let us make some comments on the previous results. The theorems mean that arbitrarily large amount of information can go through the attenuation channel, however the theorems do not say anything about the price for it. The expectation value of the number of particles needed in the pseudo-quantum code of Lemma 53 tends to infinity. Indeed,
which increases rapidly with
n. (Above
N denoted the number operator.) Hence the good question is to ask the capacity of the attenuation channel when some energy constrain is posed:
To be more precise, we have posed a bound on the average energy, different constrain is also possible. Since
for the dual operator Λ of the channel
and the number operator
N, we have
The solution of this problem is the same as
and the well-known maximizer of this problem is a so-called Gibbs state. Therefore, we have
This value can be realized as a classical capacity if the number states can be output states of the attenuation channel.
13.3. Numerical computation of quantum capacity
Let us consider the quantum capacity of the attenuation channel for the set of density operators consisted of two pure states with the energy constrain [
31,
79].
Let
and
be two subsets of
given by
The quantum capacities of the attenuation channel
with respect to the above two subsets are computed under an energy constraint
for any
:
Since the Schatten decomposition is unique for the above two state subsets, by using the notations
we obtain the following result [
31].
Theorem 56 (1) For , the quantum mutual entropy is calculated rigorously by (2) For , the quantum mutual entropy is computed precisely by (3) For any we have inequality of two quantum capacities: Note that and represent the state subspaces generated by means of modulations of PSK (Phase-Shift-Keying) and OOK (On-Off-Keying) [75]. 14. Quantum Dynamical Entropy
Classical dynamical entropy is an important tool to analyse the efficiency of information transmission in communication processes. Quantum dynamical entropy was first studied by Connes, Størmer [
48] and Emch [
49]. Since then, there have been many attempts to formulate or compute the dynamical entropy for some models [
52,
53,
54,
55,
56,
80]. Here we review four formulations due to (a) Connes, Narnhofer and Thirring (CNT) [
50], (b) Muraki and Ohya (Complexity) [
27,
81], (c) Accardi, Ohya and Watanabe [
58], (d) Alicki and Fannes (AF) [
51]. We consider mutual relations among these formulations [
80].
A dynamical entropy (Kossakowski, Ohya and Watanabe) [
59] for not only a shift but also a completely positive (CP) map is defined by generalizing the entropy defined through quantum Markov chain and AF entropy defined by a finite operational partition.
14.1. Formulation by CNT
Let be a unital -algebra, θ be an automorphism of , and φ be a stationary state over with respect to θ; . Let be a finite dimensional -subalgebra of .
The CNT entropy [
50] for a subalgebra
is given by
where
is the restriction of the state
φ to
and
is the relative entropy for
-algebra [
7,
8,
10].
The CNT dynamical entropy with respect to
θ and
is given by
and the dynamical entropy for
θ is defined by
14.2. Formulation by MO
We define three complexities as follows:
Based on the above complexities, we explain the quantum dynamical complexity (QDC) [
14].
Let
θ (resp.
) be a stationary automorphism of
(resp.
);
and Λ (the dual map of channel
) be a covariant CP map (
i.e.,
) from
to
.
(resp.
) is a finite subalgebra of
(resp.
). Moreover, let
(resp.
) be a CP unital map from
(resp.
) to
(resp.
) and
and
are given by
Two compound states for
and
, with respect to
, are defined as
Using the above compound states, the three transmitted complexities [
81] are defined by
When
,
,
,
, where
α is a unital CP map from
to
, the mean transmitted complexities are
and the same for
and
. These quantities have properties similar to those of the CNT entropy [
27,
81].
14.3. Formulation by AOW
A construction of dynamical entropy is due to the quantum Markov chain [
58].
Let
be a von Neumann algebra acting on a Hilbert space
and let
φ be a state on
and
(
matrix algebra). Take the transition expectation
of Accardi [
36,
82] such that
where
and
is a finite partition of unity
. Quantum Markov chain is defined by
such that
where
,
Aut
, and
is an embeding of
into
such that
.
Suppose that for
φ there exists a unique density operator
ρ such that
tr
for any
. Let us define a state
on
expressed as
The density operator
for
is given by
Put
The dynamical entropy through QMC is defined by
If
satisfies the Markov property, then the above equality is written by
The dynamical entropy through QMC with respect to
θ and a von Neumann subalgebra
of
is given by
14.4. Formulation by AF
Let
be a
-algebra,
θ be an automorphism on
and
φ be a stationary state with respect to
θ and
be a unital
-subalgebra of
. A set
of elements of
is called a finite operational partition of unity of size
k if
γ satisfies the following condition:
The operation ∘ is defined by
for any partitions
and
. For any partition
γ of size
k, a
density matrix
is given by
Then the dynamical entropy
with respect to the partition
γ and shift
θ is defined by von Neumann entropy
;
The dynamical entropy
is given by taking the supremum over operational partition of unity in
as
14.5. Relation between CNT and MO
In this section we discuss relations among the above four formulations. The
-mixing entropy in GQS introduced in [
16] is
where
is given by
and
is the set of all finite partitions of
.
The following theorem [
27,
81] shows the relation between the formulation by CNT and that by complexity.
Theorem 57 Under the above settings, we have the following relations: - (1)
,
- (2)
,
- (3)
, for any density operator ρ, and
Since there exists a model showing that , distinguishes states more sharply than , where .
Furthermore, we have the following results [
83].
- (1)
When
are abelian
-algebras and
is an embedding map, then
are satisfied for any finite partitions
on the probability space (Ω= spec
,
,
μ).
- (2)
When Λ is the restriction of
to a subalgebra
of
;
,
We show the relation between the formulation by complexity and that by QMC. Under the same settings in
Section 3, we define a map
from
, the set of all density operators in
, to
by
for any density operator
. Let us take a map
from
to
such that
Then a map
from
to
is given by
so that
and
From the above Theorem, we have
. Hence
14.6. Formulation by KOW
Let
(resp.
be the set of all bounded linear operators on separable Hilbert space
(resp.
We denote the set of all density operators on
(resp.
by
(resp.
Let
be a normal, unital CP linear map, that is, Γ satisfies
for any increasing net
converging to
and
hold for any
and any
. For a normal state
ω on
there exists a density operator
associated to
ω (
i.e.,
). Then a map
defined as
is a transition expectation in the sense of [
84] (
i.e.,
is a linear unital CP map from
to
), whose dual is a map
given by
The dual map
is a lifting in the sense of [
84]; that is, it is a continuous map from
to
.
For a normal, unital CP map
,
is a normal, unital CP map, where
is the identity map on
. Then one defines the transition expectation
and the lifting
The above
has been called a quantum channel [
13] from
to
in which
ρ is regarded as an input signal state and
is as a noise state.
The equality
for all
and any
defines
- (1)
a lifting
and
- (2)
marginal states
Here, the state
is a compound state for
and
in the sense of [
13]. Note that generally
is not equal to
Definition 58 The quantum dynamical entropy with respect to Γ and ω is defined by where is von Neumann entropy [6]; that is, . The dynamical entropy with respect to Λ and ρ is defined as 14.7. Generalized AF entropy and generalized AOW entropy
In this section, we generalize both the AOW entropy and the AF entropy. Then we compare the generalized AF entropy with the generalized AOW entropy.
Let θ be an automorphism of , ρ be a density operator on and be the transition expectation on with .
One introduces a transition expectation
from
to
such as
The quantum Markov state
on
is defined through this transition expectation
by
for all
and any
Let consider another transition expectation
such that
One can define the quantum Markov state
in terms of
for all
and any
Then we have the following theorem.
Theorem 59 Let
be a subalgebra of
. Taking the restriction of a transition expectation
to
i.e.,
is the transition expectation from
to
The QMC (quantum Markov chain) defines the state
on
through Equation (
40) ), which is
The subalgebra
of
can be constructed as follows: Let
be projection operators on mutually orthogonal subspaces of
such that
Putting
, the subalgebra
is generated by
One observes that in the case of
and one has for any
from which the following theorem is proved (c.f., see [
17]).
Taking into account the construction of subalgebra of one can construct a transition expectation in the case that is a finite subalgebra of
Let
be the
matrix algebra
(
) in
and
with normalized vectors
. Let
be a finite operational partition of unity, that is,
then a transition expectation
is defined by
Remark that the above type complete positive map
is also discussed in [
85].
Let
be a subalgebra of
consisting of diagonal elements of
Since an element of
has the form
one can see that the restriction
of
to
is defined as
When
is a normal unital CP map, the transition expectations
and
are defined by
Then one obtains the quantum Markov states
and
and
where we put
The above
become the special cases of
defined by taking Γ and
ω in Equation (
33). Therefore the dynamical entropy Equation (
36) becomes
The dynamical entropies of Λ with respect to a finite dimensional subalgebra
and the transition expectations
and
are given by
We call (
56) and (57) a generalized AF entropy and a generalized dynamical entropy by QMC, respectively. When
is PVM (projection valued measure) and Λ is an automorphism
θ,
is equal to the AOWdynamical entropy by QMC [
58]. When
is POV (positive operater valued measure) and
,
is equal to the AF entropy [
51].
From theorem 60, one obtains an inequality
That is, the generalized dynamical entropy by QMC is greater than the generalized AF entropy. Moreover the dynamical entropy
is rather difficult to compute because there exist off-diagonal parts in (
48). One can easily compute the dynamical entropy
.
Here, we note that the dynamical entropy defined in terms of
on
is related to that of flows by Emch [
49], which was defined in terms of the conditional expectation, provided
is a subalgebra of
15. Conclusion
As is mentioned above, we reviewed the mathematical aspects of quantum entropy and discussed several applications to quantum communication and statistical physics. All of them were studied by the present authors. Other topics for quantum information are recently developed in various directions, such as quantum algorithm, quantum teleportation, quantum cryptography,
etc., which are discussed in [
60].