Open access

Quantum Communication Processes and Their Complexity

Written By

Noboru Watanabe

Submitted: July 9th, 2012 Published: July 24th, 2013

DOI: 10.5772/56356

Chapter metrics overview

1,748 Chapter Downloads

View Full Metrics

1. Introduction

The complex systems and their dynamics are treated various way. Ohya looked for synthesizing method to treat complex systems. He established Information Dynamics [36] which is a new concept unifying the dynamics of a state and the complexity of the system itself. By applying Information Dynamics, one can discuss in a unified frame the problems such as in mathematics, physics, biology, information science. Information Dynamics is growing as one of the research fields, for instance, the international journal named "Open Systems and Information Dynamics" in 1992 has appeared. In ID, there are two types of complexity, that is, (a) a complexity of state describing system itself and (b) a transmitted complexity between two systems. Entropies of classical and quantum information theory are the example of the complexities of (a) and (b).

Shannon [52] found that the entropy, introduced in physical systems by Clausius and Boltzmann, can be used to express the amount of information by means of communication processes, and he proposed the so-called information communication theory at the middle part of the 20th century. In his information theory, the entropy and the mutual entropy (information) are most important concepts. The entropy relates to the complexity of ID measuring the amount of information of the state of system. The mutual entropy (information) corresponds to the transmitted complexity of ID representing the amount of information correctly transmitted from the initial system to the final system through a channel, and it was extended to the mutual entropy on the continuous probability space by Gelfand– Kolmogorov - Yaglom [17,23], which was defined by using the relative entropy of two states by Kullback-Leibler [26].

Laser is often used in the current communication. A formulation of information theory being able to treat quantum effects is necessary, which is the so-called quantum information theory. The quantum information theory is important in both mathematics and engineering. It has been developed with quantum entropy theory and quantum probability. In quantum information theory, one of the important problems is to investigate how much information is exactly transmitted to the output system from the input system through a quantum channel. The amount of information of the quantum input system is described by the quantum entropy defined by von Neumann [29] in 1932. The C*-entropy was introduced in [33,35] and its property is discussed in [28,21]. The quantum relative entropy was introduced by Umegaki [55] and it is extended to general quantum system by Araki [4,5], Uhlmann [54] and Donald [14]. Furthermore, it had been required to extend the Shannon’s mutual entropy (information) of classical information theory to that in the quantum one. The classical mutual entropy is defined by using the joint probability expressing a correlation between the input system and the output system. However, it was shown by Urbanik [56] that in quantum system there does not generally exists a joint probability distribution. The semi-classical mutual entropy was introduced by Holevo, Livitin, Ingarden [18,20] for classical input and output passing through a possible quantum channel. By introducing a new notion, the so-called compound state, in 1983 Ohya formulated the mutual entropy [31,32] in a complete quantum mechanical system (i.e., input state, output state and channel are all quantum mechanical), which is called the Ohya mutual entropy. It was generalized to C*-algebra in [Oent84]. The quantum capacity [40] is defined by taking the supremum for the Ohya mutual entropy. By using the Ohya quantum mutual entropy, one can discuss the efficiency of the information transmission in quantum systems [28,27,44,34,35], which allows the detailed analysis of optical communication processes. Concerning quantum communication processes, several studies have been done in [31,32,35,40,41]. Recently, several mutual entropy type measures (Lindblad - Nielsen entropy [10] and Coherent entropy [6]) were defined by using the entropy exchange. One can classify these mutual entropy type measures by calculating their measures for the quantum channel. These entropy type complexities are explained in [39,43].

The entangled state is an important concept for quantum theory and it has been studied recently by several authors. One of the remarkable formulations to search the entanglement state is the Jamiolkowski’s isomorphism [22]. It might be related to the construction of the compound state in quantum communication processes. One can discuss the entangled state generated by the beam splitting and the squeezed state.

The classical dynamical (or Kolmogorov-Sinai) entropy S(T) [23] for a measure preserving transformation T was defined on a message space through finite partitions of the measurable space. The classical coding theorems of Shannon are important tools to analyze communication processes which have been formulated by the mean dynamical entropy and the mean dynamical mutual entropy. The mean dynamical entropy represents the amount of information per one letter of a signal sequence sent from the input source, and the mean dynamical mutual entropy does the amount of information per one letter of the signal received in the output system. In this chapter, we will discuss the complexity of the quantum dynamical system to calculate the mean mutual entropy with respect to the modulated initial states and the attenuation channel for the quantum dynamical systems [59].

The quantum dynamical entropy (QDE) was studied by Connes-Størmer [13], Emch [15], Connes-Narnhofer-Thirring [12], Alicki-Fannes [3], and others [9,48,19,57,11]. Their dynamical entropies were defined in the observable spaces. Recently, the quantum dynamical entropy and the quantum dynamical mutual entropy were studied by the present authors [34,35]: (1) the dynamical entropy is defined in the state spaces through the complexity of Information Dynamics [36]. (2) It is defined through the quantum Markov chain (QMC) was done in [2]. (3) The dynamical entropy for a completely positive (CP) map was defined in [25]. In this chapter, we will discuss the complexity of the quantum dynamical process to calculate the generalized AOW entropy given by KOW entropy for the noisy optical channel [58].

Advertisement

2. Quantum channels

The signal of the input quantum system is transmitted through a physical device, which is called a quantum channel. The concept of channel has been performed an important role in the progress of the quantum information communication theory. The mathematical representation of the quantum channel is a mapping from the input state space to the output state space. In particular, the attenuation channel [31] and the noisy optical channel [44] are remarkable examples of the quantum channels describing the quantum optical communication processes. These channels are related to the mathematical desctiption of the beam splitter.

Here we review the definition of the quantum channels.

Let (B(1),S(1)) and (B(2),S(2)) be input and output systems, respectively, where B(k) is the set of all bounded linear operators on a separable Hilbert space k and S(k) is the set of all density operators on k (k=1,2). Quantum channel Λ is a mapping from S(1) to S(2).

  1. Λ is called a linear channel if Λ satisfies Λ(λρ1+(1λ)ρ2)=λΛ(ρ1)+(1λ)Λ(ρ2) for any ρ1,ρ2S(1) and any λ[0,1].

  2. Λ is called a completely positive (CP) channel if Λ is linear and its dual map Λ from B(2) to B(1) holds

i,j=1nAiΛ(BiBj)Aj0

for any nN, any BjB(2) and any AjB(1), where the dual map Λ of Λ is defined by trΛ(ρ)B=trρΛ(B) for any ρS(1) and any BB(2). Almost all physical transformations can be described by the CP channel [30,39,21,46, 43].

Advertisement

3. Quantum communication processes

Let K1 and K2 be two Hilbert spaces expressing noise and loss systems, respectively. Quantum communication process including the influence of noise and loss is denoted by the following scheme [31]: Let ρ be an input state in S(1), ζ be a noise state in S(K1).

S(1)ΛS(2)γaS(1K1)ΠS(2K2)

The above maps γ,a are given as

γ(ρ)=ρξ,ρS(1),a(σ)=trK2σ,σS(2K2).

The map Π is a CP channel from S(1K1) to S(2K2) given by physical properties of the device transmitting signals. Hence the channel for the above process is given as

Λ(ρ)trK2Π(ρζ)=(aΠγ)(ρ)

for any ρS(1). Based on this scheme, the noisy optical channel is constructed as follows:

Advertisement

4. Noisy optical channel

Noisy optical channel Λ with a noise state ζ was defined by Ohya and NW [44] such as

Λ(ρ)trK2Π(ρζ)=trK2V(ρζ)V,

where ζ=|m1m1| is the m1 photon number state in S(K1) and V is a mapping from 1K1 to 2K2 denoted by

V(|n1|m1)=j=0n1+m1Cjn1,m1|j|n1+m1j,

where

Cjn1,m1=r=LK(1)n1+jrn1!m1!j!(n1+m1j)!r!(n1j)!(jr)!(m1j+r)!αm1j+2r(β¯)n1+j2r,

and |n1 is the n1 photon number state vector in 1, and α, β are complex numbers satisfying |α|2+|β|2=1. K and L are constants given by K=min{n1,j}, L=max{m1j,0}. We have the following theorem.

Theorem The noisy optical channel Λ with noise state |mm| is described by

Λ(ρ)=i=0OiVQ(m)ρQ(m)VOi,

where Q(m)l=0(|yl|m)yl|, Oik=0|zk(zk|i|), {|yl} is a CONS in 1, {|zk} is a CONS in 2 and {|i} is the set of number states in K2.

In particular for the coherent input states

ρ=|ξξ||κκ|S(1K1),

the output state of Π is obtained by

Π(|ξξ||κκ|)=|αξ+βκαξ+βκ||β¯ξ+α¯κβ¯ξ+α¯κ|.

|κκ||ξξ||αξ+βκαξ+βκ||β¯ξ+α¯κβ¯ξ+α¯κ|

Advertisement

5. Attenuation channel

The noisy optical channel with a vacuum noise is called the attenuation channel introduced in [31] by

Λ0(ρ)trK2Π0(ρζ0)=trK2V0(ρ|00|)V0,

where |00| is the vacuum state in S(K1) and V0 is a mapping from 1K1 to 2K2 given by

V0(|n1|0)=jn1Cjn1|j|n1j,Cjn1=n1!j!(n1j)!αj(β¯)n1j

In particular, for the coherent input state

ρ=|ξξ||00|S(1K1),

one can obtain the output state

Π0(|ξξ||00|)=|αξαξ||β¯ξβ¯ξ|.

Lifting 0 from S() to S(K) in the sense of Accardi and Ohya [1] is denoted by

0(|ξξ|)=|αξαξ||βξβξ|.

0 (or Π0) is called a beam splitting. Based on liftings, the beam splitting was studied by Accardi - Ohya [1] and Fichtner - Freudenberg - Libsher [16].

Advertisement

6. Information dynamics

We are interested to study the dynamics of state change or the complexity of state for several systems. Information dynamics (ID) is a new concept introduced by Ohya [36] to construct a theory under the ID's framework by synthesizing these investigating schemes. In ID, a complexity of state describing system itself and a transmitted complexity between two systems are used. The examples of these complexities are the Shannon's entropy and the mutual entropy (information) in classical entropy theory. In quantum entropy theory, it was known that the von Neumann entropy and the Ohya mutual entropy relate to these complexities. Recently, several mutual entropy type measures (the Lindblad - Nielsen entropy [10] and the Coherent entropy [6]) were proposed by means of the entropy exchange for an input state and a channel.

Advertisement

7. Concept of information dynamics

Ohya introduced Information Dynamics (ID) synthesizing dynamics of state change and complexity of state. Based on ID, one can study various problems of physics and other fields. Channel and two complexities are key concepts of ID. Two kinds of complexities CS(ρ), TS(ρ;Λ) are used in ID. CS(ρ)is a complexity of a state ρ measured from a subset S and TS(ρ;Λ) is a transmitted complexity according to the state change from ρ to Λρ. Let S,S¯,St be subsets of S(1), S(2), S(12), respectively. These complexities should fulfill the following conditions as follows:

Advertisement

8. Complexity of system

  1. For any ρS, CS(ρ) is nonnegative (i.e., CS(ρ)0 )

  2. For a bijection j from exS(1) to exS(1),

CS(ρ)=CS(j(ρ))

is hold, where exS(1) is the set of all extremal points of S(1).

  1. For ρσS(12), ρS(1), σS(2),

CSt(ρσ)=CS(ρ)+CS¯(σ)

It means that the complexity of the state ρσof totally independent systems are given by the sum of the complexities of the states ρ and σ.

Advertisement

9. Transmitted complexity

(1’) For any ρS and a channel Λ, TS(ρ;Λ) is nonnegative (i.e., TS(ρ;Λ)0 )

(4) CS(ρ) and TS(ρ;Λ) satisfy the following inequality 0TS(ρ;Λ)CS(ρ).

(5) If the channel Λ is given by the identity map id, then TS(ρ;id)=CS(ρ) is hold.

The examples of the above complexities are the Shannon entropy S(p) for CS(p) and the classical mutual entropy I(p;Λ) for TS(p;Λ), respectively Let us consider these complexities for quantum communication processes.

Advertisement

10. Quantum entropy

Since the present optical communication is using the optical signal including quantum effect, it is necessary to construct new information theory dealing with those quantum phenomena in order to discuss the efficiency of information transmission of optical communication processes rigorously. The quantum information theory is important in both mathematics and engineering, and it contains several topics, for instance, quantum entropy theory, quantum communication theory, quantum teleportation, quantum entanglement, quantum algorithm, quantum coding theory and so on. It has been developed with quantum entropy theory and quantum probability. In quantum information theory, one of the important problems is to investigate how much information is exactly transmitted to the output system from the input system through a quantum channel. The amount of information of the quantum communication system is described by the quantum mutual entropy defined by Ohya [31], based on the quantum entropy by von Neumann [29], and the quantum relative entropy by Umegaki [55], Araki [4] and Uhlmann [54]. The quantum information theory directly relates to quantum communication theory, for instance, [40,41,45]. One of the most important communication processes is quantum teleportation, whose new treatment was studied in [24]. It is important to classify quantum states. One of such classifications is to study entanglement and separability of states (see [7,8]). There have been lots of trials in finite dimensional Hilbert spaces. Quantum mechanics should be basically discussed in infinite dimensional Hilbert spaces. We have to study such a classification in infinite dimensional Hilbert spaces.

10.1. Von Neumann entropy

The study of the entropy in quantum system was begun by von Neumann [29] in 1932. For any state given by the density operator ρ, the von Neumann entropy is defined by

S(ρ)=trρlogρ,ρS().

Since the von Neumann entropy satisfies the conditions (1),(2),(3) of the complexity of state of ID, it seems to be considered as an example of the complexity of state C(ρ)=S(ρ).

10.2. Entropy for general systems

Here we briefly explain Let us comment general entropies of states in C*-dynamical systems. The C*-entropy (S-mixing entropy) was introduced by Ohya in [33,35] and its property is discussed in [28,21].

Let (A,S(A),α(G)) be a C*-dynamical system and S be a weak* compact and convex subset of S(A). For example, S is given by S(A) (the set of all states on A), I(α) (the set of all invariant states for α), K(α) (the set of all KMS states),and so on. Every state φS has a maximal measure μ pseudosupported on exS such that

φ=Sωdμ,

where exS is the set of all extreme points of S. The measure μ giving the above decomposition is not unique unless S is a Choquet simplex. The set of all such measures is denoted by Mφ(S) and Dφ(S)is the subset of Mφ(S) constituted by

D(S)={Mφ(S);μk+ and {φk}exSs.t.kμk=1,μ=kμkδ(φk)}

where δ(φ) is the Dirac measure concentrated on an initial state φ. For a measure μDφ(S), the entropy type functional H(μ) is given by

H(μ)=kμklogμk.

For a state φS with respect to S, Ohya introduced the C*-entropy (S-mixing entropy) [33,35] defined by

SS(φ)={inf{H(μ);μDφ(S)}+if Dφ(S)=.

It describes the amount of information of the state φ measured from the subsystem S. If S=S(A), then SS(A)(φ) is denoted by S(φ). This entropy is an extension of the von Neumann entropy mentioned above.

10.3. Quantum relative entropy

The classical relative entropy in continuous probability space was defined by Kullback-Leibler [26]. It was developed in noncommutative probability space. The quantum relative entropy was first defined by Umegaki [55] for σ-finite von Neumann algebras, which denotes a certain difference between two states. It was extended by Araki [4] and Uhlmann [54] for general von Neumann algebras and *-algebras, respectively.

10.4. Umegakirelative entropy

The relative entropy of two states was introduced by Umegaki in [55] for σ-finite and semi-finite von Neumann algebras. Corresponding to the classical relative entropy, for two density operators ρ and σ, it is defined as

S(ρ,σ)={trρ(logρlogσ)(s(ρ)s(σ)),(else),

where s(ρ)s(σ) means the support projection s(σ) of σ is greater than the support projection s(ρ) of ρ. It means a certain difference between two quantum states ρ and σ. The Umegaki’s relative entropy satisfies (1) positivity, (2) joint convexity, (3) symmetry, (4) additivity, (5) lower semicontinuity, (6) monotonicity. Araki [4] and Uhlmann [54] extended this relative entropy for more general quantum systems.

10.5. Relative entropy for general systems

The relative entropy for two general states was introduced by Araki [4,5] in von Neumann algebra and Uhlmann [54] in *-algebra. The above properties are held for these relative entropies.

10.5.1. Araki's relative entropy[4,5]

Let N be a σ-finite von Neumann algebra acting on a Hilbert space and φ,ψ be normal states on N given by φ()=x,x and ψ()=y,y with x,yK (i.e., K is a positive natural cone) . On the domain Ny+(IsN(y)), the operator Sx,y is defined by

Sx,y(Ay+z)=sN(y)Ax, AN  (z is satisfying sN(y)z=0),

where sN(y) (the N-support of y) is the projection from to {Ny}. Using this Sx,y, the relative modular operator Δx,y is defined as Δx,y=(Sx,y)Sx,y¯, whose spectral decomposition is denoted by 0λdex,y(λ) (Sx,y¯ is the closure of Sx,y). Then the Araki’s relative entropy is given by

Definition The Araki’s relative entropy of φ and ψ is defined by

S(ψ,φ)={0logλdy,ex,y(λ)y(ψφ),(otherwise),

where ψφ means that φ(AA)=0 implies ψ(AA)=0 for AN.

10.5.2. Uhlmann's relative entropy[54]

Let be a complex linear space and p,q be two semi-norms on . H((p,q)) is the set of all positive Hermitian forms α on satisfying |α(x,y)|p(x)q(y) for all x,y. For x, the quadratical mean QM(p,q) of p and q is defined by

QM(p,q)(x)=sup{α(x,x)1/2;αH((p,q))}.

For each x, there exists a family of semi-norms pt(x) of t[0,1], which is called the quadratical interpolation from p to q, satisfying the following conditions:

  1. For any x, pt(x) is continuous in t,

  2. p1/2=QM(p,q)

  3. pt/2=QM(p,pt)(t[0,1])

  4. p(t+1)/2=QM(pt,q)(t[0,1])

This semi-norm pt is denoted by QIt(p,q). It is shown that for any positive Hermitianforms α,β, there exists a unique function QFt(α,β) of t[0,1] with values in the set H((p,q)) such that QFt(α,β)(x,x)1/2 is the quadratical interpolation from α(x,x)1/2 to β(x,x)1/2. For x, the relative entropy functional S(α,β)(x) of α and β is defined as

S(α,β)(x)=limt+0inf1t{QFt(α,β)(x,x)α(x,x)}.

Let be a *-algebra A. For positive linear functional φ,ψ on A, two Hermitian forms φL,ψR are given by φL(A,B)=φ(AB) and ψR(A,B)=ψ(BA).

Definition The Uhlmann’s relative entropy of φ and ψ is defined by

S(ψ,φ)=S(ψR,φL)(I).

10.5.3. Ohya mutual entropy [31]

The Ohya mutual entropy [31] with respect to the initial state ρ and a quantum channel Λ is described by

I(ρ;Λ)sup{nS(ΛEn,Λρ),ρ=nλnEn},

where S(,) is the Umegaki's relative entropy and ρ=nλnEn represents a Schatten-von Neumann (one dimensional orthogonal) decomposition [49] of ρ. Since the Schatten-von Neumann decomposion of a state ρ is not unique unless all eigenvalues of ρ do not degenerate, the Ohya mutual entropy is defined by taking a supremum for all Schatten-von Neumann decomposion of a state ρ. Then the Ohya mutual entropy satisfies the following Shannon's type inequality [31]

0I(ρ,Λ)min{S(ρ),S(Λρ)},

where S(ρ) is the von Neumann entropy. This inequalities show that the Ohya mutual entropy represents the amount of information correctly carried from the input system to the output system through the quantum channel. The capacity denotes the ability of the information transmission of the communication processes, which was studied in [40,41,45].

For a certain set SS(1) satisfying some physical conditions, the capacity of quantum channel Λ [40] is defined by

CqS(Λ)sup{I(ρ;Λ);ρS}.

If S=S(1) holds, then the capacity is denoted by Cq(Λ). Then the following theorem for the attenuation channel was proved in [40].

Theorem For a subset Sn{ρS(1);dims(ρ)=n}, the capacity of the attenuation channel Λ0 satisfies

CqSn(Λ0)=logn,

where s(ρ) is the support projection of ρ.

10.6. Mutual entropy for general systems

Based on the classical relative entropy, the mutual entropy was discussed by Shannon to study the information transmission in classical systems and it was extended by Ohya [33,34,35] for fully general quantum systems.

Let (A,S(A),α(G)) be a unital C-system and S be a weak* compact convex subset of S(A). For an initial state φS and a channel Λ:S(A)S(), two compound states are

ΦμS=SωΛωdμ,Φ0=φΛφ.

The compound state ΦμS expresses the correlation between the input state φ and the output state Λφ. The mutual entropy with respect to S and μ is given by

IμS(φ;Λ)=S(ΦμS,Φ0)

and the mutual entropy with respect to S is defined by Ohya [33] as

IS(φ;Λ)=sup{IμS(φ;Λ);μMφ(S)}.

10.7. Mutual entropy type complexity

Shor [53] and Bennet et al [6,10] proposed the mutual type measures so-called the coherent entropy and the Lindblad-Nielson entropy by using the entropy exchange [50] defined by

Se(ρ,Λ)=trWlogW,

where W is a matrix W=(Wij)i,j with

WijtrAiρAj

for a state ρ concerning a Stinespring-Sudarshan-Kraus form

Λ()jAjAj,

of a channel Λ. Then the coherent entropy IC(ρ;Λ) [53] and the Lindblad-Nielson entropy IL(ρ;Λ) [10] are given by

IC(ρ;Λ)S(Λρ)Se(ρ,Λ),

IL(ρ;Λ)S(ρ)+S(Λρ)Se(ρ,Λ).

In this section, we compare with these mutual types measures. By comparing these mutual entropies for quantum information communication processes, we have the following theorem [47]:

Theorem Let {Aj} be a projection valued measure withdim Aj=1. For arbitrary state ρ and the quantum channel Λ()jAjAj, one has

  1. 0I(ρ;Λ)min{S(ρ),S(Λρ)} (Ohya mutual entropy),

  2. IC(ρ;Λ)=0 (coherent entropy),

  3. IL(ρ;Λ)=S(ρ) (Lindblad-Nielsen entropy).

For the attenuation channel Λ0, one can obtain the following theorems [47]:

Theorem For any state ρ=nλn|nn| and the attenuation channel Λ0 with |α|2=|β|2=12, one has

  1. 0I(ρ;Λ0)min{S(ρ),S(Λ0ρ)} (Ohya mutual entropy),

  2. IC(ρ;Λ0)=0 (coherent entropy),

  3. IL(ρ;Λ0)=S(ρ) (Lindblad-Nielsen entropy).

Theorem For the attenuation channel Λ0 and the input state ρ=λ|00|+(1λ)|θθ|, we have

  1. 0I(ρ;Λ0)min{S(ρ),S(Λ0ρ)} (Ohya mutual entropy),

  2. S(ρ)IC(ρ;Λ0)S(ρ) (coherent entropy),

  3. 0IL(ρ;Λ0)2S(ρ) (Lindblad-Nielsen entropy).

The above Theorem shows that the coherent entropy IC(ρ;Λ0) takes a minus value for |α|2<|β|2 and the Lindblad-Nielsen entropy IL(ρ;Λ0) is greater than the von Neumann entropy of the input state ρ for |α|2>|β|2. Therefore Ohya mutual entropy is most suitable one for discussing the efficiency of information transmission in quantum processes. Since the above theorems and other results [47] we could conclude that Ohya mutual entropy might be most suitable one for discussing the efficiency of information transmission in quantum communication processes. It means that Ohya mutual entropy can be considered as the transmitted complexity for quantum communication processes.

11. Quantum dynamical entropy

The classical dynamical (or Kolmogorov-Sinai) entropy S(T) [23] for a measure preserving transformation T was defined on a message space through finite partitions of the measurable space.

The classical coding theorems of Shannon are important tools to analyse communication processes which have been formulated by the mean dynamical entropy and the mean dynamical mutual entropy. The mean dynamical entropy represents the amount of information per one letter of a signal sequence sent from an input source, and the mean dynamical mutual entropy does the amount of information per one letter of the signal received in an output system.

The quantum dynamical entropy (QDE) was studied by Connes-Størmer [13], Emch [15], Connes- Narnhofer-Thirring [12], Alicki-Fannes [3], and others [9,48,19,57,11]. Their dynamical entropies were defined in the observable spaces. Recently, the quantum dynamical entropy and the quantum dynamical mutual entropy were studied by the present authors [34,35]: (1) The dynamical entropy is defined in the state spaces through the complexity of Information Dynamics [36]. (2) It is defined through the quantum Markov chain (QMC) was done in [2]. (3) The dynamical entropy for a completely positive (CP) maps was introduced in [25].

12. Mean entropy and mean mutual entropy

The classical Shannon ‘s coding theorems are important subject to study communication processes which have been formulated by the mean entropy and the mean mutual entropy based on the classical dynamical entropy. The mean entropy shows the amount of information per one letter of a signal sequence of an input source, and the mean mutual entropy denotes the amount of information per one letter of the signal received in an output system. Those mean entropies were extended in general quantum systems.

In this section, we briefly explain a new formulation of quantum mean mutual entropy of K-S type given by Ohya [35,27].

In quantum information theory, a stationary information source is denoted by a C triple (A,S(A),θA) with a stationary state φ with respect to θA; that is, A is a unital C-algebra, S(A) is the set of all states over A, θA is an automorphism of A, and φS(A) is a state over A with φθA=φ.

Let an output C-dynamical system be the triple (,S(),θ), and Λ:S(A)S() be a covariant channel which is a dual of a completely positive unital map Λ:A such that Λθ=θAΛ.

In this section, we explain functional SμS(φ;αM), SS(φ;αM), IμS(φ;αM,βN) and IS(φ;αM,βN) introduced in [35,27] for a pair of finite sequences of αM=(α1,α2,,αM), βN=(β1,β2,,βN) of completely positive unital maps αm:AmA, βn:n where Am and n (m=1, , M, n=1, , N) are finite dimensional unital C-algebras.

Let S be a weak * convex subset of S(A) and φ be a state in S. We denote the set of all regular Borel probability measures μ on the state space S(A) of A by Mφ(S), so that μ is maximal in the Choquet ordering and μ represents φ=S(A)ωdμ(ω). Such measures is taken by extremal decomposition measures for φ, Using Choquet's theorem, one can be shown that there exits such measures for any state φS(A). For a given finite sequences of completely positive unital maps αm:AmA from finite dimensional unitalC-algebras Am (m=1,,M) and a given extremal decomposition measure μ of φ, the compound state of α1φ,α2φ,,αMφ on the tensor product algebra m=1MAm is given by [35,27]

ΦμS(αM)=S(A)m=1Mαmωdμ(ω).

Furthermore ΦμS(αMβN) is a compound state of ΦμS(αM) and ΦμS(βN) with αMβN(α1,α2,,αM,β1,β2,, βN) constructed as

ΦμS(αMβN)=S(A)(m=1Mαmω)(n=1Nβnω)dμ

For any pair (αM,βN) of finite sequences αM=(α1,,αM) and βN=(β1,,βN) of completely positive unital maps αm:AmA, βn:nA from finite dimensional unital C -algebras and any extremal decomposition measure μ of φ, the entropy functional Sμ and the mutual entropy functional Iμ are defined in [35,27] such as

SμS(φ;αM)=S(A)S(m=1Mαmω,ΦμS(αM))dμ(ω),IμS(φ;αM,βN)=S(ΦμS(αMβN),ΦμS(αM)ΦμS(βN)),

where S(,) is the relative entropy.

For a given pair of finite sequences of completely positive unitalmaps αM=(α1,,αM), βN=(β1,,βN), the functional SS(φ;αM) (resp. IS(φ;αM,βN) ) is given in [35,27] by taking the supremum of SμS(φ;αM) (resp. IμS(φ;αM,βN)) for all possible extremal decompositions μ's of φ:

SS(φ;αM)=sup{SμS(φ;αM);μMφ(S)},IS(φ;αM,βN)=sup{IμS(φ;αM,βN);μMφ(S)}.

Let A (resp. ) be a unital C-algebra with a fixed automorphism θA (resp. θ), Λ be a covariant completely positive unital map from to A, and φ be an invariant state over A, i.e., φθA=φ.

αN(α,θAα,,θAN1α),βΛN(Λβ,Λθβ,,ΛθN1β).

For each completely positive unital map α:A0A (resp. β:0 ) from a finite dimensional unital C-algebra A0 (resp. 0) to A (resp. ), S˜S(φ;θA,α), I˜S(φ;Λ,θA,θ,α,β) are given in [35,27] by

S˜S(φ;θA,α)=liminfN1NSS(φ;αN),I˜S(φ;Λ,θA,θ,α,β)=liminfN1NIS(φ;αM,βN).

The functional S˜S(φ;θA) and I˜S(φ;Λ,θA,θ) are defined by taking the supremum for all possible A0's, α's, 0's, and β's:

S˜S(φ;θA)=supαS˜S(φ;θA,α),I˜S(φ;Λ,θA,θ)=supα,βI˜S(φ;Λ,θA,θ,α,β).

Then the fundamental inequality in information theory holds for S˜S(φ;θA) and I˜S(φ;Λ,θA,θ) [35].

12.1. Proposition

0I˜S(φ;Λ,θA,θ)min{S˜S(φ;θA),S˜S(Λφ;θ)}.

These functional S˜S(φ;θA) and I˜S(φ;Λ,θA,θ) are constructed from the functional SμS(φ;αN) and IμS(φ;αN,βN) coming from information theory and these functionals are obtained by using a channel transformation, so that those functionals contains the dynamical entropy as a special case [35,27]. Moreover these functionals contain usual K-S entropies as follows [35,27].

Proposition If Ak,A are abelian C-algebras and each αk is an embedding, then our functionals coincide with classical K-S entropies:

SμS(A)(φ;αM)=Sμclassical(m=1MA˜m),IμS(A)(φ;αM,βidN)=Iμclassical(m=1MA˜m,n=1NB˜n)

for any finite partitions A˜m,B˜n of a probability space (Ω,F,φ).

In general quantum structure, we have the following theorems [35,27].

Theorem Let αm be a sequence of completely positive maps αm:AmA such that there exist completely positive maps αm:AAm satisfying αmαmidA in the pointwise topology. Then:

S˜S(φ;θA)=limmS˜S(φ;θA,αm).

Theorem Let αm and βm be sequences of completely positive maps αm:AmA and βm:m such that there exist completely positive maps αm:AAm and βm:m satisfying αmαmidA and βmβmid in the pointwise topology. Then one has

I˜S(φ;Λ,θA,θ)=limmI˜S(φ;Λ,θA,θ,αm,βm).

The above theorem is a Kolmogorov-Sinai type convergence theorem for the mutual entropy [35,27,28,34].

In particular, a quantum extension of classical formulation for information transmission giving a basis of Shannon's coding theorems can be considered in the case that A=A0, B=B0, S=S and θA,θB are shift operators, both denoted by θ. In this case, the channel capacity is defined as [40,41,45,46,38,39,42,43]

C˜(Λ)sup{I˜S(φ;Λ,θ);φS}.

Using this capacity, one can consider Shannon's coding theorems in fully quantum systems.

13. Computations of mean entropies for modulated states

Based on the paper [59], we here explain general modulated states and briefly review some examples of modulated states (PPM, OOK, PSK).

Let {a1,,aN} be an alphabet set constructing the input signals and N{E1,,EN} be the set of one dimensional projections on a Hilbert space satisfying

  1. EnEm(nm)

  2. En corresponds to the alphabet an.

We denote the set of all density operators on generated by

S0{ρ0=n=1NλnEn;ρ00,trρ0=1},

where an element of S0 represents a state of the quantum input system. The state is transmitted from the quantum input system to the quantum modulator in order to send information effectively, whose transmitted state is called the quantum modulated state. The quantum modulated states are denoted as follows: Let M be an ideal modulator and N(M){E1(M),,EN(M)} be the set of one dimensional projections on a Hilbert space M for modulated signals satisfying

En(M)Em(M) (nm), and we represent the set of all density operators on M by

S0(M){ρ0(M)=n=1NμnEn(M);ρ0(M)0,trρ0(M)=1},

where an element of S0(M) represents a modulated state of the quantum input system. There are many expressions for the modulations. In this section, we take the modulated states by means of the photon number states.

γM is a modulator M if γM(En)=En(M) is a map from S0 to S0(M) satisfying (1) γM is a completely positive unital map from A0 to A. Moreover γIM is called an ideal modulator IM if (1) γIM(En)=En(M) is a modulator from S0 to S0(M), γIM(En)γIM(Em) for any orthogonal EnS0. Some examples of ideal modulator are given as follows:

  1. For any EnS0, the PPM (Pulse Position Modulator) is defined by

γPPM(En)En(PPM)=E0PAME0(PAM)Ed(PAM)E0(PAM)E0(PAM)

where E0(PAM) is the vacuum state on (PAM).

  1. For E1,E2S0, the OOK (On-Off Keying) is defined by

γOOK(E1)E1(OOK)=|0><0|,γOOK(E2)E2(OOK)=|κ><κ|

where |κ><κ| is the coherent state on OOK.

  1. For E1,E2S0, the PSK (Phase Shift Keying) is defined by

γPSK(E1)E1(PSK)=|κ><κ|,γPSK(E2)E2(PSK)=|κ><κ|

where |κ><κ|,|κ><κ| are the coherent states on PSK.

Now we briefly review the calculation of the mean mutual entropy of K-S type for the modulated state (PSK) by means of the coherent state. Other calculations are obtained in [59].

α(IM)N,β(IM)N are given by

α(IM)N(αγ˜(IM),θAαγ˜(IM),,θAN1αγ˜(IM)),β(IM)N(γ˜(IM)Λβ,γ˜(IM)Λθβ,,γ˜(IM)ΛθN1β),

where Λ˜i=Λ and γ˜(IM)i=γ(IM) are held.

PSK. For an initial state ρ=i=ρii=Si, let ρi=ν|κκ|+(1ν)|κκ| (0ν1). The Schatten decomposition of ρi is obtained as

ρi=ni=12λniEni(PSK),

where the eigenvalues λni of ρi are

λ1=12{1+14ν(1ν)(1exp(|2κ|2))},λ2=12{114ν(1ν)(1exp(|2κ|2))}.

Two projections Eni(PSK) (ni=1,2) and the eigenvectors |eni(PSK) of λni (ni=1,2) are given by

Eni(PSK)=|eni(PSK)eni(PSK)|,

|eni(PSK)=ani|κ+bni|κ,(ni=1,2),

where

|bni|2=1τni2+2exp(|κ|2)τni+1,|ani|2=τni2|bni|2,anibni¯=ani¯bni=τni|bni|2

τ1=(12ν)+14ν(1ν)(1exp(|2κ|2))2(1ν)exp(|κ|2),τ2=(12ν)14ν(1ν)(1exp(|2κ|2))2(1ν)exp(|κ|2).

For the above initial state Eni(PSK), one can obtain the output state for the attenuation channel Λ as follows:

ΛEni(PSK)=ni=12λni,niEni,ni(PSK)(ni=1,2),

where the eigenvalues λni,ni of ΛEni(OOK) are given by ( ni=1,2)

λni,1=12{1+14μni(1μni)(1|uni,1,uni,2|2)},λni,2=12{114μni(1μni)(1|uni,1,uni,2|2)},

μni=12(1+exp((1η)|κ|2))τni2+2exp(|ακ|2)τni+1τni2+2exp(|κ|2)τni+1.

|uni,ni,uni,ni|2=1,uni,1,uni,2=τni21(τni2+1)24exp(|2ακ|2)τni2(ni=1,2).

Eni,ni(PSK)are the eigenstates with respect to λni,ni. Then we have

ΦE(α(PSK)N)=i=0N1γ(PSK)αθAi(ρ)=i=0N1γ(PSK)(ρi)=n0=12nN1=12(k=0N1λnk)(i=0N1Eni(PSK))

When Λ is given by the attenuation channel, we get

ΦE(βΛ(PSK)N)=i=0N1βθBiΛγ(PSK)(ρ)=n0=12nN1=12(k=0N1λnk)(i=0N1ΛEni(PSK))

The compound states through the attenuation channel Λ becomes

ΦE(α(PSK)NβΛ(PSK)N)=n0=12nN1=12(k=0N1λnk)n0=12nN1=12(k=0N1λnk,nk)×(i=0N1Eni(PSK))(i=0N1Eni,ni(PSK))ΦE(α(PSK)N)ΦE(βΛ(PSK)N)=n0=12nN1=12(k=0N1λnk)m0=12mN1=12(k=0N1λmk)×m0=02mN1=02(k"=0N1λmk",mk")×(i=0N1Eni(PSK))(i=0N1Emi,mi(PSK))

Lemma For an initial state ρ=i=ρii=Si, we have

IE(ρ;α(PSK)N,β(PSK)N)=n0=12nN1=12n0=12nN1=12(k=0N1λnkλnk,nk)logk=0N1λnk,nkm0=12mN1=12(k=0N1λmkλmk,nk).

By using the above lemma, we have the following theorem.

Theorem For an initial state ρ=i=ρii=Si, we have

S˜(ρ;θA,α(PSK)N)=limN1NS(ρ;α(PSK)N)=n=12λnlogλn

and

I˜(ρ;Λ,θ,θ,α(PSK)N,β(PSK)N)=n=12n=12λnλn,nlogλn,nm=12λmλm,n.

14. KOW dynamical entropy

In this section, we briefly explain the definition of the KOW entropy according to [25].

For a normal state ω on B(K) and a normal, unital CP linear map Γ from B(K)B() to B(K)B(), one can define a transition expectation EΓ,ω from B(K)B() to B() by

EΓ,ω(A˜)=ω(Γ(A˜))=trKω˜Γ(A˜),A˜B(K)B()

in the sense of [1,25], where ω˜S(K) is a density operator associated to ω. The dual map E is a lifting from S() to S(K) by

EΓ,ω(ρ)=Γ(ω˜ρ).

in the sense of Accardi and Ohya [1]. For a normal, unital CP map Λ from B() to B() and the identity map id on B(K), the transition expectation

EΛΓ,ω(A˜)=ω((idΛ)Γ(A˜)),A˜B(K)B()

and the lifting is defined by

EΛΓ,ω(ρ)=Γ(ω˜Λ(ρ)),ρS(),

where idΛ is a normal, unital CP map from B(K)B() to B(K)B() and Λ is a quantum channel [30,21,31,39,44,46,43] from S() to S() with respect to an input signal state ρ and a noise state ω˜. Based on the following relation

tr(1nK)ΦΛ,nΓ,ω(ρ)(A1AnB)trρ(EΛΓ,ω(A1EΛΓ,ω(A2An1EΛΓ,ω(AnB))))

for all A1,A2,,AnB(K),BB() and any ρS(), a lifting ΦΛ,nΓ,ω from S() to S((1nK)) and marginal states are given by

ρΛ,nΓ,ωtrΦΛ,nΓ,ω(ρ)S(1nK) and ρ¯Λ,nΓ,ωtr1nKΦΛ,nΓ,ω(ρ)S()

where ΦΛ,nΓ,ω(ρ) is a compound state with respect to ρ¯Λ,nΓ,ω and ρΛ,nΓ,ω in the sense of [25,31] .

Definition The quantum dynamical entropy with respect to Λ,ρ,Γ and ω is defined by

S˜(Λ;ρ,Γ,ω)limsupn1nS(ρΛ,nΓ,ω),

where S(ρΛ,nΓ,ω) is the von Neumann entropy of ρΛ,nΓ,ωS(1nK) defined by S(ρΛ,nΓ,ω)=trρΛ,nΓ,ωlogρΛ,nΓ,ω. The dynamical entropy with respect to Λ and ρ is defined as

S˜(Λ;ρ)sup{S(Λ;ρ,Γ,ω);Γ,ω}.

15. Formulation of generalized AF and AOW entropies by KOW entropy

In this section, I briefly explain the generalized AF and AOW entropies based on the KOW entropy [25].

For a finite operational partition of unity γ1,,γdB(), i.e., i=1dγiγi=I, and a normal unital CP map Λ from B() to B(), transition expectations EΛγ from MdB() to B() and EΛγ(0) from Md0B() to B() are defined by

EΛγ(i,j=1dEijAij)i,j=1dΛ(γiAijγj),EΛγ(0)(i,j=1dEijAij)i=1dΛ(γiAiiγi),

where Eij=|eiej| with normalized vectors ei,i=1,2,,ddim, Md in B() is the d×d matrix algebra and Md0 is a subalgebra of Md consisting of diagonal elements of Md. Then the quantum Markov states

ρΛ,nγ=i1,,in=1dj1,,jn=1dtrρΛ(Wj1i1(Λ(Wj2i2(Λ(Wjnin(I))))))×Ei1j1Einjn

and ρΛ,nγ(0) is obtained by

ρΛ,nγ(0)=i1,,in=1dtrρΛ(Wi1i1(Λ(Wi2i2(Λ(Winin(I))))))Ei1i1Einin=i1,,in=1dpi1,,inEi1i1Einin,

where

Wij(A)γiAγj,AB(),Wij(ρ)γjργi,ρS(),pi1,,intrρΛ(Wi1i1(Λ(Wi2i2(Λ(Winin(I))))))=trWinin(ΛΛ(Wi2i2(Λ(Wi1i1(Λ(ρ)))))).

Therefore the generalized AF entropy S˜(Λ;ρ) and the generalized AOW entropy S˜(0)(Λ;ρ) of Λ and ρ with respect to a finite dimensional subalgebra B() are obtained by

S˜(Λ;ρ)sup{γi}S˜(Λ;ρ,{γi}),S˜(0)(Λ;ρ)sup{γi}(0)S˜(Λ;ρ,{γi}),

where the dynamical entropies S˜(Λ;ρ,{γi}) and S˜(0)(Λ;ρ,{γi}) are given by

S˜(Λ;ρ,{γi})limsupn1nS(ρΛ,nγ),S˜(0)(Λ;ρ,{γi})limsupn1nS(ρΛ,nγ(0)).

16. Computations of generalized AOW entropy for modulated states

Then we have the following theorem [25]:

16.1. Theorem

S˜(Λ;ρ)S˜(0)(Λ;ρ).

S˜(0)(Λ;ρ) is equal to the AOW entropy if {γi} is PVM (projection valued measure) and Λ is given by an automorphism θ. S˜(Λ;ρ) is equal to the AF entropy if {γiγi} is POV (positive operator valued measure) and Λ is given by an automorphism θ. For the noisy optical channel, the generalized AOW entropy can be obtained in [58] as follows.

Theorem [58] When ρ is given by ρ=λ|00|+(1λ)|ξξ| and Λ is the noisy optical channel with the cohetent noise |κκ| and parameters α,β satisfying |α|2+|β|2=1, the quantum dynamical entropy with respect to Λ, ρ and {γj} is obtained by

S˜(0)(Λ;ρ,{γj})=j,kqk,jqjlogqk,j,

where

qj=λ|βκ,xj|2+(1λ)|αξ+βκ,xj|2

qk,j=νj+|xk,yj+|2+(1νj+)|xk,yj|2,|yj+=aj+|βκ+bj+|αξ+βκ,|yj=aj|βκbj|αξ+βκ,

aj+=εj+aj,aj=εjaj,bj+=εj+bj,bj=εjbj,

εj+=τj2+2exp(12|ξ|2)τj+1τj2+2exp(12|αξ|2)τj+1,εj=τj2+2exp(12|ξ|2)τj+1τj22exp(12|αξ|2)τj+1,νj+=12(1+exp(12(1|α|2)|ξ|2))1(εj+)2,

τj=(12λ)2(1λ)exp(12|ξ|2)+(1)j14λ(1λ)(1exp(|ξ|2))2(1λ)exp(12|ξ|2),

|bj|2=1τj2+2exp(12|ξ|2)τj+1,|aj|2=τj2|bj|2,a¯jbj=ajb¯j=τj|bj|2

Theorem [58] For n≧3, the above compound state ρΛ,nγ(0) is written by

ρΛ,nγ(0)=j1,,jn=12qj1,,jnk=1n|xjkxjk|,

where

qj1,,jntrWjnjn(Λ(Λ(Wj2j2(Λ(Wj1j1(Λ(ρ))))))),Λ(ρ)=λ|βκβκ|+(1λ)|αξ+βκαξ+βκ|,Wjj(Λ(ρ))γjΛ(ρ)γj=(λ|βκ,xj|2+(1λ)|αξ+βκ,xj|2)|xjxj|.

Based on [40,41,45,46], one can obtain

Λ(|xjxj|)=νj+|yj+yj+|+(1νj+)|yjyj|.

Thus we have

qj1,,jn=k=2n(νjk1+|xjk,yjk1+|2+(1νjk1+)|xjk,yjk1|2)×(λ|βκ,xj|2+(1λ)|αξ+βκ,xj|2)=k=2nqjk,jk1qj1.

If jqk,jqj=qk is hold , then we get the dynamical entropy with respect to Λ, ρ and {γj} such as

S˜(0)(Λ;ρ,{γj})=j,kqk,jqjlogqk,j.

References

  1. 1. Accardi, L., and Ohya, M., Compondchannnels, transition expectation and liftings, Appl. Math, Optim., 39, 33-59 (1999).
  2. 2. Accardi, L., Ohya, M., and Watanabe, N., Dynamical entropy through quantum Markov chain, Open System and Information Dynamics, 4, 71-87 (1997)
  3. 3. Alicki, R. and Fannes, M., Defining quantum dynamical entropy, 32, 75-82 (1994).
  4. 4. Araki, H., Relative entropy for states of von Neumann algebras, Publ. RIMS Kyoto Univ., 11, 809-833 (1976).
  5. 5. Araki, H., Relative entropy for states of von Neumann algebras II, 13, 173-192 (1977).
  6. 6. Barnum, H., Nielsen, M.A., and Schumacher, B.W., Information transmission through a noisy quantum channel, Physical Review A, 57, No.6, 4153-4175 (1998).
  7. 7. Belavkin, V.P., and Ohya, M., Quantum entropy and information in discrete entangled states, Infinite Dimensional Analysys, Quantum Probability and Related Topics, 4, No.2, 137-160 (2001).
  8. 8. Belavkin, V.P., and Ohya, M., Entanglement, quantum entropy and mutual information, Proceedings of the Royal Society of London, Series A, Mathematical and Physical Sciences, 458, 209-231 (2002).
  9. 9. Benatti, F., Trieste Notes in Phys, Springer-Verlag, (1993).
  10. 10. Bennett, C.H., Shor, P.W., Smolin, J.A., and Thapliyalz, A.V., Entanglement-Assisted Capacity of a Quantum Channel and the Reverse Shannon Theorem, quant-ph/0106052.
  11. 11. Choda, M., Entropy for extensions of Bernoulli shifts, 16, 1197-1206 (1996).
  12. 12. Connes, A., Narnhofer, H. and Thirring, W., Dynamical entropy of C*-algebras and von Neumann algebras, 112, 691-719 (1987).
  13. 13. Connes, A., and Størmer, E., Entropy for automorphisms of von Neumann algebras, Acta Math., 134, 289-306 (1975).
  14. 14. Donald, M. J., On the relative entropy, 105, 13-34 (1985).
  15. 15. Emch, G.G., Positivity of the K-entropy on non-abelian K-flows, Z. Wahrscheinlichkeitstheoryverw. Gebiete, 29, 241 (1974).
  16. 16. Fichtner, K.H., Freudenberg, W., and Liebscher, V., Beam splittings and time evolutions of Boson systems, Fakultat fur Mathematik und Informatik, Math/ Inf/96/ 39, Jena, 105 (1996).
  17. 17. Gelfand, I. M. and Yaglom, A. M., Calculation of the amount of information about a random function contained in another such function, 12, 199-246 (1959).
  18. 18. Holevo, A.S., Some estimates for the amount of information transmittable by a quantum communication channel (in Russian), ProblemyPeredachiInformacii, 9, 3-11 (1973).
  19. 19. Hudetz, T., Topological entropy for appropriately approximated C*-algebras, 35, 4303-4333 (1994).
  20. 20. Ingarden, R. S., Quantum information theory, 10, 43-73 (1976).
  21. 21. Ingarden, R.S., Kossakowski, A., and Ohya, M., Information Dynamics and Open Systems, Kluwer, (1997).
  22. 22. Jamiołkowski, A., Linear transformations which preserve trace and positive semidefiniteness of operators, Rep. Math. Phys. 3, 275 -278 (1972).
  23. 23. Kolmogorov, A. N., Theory of transmission of information, Ser.2, 33, 291-321 (1963).
  24. 24. Kossakowski, A. and Ohya, M., New scheme of quantum teleportation, Infinite Dimensional Analysis, Quantum Probability and Related Topics, 10, No.3, 411-420 (2007).
  25. 25. Kossakowski, A., Ohya, M. and Watanabe, N., Quantum dynamical entropy for completely positive map, Infinite Dimensional Analysis, Quantum Probability and Related Topics, 2, No.2, 267-282, (1999)
  26. 26. Kullback, S. and Leibler, R., On information and sufficiency, 22, 79-86 (1951).
  27. 27. Muraki, N., and Ohya, M., Entropy functionals of Kolmogorov Sinai type and their limit theorems, Letter in Math. Phys., 36, 327-335 (1996)
  28. 28. Muraki, N., Ohya, M., and Petz, D., Entropies of general quantum states, Open Systems and Information Dynamics, 1, 43-56 (1992)
  29. 29. von Neumann, J., Die MathematischenGrundlagen der Quantenmechanik, Springer-Berlin (1932)
  30. 30. Ohya, M., Quantum ergodic channels in operator algebras, J. Math. Anal. Appl., 84, 318-328 (1981)
  31. 31. Ohya, M., On compound state and mutual information in quantum information theory, IEEE Trans. Information Theory, 29, 770-774 (1983)
  32. 32. Ohya, M., Note on quantum probability, L. NuovoCimento, 38, 402-404 (1983).
  33. 33. Ohya, M., Entropy transmission in C*-dynamical systems, J. Math. Anal. Appl., 100, No.1, 222-235 (1984).
  34. 34. Ohya, M., State change and entropies in quantum dynamical systems, 1136, 397-408 (1985).
  35. 35. Ohya, M., Some aspects of quantum information theory and their applications to irreversible processes, Rep. Math. Phys., 27, 19-47 (1989)
  36. 36. Ohya, M., Information dynamics and its application to optical communication processes, Springer Lecture Note in Physics, 378, 81-92, (1991)
  37. 37. Ohya, M., State change, complexity and fractal in quantum systems, Quantum Communications and Measurement, 2, 309-320 (1995)
  38. 38. Ohya, M. Fundamentals of quantum mutual entropy and capacity, 6, 69-78 (1999).
  39. 39. Ohya, M., and Petz, D., Quantum Entropy and its Use, Springer, Berlin, 1993.
  40. 40. Ohya, M., Petz, D., and Watanabe, N., On capacity of quantum channels, Probability and Mathematical Statistics, 17, 179-196 (1997)
  41. 41. Ohya, M., Petz, D., and Watanabe, N., Numerical computation of quantum capacity, International Journal of Theoretical Physics, 37, No.1, 507-510 (1998).
  42. 42. Ohya, M. and Volovich, I. V. On Quantum Entropy and its Bound, 6, 301 - 310 (2003).
  43. 43. Ohya, M., and I. Volovich, I., Mathematical Foundations of Quantum Information. and Computation and Its Applications to Nano- and Bio-systems, Springer, (2011).
  44. 44. Ohya, M., and Watanabe, N., Construction and analysis of a mathematical model in quantum communication processes, Electronics and Communications in Japan, Part 1, 68, No.2, 29-34 (1985)
  45. 45. Ohya, M., and Watanabe, N., Quantum capacity of noisy quantum channel, Quantum Communication and Measurement, 3, 213-220 (1997)
  46. 46. Ohya, M., and Watanabe, N., Foundation of Quantum Communication Theory (in Japanese), Makino Pub. Co., (1998)
  47. 47. Ohya, M., and Watanabe, N., Comparison of mutual entropy - type measures, TUS preprint (2003).
  48. 48. Park, Y. M., Dynamical entropy of generalized quantum Markov chains, 32, 63-74 (1994).
  49. 49. Schatten, R., Norm Ideals of Completely Continuous Operators, Springer-Verlag, (1970).
  50. 50. Schumacher, B.W., Sending entanglement through noisy quantum channels, Physical Review A, 54, 2614 (1996).
  51. 51. Schumacher, B.W., and Nielsen, M.A., Quantum data processing and error correction, Physical Review A, 54, 2629 (1996).
  52. 52. Shannon, C. E., A mathematical theory of communication, 27, 379 - 423 and 623 - 656.
  53. 53. Shor, P., The quantum channel capacity and coherent information, Lecture Notes, MSRI Workshop on Quantum Computation, (2002).
  54. 54. Uhlmann, A., Relative entropy and the Wigner-Yanase-Dyson-Lieb concavity in interpolation theory, Commun. Math. Phys., 54, 21-32 (1977)
  55. 55. Umegaki, H., Conditional expectations in an operator algebra IV (entropy and information), Kodai Math. Sem. Rep., 14, 59-85 (1962).
  56. 56. Urbanik, K., Joint probability distribution of observables in quantum mechanics, 21, 117-133 (1961).
  57. 57. Voiculescu, D., Dynamical approximation entropies and topological entropy in operator algebras, 170, 249-281 (1995).
  58. 58. Watanabe, N., Some aspects of complexities for quantum processes, Open Systems and Information Dynamics, 16, 293-304 (2009).
  59. 59. Watanabe, N., Note on entropies of quantum dynamical systems, Foundation of Physics, 41, 549-563 DOI 10, 1007/s10701-010-9455-x (2011).

Written By

Noboru Watanabe

Submitted: July 9th, 2012 Published: July 24th, 2013