The complex systems and their dynamics are treated various way. Ohya looked for synthesizing method to treat complex systems. He established Information Dynamics  which is a new concept unifying the dynamics of a state and the complexity of the system itself. By applying Information Dynamics, one can discuss in a unified frame the problems such as in mathematics, physics, biology, information science. Information Dynamics is growing as one of the research fields, for instance, the international journal named "Open Systems and Information Dynamics" in 1992 has appeared. In ID, there are two types of complexity, that is, (a) a complexity of state describing system itself and (b) a transmitted complexity between two systems. Entropies of classical and quantum information theory are the example of the complexities of (a) and (b).
Shannon  found that the entropy, introduced in physical systems by Clausius and Boltzmann, can be used to express the amount of information by means of communication processes, and he proposed the so-called information communication theory at the middle part of the 20th century. In his information theory, the entropy and the mutual entropy (information) are most important concepts. The entropy relates to the complexity of ID measuring the amount of information of the state of system. The mutual entropy (information) corresponds to the transmitted complexity of ID representing the amount of information correctly transmitted from the initial system to the final system through a channel, and it was extended to the mutual entropy on the continuous probability space by Gelfand– Kolmogorov - Yaglom [17,23], which was defined by using the relative entropy of two states by Kullback-Leibler .
Laser is often used in the current communication. A formulation of information theory being able to treat quantum effects is necessary, which is the so-called quantum information theory. The quantum information theory is important in both mathematics and engineering. It has been developed with quantum entropy theory and quantum probability. In quantum information theory, one of the important problems is to investigate how much information is exactly transmitted to the output system from the input system through a quantum channel. The amount of information of the quantum input system is described by the quantum entropy defined by von Neumann  in 1932. The C*-entropy was introduced in [33,35] and its property is discussed in [28,21]. The quantum relative entropy was introduced by Umegaki  and it is extended to general quantum system by Araki [4,5], Uhlmann  and Donald . Furthermore, it had been required to extend the Shannon’s mutual entropy (information) of classical information theory to that in the quantum one. The classical mutual entropy is defined by using the joint probability expressing a correlation between the input system and the output system. However, it was shown by Urbanik  that in quantum system there does not generally exists a joint probability distribution. The semi-classical mutual entropy was introduced by Holevo, Livitin, Ingarden [18,20] for classical input and output passing through a possible quantum channel. By introducing a new notion, the so-called compound state, in 1983 Ohya formulated the mutual entropy [31,32] in a complete quantum mechanical system (i.e., input state, output state and channel are all quantum mechanical), which is called the Ohya mutual entropy. It was generalized to C*-algebra in [Oent84]. The quantum capacity  is defined by taking the supremum for the Ohya mutual entropy. By using the Ohya quantum mutual entropy, one can discuss the efficiency of the information transmission in quantum systems [28,27,44,34,35], which allows the detailed analysis of optical communication processes. Concerning quantum communication processes, several studies have been done in [31,32,35,40,41]. Recently, several mutual entropy type measures (Lindblad - Nielsen entropy  and Coherent entropy ) were defined by using the entropy exchange. One can classify these mutual entropy type measures by calculating their measures for the quantum channel. These entropy type complexities are explained in [39,43].
The entangled state is an important concept for quantum theory and it has been studied recently by several authors. One of the remarkable formulations to search the entanglement state is the Jamiolkowski’s isomorphism . It might be related to the construction of the compound state in quantum communication processes. One can discuss the entangled state generated by the beam splitting and the squeezed state.
The classical dynamical (or Kolmogorov-Sinai) entropy S(T)  for a measure preserving transformation T was defined on a message space through finite partitions of the measurable space. The classical coding theorems of Shannon are important tools to analyze communication processes which have been formulated by the mean dynamical entropy and the mean dynamical mutual entropy. The mean dynamical entropy represents the amount of information per one letter of a signal sequence sent from the input source, and the mean dynamical mutual entropy does the amount of information per one letter of the signal received in the output system. In this chapter, we will discuss the complexity of the quantum dynamical system to calculate the mean mutual entropy with respect to the modulated initial states and the attenuation channel for the quantum dynamical systems .
The quantum dynamical entropy (QDE) was studied by Connes-Størmer , Emch , Connes-Narnhofer-Thirring , Alicki-Fannes , and others [9,48,19,57,11]. Their dynamical entropies were defined in the observable spaces. Recently, the quantum dynamical entropy and the quantum dynamical mutual entropy were studied by the present authors [34,35]: (1) the dynamical entropy is defined in the state spaces through the complexity of Information Dynamics . (2) It is defined through the quantum Markov chain (QMC) was done in . (3) The dynamical entropy for a completely positive (CP) map was defined in . In this chapter, we will discuss the complexity of the quantum dynamical process to calculate the generalized AOW entropy given by KOW entropy for the noisy optical channel .
2. Quantum channels
The signal of the input quantum system is transmitted through a physical device, which is called a quantum channel. The concept of channel has been performed an important role in the progress of the quantum information communication theory. The mathematical representation of the quantum channel is a mapping from the input state space to the output state space. In particular, the attenuation channel  and the noisy optical channel  are remarkable examples of the quantum channels describing the quantum optical communication processes. These channels are related to the mathematical desctiption of the beam splitter.
Here we review the definition of the quantum channels.
Let and be input and output systems, respectively, where is the set of all bounded linear operators on a separable Hilbert space and is the set of all density operators on . Quantum channel is a mapping from to .
is called a linear channel if satisfies for any and any .
is called a completely positive (CP) channel if is linear and its dual map from to holds
3. Quantum communication processes
Let and be two Hilbert spaces expressing noise and loss systems, respectively. Quantum communication process including the influence of noise and loss is denoted by the following scheme : Let be an input state in , be a noise state in .
The above maps are given as
The map is a CP channel from to given by physical properties of the device transmitting signals. Hence the channel for the above process is given as
for any Based on this scheme, the noisy optical channel is constructed as follows:
4. Noisy optical channel
Noisy optical channel with a noise state was defined by Ohya and NW  such as
where is the photon number state in and is a mapping from to denoted by
and is the photon number state vector in , and , are complex numbers satisfying . and are constants given by , . We have the following theorem.
where , is a CONS in , is a CONS in and is the set of number states in .
In particular for the coherent input states
the output state of is obtained by
5. Attenuation channel
The noisy optical channel with a vacuum noise is called the attenuation channel introduced in  by
where is the vacuum state in and is a mapping from to given by
In particular, for the coherent input state
one can obtain the output state
Lifting from to in the sense of Accardi and Ohya  is denoted by
(or is called a beam splitting. Based on liftings, the beam splitting was studied by Accardi - Ohya  and Fichtner - Freudenberg - Libsher .
6. Information dynamics
We are interested to study the dynamics of state change or the complexity of state for several systems. Information dynamics (ID) is a new concept introduced by Ohya  to construct a theory under the ID's framework by synthesizing these investigating schemes. In ID, a complexity of state describing system itself and a transmitted complexity between two systems are used. The examples of these complexities are the Shannon's entropy and the mutual entropy (information) in classical entropy theory. In quantum entropy theory, it was known that the von Neumann entropy and the Ohya mutual entropy relate to these complexities. Recently, several mutual entropy type measures (the Lindblad - Nielsen entropy  and the Coherent entropy ) were proposed by means of the entropy exchange for an input state and a channel.
7. Concept of information dynamics
Ohya introduced Information Dynamics (ID) synthesizing dynamics of state change and complexity of state. Based on ID, one can study various problems of physics and other fields. Channel and two complexities are key concepts of ID. Two kinds of complexities , are used in ID. is a complexity of a state measured from a subset and is a transmitted complexity according to the state change from to . Let be subsets of , respectively. These complexities should fulfill the following conditions as follows:
8. Complexity of system
For any , is nonnegative (i.e., )
For a bijection from to ,
is hold, where is the set of all extremal points of .
For , , ,
It means that the complexity of the state of totally independent systems are given by the sum of the complexities of the states and .
9. Transmitted complexity
(1’) For any and a channel , is nonnegative (i.e., )
(4) and satisfy the following inequality .
(5) If the channel is given by the identity map , then is hold.
The examples of the above complexities are the Shannon entropy for and the classical mutual entropy for , respectively Let us consider these complexities for quantum communication processes.
10. Quantum entropy
Since the present optical communication is using the optical signal including quantum effect, it is necessary to construct new information theory dealing with those quantum phenomena in order to discuss the efficiency of information transmission of optical communication processes rigorously. The quantum information theory is important in both mathematics and engineering, and it contains several topics, for instance, quantum entropy theory, quantum communication theory, quantum teleportation, quantum entanglement, quantum algorithm, quantum coding theory and so on. It has been developed with quantum entropy theory and quantum probability. In quantum information theory, one of the important problems is to investigate how much information is exactly transmitted to the output system from the input system through a quantum channel. The amount of information of the quantum communication system is described by the quantum mutual entropy defined by Ohya , based on the quantum entropy by von Neumann , and the quantum relative entropy by Umegaki , Araki  and Uhlmann . The quantum information theory directly relates to quantum communication theory, for instance, [40,41,45]. One of the most important communication processes is quantum teleportation, whose new treatment was studied in . It is important to classify quantum states. One of such classifications is to study entanglement and separability of states (see [7,8]). There have been lots of trials in finite dimensional Hilbert spaces. Quantum mechanics should be basically discussed in infinite dimensional Hilbert spaces. We have to study such a classification in infinite dimensional Hilbert spaces.
10.1. Von Neumann entropy
The study of the entropy in quantum system was begun by von Neumann  in 1932. For any state given by the density operator , the von Neumann entropy is defined by
Since the von Neumann entropy satisfies the conditions (1),(2),(3) of the complexity of state of ID, it seems to be considered as an example of the complexity of state
10.2. Entropy for general systems
Here we briefly explain Let us comment general entropies of states in C*-dynamical systems. The C*-entropy (-mixing entropy) was introduced by Ohya in [33,35] and its property is discussed in [28,21].
Let be a C*-dynamical system and be a weak* compact and convex subset of . For example, is given by (the set of all states on ), (the set of all invariant states for ), (the set of all KMS states),and so on. Every state has a maximal measure pseudosupported on such that
where is the set of all extreme points of . The measure giving the above decomposition is not unique unless is a Choquet simplex. The set of all such measures is denoted by and is the subset of constituted by
where is the Dirac measure concentrated on an initial state . For a measure , the entropy type functional is given by
It describes the amount of information of the state measured from the subsystem . If , then is denoted by . This entropy is an extension of the von Neumann entropy mentioned above.
10.3. Quantum relative entropy
The classical relative entropy in continuous probability space was defined by Kullback-Leibler . It was developed in noncommutative probability space. The quantum relative entropy was first defined by Umegaki  for σ-finite von Neumann algebras, which denotes a certain difference between two states. It was extended by Araki  and Uhlmann  for general von Neumann algebras and *-algebras, respectively.
10.4. Umegakirelative entropy
The relative entropy of two states was introduced by Umegaki in  for -finite and semi-finite von Neumann algebras. Corresponding to the classical relative entropy, for two density operators and , it is defined as
where means the support projection of is greater than the support projection of . It means a certain difference between two quantum states and . The Umegaki’s relative entropy satisfies (1) positivity, (2) joint convexity, (3) symmetry, (4) additivity, (5) lower semicontinuity, (6) monotonicity. Araki  and Uhlmann  extended this relative entropy for more general quantum systems.
10.5. Relative entropy for general systems
10.5.1. Araki's relative entropy[4,5]
Let be a -finite von Neumann algebra acting on a Hilbert space and be normal states on given by and with (i.e., is a positive natural cone) . On the domain , the operator is defined by
where (the -support of ) is the projection from to . Using this , the relative modular operator is defined as , whose spectral decomposition is denoted by (is the closure of ). Then the Araki’s relative entropy is given by
where means that implies for .
10.5.2. Uhlmann's relative entropy
Let be a complex linear space and be two semi-norms on . is the set of all positive Hermitian forms on satisfying for all . For , the quadratical mean of and is defined by
For each , there exists a family of semi-norms of , which is called the quadratical interpolation from to , satisfying the following conditions:
For any , is continuous in ,
This semi-norm is denoted by . It is shown that for any positive Hermitianforms , there exists a unique function of with values in the set such that is the quadratical interpolation from to . For , the relative entropy functional of and is defined as
Let be a *-algebra . For positive linear functional on , two Hermitian forms are given by and .
10.5.3. Ohya mutual entropy 
The Ohya mutual entropy  with respect to the initial state and a quantum channel is described by
where is the Umegaki's relative entropy and represents a Schatten-von Neumann (one dimensional orthogonal) decomposition  of . Since the Schatten-von Neumann decomposion of a state is not unique unless all eigenvalues of do not degenerate, the Ohya mutual entropy is defined by taking a supremum for all Schatten-von Neumann decomposion of a state . Then the Ohya mutual entropy satisfies the following Shannon's type inequality 
where is the von Neumann entropy. This inequalities show that the Ohya mutual entropy represents the amount of information correctly carried from the input system to the output system through the quantum channel. The capacity denotes the ability of the information transmission of the communication processes, which was studied in [40,41,45].
For a certain set satisfying some physical conditions, the capacity of quantum channel  is defined by
If holds, then the capacity is denoted by Then the following theorem for the attenuation channel was proved in .
where is the support projection of
10.6. Mutual entropy for general systems
Based on the classical relative entropy, the mutual entropy was discussed by Shannon to study the information transmission in classical systems and it was extended by Ohya [33,34,35] for fully general quantum systems.
Let be a unital -system and be a weak* compact convex subset of . For an initial state and a channel , two compound states are
The compound state expresses the correlation between the input state and the output state . The mutual entropy with respect to and is given by
and the mutual entropy with respect to is defined by Ohya  as
10.7. Mutual entropy type complexity
where is a matrix with
for a state concerning a Stinespring-Sudarshan-Kraus form
In this section, we compare with these mutual types measures. By comparing these mutual entropies for quantum information communication processes, we have the following theorem :
(Ohya mutual entropy),
For the attenuation channel , one can obtain the following theorems :
(Ohya mutual entropy),
(Ohya mutual entropy),
The above Theorem shows that the coherent entropy takes a minus value for and the Lindblad-Nielsen entropy is greater than the von Neumann entropy of the input state for . Therefore Ohya mutual entropy is most suitable one for discussing the efficiency of information transmission in quantum processes. Since the above theorems and other results  we could conclude that Ohya mutual entropy might be most suitable one for discussing the efficiency of information transmission in quantum communication processes. It means that Ohya mutual entropy can be considered as the transmitted complexity for quantum communication processes.
11. Quantum dynamical entropy
The classical dynamical (or Kolmogorov-Sinai) entropy S(T)  for a measure preserving transformation T was defined on a message space through finite partitions of the measurable space.
The classical coding theorems of Shannon are important tools to analyse communication processes which have been formulated by the mean dynamical entropy and the mean dynamical mutual entropy. The mean dynamical entropy represents the amount of information per one letter of a signal sequence sent from an input source, and the mean dynamical mutual entropy does the amount of information per one letter of the signal received in an output system.
The quantum dynamical entropy (QDE) was studied by Connes-Størmer , Emch , Connes- Narnhofer-Thirring , Alicki-Fannes , and others [9,48,19,57,11]. Their dynamical entropies were defined in the observable spaces. Recently, the quantum dynamical entropy and the quantum dynamical mutual entropy were studied by the present authors [34,35]: (1) The dynamical entropy is defined in the state spaces through the complexity of Information Dynamics . (2) It is defined through the quantum Markov chain (QMC) was done in . (3) The dynamical entropy for a completely positive (CP) maps was introduced in .
12. Mean entropy and mean mutual entropy
The classical Shannon ‘s coding theorems are important subject to study communication processes which have been formulated by the mean entropy and the mean mutual entropy based on the classical dynamical entropy. The mean entropy shows the amount of information per one letter of a signal sequence of an input source, and the mean mutual entropy denotes the amount of information per one letter of the signal received in an output system. Those mean entropies were extended in general quantum systems.
In quantum information theory, a stationary information source is denoted by a triple with a stationary state with respect to ; that is, is a unital -algebra, is the set of all states over , is an automorphism of , and is a state over with .
Let an output -dynamical system be the triple , and be a covariant channel which is a dual of a completely positive unital map such that .
In this section, we explain functional , , and introduced in [35,27] for a pair of finite sequences of , of completely positive unital maps , where and (, , , , , ) are finite dimensional unital -algebras.
Let be a weak * convex subset of and be a state in . We denote the set of all regular Borel probability measures on the state space of by , so that is maximal in the Choquet ordering and represents . Such measures is taken by extremal decomposition measures for , Using Choquet's theorem, one can be shown that there exits such measures for any state . For a given finite sequences of completely positive unital maps from finite dimensional unital-algebras and a given extremal decomposition measure of , the compound state of on the tensor product algebra is given by [35,27]
Furthermore is a compound state of and with , constructed as
For any pair of finite sequences and of completely positive unital maps , from finite dimensional unital -algebras and any extremal decomposition measure of , the entropy functional and the mutual entropy functional are defined in [35,27] such as
where is the relative entropy.
For a given pair of finite sequences of completely positive unitalmaps , , the functional (resp. ) is given in [35,27] by taking the supremum of (resp. ) for all possible extremal decompositions 's of :
Let (resp. ) be a unital -algebra with a fixed automorphism (resp. ), be a covariant completely positive unital map from to , and be an invariant state over , i.e., .
The functional and are defined by taking the supremum for all possible 's, 's, 's, and 's:
Then the fundamental inequality in information theory holds for and .
These functional and are constructed from the functional and coming from information theory and these functionals are obtained by using a channel transformation, so that those functionals contains the dynamical entropy as a special case [35,27]. Moreover these functionals contain usual K-S entropies as follows [35,27].
for any finite partitions of a probability space
In particular, a quantum extension of classical formulation for information transmission giving a basis of Shannon's coding theorems can be considered in the case that , , and are shift operators, both denoted by . In this case, the channel capacity is defined as [40,41,45,46,38,39,42,43]
Using this capacity, one can consider Shannon's coding theorems in fully quantum systems.
13. Computations of mean entropies for modulated states
Based on the paper , we here explain general modulated states and briefly review some examples of modulated states (PPM, OOK, PSK).
Let be an alphabet set constructing the input signals and be the set of one dimensional projections on a Hilbert space satisfying
corresponds to the alphabet .
We denote the set of all density operators on generated by
where an element of represents a state of the quantum input system. The state is transmitted from the quantum input system to the quantum modulator in order to send information effectively, whose transmitted state is called the quantum modulated state. The quantum modulated states are denoted as follows: Let be an ideal modulator and be the set of one dimensional projections on a Hilbert space for modulated signals satisfying
, and we represent the set of all density operators on by
where an element of represents a modulated state of the quantum input system. There are many expressions for the modulations. In this section, we take the modulated states by means of the photon number states.
is a modulator if is a map from to satisfying (1) is a completely positive unital map from to . Moreover is called an ideal modulator if (1) is a modulator from to , for any orthogonal . Some examples of ideal modulator are given as follows:
For any , the PPM (Pulse Position Modulator) is defined by
where is the vacuum state on .
For , the OOK (On-Off Keying) is defined by
where is the coherent state on .
For , the PSK (Phase Shift Keying) is defined by
where are the coherent states on .
Now we briefly review the calculation of the mean mutual entropy of K-S type for the modulated state (PSK) by means of the coherent state. Other calculations are obtained in .
are given by
where and are held.
PSK. For an initial state , let . The Schatten decomposition of is obtained as
where the eigenvalues of are
Two projections () and the eigenvectors of () are given by
For the above initial state , one can obtain the output state for the attenuation channel as follows:
where the eigenvalues of are given by ( )
are the eigenstates with respect to . Then we have
When is given by the attenuation channel, we get
The compound states through the attenuation channel becomes
By using the above lemma, we have the following theorem.
14. KOW dynamical entropy
In this section, we briefly explain the definition of the KOW entropy according to .
For a normal state on and a normal, unital CP linear map from to , one can define a transition expectation from to by
in the sense of Accardi and Ohya . For a normal, unital CP map from to and the identity map on , the transition expectation
and the lifting is defined by
for all and any , a lifting from to and marginal states are given by
where is the von Neumann entropy of defined by . The dynamical entropy with respect to and is defined as
15. Formulation of generalized AF and AOW entropies by KOW entropy
In this section, I briefly explain the generalized AF and AOW entropies based on the KOW entropy .
For a finite operational partition of unity , i.e., , and a normal unital CP map from to , transition expectations from to and from to are defined by
where with normalized vectors , in is the matrix algebra and is a subalgebra of consisting of diagonal elements of . Then the quantum Markov states
and is obtained by
Therefore the generalized AF entropy and the generalized AOW entropy of and with respect to a finite dimensional subalgebra are obtained by
where the dynamical entropies and are given by
16. Computations of generalized AOW entropy for modulated states
Then we have the following theorem :
is equal to the AOW entropy if is PVM (projection valued measure) and is given by an automorphism . is equal to the AF entropy if is POV (positive operator valued measure) and is given by an automorphism . For the noisy optical channel, the generalized AOW entropy can be obtained in  as follows.
Thus we have
If is hold then we get the dynamical entropy with respect to , and such as