Information Capacity of Quantum Transfer Channels and Thermodynamic Analogies

We will begin with a simple type of stationary stochastic1 systems of quantum physics using them within a frame of the Shannon Information Theory and Thermodynamics but starting with their algebraic representation. Based on this algebraic description a model of information transmission in those systems by defining the Shannon information will be stated in terms of variable about the system state. Measuring on these system is then defined as a spectral decomposition of measured quantities operators. The information capacity formulas, now of the narrow-band nature, are derived consequently, for the simple system governed by the Bose–Einstein (B–E) Law [bosonic (photonic) channel] and that one governed by the Fermi-Dirac (F–D) Law [fermionic (electron) channel]. The not-zero value for the average input energy needed for information transmission existence in F–D systems is stated [11, 12].


Introduction
We will begin with a simple type of stationary stochastic 1 systems of quantum physics using them within a frame of the Shannon Information Theory and Thermodynamics but starting with their algebraic representation. Based on this algebraic description a model of information transmission in those systems by defining the Shannon information will be stated in terms of variable about the system state. Measuring on these system is then defined as a spectral decomposition of measured quantities -operators. The information capacity formulas, now of the narrow-band nature, are derived consequently, for the simple system governed by the Bose-Einstein (B-E) Law [bosonic (photonic) channel] and that one governed by the Fermi-Dirac (F-D) Law [fermionic (electron) channel]. The not-zero value for the average input energy needed for information transmission existence in F-D systems is stated [11,12].
Further the wide-band information capacity formulas for B-E and F-D case are stated. Also the original thermodynamic capacity derivation for the wide-band photonic channel as it was stated by Lebedev-Levitin in 1966 is revised. This revision is motivated by apparent relationship between the B-E (photonic) wide-band information capacity and the heat efficiency for a certain heat cycle, being further considered as the demonstrating model for processes of information transfer in the original wide-band photonic channel. The information characteristics of a model reverse heat cycle and, by this model are analyzed, the information arrangement of which is set up to be most analogous to the structure of the photonic channel considered, we see the necessity of returning the transfer medium (the channel itself) to its initial state as a condition for a sustain, repeatable transfer. It is not regarded in [12,30] where a single information transfer act only is considered. Or the return is 1 We deal with such a system which is taking on at time t = 0, 1, . . . states θ t from a state space Θ. If for any t 0 the relative frequencies I B of events B ⊂ Θ is valid that 1 T t 0 +T ∑ t=t 0 +1 I B (θ t ) tends for T → ∞ to probabilities p t 0 (B) we speak about a stochastic system. If these probabilities do not depend on the beginning t 0 , a stationary stochastic system is spoken about.
regarded, but by opening the whole transfer chain for covering these return energy needs from its environment not counting them in -not within the transfer chain only as we do now. The result is the corrected capacity formula for wide-band photonic (B-E) channels being used for an information transfer organized cyclicaly.

Information transfer channel
An information, transfer channel K is defined as an arranged tri-partite structure [5] -X is an input stochastic quantity, a source of input messages a ∈ A + = X , a transceiver, -Y is an output stochastic quantity, a source of output messages b ∈ B + = Y, a receiver, -output messages b ∈ Y are stochastic dependent on input messages a ∈ X and they are received by the receiver of messages, Y, -ε is the maximal probability of an error in the transfer of any symbol x ∈ A in an input message a ∈ X [the maximal probability of erroneously receiving y ∈ B (inappropriate for x) in an output message b ∈ Y], -A denotes a finite alphabet of elements x of the source of input messages, -B denotes a finite alphabet of elements y of the source of output messages, -p X (·) is the probability distribution of evidence of any symbol x ∈ A in an input message, -p Y (·) is the probability distribution of evidence of any symbol y ∈ B in an output message. The structure (X, K, Y) or (X , K, Y ) is termed a transfer (Shannon) chain. The symbols H(X) and H(Y) respectively denote the input information (Shannon) entropy and the output information (Shannon) entropy of channel K, discrete for this while, The symbol H(X|Y) denotes the loss entropy and the symbol H(Y|X) denotes the noise entropy of channel K. These entropies are defined as follows, where the symbol p ·|· (·|·) denotes the condition and the symbol p ·,· (·, ·) denotes the simultaneous probabilities.
From (2) and (3), together with the definitions of p ·|· (·|·) and p ·,· (·, ·), is provable prove that the transinformation is symmetric. Then the equation of entropy (information) conservation is valid The information capacity of the channel K (both discrete an continuous) is defined by the equation over all possible probability distributions q(·), p(·).It is the maximum (supremum) of the medium value of the usable amount of information about the input message x within the output message y.

Representation of physical transfer channels
The most simple way of description of stationary physical systems is an eucleidian space Ψ of their states expressed as linear operators. 2 This way enables the mathematical formulation of the term (physical) state and, generally, the term (physical) quantity. Physical quantities α, associated with a physical system Ψ represented by the Eucleidian space Ψ are expressed by symmetric operators from the linear space L(Ψ) of operators on Ψ, α ∈ A ⊂ L(Ψ) [7]. The supposition is that any physical quantity can achieve only those real values α which are the eigenvalues of the associated symmetric operator α (symmetric matrix [α i,j ] n,n , n = dim Ψ). They are elements of the spectrum S(α) of the operator α. The eigenvalues α ∈ S(α) ⊂ R of the quantity α being measured on the system Ψ ∼ = Ψ depend on the (inner) states θ of this system Ψ.
• The pure states of the system Ψ ∼ = Ψ are represented by eigenvectors ψ ∈ Ψ. It is valid that the scalar project (ψ, ψ) = 1; in quantum physics they are called normalized wave functions.
• The mixed states are nonnegative quantities θ ∈ A; their trace [of an square matrix (operator) α] is defined The symmetric projector π{ψ} = π[Ψ({ψ})] (orthogonal) on the one-dimensional subspace Ψ({ψ}) of the space Ψ is nonnegative quantity for which Tr(π{ψ}) = 1 is valid. The projector π{ψ} represents [on the set of quantities A ⊂ L(Ψ)] the pure state ψ of the system Ψ. Thus, an arbitrary state of the system Ψ can be defined as a nonnegative quantity θ ∈ Θ ⊂ A for which Tr(θ) = 1 is valid. For the pure state θ is then valid that θ 2 = θ and the state space Θ of the system Ψ is defined as the set of all states θ of the system Ψ.
• The special case of the p-distribution is that for measuring values of α≡θ, θ ∈ Θ, p(θ|θ|θ) = Tr(θπ θ ) = θdim Ψ(θ|θ), θ ∈ S(θ) (16) It is the case of measuring the θ itself. Thus, by the equation (14), the term measuring means just the spectral decomposition of the unit operator 1 within the spectral relation with α [4,20]. The act of measuring of the quantity α ∈ A in (the system) state θ gives the value α from the spectrum S(α) with the probability p(α|α|θ) = Tr(θπ α ) given by the p-distribution, The measured quantity α in the state θ is a stochastic quantity with its values [occuring with probabilities p(·|α|θ)] from its spectrum S(α). For its mathematical expectation, medium value is valid Nevertheless, in the pure state θ = π{ψ} the values i = α ∈ S(α) are measured, Tr( Let Ψ is an arbitrary stationary physical system and θ ∈ Θ ⊂ A is its arbitrary state. The physical entropy H(θ) of the system Ψ in the state θ is defined by the equality When {π θ : θ ∈ S(θ)} is the decomposition of 1 spectral equivalent with θ, then it is valid that 3.1.0.3. Theorem: For a physical system Ψ in any state θ ∈ Θ is valid that where H(·) is the Shannon entropy, I(· ·) is the information divergence, p(·|θ|θ) is the p-distribution for the state θ, q(·|θ) is the q-distribution for the state θ and d(·|θ) is the d-distribution for the state θ, n = dim Ψ .

Narrow-band quantum transfer channels
Let the symmetric operator ε of energy of quantum particle is considered, the spectrum of which eigenvalues ε i is S(ε). Now the equidistant energy levels are supposed. In a pure state θ i of the measured (observed) system Ψ the eigenvalue ε i = i·ε, ε > 0. Further, the output quantity α of the observed system Ψ is supposed ( Such a situation arises when a particle with energy ε i is excited by an impact from the output environment. The jump of energy level of the impacted particle is from ε i up to ε i+j , i + j = k. The output ε i+j for the excited particle is measured (it is the value on the output of the channel K ∼ = Ψ. This transition j occurs with the probability distribution Pr(j), j ∈ {0, 1, 2, ...} (26) Let be considered the narrow-band systems (with one constant level of a particle energy) Ψ of B-E or F-D type [27] (denoted further by Ψ B−E,ε , Ψ F−D,ε ).
• In the B-E system, bosonic, e.g. the photonic gas the B-E distribution is valid • In the F-D system, fermionic, e.g. electron gas the F-D distribution is valid where parameter p is variable with absolute temperature Θ > 0; k is the Boltzman constant.
Also a collision with a bundle of j particles with constant energies ε of each and absorbing the energy j · ε of the bundle is considerable. E.g., by Ψ (e.g. Ψ B−E,ε is the photonic gas) the monochromatic impulses with amplitudes i ∈ S are transferred, nevertheless generated from the environment of the same type but at the temperature T W , T W > T 0 where T 0 is the temperature of the transfer system Ψ ∼ = K (the noise temperature).
It is supposed that both pure states θ i in the place where the input message is being coded -on the input of the channel Ψ ∼ = K and, also, the measurable values of the quantity α being observed on the place where the output message is decoded -on the output of the channel Ψ ∼ = K are arrangable in such a way that in a given i-th pure state θ i of the system Ψ ∼ = K only the values α k ∈ S(α), k = i + j are measurable and, that the probability of measuring the k-th value is Pr(j) = Pr(k − i). This probability distribution describes the additive noise in the given channel Ψ ∼ = K. Just it is this noise which creates observed values from the output spectrum S(α) = {α i , α i+1 , ...} being the selecting space of the stochastic quantity α.
The pure states with energy level ε i = i · ε are achievable by sending i particles with energy ε of each. When the environment, through which these particles are going, generates a bundle of j particles with probability Pr(j) then, with the same probability the energy ε i+j = k·ε is decoded on the output.
It is supposed, also, the infinite number of states θ i , infinite spectrum S(α) of the measured The narrow-band, memory-less (quantum) channel, additive (with additive noise) operating on the energy level ε ∈ S(ε) is defined by the tri-partite structure (1),

Capacity of Bose-Einstein narrow-band channel
Let now θ i ≡ i, i = 0, 1, ... are pure states of a system Ψ B−E,ε ∼ = K and let α is output quantity taking on in the state θ i values α ∈ S with probabilities Thus the distribution p(·|α|θ i ) is determined by the forced (inner-input) state θ i = π{ψ i } representing the coded input energy at the value ε i = i · ε ∈ S(ε), ε = const. For the medium value W of the input i is valid The quantity W is the medium value of the energy coding the input signal i.
For the medium value of the number of particles j = α − i ≥ 0 with B-E statistics is valid The quantity E(α) is the medium value of the output quantity α and where p(α|α|θ) is the probability of measuring the eigenvalue α = k of the output variable α.
This probability is defined by the state θ = From the differential equation with the condition for α = 0 follows, for the medium value E(α) of the output stochastic variable α, that The quantity H(α θ i ) is the p-entropy of measuring α for the input i ∈ S being represented by the pure state θ i of the system The quantity H(α|θ) is the conditional Shannon entropy of the stochastic quantity α in the state θ of the system Ψ B−E,ε , not depending on the θ (the noise entropy) For capacity C B−E" of the channel K ∼ = Ψ B−E is, following the capacity definition, valid where the set Θ 0 = {θ ∈ Θ, E(θ) = W ≥ 0} represents the coding procedure of the input i ∈ S. The quantity H(α θ) = H(p(·|α|θ)) is the p-entropy of the output quantity α. Its supremum is determined by the Lagrange multipliers method: The conditions for determinating of the bound extreme are ∑ α∈S(α) The Lagrange function which gives the condition for the extreme, Then for the medium value E(α) the following result is obtained By (35), (36) and for the parametr depending only on ε kT W or, on the absolute temperature T W respectively. Thus for E(α) is valid that From (35) and (46) is visible that p(W ) or T W respectively is the only one root of the equation From (34) and (45) follows that for state θ ∈ Θ 0 or, for the q-distribution q(·|θ) respectively, in which the value H(α θ) is maximal [that state in which α achieves the distribution q(·|θ) = p(·)], is valid that For the effective temperatutre T W of coding input messages the distribution (45) supremizes (maximizes) the p-entropy H(α θ) of α and, by using (37) with p(W ), is gained that From (39) and (49) follows [12,37] By (35), (36) and (46) the medium value W of the input message i ∈ S is derived, By (31) the condition for the minimal average energy W Krit needed for coding the input message is The relations (35), (36), (46) and (52), (53) yield in From (47) and (54) follows that for the defined direction of the signal (messaage) transmission at the temperature T W of its sending and decoding is valid that

Capacity of Fermi-Dirac narrow-band channel
Let is now considered, in the same way as it was in the B-E system, the pure states θ i ≡ i of the system Ψ which are coding the input messages i = 0, 1, ... and the output stochastic quantity α having its selecting space S. On the spectrum S probabilities of realizations α ∈ S are defined, expressing the additive stochastic transformation of an input i into the output α for wich is valid α = i or α = i + 1. 5 The uniform energy level ε = const. of particles is considered.
The quantity W is the mathematical expectation of the energy coding the input signal The medium value of a stochastic quantity with the F-D statistic is given by The quantity E(α) is the medium value of the output quantity α, where p(α|α|θ) is probability of realization of α ∈ S in the state θ of the system Ψ ≡ From the differential equation with the condition for follows that for the medium value E(α) of the output stochastic variable α is valid that The quantity H(α θ i ) is the p-entropy of measuring α for the input i ∈ S being represented by the pure state θ i of the system Ψ F−D,ε The quantity H(α|θ) is the conditional (the noise) Shannon entropy of the stochastic quantity α in the system state θ, but, independent on this θ, For capacity C F−D,ε of the channel K ∼ =Ψ F−D,ε is, by the capacity definition in (23)- (24), valid that where the set Θ 0 = {θ ∈ Θ : E(θ) = W > 0} represents the coding procedure.
The quantity H(α θ) = H(p(·|α|θ)) is the p-entropy of the stochastic quantity α in the state θ of the system Ψ F−D,ε . Its supremum is determined by the Lagrange multipliers method in the same way as in B-E case and with the same results for the probility distribution p(·|α|θ) (geometric) and the medium value E(α) , or on absolute temperature T W respectively, only. By (63) it is seen that p(W ) or T W respectively is the only one root of the equation [12,30] For the q-distribution q(·|θ) = p(·) of states θ ∈ Θ 0 , for which the relation (62) and (67) is gained, follows that For the effective temperatutre T W of coding the input messages the distribution (67) supremises (maximizes) the p-entropy H(α θ) of α is valid, in the same way as in (49), that By using (70) in (66) the formula for the C F−D,ε capacity [12,37] is gained The medium value W of the input i = 0, 1, 2, ... is limited by a minimal not-zero and positive 'bottom' value W Krit . From (58), (63) and (68) follows For the average coding energy W, when the channel C F−D,ε acts on a uniform energetic level ε, is For the F-D channel is then possible speak about the effect of the not-zero capacity when the difference between the coding temperatures T W and the noise temperature T 0 is zero. 6 This phennomenon is, by necessity, given by properties of cells of the F-D phase space.

Wide-band quantum transfer channels
Till now the narrow-band variant of an information transfer channel K ε , ε ∈ S(ε), card S(ε) = 1 has been dealt. Let is now considered the symmetric operator of energy ε of a particle, having the spectrum of eigenavalues where τ > 0 denotes the time length of the input signal and h denotes Planck constant. The multi-band physical transfer channel K, memory-less, with additive noise is defend by the (arranged) set of narrow-band, independent components K ε , ε ∈ S(ε), Due the independency of narrow-band components K ε the vector quantities i ε , α ε , θ ε , j ε are independent stochastic quantities too.
The simultaneous q-distribution of the input vector of i ε and the simultaneous p-distribution of measuring the output vector of values α ε (of the individual narrow-band components K ε ) are The system of quantities θ ε (the set of states of the narrow-band components K ε ) is the state θ of the multi-band channel K in which the (canonic) q-distribution of the system K is defined.
(79) are the numbers of the input, output and additive (noise) particles of the multi-band channel K. In this channel the stochastic transformation of the input i into the output α is performed, being determined by additive stochastic transformations of the input i ε into the output α ε in individual narrow-band components K ε .
Realizations of the stochastic systems i, α, θ, j are the vectors (sequences) i, α, θ, j For the probability of the additive stochastic transformation (77), (78) of input i into the output α is valid The symbol θ ε,i ε denotes the pure state coding the input i ε ∈ S of a narrow-band component For the multi-band channel K the following quantities are defined: 7 • the p-entropy of the output α for which, following the output narrow-band B-E and F-D components K ε ∈ K, is valid that • the conditional noise entropy (entropy of the multi-band B-E | F-D noise) • the transinformation T(α; θ) and the information capacity C(K), represents a coding procedure of the input i of the K into θ i , [by transforming each input i ε into pure state θ ε,i ε , ∀ε ∈ S(ε)]. 7 Using the chain rule for simultaneous probabilities it is found that for information entropy of an independent stochastic . Thus the physical entropy H(θ) of independent stochastic system, θ = {θ ε } ε , is the sum of H ε [q(·|θ ε )] over ε ∈ S ε too.

Transfer channels with continuous energy spectrum
Let a spectrum of energy with the finite cardinality n + 1 and a finite time interval τ > 0 are considered For a transfer channel with the continuous spectrum of energies of particles and with the band-width equal to card S(ε) = ε n h , is valid that But the infinite wide-band and infinite number of particles (τ −→ ∞, n −→ ∞) will be dealt with. Then and thus the wide-band spectrum S(ε) of energies is With the denotation α ε = α, i = i ε , j = j ε , i ε , j ε ∈ S, α ε ∈ S, ε ∈ S(ε) For the p-entropy of the output α of the wide-band transfer channel K B−E|F−D is valid that By (52) and (74) the average number of particles on the input of a narrow-band component K ε is and then for the whole average number of input particles W of the wide-band transfer channel K is obtained where W is whole input energy and T W is the effective coding temperature being supposingly at the value T W ε , T W = T W ε , ∀ε ∈ S(ε).

Bose-Einstein wide-band channel capacity
By derivations (86) and (92), (93) is valid that [12] C For the first or, for the second integral respectively, obviously is valid Then, for the capacity of the wide-band B-E transfer channel K B−E is valid and for the whole average output energy is valid (99) For the whole average energy of the B-E noise must be valid From the relations (79) among the energies of the output α , of the noise j and the input i , the effective coding temperature T W is derivable, T W = T 0 · 1 + 6hW π 2 k 2 T 0 2 . Using it in (98) gives For T 0 → 0 the quantum aproximation of C(K B−E ), independent on the heat noise energy (deminishes whith temperture's aiming to absolute 0 • K) The classical approximation of C(K B−E ) is gaind for temperatures T 0 0 (T 0 → ∞ respectively). It is near to value W kT 0 , the Shannon capacity of the wide-band Gaussian channel with the whole noise energy kT 0 and with the whole average input energy W. For T 0 from (101), great enough, is gained that 8

Fermi-Dirac wide-band channel capacity
By derivations (86) and (92), (93) is valid that [12] C By figuring (105) the capacity of the wide-band F-D channel K F−D is gained, and for T W > T 0 is writable For the whole average output energy is valid the same as for the B-E case, For the whole average F-D wide-band noise energy is being derived From the relation (79) among the whole output, input, and noise energy, follows the effective coding temperature T W = T 0 · 1 2 + 6hW π 2 k 2 T 0 2 . Using it in (107) the result is [24] C(K F−D ) = π 2 kT 0 3h For T 0 → 0 the quantum approximation capacity C(K F−D ), independent on heat noise energy kT 0 is gaind (the same as in the B-E case (103), The classical approximation of the capacity C(K F−D ) is gained for T 0 0 9 C(K F−D ) = π 2 kT 0 3h By (74) the condition for the medium value of the input particles of a narrow-band component , from which the condition for the whole input energy of the wide-band channel K F−D follows. By (95) it is gaind, for

Physical information transfer and thermodynamics
Whether the considered information transfers are narrow-band or wide-band, their algebraic-information description remains the same. So let be considered an arbitrary stationary physical system Ψ of these two band-types as usable for information transfer.
From the relation θ −→ θ also is visible that it is reflexive and transitive relation between states and, thus, it defines (a partial) arrangment on the set space Θ. The terminal, maximal state for this arrangement is the equilibrial state θ + of the system Ψ: it is the successor of an arbitrary system state, including itself.
The statistic, Shannon, information) entropy H(·) is a generalization of the physical entropyH(θ). The quantity I-divergence I(· ·) is, by (21), a generalization of the physical quantity I(p d) = H(θ + ) − H(θ) where the state is the equlibrial state of the system Ψ. The probability distribution into the canonic components θ i of θ + is uniform and thus H(θ + ) = ln n = ln dim (Ψ) Let for states θ, θ ∈ Θ of the system Ψ is valid that θ → θ . Then
6.0.0.6. Proof: (a) For a strictly convex function f (u) = u · ln u the Jensen inequality is valid [23] f 11 It is the matrix of the unitary operator u(t) expressing the time evolution of the system Ψ.
H-theorem says, that a reversible transition is not possible between any two different states θ = θ . From the inequality (119) also follows that any state θ ∈ Θ of the system Ψ is the successor of itself, θ → θ and, that any reversibility of the relation θ → θ (the transition θ → θ) is not possible within the system only, it is not possible without openning this system Ψ. The difference reppresents the information-theoretical expressing of the Brillouin (maximal) entropy defect ΔH (the Brillouin negentropic information principle [2,30]). For the state θ + is valid that θ → θ + , ∀θ ∈ Θ. It is also called the terminal state or the (atractor of the time evolution) of the system Ψ. 12 6.0.0.7. Gibbs Theorem: For all θ,θ ∈ Θ of the system Ψ is valid and the equality arises only for θ =θ [38].
operator θ in the base {ψ 1 , ψ 2 , ..., ψ n } is obtained that θ ij = n ∑ k=1 u ki u kj q(k|θ) and thus for operators lnθ and Tr(θ lnθ) is valid that For the information divergence of the distributions q(·|θ ), q(·|θ) and the entropy H(θ ) is valid that the second equality is for θ = θ. The Gibbs theorem expresses, in the deductive (matematical-logical) way, the phenomenon of Gibbs paradox. 13 From formulas (47), (55), (56) and (68), (72), (73) for the narrow-band B-E and F-D capacities follows that (127) and it is seen that the quantity temperature is decisive for studied information transfers. The last relation envokes, inevitably, such an opinion, that these transfers are able be modeled by a direct reversible Carnot cycle with efficiency η max ∈ (0, 1 ). Conditions leading to C [·|·] < 0 mean, in such a direct thermodynamic model, that its efficiency should be η max < 0. This is the contradiction with the Equivalence Principle of Thermodynamics [19]; expresses only that the transfer is running in the opposite direction (as for temperatures).
As for B-E channel; for the supposition W < 0 the inequalities T W < T 0 and p(W ) < p would be gained which is the contradiction with (35), (36)

and (47). It would be such a situation with the information is transferred in a different direction and under a different operation mode.
Our sustaining on the meaning about the original organization of the transfer, for T W > T 0 , then leads to the contradiction mentioned above saying only that we are convinced mistakenly about the actual direction of the information transfer. In the case T W = T 0 for the capacity As for F-D channel; for the supposition W < 2p 2 1 − p 2 T W < T 0 and p(W ) < p is gained which is the contradiction with (68). For T W = T 0 is for C F−D from (71) valid 14 Let be noticed yet the relations between the wide-band B-E and F-D capacities and the model heat efficiency η max . For the B-E capacity (98) is gained that It is the information capacity for such a direct Carnot cycle where H(X) = π 2 kT W 3h = For the wide-band F-D capacity from (105) is valid Then,

H(θ), H(θ)
Again the phenomenon of the not-zero capacity is seen here when the difference between the coding temperature T W and the noise temperature T 0 is zero. Capacities C(K F−D ) ≥ 0 are, surely, considerable for T W ∈ T 0 2 , T 0 and being given by the property of the F-D phase space cells. Capacities C(K B−E ) < 0 and C(K F−D ) < 0 are without sense for the given direction of information transfer.

Nevertheless, it will be shown that all these processes themselves are not organized cyclically 'by themselves'.
Further the relation between the information transfer in a wide-band B-E (photonic) channel organized in a cyclical way and a relevant (reverse) heat cycle will be dealt with. But, firstly, the way in which the capacity formula for an information transfer system of photons was derived in [30] will be reviewed.

Thermodynamic derivation of wide-band photonic capacity
A transfer channel is now created by the electromagnetic radiation of a system L ∼ = K L−L of photons being emitted from an absolute black body at temperature T 0 and within a frequency bandwidth of Δν = R + , where ν is the frequency. Then the energy of such radiation is the energy of noise. A source of input messages, signals transmits monochromatic electromagnetic impulses (numbers a i of photons) into this environment with an average input energy W. This source is defined by an alphabet of input messages, signals {a i } n i=1 , with a probability distribution p i = p(a i ), i = 1, 2, ... , n. 15 The output (whole, received) signal is created by additive superposition of the input signal and the noise signal. The input signal a i , within a frequency ν, is represented by the occupation number m = m(ν), which equates to the number of photons of an input field with an energy level ε(ν) = hν. The output signal is represented by the occupation number l = l(ν). The noise signal, created by the number of photons emitted by absolute black body radiation at temperature T 0 , is represented by the occupation number n = n(ν). The medium values of these quantities (spectral densities of the input, noise and output photonic stream) are denoted as m, n and l. In accordance with the Planck radiation law, the spectral density r of a photonic stream of absolute black body radiation at temperature Θ and within frequency ν, is given by the Planck distribution, Thus, for the average energy P of radiation at temperature Θ within the bandwidth Δν = R + is gained that where ε(ν, Θ) = r(ν)hν and dP(Θ) dΘ = π 2 k 2 Θ 3h .
Then, for the average noise energy P 1 at temperature T 0 , and for the average output energy P 2 at temperature T W , both of which occur within the bandwidth Δν = R + is valid that The entropy H of radiation at temperature ϑ is derived from Clausius definition of heat entropy and thus Thus, for the entropy H 1 of the noise signal and for the entropy H 2 of the ouptut signal on the channel L is The information capacity C T W ,T 0 of the wide-band photonic transfer channel L is given by the maximal entropy defect [2,30]) by For P 2 = P 1 + W, where W is the average energy of the input signal is then valid that Then, in accordance with (102), (103), (104) [30]

Reverse heat cycle and transfer channel
A reverse and reversible Carnot cycle O rrev starts with the isothermal expansion at temperature T 0 (the diathermic contact [31] is made between the system L and the cooler B) when L is receiving the pumped out, transferred heat ΔQ 0 from the B. During the isothermal compression, when the temperature of both the system L and the heater A is at the same value T W , T W > T 0 > 0, the output heat ΔQ W is being delivered to the A where ΔA is the input mechanical energy (work) delivered into L during this isothermal compression. It follows from [2,8,28] that when an average amount of information ΔI is being recorded, transmitted, computed, etc. at temperature Θ, there is a need for the average energy ΔW ≥ k · Θ · ΔI; at this case ΔW = ΔA. Thus O rrev is considerable as a thermodynamic model of information transfer process in the channel K ∼ = L [14]. The following values are changes of the information entropies defined on K 16 : where k is Boltzman constant. The information transfer in K ∼ = L is without losses caused by the friction, noise heat (ΔQ 0x = 0) and thus H(X|Y) = 0. By assuming that for the changes (140) and H(X|Y) = 0 the channel equation (4), (5) and (23) is valid The result is But the other information arrangement, description of a revese Carnot cycle will be used further, given by In a general (reversible) discrete heat cycle O (with temperatures of its heat reservoires changing in a discrete way) considered as a model of the information transfer process in an transfer channel K ∼ = L [17,19] is, for the elementary changes H(Θ k ) · η [max k ] of information entropies of L, valid that 17 where n ≥ 2 is the maximal number of its elementary Carnot cycles O k . 18 The change of heat of the system L at temperatures Θ k is ΔQ(Θ k ).
In a general (reversible) continuous cycle O [with temperatures changing continuously, n −→ ∞ in the previous discrete system, at Θ will be ΔQ(Θ)] considered as an information transfer process in a transfer channel K ∼ = L is valid that For the whole cycle O rrev , T W > T 0 > 0, let be H(X|Y) = 0 and then Further it is obvious that T(X; Y) is the capacity C T W ,T 0 of the channel K ∼ = L too, 17 In reality for the least elementary heat change δQ =hν is right whereh = h 2π and h is Planck constant. 18 It is provable that the Carnot cycle itself is elenentary, not dividible [18].

Triangular heat cycle
Elementary change dΘ of temperature Θ of the environment of the general continuous cycle O and thus of its working medium L (both are in the diathermic contact at Θ) causes the elementary reversible change of the heat Q * (Θ) delivered, (radiated) into L, just about the value δQ * (Θ), The heat Q * (Θ W ) is the whole heat delivered (reversibly) into L (at the end temperature Θ W ). For the infinitezimal heat δQ * (Θ) delivered (reversibly) into L at temperature Θ and in accordance with the Clausius definition of heat entropy S * L [22] is valid that For the whole change of entropy ΔS * L (Θ W ), or for the entropy S * L (Θ W ) respectively, delivered into the medium L by its heating within the temperature interval (0, Θ W , is valid that Then, by medium value theorem 19 is valid that For the extremal values T 0 a T W of the cooler temperature Θ of O and by (151) With S * L (0) = 0 for the (end) temperatures Θ, T 0 , T W of L and the relevant heats and their entropies is valid For the change ΔS * L of the thermodynamic entropy S * L of the system L, at the tememperature Θ running through the interval T 0 , T W , by gaining heat from its environment (the environment of the cycle O), is valid By the result of derivation (155)-(156) for an arbitrary temperature Θ of medium L is valid that 20 Obviously, from (154) for Θ ∈ T 0 , T W is derivable that Let such a reverse cycle is given that the medium L of which takes, through the elementary isothermal expansions at temperatures Θ ∈ T 0 , T W , the whole heat ΔQ 0 or, with medium values  20 If work ΔA and giving, at its higher temperature t W (the average temperature of the heater of our general cycle), the same heat ΔQ W , is valid that Then For the elementary work δA(·, ·) corresponding with the heat Q * (Θ) pumped out (reversibly) from L at the (end, output) temperature Θ of L and for the entropy S * L (Θ) of the whole For the whole work ΔA(Θ W ; T 0 ) consumpted by the general reverse cycle O between temperatures T 0 and Θ W , being coverd by elementary cycles (162), is valid that But, then for the results for ΔA in (160), (161) and (164) follows that and its efficiency is For works δA(Θ, dΘ; T 0 ) of elementary Carnot cycles covering cycle O rrev (166), the range of their working temperatures is dΘ and, for the given heater temperature Θ ∈ T 0 , Θ W is, by (162)-(163) valid that For the whole heats ΔQ 0 a ΔQ W being changed mutually between the working medium L of the whole triangular cycle O rrev and its environment (166), and for the work ΔA, in its equivalent Carnot cycle O rrev with working temperatures T 0 + T W 2 and T W , will be valid that 21

Capacity corections for wide-band photonic transfer channel
The average output energy P 2 (Θ W ) of the message being received within interval (0, T W of the temperature Θ of the medium L ∼ = K L−L from [30], when 0 < Θ 0 ≤ Θ ≤ Θ W and Θ 0 ≤ T 0 and Θ W ≤ T W are valid, is given by the sum of the input average energy W(Θ W , Θ 0 ) and the average energy P 1 (Θ 0 ) of the additive noise The output message bears the whole average output information H 2 (Θ W ). By the medium value theorem is possible, for a certain maximal temperature Θ W ≤ T W of the temperature Θ ∈ (0, Θ W , consider that the receiving of the output message is performed at the average Then for the whole change of the output information 21 Further it willbe layed down λ = π 2 k 2 6h , l = π 2 k 2 3h = 2 · λ.
By (153) is valid that Q * (Θ) = λΘ 2 and δQ * (Θ) = lΘdΘ. Thus for Θ ∈ (0, T 0 a Θ ∈ (0, T W is valid With Θ W = T W and Θ 0 = T 0 and with the reducing temperature T W 2 is possible to write By the channel equation (4), (5) and by equations (23)- (24) and also by definitions (174)-(175) and with the loss entropy H(X|Y) = 0 it must be valid for the transinformation T(·; ·) that T(X; Y) = H(X) − H(X|Y) = H(X) = T(X, Y) and by using l = π 2 k 2 3h , β = T 0 T W , For the given extremal temperatures T 0 , T W the value T(X; Y) stated this way is the only one, and thus also, it is the information capacity C T 0 ,T W (W) of the channel K L−L (the first correction) The information capacity correction (177) of the wide-band photonic channel K L−L [30], stated this way, is (1 + β)-times higher than the formulas (102) and (138) The first abscissa represents the phase of noise generation and the second one the phase of input signal generation. The whole composed abscissa represents the phase of whole output signal generation.
For the sustaining, in the sense repeatable, cyclical information transfer, the renewal of the initial or starting state of the transfer channel K L−L ∼ = L, after any individual information transfer act -the sending input and receiving output message has been accomplished, is needed.
Nevertheless, in derivations of the formulas (102), (177) and (138) this return of the physical medium L, after accomplishing any individual information transfer act, into the starting state is either not considered, or, on the contrary, is considered, but by that the whole transfer chain is opened to cover the energetic needs for this return transition from another, outer resources than from those ones within the transfer chain itself. In both these two cases the channel equation is fulfilled. This enables any individual act of information transfer be realized by external and forced out, repeated starting of each this individual transfer act. 22 23 If for the creation of a cycle the resources of the transfer chain are used only, the need for another correction, this time in (177) arises. To express it it will be used the full cyclical thermodynamic analogy K of K L−L used cyclically, K L−L . The information transfer will be modeled by the cyclical thermodynamic process O rrev of reversible changes in the channel K ∼ = K L−L ∼ = L and without opening the transfer chain. (Also K ∼ = K B−E ).

Return of transfer medium into initial state, second correction
Now the further correction for capacity formulas (102), (138) and (177) will be dealt with for that case that the return of the medium L into its initial, starting state is performed within the transfer chain only. It will be envisiged by a triangular reverse heat cycle O rrev created by the oriented abscissas within the apexes in the S − T diagram (166) The abstract experiment from [30] will be now, formally and as an analogy, realized by this reverse and reversible heat cycle O rrev ≡ O rrev , described informationaly, and thought as modeling information transfer process in a channel K ∼ = K L−L|B−E ∼ = L. Thus the denotation K ≡ K is usable. By (153)-(157) it will be Q * (Θ) = π 2 k 2 Θ 2 6h = l 2 · Θ 2 , Q * W = l 2 · T W 2 , Q * 0 = l 2 · T 0 2 (179) The working temperature Θ 0 of cooling and Θ W of heating are changing by (157),  22 For these both cases is not possible to construct a construction-relevant heat cycles described in a proper information way. 23 But the modeling by the direct cycle such as in (128) is possible for the II. Principle of Thermodynamics is valid in any case and giving the possibility of the cycle description.

It is visible that the quantity H(Y) [= H(X) + H(Y|X)] is introduced correctly, for by (185) is valid that
For the transinformation and the information capacity of the transfer organized this way is valid (177), With the extremal temperatures T 0 and T W the information capacity C * T 0 ,T W (W) is given by T(X; Y) = C * T 0 ,T W (W) and then C max = lim From the difference ΔQ 0 = W = Q * 0 − Q * W (in L) follows that the temperature T W = T 0 · 1 + 6h · W π 2 k 2 T 0 2 . Then, for the transinformation, in the same way as in (187), is now valid The transiformation T(X; Y) is the capacity C(K ) and it is possible to write T(X; Y) = C(K ) = C * T 0 ,T W (W) = π 2 kT 0 6hT W · 1 + 6h · W π 2 k 2 T 0 2 − 1 · (T 0 + T W ) (190) = C * T W (W) = W kT W = C * T 0 (W) = W kT 0 · 1 + 6h · W π 2 k 2 T 0 2 = C(K L−L|B−E ) which value is 2× less than (177) and 2 1 + β × less than (138).
For T 0 −→ 0 the quantum approximation C(W) of the capacity C * T 0 ,T W (W) is obtained, independent on the noise energy (the noise power deminishes near the abslute 0 • K) The classical aproximation C T 0 (W) of C * T 0 ,T W (W) is gained for T 0 0. This value is near Shannon capacity of the wide-band Gaussian channel with noise energy kT 0 and with the whole average input energy (energy) W; in the same way as in (104) is now gained The mutual difference of results (189) and and (102), (138) [12,30] is given by the necessity of the returning the transfer medium, the channel K ∼ = K L−L|B−E ∼ = L into its initial state after each individual information transfer act has been accomplished and, by the relevant temperatutre reducing of the heat ΔQ 0 [by T W in (183)-(189)]. Thus, our thermodynamic cyclical model K ∼ = O rrev for the repeatible information transfer through the channel K L−L|B−E is of the information capacity (189), while in [12,30] the information capacity of the one-act information transfer is stated. 25 By (189) the whole energy costs for the cyclical information transfer considered is countable. 26

Conclusion
After each completed 'transmission of an input message and receipt of an output message' ('one-act' transfer) the transferring system must be reverted to its starting state, otherwise the constant (in the sense repeatable) flow of information could not exist. The author believes that either the opening of the chain was presupposed in the original derivation in [30], or that the return of transferring system to its starting state was not considered at all, it was not counted-in. In our derivations this needed state transition is considered be powered within the transfer chain itself, without its openning. Although our derivation of the information capacity for a cyclical case (using the cyclic thermodynamic model) results in a lower value than the original one it seems to be more exact and its result as more precise from the theoretic point of view, extending and not ceasing the previous, original result [12,30] which remains of its technology-drawing value. Also it forces us in being aware and respecting of the global costs for (any) communication and its evaluation and, as such, it is of a gnoseologic character.