Information Thermodynamics of Cell Signal Transduction Information Thermodynamics of Cell Signal Transduction

Intracellular signal transduction is the most important research topic in cell biology, and for many years, model research by system biology based on network theory has long been in progress. This article reviews cell signaling from the viewpoint of information thermo- dynamics and describes a method for quantitatively describing signaling. In particular, a theoretical basis for evaluating the efficiency of intracellular signal transduction is presented in which information transmission in intracellular signal transduction is maxi-mized by using entropy coding and the fluctuation theorem. An important conclusion is obtained: the average entropy production rate is constant through the signal cascade.


Introduction
The analysis of intracellular signal transduction is one of the most important research topics in cell molecular biology. Determining the mechanisms for communicating intracellular information in the steady state, responding to changes in the external environment, and converting the change to express genetic information are a significant problem. The presented quantitative analysis may enable a comparison of signal transduction and evaluation of efficiency and should help realize the quantitative reproducibility of data for cell molecular biology and precise theoretical construction.
Gene expression cascade has been extensively studied for network study [1]. A correlation analysis of the expression pattern of a given gene is expected to give useful information for clinical diagnosis [2,3]. Along with this evolution, protein-protein network theory has developed greatly in graph theory and phase analysis [4,5]. Taschendorff et al. applied signaling entropy defined by correlation and transition probabilities between the proteins of interest for omics data analysis [5]. Chemokines and immunological networks are also an important theme of network research [6]. Meanwhile, considering specific reaction kinetics and thermodynamic analysis in individual reactions, there have been few studies discussing signal transduction, for example, limited to chemotaxis models of Escherichia coli [7], and several theoretical researches about mitogen-activated protein kinase (MAPK) cascade and bistability or ultra-sensitivity and feedback controllability of the cascade have been reported [8][9][10][11][12][13]. In addition, information thermodynamics of MAPK cascade has been recently reported [14][15][16]. This article reviews these recent studies from information thermodynamics in relation to fluctuation theorem.

Signaling cascade model
Intracellular signal transduction is carried out by a chain network of intracellular biochemical reactions. The network is operated by protein-protein interaction [4,[17][18][19][20][21][22][23]. The cell signal cascade considered here is an interesting next chain reaction mechanism: what was originally a substrate of a biochemical reaction becomes an enzyme in the next step and is a signal molecule in each step. This can be interpreted as if signal conversion is occurring rather than changing. It is possible to model this with a chemical reaction equation. The signaling step in the above cascades may be described as follows: ATP, ADP, and Pi represent adenosine triphosphate, adenosine diphosphate, and inorganic phosphate, respectively. Among signal pathways, the most well-known signal pathway is the MAPK cascade. As a ligand, the epidermal growth factor (EGF) stimulates a single cell via EGF receptor (EGFR) for sequential phosphorylation of c-Raf, MAP kinase-extracellular signalregulated kinase, and kinase-extracellular signal-regulated kinase (ERK), as shown in Figure 1. This cascade can transmit signal from the cell membrane to the nucleus ( Figure 1):

Encoding of signal events
It is possible to apply information theory by considering information source coding of signal molecules. X j and X j * represent the signal molecules. The symbol * indicates an activated state, and mutual conversion is possible between these two. In an actual reaction, it takes a sufficiently longer time to change from the activate state to the inactive state than it is changed from the inactive state to the active state. As shown in Eq. (1), a signal series that the activating signal molecule of step j-1 activates j molecule is established. Following the order in which the concentration fluctuation of each signal molecule becomes significantly larger than fluctuation at the steady state, X 1 X 2 X 1 *X 3 … or X 2 X 3 X 2 *X 1 … and so on.
If the probability that a signal molecule appears in one signal event is proportional to the concentration, then This gives τ the duration of the overall signal event, and the total number of signal events occurring during that time is taken as the total number of signal molecules X. The total signal event number Ψ in a given reaction event can be described as follows: ð6Þ The entropy of the signal event can be defined logarithmically. The logarithm of Ψ is approximated according to Starling's Equation [10]: Here, we used Eqs. (1) and (2). This right-hand side is in the form of well-known mixed entropy. Each step of the signal pathway is considered to be a mixed state of two kinds of signal molecules.

Definition of code length
Here, the signal length for the time series formed by cellular signaling molecules is defined according to the theory of information source coding ( Figure 2). τ + j is the duration of the state in which the phosphorylated molecule is in an increasing state, and τ À j is negative with respect to the increase in the non-phosphorylated molecule (the decline phase of the phosphorylated molecule). A positive value is assigned for τ + j , and a negative value is assigned for τ À j giving consideration of the direction of signal transduction. For example, even if a signal is transmitted in the positive direction, if the same amount of signal is transmitted in the opposite direction, the signal becomes a net zero. To evaluate such a signal amount, the direction needs to be considered. In order to capture this, positive and negative signs are assigned to time. The definition of one total code length, i.e., total length of the given signal event, the following is given: Then, (3) and (4) can be used to obtain Here, the average entropy production rate is defined during the phosphorylation or activation of signaling molecule using an arbitrary parameter s j :

Entropy coding
In order to maximize the number of signal events in a given duration, the relationship between the appearance probability (4) and the code length (9) should be calculated. The Lagrange undetermined constant method is adopted for this. If the constraint conditions are given by (5), , and (9), the function G can be defined as follows [16]: If the partial derivative of G on the right side is taken with the occurrence probability and the total number of signal molecules, respectively, then Figure 2. A common time course of the j th steps in the cascade. The vertical axis represents the ratio of the signaling active molecule concentration, X j *, to that in the steady state, X j * st . The horizontal axis denotes the duration (min or time unit). τ j and τ Àj denote the duration of the j th step. Line X j * = X j *st denotes the X j * concentration at the initial steady state before the signal event.

Application of binary code theory
In practice, the signal transduction system can be classified according to two types of signaling molecules: the activated type is phosphorylated at each step of the reaction chain, and the inactive type is non-phosphorylated.
In terms of the change, the objective was to evaluate information transmission between each signal transmission step in the cascade. Increasing the active form induces the chemical potential caused by the mixed entropy change of each step in the signaling cascade and allows for biological signaling. j step component is extracted from Eq. (3) Consider the entropy flow between the steps. For example, when a cell system is stimulated by the external environment or the state of a receptor at the boundary fluctuates (e.g., activation type) because of a change of the external environment, the signal cascades up to step j (i.e., transmitted). In this case, because the signal is not transmitted to step j + 1, the concentration fluctuation of X j or X j * differs between steps j and j + 1, and an entropy flow can occur.
When the signal event starts and the signal is transmitted to the j step, fluctuation is observed in the j step as follows: s j ¼ Àk B X p j þ dp j log p j þ dp j þ p j * þ dp j * log p j * þ dp j * h i The signal has not yet reached the j + 1 step; hence, the entropy of the j th molecule remains: We can calculate the entropy current: Here, the logarithm of the ratio of the inactive signal molecule to the active signal molecule appeared. This form often appears. Assuming that there is no new generation of signal molecules dp j þ dp j * ¼ 0 Here, we defined the entropy current per one signal molecule, c j :

Fluctuation and signal transduction
Even in the steady state, signal events represented by this code sequence are occurring. When there is minor change in the extracellular environment, the amount of binding complex between the receptor on the cell membrane surface and stimulant ligand increases. This fluctuation increases the phosphorylated form of another signal molecule next to the complex and increases the fluctuation of the active type signal molecule through a chain reaction. Based on the signaling in the steady state, the increase in fluctuation indicates a signal response. In this manner, cell signaling can be distinguished as in the steady state or a fluctuation response to a change in the external environment.

Adaptation of fluctuation theorem to analysis of signal transduction
We defined transitional probability p (j + 1|j), which is the probability of step (j + 1) given step j, and p (j|j + 1), which is the transitional probability of step j given j + 1 step during τ j . The logarithm of ratio p (j + 1|j) / p (j|j + 1) is divided by τ j À τ Àj and taking the limit, the AEPR from the j to the j + 1 field satisfies the steady fluctuation theorem (FT) [24]: where s j is an arbitrary parameter representing the progression of a reaction event. This fluctuation theorem leads to various nonequilibrium relations among cumulates of the current.
We have an equation below using signal current density [16,24]: Substituting the right side in Eq. (23) into the right side of Eq. (27), we had an important result [16]: By substituting Eqs. (15) and (16) Accordingly, entropy coding is given using Eqs. (30) and (31): are obtained. In conclusion, the AEPR is conserved during the whole cascade of signal transduction [16].

Conclusion
Here, each signal step is handled as actual biochemical reactions based on kinetics and thermodynamics. Signal transduction is interpreted based on encoding theory and fluctuation theorem. Regarding the relationship between information and entropy in thermodynamic mechanisms [25][26][27], information thermodynamics has seen remarkable developments in recent years, and information theory and thermodynamics are easier to understand when they are integrated. In particular, the theoretical results based on analysis of the Szilard engine model [28] have made it possible to compute the mutual information [29,30] and the amount of work that can be extracted from a system by free energy changes [15]. Thus, information thermodynamics may be the theoretical basis of the signal transduction.