## Abstract

In the present chapter, we made a detailed analysis of the different regimes of certain chaotic systems and their correspondence with the change in the normalized Shannon entropy, Statistical Complexity, and Fisher information measure. We construct a bidimensional plane composed of the selection of a pair of the informational tools mentioned above (a casual plane is defined), in which the different dynamical regimes appeared very clear and give more information of the underlying process. In such a way, a plane composed of the normalized Shannon entropy, statistical complexity, normalized Shannon entropy, and Fisher information measure can be applied to follow the changes in the behavior variations of the nonlinear systems.

### Keywords

- chaotic dynamics
- statistical complexity
- information theory quantifiers
- Shannon entropy
- Fisher information measure
- Bandt-Pompe probability distribution function

## 1. Introduction

In the space of few decades, chaos theory has jumped from the scientific literature into the popular realm, being regarded as a new way of looking at complex systems like brains or ecosystems. It is believed that the theory manages to capture the disorganized order that pervades our world. Chaos theory is a facet of the complex system paradigm having to do with determinism randomness. As many other people before, we wish to approach it from the information theory viewpoint.

In 1959 Kolmogorov had pointed out that the probabilistic theory of information developed by Shannon could be applied to symbolic encodings of the phase space descriptions of physical non-linear dynamical systems and with line of rezoning it more or less direct characterize a process in terms of *its Kolmogorov-Sinai entropy* [1, 2]. It has been a cornerstone in the updated theory of dynamical systems that could be complimented with Pesin’s theorem in 1977 [3]. With this theorem, Pesin has proven that for certain deterministic nonlinear dynamical systems exhibiting chaotic behavior, an estimation of the *Kolmogorov-Sinai entropy* can be computed as the sum of the positive Lyapunov exponents for the process.

As is well known, chaotic systems have sensitivity to initial conditions which means instability everywhere in the phase space and lead to nonperiodic motion (chaotic time series) [4]. One of the main characteristics of this kind of systems is its capability of long-term unpredictability despite the deterministic character of the temporal trajectory. In a system undergoing chaotic motion, two closeup neighboring points in the phase space after a short time elapsed show an exponential divergence of their respective trajectories. For example, let *information source*.

As has been shown in the literature for a many of simple nonlinear processes, the Lyapunov exponents may be computed very precisely with different algorithms. In such a way, a nonlinear dynamical system may be considered as an information source from which information-related quantifiers may help visualize relevant details of the chaotic process. The existence of simple “calibrated” sources such as the logistic map provides a means for a precise evaluation of the performance of these information quantifiers. In this communication we take advantage such fact in order to show that planar representations constructed with two information theory-based quantifiers offer one possibility of easily visualizing many interesting details of chaos characteristics, including the fine structure of chaotic attractors. We exemplified their use showing the result on two chaotic maps: the logistic map and the delayed logistic map.

## 2. Information theory quantifier prescription

Many systems during its functioning generate a sequence of values that can be measured constituting what is called in science as time series (TS). The analysis concerns to extract the major quantity of information of them to accomplish the understanding of the meaning of the changes characterizing different dynamical regimes. It usually computes the experimental, or when the case permits the theoretical, probability distribution function (PDF) of the regimes exhibited by the TS, from here noted as

The mathematical tools applied once the PDF is available receive the name of informational tools; more precisely information theory quantifiers [5], the main feature of the quantifiers is exactly quantifying the amount of information coming from the TS, originating in the dynamical system.

### 2.1. Shannon entropy, Fisher information measure, and statistical complexity

The concept of entropy has many interpretations arising from a wide diversity of scientific and technological fields. Among them is associated with disorder, with the volume of state space, and with a lack of information too. There are various definitions according to ways of computing this important magnitude to study the dynamics of the systems, and one of the most frequent that could be considered of foundational definition is the denominated *Shannon entropy* [6], which can be interpreted as a measure of uncertainty. The *Shannon entropy* can be considered as one of the most representative examples of information quantifiers.

Let a continuous PDF be noted by *Shannon Entropy*

This concept means a global measure of the information contained in the TS; it has a low degree of sensitivity to strong changes in the distribution originating from a small-sized region of the set

For a time series *Shannon entropy* (formally *Shannon’s logarithmic information*) [7] is defined by

Eq. (2) constitutes a function of the probability *k* associated with probabilities

It is useful to define the so-called normalized Shannon entropy, denoted as *H*[*P*] in which its expression is

(0 ≤ *H*[*P*] ≤ 1) with

In order to analyze the local aspects of variations in the content of information given by a TS is extended the use of the Fisher’s Information Measure, which uses the gradient content of the PDF, and a difference that means the Shannon Entropy, the FIM as can be seen from its definition given in the expression (4) reflect tiny localized perturbations. It reads [8, 9]

where

In this sense, the Fisher information is a local information quantifier. It has various interpretations, and, among others, it can be thought of as a measure of the ability to estimate a parameter. In other cases, it is applied to calculate the amount of information that can be extracted from a TS and also as a measure of the state of disorder of a system or phenomenon [8]. The so-called Cramer-Rao bound can be considered as the most important property in which the FIM participates [9]. The local sensitivity of FIM can contribute in such cases in which the analysis necessitates an appeal to a notion of *order*. When there are certain points in the set

The signal discretization carries a problem of loss of information. It was extended studies by several authors, for example, see [10, 11] and references therein. In particular, it entails the loss of Fisher’s shift invariance, which has not been relevant in the present chapter. Taking in mind the considerations made above, the discrete normalized FIM runs over the interval [0,1] and [12] is given by

where the normalization constant

The local sensitivity of FIM for discrete PDFs is reflected by the fact that the specific *i-*ordering of the discrete values in

In a system with

The third information quantifier applied in this chapter is the *statistical complexity measure* (SCM) which is a global informational quantifier. All the computations made in the present work were done with the definitions introduced by López-Ruiz et al., in their seminal paper [14] with improvements advanced by Lamberti et al. [15]. For a discrete probability distribution function (PDF),

where

The normalization condition

In this way, we have

The *C*[*P*] is identically null that means the signal possesses no structure. In between these two extreme instances, a large range of possible stages of physical structure may be realized by a dynamical system. These stages should be reflected in the features of the obtained PDF and quantified by a no-null *C*[*P*].

The global character of the SCM arising in that its value does not change with different orderings of the PDF. So the

### 2.2. The Bandt and Pompe approach to building up a PDF

In the beginning of this section, it was mentioned that during the analysis of a TS, one of the first steps is the computation of the PDF associated. Immediately a question emerges: What is the appropriate PDF that can be computed from the TS? The regrettable answer is not unique. There is no universal nonparametric algorithm given by the statistics in the literature to do with this task.

To give light in this subject, Bandt and Pompe (BP) [17] introduce a simple and robust symbolic method that takes into account the time causality connected with dynamics of the system. They proposed to use a symbol sequence from the TS that can be constructed in a natural way. So the PDF introduced by Bandt and Pompe (BP-PDF) did not use any kind of assumption about the model, in general unknown, in which of the underlying dynamics exists. To compute the BP-PDF, the “partitions” are constructed by comparing the order of neighboring relative values in the TS rather than by apportioning amplitudes according to different levels like in the usual amplitude statistic methodology.

One problem remains linked with the lack of information associated with the temporal causality in which origins are in the computed methodologies to calculate the amplitude of the histograms. To give an answer to this problem, Kowalski and co-workers [18] using the Cressie-Read family of divergence measure showed in quantitative assessment the advantages of the BP-PDF in relation to any scheme based upon the construction of the corresponding amplitude histogram of the PDF, and also the BP-PDF brought some insight information about the dynamics of the physical problem.

Two parameters are necessary to define at the time of computing the BP-PDF, namely, the embedding dimension and the embedding delay. To clarify these crucial concepts, we will give the following details. Let TS

So the methodology proposed by Bandt and Pompe has as a starting point for every time

Once time settled the ordinal pattern of order

At this stage of the BP-PDF procedure, the vector defined by Eq. (10) is converted into a definite symbol

Considering all the

In Eq. (12) the symbol

Time series amplitude information is not considered, and it is a clear disadvantage of the methodology proposed by BP, but it is compensated by the valuable information given by the intrinsic structure of the process under analysis. The scheme proposed by BP can be understood as a symbolic representation of time series by recourse to a comparison of consecutive points (

Among other properties, we can mention the following characteristics to give reasons in the selection of the BP-PDF: (i) the reduced number of parameters needed contributes to its simplicity of implementation (

Parameter *D*, required by the BP-PDF methodology, determines the number of accessible states which is given by *D!*. Moreover, the minimum length of the TS must satisfy the condition *M* >> *D!* in order to achieve a reliable statistic and proper distinction between stochastic and deterministic dynamics [20]. The seminal work of BP [17] includes an advice on the choice of range of the parameters to compute the BP-PDF, when the selection of time lag is *D*) to pick up on the interval

### 2.3. Ordinal patterns for deterministic processes

There is a demonstrated fact, done by Amigó et al. [21, 22], that in the case of deterministic one-dimensional maps, independently of the TS length *not all possible ordinal patterns*, applying BP methodology [17], can effectively give orbits in the phase space. This is a kind of a new dynamical property that means the existence of *forbidden ordinal patterns.* The proximity of patterns as well as correlation is not linked with the abovementioned property [21, 22]. So the informational quantifiers give a new characteristic in the analysis of chaotic or deterministic TS.

### 2.4. Causal informational planes

To characterize a given dynamical system described by a TS, we are able to use two representation spaces: (a) one with global-global characteristics called causal entropy-complexity plane (

The time causal nature of the Bandt and Pompe PDF gives a criterion to separate and differentiate chaotic and stochastic systems in different regions in both informational planes (

## 3. Description of the chaotic maps

We focus our attention on two chaotic maps, namely, the logistic map and logistic map with delay.

### 3.1. The logistic map

One of the most used examples of deterministic chaotic systems is the logistic map. Its simplicity and easy computational implementation had been one of the most useful tools to explain chaotic behavior. It is a quadratic map

where

where

In the bifurcation diagram (Figure 1a), for fixed

In Figure 1a and 1b, we marked eight zones in order to analyze the logistic map behavior. They are *Zone 1*, *Zone 2*, *Zone 3*, *Zone 4*, *Zone 5*, *Zone 6*, *Zone 7*, *Zone 8*,

Periodic windows “interrupt” chaotic behavior in noticeable fashion. At the beginning of a window, there is a sudden and dramatic change in the long-term behavior of the logistic map. Consider, for example, the behavior for *period-doubling* *chaos* (*band splitting*) *chaos* (*band merging*) again, albeit at a much smaller scale. Same findings are encountered at all the other periodic windows, including miniature windows within the larger windows, as evidence of self-similarity.

### 3.2. The logistic map with delay

In 1948 Hutchinson [27] introduces a delay in the logistic equation to improve its applications in the study of population dynamics. The proposed model by Hutchinson has been applied in population dynamics [28], deterministic chaotic systems [29], the analysis of random discrete delay equations [30, 31, 32], etc. We face a discrete logistic equation with delay [33] given by the difference equation:

with

It is convenient to convert the second-order difference equation into an equivalent pair of first-order difference equations. The logistic map with delay is thus recasted as a two-dimensional map:

and the corresponding Lyapunov exponents can be evaluated numerically [26] via

and

with

The pertinent bifurcation diagram and the corresponding Lyapunov exponents

This map has common characteristics with the usual logistic map. In particular,

## 4. Results and discussion

For each chaotic map previously described, the same time series of length

For ordinal entropic quantifiers of Shannon kind (global quantifiers), the BP-PDF provides univocal prescription. However, some ambiguities arise in the case in which one wishes to employ the BP-PDF to construct local quantifiers. The local sensitivity of the Fisher information measure for discrete PDFs is reflected in the fact that the specific “

For the logistic map, the period-doubling zone is detected for all the quantifiers. In particular for

After

For the delayed logistic map, a regular dynamic (steady state) is observed for parameter in the range

The causal planes

In Figure 6, we show the behavior of the delayed logistic map for the whole range of the parameter

## 5. Conclusions

We have shown that taken as starting point a probabilistic description of dynamical system considering the inherent temporal causality in the generated time series throughout Bandt-Pompe methodology, it is possible to evaluate information quantifiers of global or local character and a complete and detailed characterization of the dynamical system can be successfully archived with reference to an information causal plane, in which the two coordinate axes are different information quantifiers. The causal information planes defined are the global-global

For the discrete systems considered here, the logistic map and the delayed logistic map, we find that both