Open access peer-reviewed chapter

A Chaos Auto-Associative Model with Chebyshev Activation Function

Written By

Masahiro Nakagawa

Submitted: 07 June 2022 Reviewed: 28 June 2022 Published: 28 July 2022

DOI: 10.5772/intechopen.106147

From the Edited Volume

Chaos Theory - Recent Advances, New Perspectives and Applications

Edited by Mykhaylo Andriychuk

Chapter metrics overview

59 Chapter Downloads

View Full Metrics

Abstract

In this work, we shall put forward a novel chaos memory retrieval model with a Chebyshev-type activation function as an artificial chaos neuron. According to certain numerical analyses of the present association model with autocorrelation connection matrix between neurons, the dependence of memory retrieval properties on the initial Hamming distance between the input pattern and a target pattern to be retrieved among the embedded patterns will be presented to examine the retrieval abilities, i.e. the memory capacity of the associative memory.

Keywords

  • Chebyshev Chaos neuron
  • associative memory
  • computer simulation
  • numerical analysis methods
  • memory retrieval

1. Introduction

Over the past quarter century, it has been extensively reported that there may exist inherently chaotic dynamics in the human electroencephalogram (EEG) in the variety of biological experiments [1, 2, 3, 4, 5, 6, 7, 8]. In addition from the viewpoints of the artificial neuron models including chaos neurons, many researchers have investigated the association memory models in terms of the various types of activation functions of artificial neurons [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33] so as to improve their memory retrieval capabilities as reviewed in brief in the following section.

First, as an artificial neuron model, Tsuda reported a dynamic retrieval model as well as dynamic linking of associative memories and pointed out a crucial role of chaos in those dynamics [9, 10, 11]. In addition, Davis and Nara et al. put forward a new memory search model with a chaos control [12, 13]. Subsequently, several applications of the chaos neural networks with the sigmoidal, i.e. monotonous, activation function have been reported by Aihara et al. [14] and Nakamura and Nakagawa [15]. In practice, however, as was confirmed by Kasahara and Nakagawa [16], the chaotic dynamics in association process with such a sigmoidal, i.e. monotonous, activation function encounters an inevitable troublesome such that the complete association of the embedded patterns becomes considerably difficult if the loading rate, i.e. α=L/N, in which L and N are the number of embedded patterns and the neurons, respectively, increases beyond a certain critical values αc, i.e. αc ∼ 0.05, for the autocorrelation learning model.

In contrast to the aforementioned monotonous chaos neuron model [14, 15, 16], the neurodynamics with a nonmonotonous activation function was investigated by Morita [19], in which he reported that the nonmonotonous mapping in a neurodynamics may possess a certain advantage of the critical loading rate, αc∼0.27, superior than αc∼0.15 for the conventional association model as the Associatron with such a signum activation function [17]. Later, Shiino and Fukai examined the memory capacity for a somewhat simplified step-like nonmonotonous activation function with the continuous time model, instead of the aforementioned discrete time ones, in an appropriate statistical manner, and concluded that the complete association could be achieved up to the critical loading rate such as ∼0.42 [20]. Subsequently, several models of nonmonotonous activation functions have been proposed including a sinusoidal activation function to control chaos dynamics [21, 22, 23, 24, 25, 26, 27, 28, 29]. In practice, the present author reported a chaos neuron model with a periodic activation function in order to construct a chaos association model with a discrete time as well as the orthogonal learning scheme [21, 22, 23, 24]. Therein, the memory capacity was found to be increased up to αc ∼0.4, even in the search mode without any key information for the input vector, beyond the previously proposed monotonous dynamic models with the discrete time [14, 15, 16]. As the further possible applications, such a chaotic dynamics was involved in the synergetic neural network [25, 26]. The earlier-noted chaos neuron models with such sinusoidal periodic activation functions, which are referred as the sinusoidal chaos neuron model in the present work, have been extensively applied to the chaos association models to mainly evaluate their memory capacities [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34].

From the earlier-noted aspects, in this work, we shall propose a novel chaos auto-association model with the Chebyshev-type activation function recently proposed [35] which may have an advantage to promote the memory capacity of the associative memory beyond the conventional models, proposed up to date, e.g. the sinusoidal activation function and the signum one [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34].

In §2, a theoretical framework of the present association model will be described making use of the Chebyshev-type chaos activation function [35]. Therein, a chaos control related to the chaos simulated annealing as previously mentioned [21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35] will be introduced to transfer the system from a strongly chaotic initial state to a moderate one which may lead the associative model to a memory retrieval point. Furthermore, some computer simulation results for the memory retrieval characteristics will be given in §3, in which an apparent advantage of the present chaos associative model will be elucidated in comparison with the previously proposed sinusoidal activation function model [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33] as well as the Associatron [17]. Finally, §4 is devoted to a few concluding remarks on the presently proposed chaos associative model as well as the future works to be investigated further.

Advertisement

2. Theory

Denoting the i-th component of the r-th embedded, or memorized, vector as eir, the autocorrelation connection matrix can be defined as follows in accordance with the conventional autocorrelation learning model [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19]:

wij=1Nr=1Leirejr,E1

where N and L are the numbers of the neurons and the embedded vectors, respectively. Then the loading rate, which corresponds to a ratio of the number of the embedded vectors to that of neurons, α is to be defined by:

α=L/N.E2

Therefore, one may evaluate the memory retrieval performance depending on the loading rate α for an initial vector which may involve a certain noise in a certain vector among the embedded ones, as will be explained later in detail. Then the updating rule of the internal state of the i-th neuron, σit, in the presently concerned discrete time model may be introduced by:

σit+1=1ksσit+ksj=1Nwijsjt,E3

where ks0<ks1 is considered to be a relaxation parameter of the updating process of the internal state σit [34], and the output of the i-th neuron sit is to be related to σit in terms of

sit=fσit;E4

here f is an activation function of the neuron [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35], which is closely related to the dynamics of the neural networks [34] as has been reported in the conventional models with several nonmonotonous activation functions [9, 10, 11, 12, 13, 18, 19, 20] as well as the chaos associative models [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] with periodic activation function.

In the present work, we shall focus our interests on the recently proposed Chebyshev-type activation function as follows [35]:

sit=tanh1sinνsin1tanhσit,E5

where ν may be regarded as a chaos control parameter to be related to the strength of chaos as will be mentioned later [35]. If we ignore the operations of tanh1 and tanh in Eq. (5), the present activation function is readily reduced to the original Chebyshev function of the second kind [36] as was previously argued in detail [35]. That is, the activation function as Eq. (5) is regarded as a modified version of the original Chebyshev function [36] so as to apply it to the practical applications where the range of the internal states σit may not be restricted in general as σit1, which is to be forced in the original Chebyshev function [35, 36].

To review the chaotic property of the present chaotic neurons [35], let us present the fundamental properties of a single neuron in the following section. Several profiles of the activation function of Eq. (5) for several ν are depicted in Figure 1. As a specific case, assuming that σi± in Eq. (5), one finds

Figure 1.

The presently introduced Chebyshev-type activation function [35]. Herein, the control parameter ν is set to ν=2,3,4,5 for each profile of the activation function defined by Eq. (5). The solid, dashed, dashed-dotted, and dotted lines are for ν=2, ν=3, ν=4, and ν=5, respectively.

sit=limσit±tanh1sinνsin1tanhσit=tanh1sin±νπ2=±tanh1sinνπ2,E6

as is confirmed in Figure 1. Herein, the subscripts i of σit and sit are ignored for the moment for a single neuron.

Then, in order to examine a chaotic dynamics of the present chaos neuron model, replacing sit with σit+1 in Eq. (5), we have the following updating rule for the internal state σit of the i-th neuron [35]:

σit+1=tanh1sinνsin1tanhσit.E7

As an example, the chaotic behavior of the internal state σit in terms of Eq. (7) for a single chaos neuron, in which the subscript i of σit is ignored again, is presented in Figure 2, where ν was set to ν=2. Moreover, the corresponding bifurcation diagram is depicted in Figure 3 over τ=τ020<τ0<<1, or ν=1+, in the abscissa, where ν is related to τ in terms of [35]:

Figure 2.

An example of chaotic behavior of the internal state σt according to Eq. (7) where ν=2 or τ=2/ν=1.

Figure 3.

The bifurcation diagram for the Chebyshev-type mapping given by Eq. (7) [35].

ν=2/τ.E8

Therein, one may confirm the variation of the chaos strength, or the complexity of dynamics, depending on τ. As a special case with ν=1 or τ=2/ν=2, one may readily confirm the following relation:

σit+1=tanh1sinsin1tanhσit=σit,E9

which corresponds to an identity mapping such that sit=σit.

Through the updating rule of Eq.(7), the corresponding Lyapunov exponent λ has to be derived as [35]:

λ=limT1Tn=0T1logt+1t=limT1Tn=0T1logνcosztcosνzt=logν+limT1Tn=0T1logcosztcosνzt,E10

where

zt=sin1tanhσt.E11

To derive Eq. (10), we have noticed the following derivations:

t+1t=dgνηtttt=dgνηtdfνηtdfνηtttt=νcosztcosνzt,E12

where gν, fν, ηt, and zt are defined as:

gνηt=tanh1fνηt,E13
fνηt=sinνsin1ηt=sinνzt,E14
ηt=sinzt=tanhσt,E15

respectively [35].

The dependence of the Lyapunov exponents λ derived by Eq. (10) on τ=2/ν are illustrated with dot symbols in Figure 4, where λ is found to be decreased as τ increases toward ν=2/τ=1 as Eq. (9). In addition, the following approximated expression [34] of the Lyapunov exponents is also depicted as a solid curve in Figure 4:

Figure 4.

The Lyapunov exponents for the Chebyshev-type mapping given by Eq. (7) for 0<τ=2/ν<2. Herein, the Lyapunov exponents in the ordinate are normalized in terms of log2 for convenience. The dots and the solid line are for the numerical results derived by Eqs. (7) and (10) and the approximated expression, log2/τ/log2 as in Eq. (16), respectively.

λlogν=log2τ.E16

Therein, one may confirm that the aforementioned expression holds approximately in comparison with the numerical results with Eqs. (7) and (10). Hence, according to Eqs. (10) and (16), the following relation is considered to be approximately satisfied [35]:

limT1Tt=0T1logcosztcosνzt0.E17

Especially for a specific case of ν=2, or τ=1, an analytical expression of the Lyapunov exponent λ has been derived as [35]

λ=log2,E18

which is found to be exactly same with that of the logistic mapping because of the topological conjugacy of the Chebyshev-type mapping as Eq. (5) with the conventional logistic one [35]. In the aforementioned work [35], the corresponding invariant measure was also investigated further in comparison with the sinusoidal and the monotonous activation functions [37].

From these results, the strength of chaos is found to be properly controlled, through the parameter τ=2/ν as in the previous chaos associative memories [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] as well as the learning models with the back-propagation algorithm [38, 39, 40].

In the similar manner to the previous models [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 37, 38, 41], the control dynamics of the parameter τt may be simply introduced as follows:

τt+1=τt+κττt,E19

where 0<κ<1 is regarded as a parameter to control the number of the steps, or the speed, of the transient retrieval process from an initial chaotic state with a relatively small τ0=τ0<<1 to a memory retrieval point with τtτ. Since ν=2/τ, ν varies from 2/τ0>>1 to 2/τ as τt changes from τ0=τ0<<1 to τ=τ according to Eq. (19). Solving the difference equation as Eq. (19) with an initial value of τ0=τ0<<1, one may readily derive the following analytical expression for τt:

τt=τττ01κt=τττ0expt/Tc,E20

where Tc is regarded as a time constant for the retrieval process from τ0=τ00<τ0<<1 to τ=τ and given by:

Tc=1log1κ=1log11κ.E21

Updating τt with a certain initial value as τ0=τ00<τ0<<1 and solving Eq. (3) with Eqs. (5), (8), and (19) simultaneously under an initial vector set si01iN which has a Hamming distance with respect to a target vector eis1iN1sL to be retrieved among the embedded vectors, we may construct a chaos auto-associative model with the Chebyshev-type activation function [35].

Advertisement

3. Computer simulation results

In this section, we shall show some numerical results derived from the aforementioned chaos auto-association model defined by Eqs. (1), (3), (5), (8), and (19).

For numerical simulations, the embedded vectors eir1iN1rL will be defined as random bipolar vectors as follows:

eir=signzir,E22

where sign is the signum function such that

signx=+1x>00x=01x<0,E23

and zir1iN1rL are the pseudo-random numbers between −1 and + 1. In addition, eir1rN are assumed to be linearly independent each other to avoid the redundancy among the embedded vectors such that

deti=1Neireis0,E24

where det means the determinant of a matrix . Then the connection matrix, i.e. the autocorrelation matrix, between the neurons, wij is straightforwardly derived in terms of Eq. (1).

The initial conditions for sit1iN may be set [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33] in the following manner:

si0=eis1iHdeisHd+1iN,E25

where Hd is the Hamming distance between the initial vector si0 and a concerned target vector eiss12L to be retrieved which is randomly chosen among the embedded vectors eir1rL. Of course, since the presently introduced embedded patterns are defined as the random bipolar vectors as in Eq. (22), Eq. (25) is substantially equivalent to random choice of the bit-reversed components in an initial vector si0 with certain amounts of bit-reversed Hd components corresponding to the involved noise in si0. For a minimum noise, or a minimum Hamming distance, included in the initial vector si0, Hd is to be set to 1, which corresponds to a single-bit reverse in the initial vectorsi0 from eiss12L. For the initial values of the vector, si0, the corresponding initial values for the internal states σi0 may be set to as:

σi0=j=1Nwijsj0,E26

setting as σit+1=σit=σi0 and sit=si0 in Eq. (3).

For convenience, the ratio of the distorted components according to Eq. (25), or the Hamming distance Hd, to the number of neurons, N, may be introduced by:

rd=Hd/N.E27

Hence, the directional cosine, i.e. the inner product, ςs between the input vector si0 and a target vector eiss12L can be evaluated in terms of rd as follows:

ςs=1Ni=1Neissi0=12HdN=12rd.E28

It is well known that the succeeded memory retrieval becomes apparently troublesome if ςs is decreased from ςs=1 to ςs0 as an initial condition, since the involved information in order to retrieve a target vector eiss12L tends to be depressed as rd increases from rd=0 to rd0.5 [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 41].

Then, in similar manner to the previous works [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 41] for the associative memories, the overlaps ort with respect to the embedded vector eir1rL for the output vector sit may be defined as:

ort=1Ni=1Neirsignsit.E29

If one of the overlaps ort1rL, e.g.ost, leads to +1 or 1, then one may determine that a complete memory retrieval corresponding to eis is achieved with sit=eis. It has to be noted here that there exists an uncertainty for the sign of ost because of the invariant property of the dynamic equation, Eq. (3), and the activation function Eq. (5), under sitsit as well as σitσit. Thus, the success rate for the different random embedded sets eir=signzir1rL may be evaluated by examining the succeeded memory retrieval according to the following criteria:

Max1rLort1=Max1rL1Ni=1Neirsignsit1<ε<<1,E30

and

τt+1τt<ε<<1,E31

as τt approaches to τ starting from τ0=τ0<<1 in accordance with Eq. (19) [33, 34]. Provided that both conditions such as Eqs. (30) and (31) are satisfied simultaneously, the current memory retrieval may be decided as a succeeded one. Alternatively, if only the latter condition as Eq. (31) is satisfied, the retrieval trial is judged as a failed one. In any cases, the next retrieval process with Eq. (3) will be restarted with the other random set of embedded vectors as Eq. (22) to constitute the connection matrix wij as Eq. (1) resetting τt to the initial value of τ0=τ0, si0, and σi0 to Eqs. (25) and (26) with fixed α and rd. This retrieval process will be successively repeated up to the total number of trials, i.e. Ts.

Thus through the present numerical simulations for certain values of α=L/N and rd=Hd/N, one may evaluate the number of the succeeded trials Nsαrd among Ts trials with the aforementioned different random bipolar vector sets eir1iN1rL and the connection matrices wij as defined in terms of Eqs. (22) and (1). Here, the number of trials Ts in the retrieval simulations corresponds to the number of the retrieval processes from τ0=τ00<τ0<<1 to τ=τ according to Eq. (19) or Eq. (20) with the criterion Eq. (31). Then, counting up the total number of the succeeded trials as Nsαrd among Ts trials, the succeeded association rate Srαrd in accordance with the aforementioned criteria with Eqs. (30) and (31) for a pair of fixed parameters, α=L/N and rd=Hd/N, will be introduced as follows:

Srαrd=NsαrdTs.E32

Thus, the ability of the memory retrievals for the associative memory model may be evaluated in terms of the score of the succeeded memory retrievals rate to satisfy the conditions and Eqs. (30) and (31) for a certain fixed value of rd=Hd/N, i.e. Srαrd vs. α=L/N characteristics as will be presented later. Hence, the memory capacity Mcrd may be defined as the total area of Srαrd (the ordinate) vs. α=L/N (the abscissa) curves as follows:

Mcrd=01Srαrd.E33

In the following practical simulations, the aforementioned parameters, N, ks, κ, Ts, τ=τ, τ0=τ0, and ε (see Eqs. (30) and (31).) are set to 1000, 0.2, 0.5, 100, 1, 1010, and 1010, respectively, if not mentioned.

In Figure 5ae, the retrieval characteristics, i.e. the dependence of the succeeded association rate defined by Eq. (32), Srαrd, on the loading rate α=L/N are depicted for rd=1/N=0.001(ςs=0.998), rd=0.1(ςs=0.8), rd=0.2 (ςs=0.6), and rd=0.3(ςs=0.4), respectively. From these results, the memory retrieval abilities of the presently proposed Chebyshev-type associative memory model are found to be apparently superior than the conventional association models [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 41]. By increasing rd=Hd/N, i.e. decreasing ςs as Eq. (28), the memory capacities Mcrd defined by Eq. (33) are found to be decreased in similar to the conventional associative models [9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 41]. In practice, the memory capacities Mcrd were evaluated as Mcrd=0.001= 0.4858, Mcrd=0.1= 0.4292, Mcrd=0.2= 0.3680, Mcrd=0.3= 0.2472, and Mcrd=0.4= 0.1168. From these findings, one may confirm the advantage of the present chaos associative model with the Chebyshev-type activation function as Eq. (5) beyond the conventional association models [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 41]. This point will be mentioned later in detail as shown in Figure 6.

Figure 5.

Memory retrieval characteristics for the Chebyshev-type activation function as in Eq. (5). Here, the abscissa and the ordinate are for the loading rate α=L/N and the succeeded association rate Srαrd defined by Eq. (32). (a) rd=0.001, (b) rd=0.1, (c) rd=0.2, and (d) rd=0.3.

Figure 6.

Comparison of the memory capacities defined by Eq. (33), i.e. the total area Srαrd vs. α=L/N curves given in Figures 5, 8 and 9, depending on rd=Hd/N between the presently examined three models, in which the solid, the dashed-dotted, and the dotted lines are for the Chebyshev-type activation function as Eq. (5), the sinusoidal activation function as Eq. (34) and the Associatron with signum activation function as Eq. (35), respectively.

In Figure 7, let us present some examples of time dependences of the internal states σit in Eq. (3), the outputs sit derived from Eq. (5), and the overlaps ort defined by Eq. (29) under an updating process of τt according to Eq. (19) with an initial value τ0=τ0 as shown in Figure 7d for a certain loading rate α=L/N=0.3 and rd=Hd/N=0.1,0.2,0.3. From these results, one may confirm the success of the memory retrievals for (a-1)–(a-3) with rd=0.1 as well as (b-1)-(b-3) with rd=0.2, whereas the memory retrieval is no longer to be attained for (c-1)–(c-3) with rd=0.3. This tendency is found to coincide with the previously shown results for the success rates for rd=0.1,0.2,0.3 as shown in Figure 5bd, respectively. In addition even for α=0.3 and rd=0.2, one may find the succeeded memory retrievals in which ost=Maxrort approaches to +1 or − 1 to satisfy Eq. (30). Of course, it is well appreciated that, for such a case with α=0.3 and rd=0.2, the succeeded memory retrievals considerably become troublesome in the conventional association models [9, 10, 11, 12, 13, 14, 18, 19, 20, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34]. Comparing the time dependences of σit or sit between the succeeded cases(see Figure 7a and b.) and the failed one (see Figure 7c.), it may be found that, in such a failed retrieval case as in Figure 7(c-3), there exists no longer separated regime as σit<0, σit>0, or sit<0, sit>0 as τtτ as can been seen in Figure 7(c-1), (c-2). In fact this tendency in Figure 7(c-1), (c-2) is obviously different from Figure 7(a-1), (a-2) and Figure 7(b-1), (b-2). As can be confirmed in Figure 7(c-3), Maxrort does no longer approach to 1 under such behaviors of σit and sit, in which one may confirm no longer the separated regime as σit<0, σit>0, or sit<0, sit>0 as τtτ. Now we are at the position to compare the present results with the conventional models with the sinusoidal and the signum activation functions. In practice, we shall compare the present numerical results of the Chebyshev-type activation function with the previously reported sinusoidal activation function model [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] and the Associatron with the signum activation function proposed by Nakano [17], which are given by:

Figure 7.

Some examples of time dependences of the internal states σit in Eq. (3), the outputs sit derived from Eq. (5), and the overlaps ort defined by Eq. (29) under an updating process of τt according to Eq. (19) with an initial value τ0=τ0 for α=L/N=0.3 and rd=Hd/N=0.1,0.2,0.3. (a-1) the time dependence of the internal states σit in Eq. (3) for rd=0.1. (a-2) the time dependence of the outputs sit according to Eq. (5) for rd=0.1. (a-3) the time dependence of overlaps ort defined by Eq. (29) for rd=0.1. (b-1) the time dependence of the internal states σit in Eq. (3)rd=0.2. (b-2) the time dependence of the outputs sit according to Eq. (5) for rd=0.2. (b-3) the time dependence of overlaps ort defined by Eq. (29)rd=0.2. (c-1) the time dependence of the internal states σit in Eq. (3) for rd=0.3. (c-2) the time dependence of the outputs sit according to Eq. (5) for rd=0.3. (c-3) the time dependence of overlaps ort defined by Eq. (29) for rd=0.3. (d) the time dependence of τt, which follows Eq. (19), whose behavior is common to all figures, (a-1)–(c-3).

sit=sinπ2τσit,E34

and

sit=signσit,E35

respectively; here τ ranges over τ0ττ in Eq. (34), which may be controlled in terms of Eq. (19), and sign is again the signum function defined by Eq. (23). The fundamental chaotic property of a sinusoidal chaos neuron defined by Eq. (34) has been reported in the previous paper [35], in which, in similar manner to the present Chebyshev-type activation function, the strength of chaos, i.e. the Lyapunov exponents are found to be approximated as [34, 35, 37]:

λlogπ4τ.E36

Therefore, the chaos dynamics is expected to be properly controlled by the parameter τ [35] which is to be updated through Eq. (19) or Eq. (20) in similar to the aforementioned Chebyshev-type association model.

The resultant retrieval characteristics for Eqs. (34) (sinusoidal activation function) and (35) (signum activation function) are given in Figures 8 and 9, respectively. First, in Figure 8 for the sinusoidal activation function as Eq. (34), according to these α vs. Srαrd curves, the memory capacities as Eq. (33) were evaluated as Mcrd=0.001=0.3234, Mcrd=0.1=0.2078, Mcrd=0.2=0.1254, Mcrd=0.3=0.1028, and Mcrd=0.4=0.0560. Here, the control parameter τ=τt in Eq. (34) was updated in terms of Eq. (19) as in the same manner to the previously mentioned Chebyshev-type model in Figure 5.

Figure 8.

Memory retrieval characteristics for the sinusoidal activation function as in Eq. (34) instead of Eq. (5). Here the abscissa and the ordinate are for the loading rate α=L/N and the success rate Srαrd defined by Eq. (32). (a) rd=0.001, (b) rd=0.1, (c) rd=0.2, and (d) rd=0.3.

Figure 9.

Memory retrieval characteristics for the signum function as in Eq. (35) instead of Eq. (5). Here the abscissa and the ordinate are for the loading rate α=L/N and the association rate defined by Eq. (32). (a) rd=0.001, (b) rd=0.1, (c) rd=0.2, and (d) rd=0.3.

Then, according to α vs. Srαrd curves in Figure 9, the memory capacities Mcrd as Eq. (33) for the Associatron with Eq. (35) were evaluated as Mcrd=0.001=0.2282, Mcrd=0.1=0.1454, Mcrd=0.2=0.1402, and Mcrd=0.3=0.1278, Mcrd=0.4=0.070. These values of the memory capacities Mcrd are found to be lower than the aforementioned Chebyshev-type associative model with Eq. (5) over 0.001<rd=Hd/N<0.4 and lower than the sinusoidal associative one with Eq. (34), especially for a relatively low rd such that rd=Hd/N<0.1.

In Figure 6, let us present the dependence of the memory capacities Mcrd on rd=Hd/N for the presently concerned three models, i.e. the Chebyshev-type activation function as Eq. (5) [35], the sinusoidal activation function as Eq. (34) [21, 22, 23, 26, 27, 28, 29, 30, 31, 32, 33, 34], and the Associatron as Eq. (35) [17] for comparison. From these results, one may find the advantage of the presently proposed Chebyshev-type activation function as Eq. (5) beyond such previous models as Eqs. (34) [21, 22, 23, 26, 27, 28, 29, 30, 31, 32, 33, 34] and (35) [17] as well as the other association models with the autocorrelation connection matrix wij and the discrete time updating scheme as Eq. (3) [9, 10, 11, 12, 13, 14, 15, 18, 20]. Especially, for relatively small values of rd such that rd=Hd/N<0.1, it is confirmed that the chaos retrieval models with the Chebyshev-type and the sinusoidal activation functions possess a remarkable advantage beyond the non-chaos association model such as the Assocciatron [17]. Alternatively, for relatively large rd such that rd=Hd/N>0.2, the memory capacities of the sinusoidal chaos associative memory are found to be depressed and become slightly lower than those of the Associatron [17]. As a whole, however, the Chebyshev-type chaos association model shows an apparent advantage beyond these models, i.e. the sinusoidal chaos model [21, 22, 23, 26, 27, 28, 29, 30, 31, 32, 33, 34] and the Associatron [17] over rd=0.0010.4. Therefore, one may conclude that the chaos retrieval memories such as the Chebyshev-type and sinusoidal models have an great advantage, especially for a relatively small rd rather than for a relatively large rd. This finding may indicate that the information involved in the input vector si0 related to rd as defined by Eq. (25) is to be closely related to the retrieval ability even for the chaos memory retrieval processes in similar to the conventional non-chaos association model such as the Assosiatron [17, 34]. It seems to be instructive to note here that the memory capacity Mcrd1/N00.486 for the present Chebyshev-type association model with autocorrelation connection matrix wij is considerably close to ∼0.5 which is the critical value of the memory capacity for the orthogonal learning models with the signum-type activation function as Eq. (35), investigated by means of a statistical approach in the analogy with the spin glass system as previously reported [42, 43].

Advertisement

4. Concluding remarks

In this chapter, we have proposed a chaos associative memory model with the Chebyshev-type activation function [35] instead of the conventional ones including such as the sinusoidal one [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] as well as the signum one [17]. From the present computer simulation results, we have confirmed the apparent advantage of the present chaos associative model beyond the aforementioned conventional models from the viewpoint of the memory capacities related to the robustness for the included noise, i.e. the distorted components in an input vector for the memory retrieval. In practice for the extreme case such as rd=1/N<<1, the memory capacities of the Chebyshev-type activation function as Eq. (5), the sinusoidal activation function as Eq. (34), and the signum function as Eq. (35) are found to be 0.486, 0.341, and 0.228, respectively. Moreover, it has been found that the presently focused Chebyshev-type activation function has a remarkable advantage beyond the sinusoidal association models [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] as well as the non-chaos association model as the Assocciatron [17] as can be confirmed by rd vs. Mcrd characteristics in Figure 6. Moreover, for relatively large values of rd such that rd>0.2, the memory capacities of the sinusoidal chaos associative memory were found to be slightly lower than those of the Associatron [17] as can be seen in Figure 6, whereas that of the Chebyshev-type model has shown a relatively large memory capacities beyond both the sinusoidal chaos model [21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34] and the Associatron [17]. Especially, it has to be borne in mind here again that the memory capacity of the Chebyshev-type model for Hd=1, or rd=1/N, is promoted up to Mcrd∼0.486 which is remarkably close to that of the Associatron with the orthogonal learning model [42, 43] in which Mcrd∼0.5.

Several investigations on an optimization scheme of the included model parameters such as ks, κ, τ0, and τ are now in progress and then will be reported elsewhere in the near future.

Moreover, it seems to be also interesting to investigate the orthogonal learning model [42, 43] as well as the continuous time model [20]. It seems to be worthwhile to introduce the present Chebyshev-type activation function to the chaotic leaning problems as previously reported [38, 39, 40] as well as the design of the classifier applied to the sensibility measurement scheme [44] in connection with an application to the communication technology, e.g. a silent speech interface as seen in the human-brain interfaces. In addition, a chaos learning model with the present Chebyshev-type activation function is to be investigated in comparison with the other chaos learning models with the sinusoidal activation functions [38, 39, 40].

Advertisement

Acknowledgments

The present work was supported in part by START program by Japan Science and Technology Agency in 2014–2016. The present author would like to greatly appreciate for the Grants-in-Aid for Scientific Research of MEXT, No.21300081, No.23650109, and No.24300084. The present author also would like to sincerely express his gratitude to his academic staffs and also to his family for their continuously heartwarming supports.

References

  1. 1. Babloyanz JM, Nicolis C. Evidence of chaotic dynamics of brain activity during the sleep cycle. Physics Letters. 1985;111A:152-156
  2. 2. Siska J. On some problems encountered in the estimation of the correlation dimension of the EEG. Physics Letters. 1986;A118:63-66
  3. 3. Mayer-Kress G, Layne SP. Dimensionality of the human electroencephalogram. Annals of the New York Academy of Sciences. 1987;504:62-87
  4. 4. Meyer-Kress G, Yates FE, Benton L, Keidel M, Tirsch W, Poppl SJ, et al. Dimensional analysis of nonlinear oscillations in brain, heart, and muscle. Mathematical Biosciences. 1988;90:155-182
  5. 5. Nan X, Jinghua X. The fractal dimension of EEG as a physical measure of conscious human brain activities. Bulletin of Mathematical Biology. 1988;50:559-565
  6. 6. Nakagawa M. On the chaos and fractal properties in EEG data. IEICE Transactions on Fundamentals. 1995;78:161-168
  7. 7. Nakagawa M. Chaos and fractal properties in EEG data. Electronic Communication Japan. 1995;78:27-36
  8. 8. Duke DW, Pritchard WS. Measuring Chaos in the Human Brain Proceedings of the Conference. World Scientific Inc; 1991
  9. 9. Tsuda I. Chaotic neural networks and thesaurus, neurocomputers and attention. In: Holden AV, Kryukov VI, editors. Poc. of the International conference on Neurocomputers and attention. Manchester University Press; 1991. pp. 405-424
  10. 10. Tsuda. Dynamic link of memory—Chaotic memory map in non-equilibrium neural networks. Neural Networks. 1992;5:313-326
  11. 11. Tsuda E, Shimizu H. Memory dynamics in asynchronous neural networks. Progress in Theoretical Physics. 1987;78:51-71
  12. 12. Davis P. Application of optical Chaos to temporal pattern search in a nonlinear optical resonator. Japanese Journal of Applied Physics. 1990;29(7A):L1238
  13. 13. Nara S, Davis P, Totsuji H. Memory search using complex dynamics in a recurrent neural network model. Neural Networks. 1993;6:963-973
  14. 14. Aihara T, Toyoda M. Chaotic neural networks. Physics Letters. 1990;A144:333-340
  15. 15. Nakamura K, Nakagawa M. On the associative model with parameter controlled Chaos neurons. Journal of the Physical Society of Japan. 1993;62:2942-2955
  16. 16. Kasahara T, Nakagawa M. Parameter-controlled chaos neural networks. Electronics and Communications in Japan, Part III Fundamentals. 1995;78:1-10
  17. 17. Nakano K. Associatron-A Model of Associative Memory. In: IEEE Transactions on Systems, Man, and Cybernetics, SMC-2. 1972. pp. 380-388
  18. 18. Nozawa H. A neural network model as a globally coupled map and applications based on chaos. Chaos. 1992;2:377-386
  19. 19. Morita. Associative memory with nonmonotone dynamics. Neural Networks. 1993;6:115-126
  20. 20. Fukai T. Self-consistent signal-to-noise analysis of the statistical behavior of analog neural networks and enhancement of the storage capacity. Physics Review. 1993;E48:867-897
  21. 21. Nakagawa M. A model of Chaos neuro-dynamics with a periodic activation function, in proc. In: 1994 International Conference on Neural Information Processing (ICONIP'94). Seoul; 1994. pp. 609-613
  22. 22. Nakagawa M. A Model of Chaos Neuro-Dynamics. In: Proc. 1994 International Conference on Dynamical Systems and Chaos (ICDC'94). Tokyo; 1995. pp. 603-607
  23. 23. Nakagawa M. An artificial neuron model with a periodic activation function. Journal of Physics Society Japan. 1994;64:1023-1031
  24. 24. Kasahara T, Nakagawa M. A study of association model with periodic Chaos neurons. Journal of the Physical Society of Japan. 1995;64:4964-4977
  25. 25. Nakagawa M. A synergetic neural network. IEICE Transactions on Fundamentals. 1995;E78-A:412-423
  26. 26. Nakagawa M. Chaos synergetic neural network. Journal of the Physical Society of Japan. 1995;64:3112-3119
  27. 27. Nakagawa M. Periodic Chaos Neural Networks. In: Proc. 1995 International Conference on Neural Networks (ICNN'95). Australia; 1995. pp. 3028-3033
  28. 28. Nakagawa M. A novel Chaos neuron model with a periodic mapping. Journal of the Physical Society of Japan. 1997;66:263-267
  29. 29. Nakagawa M. A parameter controlled Chaos neural network. Journal of the Physical Society of Japan. 1996;65:1859-1867
  30. 30. Nakagawa M. An Autonomously Controlled Chaos Neural Network. In: Proc. 1996 International Conference on Neural Networks (ICNN'96). Washington DC. pp. 862-867
  31. 31. Tanaka T, Nakagawa M. A Chaos association model with a time-dependent periodic activation function. IEICE Transactions on Fundamentals. J79-A:1826
  32. 32. Nakagawa M. A super memory retrieval with Chaos associative model. Journal of the Physical Society of Japan. 1999;68:2457-2465
  33. 33. Nakagawa M. A Chaos associative model with a sinusoidal activation function. Chaos, Soliton and Fractals. 1999;10:1437-1452
  34. 34. Nakagawa M. Chaos and Fractals in Engineering. World Scientific Inc.; 1999
  35. 35. Nakagawa M. On the Chaos neuron models with Chebyshev type activation functions. Journal of the Physical Society of Japan. 2021;90:014001
  36. 36. Courant R, Hilbert D. Methoden der Mathematischen Physik. Berlin: Springer; 1962
  37. 37. Nakagawa M. On the Chaos neurons and invariant measure from a symmetry consideration. IEICE. 2003;86(2):103-108
  38. 38. Okada T, Nakagawa M. A study of Back propagation learning with periodic Chaos neurons. Chaos Sollitons and Fractals. 1999;10:77-97
  39. 39. Onozuka Y, Nakagawa M. IEICE Transactions on Fundamentals. 2001;J84-A:33-41
  40. 40. Maeda H, Nakagawa M. A Back Propagation Model with Periodic Chaos Neurons. In: Proc 1996 International conference on neural information processing (ICONIP’96). Hong Kong; 1996. pp. 567-571
  41. 41. Nakagawa M. Statistical properties of Chaos associative memory. Journal of Physics Society of Japan. 2002;71:2316-2325
  42. 42. Personnaz L, Guyon I, Dreyfus G. Information storage and retrieval in spin-glass like neural networks. Journal of Physics Letters. 1985;46:L359-L365
  43. 43. Kanter I, Sompolinski H. Associative recall of memory without errors. Physical Review A. 1987;35:380-392
  44. 44. Nakagawa M. Chaos and Fractals Sensibility Information Engineering. Nikkan-Kogyo Shinbun Inc; 2010

Written By

Masahiro Nakagawa

Submitted: 07 June 2022 Reviewed: 28 June 2022 Published: 28 July 2022