The coefficients values and the Lyapunov exponents for ten attractors of (2).
1. Introduction
From the dawn of civilization mankind was aware of the importance of language. We live, think, have knowledge and our being in language. All knowledge of the world and ourselves is expressed and mediated through language. Therefore, it is not surprising that in ancient views language (Greek
However, the ancient dimensions faded in time, and it is only recently that modern science has begun seriously to investigate language. A considerable research effort was and is still involved. One direction of investigation is conceptual, and has the aim to answer the fundamental philosophical quests regarding the nature, origins, and usage of language. Modern philosophy of language follows the same type of speculative inquiry into language by pure
The other direction is mostly empirical in nature (Sampson, 2002), i.e., based on observation and experimentation, and is specific to modern scientific approach. Initially, the study of language, referred by the term philology, was concerned mainly with the historical development of languages and associated literature in the cultural context. Later, the scientific study of language became known as linguistics. Its main theoretical purpose is the construction of a general theory of the structure of language (grammar) and the study of meaning (semantics) (Aronoff & Rees-Miller, 2003; Fromkin, 2000). Linguistics tries to discover the common elements of all languages or the universals and devise a predictive scientific theory of them. The applied linguistics involves the application of the theory to practical tasks such as language teaching and learning, linguistic competence, and communication (Cook, 2003; Davies & Elder, 2004)
The early attempts in linguistics were oriented towards an idealistic conception of language. Later, the dominant view became the classical premise of structuralism that language is a formal system of discrete symbolic units and their combinations. In Saussure’s conception, the language system, called
After a period dominated mainly by behaviorist attitudes towards language, modern linguistics became influenced by the view that language is a non-finite but denumerable set that can be defined by an algorithm. Natural languages were considered syntactically rule-governed, and the goal of linguistics became the investigation of these rules. Both statistical and algebraic approaches have been considered with much more emphasis on the latter. In Chomsky's approach (Chomsky, 1957), natural language resembles artificial formal languages and therefore is defined by a finite system of generative rules for each language. The most significant accomplishment is the transformational grammar which uses rules to express relationships among the various elements of a sentence and to generate accordingly the grammatical sentences in a language. The purpose of linguistic analysis of a language
Computational linguistics is a precursor of artificial intelligence and originated in the task of using computers to translate texts from other languages (Hutchins & Somers, 1992). Later, the task extended to the study of computer systems for natural language understanding and generation (Mitkov, 2005). Specifically, human-computers interaction became the field of natural language processing (NLP) which has the goal of developing techniques for both speech recognition and synthesis (Huang et al., 2001; Jurafsky & Martin, 2000). An important class of methods for language recognition and generation is based on probabilistic models (Jurafsky & Martin, 2000; Manning & Schütze, 1999), such as n-grams model, hidden Markov and Maximum Entropy model. Given a sequence of units (words, letters, morphemes, sentences, etc.) these models try to compute a probability distribution over possible labels and choose the best label sequence. Another approach in NLP is to use neural networks, in particular self-organizing maps of symbol strings (Kohonen, 2001; Somervuo, 2003). Still, an important challenge for any NLP approach, which may hinder its success, is the capability of dealing with the dynamic character of language phenomenon. The main progress achieved so far pertains in principal with a concise articulation of language competence rather than providing a working model of language performance. This is the effect of reducing language to the formal or uttered word, which is viewed only as a symbol, carrying information that a computer can store and retrieve. The higher dimensions of language, experienced for instance when the right words are found to express the nuances of a thought, are missed.
Out of dissatisfaction with formal approaches to language, a relatively new trend has manifested in linguistics. This is cognitive linguistics which starts from the fundamental premise that language reflects patterns of thought (Croft & Cruse, 2004; Evans & Green, 2006; Geeraerts & Cuyckens, 2007). In contrast with the Chomskyan approach, where an autonomous separate module for language acquisition is present in the mind, cognitive linguistics assumes that human linguistic ability is conceptual in nature and not different from other cognitive functions. Knowledge of language, being a cognitive process, emerges from language use. Cognitive semantics, as part of cognitive linguistics, goes beyond the classic theories in semantics which try to explain meaning in terms of necessary and sufficient and truth-conditions. Meaning is conceptual, i.e., corresponds with a concept present in the mind and is based on personal understanding. An increasing influence role in the domain is played by the cognitive grammar (Langacker, 2008; Taylor, 2002). According to this view, the mental grammar takes the form of symbolic assemblies, consisting of conventional pairings of form and meaning, without the need of the abstract rules that cannot be naturally discerned in language use. The competence versus performance issue is dissolved as is also dissolved the principled separation between lexicon and grammar. Grammar is an integral part of cognition as the other cognitive abilities. A central place in this theory is occupied by conceptual archetypes. These are related with the grammatical components and lexical classes and play a cognitive role in experiential gestalts. It's interesting to remark the dynamic dimensions introduced by the archetypes such as physical objects, locations, motion of an object through space, events, participants in an event, energy transfer from one participant to another, etc. (Langacker, 2008). A distinct place in the realm of dynamic view in linguistics and semantics is occupied by the Catastrophe Theory proposed by Thom (Thom, 1983), starting from Tesnière’s fundamental ideas of actant and valency (Tesnière, 1959). There is a correspondence, in this theory, between the evolution of sentence structures, where the verb has the central role, and the morphogenesis of biological forms. Thom identified a set of archetypal morphologies that play a role in the actantial interactions originated by the verb. The morphogenesis of sentence-structures is described in terms of the number of actants involved by the verb and the evolution of interactions in time.
In recent years, an increased scientific awareness is manifested towards the importance of dealing with the dynamics of phenomena for a deeper understanding in any domain of science. Within cognitive linguistics, the dynamical perspective on linguistic phenomena has been argued by several authors to be a promising alternative to the symbolic paradigm based on logics and algebraic algorithms (Andersen, 2002; Manjali, 1995; Peregrin, 2003; Wildgen, 1986; Wildgen, 2008). Any attempt in explaining natural language remains incomplete unless there is an understanding of its dynamics. The dynamical system theory describes the behavior of complex dynamical systems by employing differential and difference equations. Using the tools offered by this theory, the dynamics modeling of linguistic phenomena can be performed more effectively. Another support to the dynamical approach comes from the new established field of cognitive neurodynamics, which provided evidence that event-related brain potentials reflect a lexical-semantic integration and syntactic process that can be interpreted in terms of dynamical system theory (Graben et al., 2008; Rabinovich et al., 2006; Vogels et al., 2005)
In the above context, a reasonable interest to explore the possibility of using dynamical systems in modeling language and meaning appears motivated. Starting from the premise that natural language phenomena can be viewed as a dynamical system the purpose of this chapter is to investigate the possibility of modeling meaning of words and sentences by superposition of chaotic attractors. In the following sections we present some details of this dynamic approach.
2. Meaning and dynamical systems
The essence of language is to be meaningful. The inquiring into the nature of meaning is one of the most profound philosophical quests for human mind. What is meaning? What it means to mean something? How something meaningful for a person can be known or transmitted to someone else? Several approaches have been proposed for a theory of meaning such as meaning as reference, meaning as truth, meaning as usage, thought and language, a naturalized account of meaning, etc. (Collins, 2001; Greenberg & Harman, 2005; Lycan, 2000; Stainton, 1996). It is not our purpose to analyze these theories here, but some common characteristics can be outlined. All theories of meaning encounter the same difficulty: They try to explain meaning using other meaningful concepts, and for this reason are prone to limitations of one kind or another. In general, since all human knowledge is encompassed within language, in order to explain language we need to use language. However, a way out of this difficulty that can lead also to a certain degree of objectivity is to account the object language in metalinguistic terms. For instance, in formal semantics approaches, a metadescription is obtained by assigning labels to sentence constituents (the syntactic category
On the other hand, we have to consider also the following problem. In the classical view, the information content of what a sentence means can be generated from information about the meaning of the sentence's constituents and of the ways they are related to each other. In this concept, natural languages are necessarily compositional. The compositionality constraint has to be satisfied by any theory of meaning for the simple reason that the theory has to show how the meanings of sentences are determined by properties of the simple constituents of the sentences, coupled with the combination or order in which the constituents appear (LePore & Ludwig, 2005). In this context, the proposed dynamic approach tries to answer several fundamental questions (Crisan, 2008; Crisan, 2009b). One is about the formation of meaning. A word is composed of phonemes which are individually uttered by the speaker and individually perceived by the hearer. If the component phonemes of a word are distinct elements in the process of word uttering and perception, how can these distinct elements be cognized as a whole so that the meaning of the word is understood as a resulting composite phoneme-unit? What constitutes the morpheme or the unit of meaning? Do the individual words or even syllables (letters) have a separate meaning by themselves, or meaning is present only when they are combined together? The individual phonemes do not manifest separate meaning by themselves. Only combined together in a word the meaning is revealed. Similarly, the several words, which are supposed to constitute a sentence, only as combined together can convey the unitary meaning at the sentence level. However, the problem is that these phonemes/words are never together in the same time as a whole. They appear in a sequence, one after another. Words containing the same syllables in different order have different meaning or no meaning at all. Yet, the order seems to be less strict for the words in a sentence in order to convey meaning. In principle, the same words can be used in a rephrased sentence to convey the same meaning. One may argue from a structuralist position that it is the last phoneme/word actually perceived combined with the memory of the previous phonemes or words that brings about the meaning as a whole. In other words, the word is nothing more than the phonemes themselves or the whole results from the sum of its parts. But how is such a combination possible which, obviously, should take also into account the order of phonemes in a word or of words in a sentence. What would determine the order of phonemes/words in the absence of an underlying dynamic condition which engenders the unitary meaning? Another problem is to account for the role an individual word might have in sentence- meaning. For instance, not all words refer to a specific thing, or a given word may be used in large varieties of contexts and circumstances. Also, there are causes that may create difficulties such as the similarity/dissimilarity of words’ form (polysemy, homonymy, homophony, etc.). From such considerations, we may expect that only a holistic dynamic concept could encompass the nonlinear phenomena of meaning manifestation from sentence’s constituents. Verbal communication is made possible because of the presence of similar dynamic linguistic properties in both the speaker and the hearer. It is the role of the dynamic approach to account for such a unifying principle of meaning generation out of the dynamic contribution of the component elements. Therefore, in the dynamic view, we encounter the principle of the gestalt theory that the whole appears greater than the sum of its parts. This is also consistent with the basic principles of cognitive linguistics (Geeraerts & Cuyckens, 2007).
We may start with the assumption that at least one kind of internal states is interrelated with language, or in other words that there is no cognition without the operation of the word. This is not in the sense that we have a thought and then we look for a word with which to express it, or that we have an isolated word which we try to associate with a thought. Our approach assumes that the speaker’s purpose is to convey a thought structure, and therefore uses language to encode that structure, hoping that this code will be understood by the hearer. Understanding is equivalent with the formation of a similar thought in the hearer’s mind. Thus, meaning appears to be inseparably tied to such concepts as belief, judgment, desire, intention, knowledge, and understanding. Therefore, meaning understanding presupposes the capacity of the receiver to extract and retrieve the thought structure of the transmitter from particular utterances.
Observations may lead to the fact that people do not speak in individual words. Linguistic communication is based on a meaning concept as a whole at the level of indivisible sentences. Although the individual words or even letters have meaning, the sentence is the complete form of a meaningful thought. An ideal receiver has to have the “capacity” to extract meaning from a sentence. This capacity is what qualifies the linguistic competency, and can be described by the cognition of the cognitive properties a sentence has assigned by the transmitter. It is useful to consider the following semantic bearing criteria: (1) semantical competency, (2) expectancy, (3) contiguity in space and time, and (4) transmitter’s intention (Matilal, 1985). These cognitive properties are the requirements for defining a grammatical and meaning-bearing sentence. A sentence is said to have semantic competency when the objects denoted by the respective words are compatible one to another. For instance, the sentence "
In defining meaning as something that must have a finite and objective significance, we postulate the concept of undivided meaning whole (UMW), which exists internally in the mind (the agent’s information level or knowledge base) (Crisan, 2006). This is structured information, and may be similarly conceived as informational structure of an algorithm. A somewhat similar assumption can be found in (Steels & Hanappe, 2006). Even if UMW is a unitary information structure, it is describable rationally in terms of cognitive semantic units. These semantic units are the generating principle of producing the sequence of uttered words. When an agent wants to communicate, it begins with the UMW existing internally in its mind. A sentence (utterance) is significant or meaningful if it can generate knowledge in an ideal receiver (reader or hearer). This knowledge is a result of a reaction mechanism triggered by the series of words in the sentence. When words are uttered producing different sounds in sequence, it appears only to have differentiation. Ultimately, the sound sequence is perceived as a unity or UMW and only then the word meaning, which is also inherently present in the receiver’s mind, is identified.
The above described capacity of the receiver to extract meaning from series of words led to another assumption, that the whole word/sentence meaning has to be inherently present in the mind of each agent according to a similar dynamic process. Thus, it can be explained how it is possible the UMW to be grasped by the hearer even before the whole sentence has been uttered. The sounds which differ from one another because of difference in pronouncement cause the cognition of the one changeless UMW without determining any change in it. Sometimes, reasoning may have to be applied to the components of the sentence so that the cognition is sufficiently clear to make possible the perception of the meaning-whole. It appears that the unitary word-meaning is an object of each agent’s own cognitive perception. When a word, such as “tree” is pronounced or read there is the unitary perception or simultaneous cognition of trunk, branches, leaves, fruits, etc. in the receiver’s mind. Communication (verbal or written) between peoples is only possible because of the existence of the UMW which is potentially perceivable by all and dynamically revealed by words’ sounds or symbols.
The concept of UMW is consistent with a more general view, suggested by Bohm, regarding the possibilities for wholeness in the quantum theory to have an objective significance (Bohm, 1990). This is in contrast with the classical view which must treat a whole as merely a convenient way of thinking about what is considered to be in reality nothing but a collection of independent parts in a mechanical kind of interaction. If wholeness and non-locality is an underlying reality then all the other natural phenomena must, one way or another, be consistent with such a model. Natural language generation and understanding is a phenomenon that might be modeled in such a way. UMW is like “active information” in Bohm’s language, and is the activity of form, rather than of substance. As Bohm puts it clearly (Bohm, 1990), “…when we read a printed page, we do not assimilate the substance of the paper, but only the forms of the letters, and it is these forms which give rise to an information content in the reader which is manifested actively in his or her subsequent activities.” But, similar so called mind-like quality of matter reveals itself strongly at the quantum level. The form of the wave function manifests itself in the movements of the particles. From here, a new possibility of modeling the mind as a dynamical system is considered. In line with Kantian thought, in (Coward, 1980) we find a similar insight, as above, regarding the linguistic apprehension. This is the interplay of two factors of different levels: (a) the empirical manifold of the separate letters or words and (b) the
Usually, a dynamical system is a smooth action of the real numbers or the integers on a manifold. The manifold is the state space or phase space of the system. Having a continuous function,
The same system can behave either predictably or chaotically, depending on small changes in a single term of the equations that describe the system. Equation (1) can also be viewed as a difference equation (
In our approach, the dynamic continuity can be found in the domain of dynamical systems and chaos theory. The UMW concept and its type of dynamics appear consistent with chaotic attractor modeling. There is a fundamental connection between chaos and information. This view is also supported by other works that demonstrated that a chaotic system can be manipulated to encode symbolic representation of a desirable message (Bollt & Dolnik, 1997; Lai, 2000). Also, in other approaches (Moisl, 2001; Yang, 2003), chaotic attractors are used for coding words and sentences in a process of dynamic interaction.
3. Chaos-based word modeling
In quantum experiments, when particles interact, it is as if they were all connected by indivisible links into a single whole. The same behavior is manifested by the chaotic solutions in an attractor, as we will see in this section. In spite of the apparent random behavior of these phenomena, there is an ordered pattern given by the form of the quantum wave (or potential) in the former case, and by the equations of the dynamic system in the latter.
Let’s consider the simplest case of the quadratic iterated map described by the equation:
Even if it is so simple, it is nonlinearly stable and can manifest chaotic solutions. The initial conditions may be drawn to a special type of attractor called a chaotic attractor. This may appear as a complicated geometrical object which gives the form of the dynamic behavior.
In nonlinear dynamics the problem is to predict if a given flow will pass through a given region of state space in finite time. One way to decide if the nonlinear system is stable is to actually simulate the dynamics of the equation. The primary method in the field of nonlinear dynamic systems is simply varying the coefficients of the nonlinear terms in a nonlinear equation and examining the behavior of the solutions. The initial values of the components of the model vector,
The sum is taken from a value of
No. | a1 | a2 | a3 | LE |
1 | 1.2 | -0.9 | -0.9 | 0.3106 |
2 | 1.1 | -1 | -0.6 | 6.6073 |
3 | 1.1 | -0.7 | -0.9 | 0.1538 |
4 | 0.8 | -1.1 | -1 | 0.2805 |
5 | 0.7 | -1.2 | -0.8 | 0.2001 |
6 | -0.4 | -1.2 | 1.2 | 0.3144 |
7 | -0.7 | -1.1 | 1.2 | 0.3033 |
8 | -0.8 | -1.1 | 0.7 | 6.9382 |
9 | -0.8 | -0.9 | 1.1 | 0.2214 |
10 | -1.2 | -0.9 | 0.8 | 0.2793 |
It’s interesting to analyze in more details the behavior of a chaotic attractor. The idea of the self-organizing maps is to project the
In Fig. 1(b) the same attractor is shown only after a few iterates (2000). We can observe the sparse distribution of dots but along with the ordered path. This type of behavior is similar with the quantum phenomena, such as the distribution of photons along the interference pattern lines in the two-slit interference experiment, where the photons are emitted in series one after the other. This is also akin to the quality of the perception act (understanding word meaning). It’s an observation fact that a word meaning is at first perceived vaguely and then more and more clearly. Thus, through the process of repeated perception or iterations finally the meaning is revealed. Therefore, we may suggest that meaning can be mathematically modeled as a basin of attraction.
Another interesting property is the symmetry between
There is a large possibility to obtain other attractors by tuning the values of the coefficients. The shape of the attractor changes smoothly with small variations of the coefficients. Even if the interval of variation is rather small, dramatic changes in the shape of the map can be obtained. In Fig. 2(a), the dynamic behavior of (2) can be observed for
An important change in shape can be obtained for
The above analysis revealed the fact that chaotic attractors offer dynamic properties that can map in a continuous manner the feature vectors according to some input patterns. In the process of language communication, the dynamics of each phoneme, as it is uttered, has a contribution to the dynamics of the entire word. The goal is to construct a unified word feature that may account for the word meaning or UMW, by encapsulating the phonemes’ dynamics into a unitary description of a chaotic attractor.
For a generic word
In the first approach, the quadratic maps
where
Each valid word of length
A second possibility is to use a linear superposition of
where
In order to exemplify our approach, let’s consider the phonemes /
where
The resulting attractor for the word
and is represented in Fig. 4(a). Similarly, the resulting attractor for the word
There is a clear difference between the dynamics of (10) and (11), although common trajectory patterns can be indentified in both chaotic attractors. This is according to our expectations since both words are composed of the same phonemes. Concomitantly, the meanings encapsulated by the respective words dynamics are clearly different.
In the second approach involving linear superposition, according to (5), the dynamics of the word
where
A similar linear superposition can be used for the word
keeping the same superposition parameters as in (12). The dynamics of (13) appears in Fig. 5(b). When comparing to (12), a clear global difference can be noticed concomitantly with the identification of common trajectories patterns.
The simulation results for both linear and nonlinear superposition of phonemes’ dynamics have proved the validity of the dynamic approach in modeling meaning and semantics. The model can also account the synthetic interplay between the separate linguistic components and the ultimate unitary manifestation of meaning. The key element in this approach is to emphasize the role of individual phonemes in the formation of the composite phoneme-unit at the word level. In this regard, the above linear superposition enfoldment of the phoneme attractors is suggestive, but the enfoldment process can be further refined as we will exemplify below.
The word is not only a linear sum of the phoneme components but a dynamic compound. Therefore, in a series of phonemes, uttered one after another during the word’s generation, the dynamic influence of one phoneme should be manifested in the behavior of the next one in sequence. Then, the influenced of these two phonemes combined is manifested on the next one, and so on. We exemplify this process in the following set of simulations (Crisan, 2009b). We may start from the general form of a two-dimensional non-linear quadratic system:
If the coefficients were chosen from the approximate interval [–1 +1] the system would exhibit behavior that was stable or bounded, non-degenerative, non-periodic and deterministically chaotic. This can be a rich source of chaotic attractors suitable for modeling the syllable components of words.
Let’s consider a series of three phonemes, say /
/o/ | /r/ | /t/ | |
A00 | −0.375 | −0.164 | 0.723 |
A01 | −0.033 | 0.179 | 0.883 |
A02 | 0.065 | 0.895 | 0.178 |
A10 | 0.519 | −0.377 | −0.907 |
A11 | 0.533 | 0.442 | −0.419 |
A12 | −0.51 | 0.106 | −0.448 |
A20 | 0.255 | −0.625 | −0.044 |
A21 | −0.822 | 0.914 | −0.08 |
A22 | 0.376 | −0.117 | 0.124 |
B00 | 0.011 | −0.663 | 0.34 |
B01 | 0.032 | 0.525 | −0.169 |
B02 | −0.683 | 0.43 | −0.931 |
B10 | −0.952 | −0.075 | −0.145 |
B11 | 0.229 | 0.942 | −0.876 |
B12 | 0.182 | 0.011 | −0.941 |
B20 | −0.046 | 0.56 | 0.152 |
B21 | −0.624 | −0.728 | −0.198 |
B22 | 0.032 | −0.81 | −0.812 |
Let’s consider next the generation process of the syllable or word
Therefore, resuming the above example, the phoneme /
The phoneme /
where
Next, it’s interesting to study comparatively the dynamics of the syllable
The correspondent dynamics can be seen in Fig. 7(b) for
The last phoneme in a generated word is dominant because the meaning is revealed only after the last phoneme is uttered. We can observe this effect by considering the influence of the phoneme /t/ upon the previous syllable
where [
The combined influence of the three attractors can be clearly observed in an interesting pattern. Also, following a similar process, it’s interesting to observe the formation of the phoneme-unit
The described nonlinear model of attractors’ enfoldment is stable and preserves rather well the chaotic behavior of the components. The enfoldment process of the chaotic trajectories of one phoneme into another is clearly demonstrated. This proves to be more refined in modeling the phoneme-unit than the linear superposition as used in (5). However, both methods can provide the resultant chaotic attractor with a clearly distinct pattern for the entire word.
4. Sentence as the semantic unit of language
We start considering along with other theories of language that the sentence (utterance) is the semantic unit of language. Although the individual words may have meaning this is not complete. Only at the sentence level a complete unitary meaning is revealed. Communication is done with sentences. The individual words have only a conducive role in the formation of the sentence meaning as a unity. The dynamics of the component words of a sentence can be modeled successfully by chaotic attractors. The dynamics of the entire sentence results according to an informational structure which defines the coupling process of the words’ attractors and the contribution of each word’s dynamics in the ensemble.
The role of the dynamics of each component word can be analyzed considering its relations and position in a sentence. We consider in a general formalization that a word is any phoneme-sequence possessing the property of inflection. Normally, each word takes either a verbal, i.e., conjugational inflection, in which case it is called a verb, or a nominal, i.e., declensional inflection, in which case it is of a non-verbal category (substantives, adjectives, participles, etc.). All the other words which do not have declensional inflections, such as prepositions, may be considered to possess invariant inflection. However, only classifying words in terms of their inflection property is incomplete and does not seem to help much in explaining how the meaning as structured information is conveyed by a sentence. We have to take into account the role of the specific dynamics of each component word in a sentence in relation with other words. Therefore, we suggest the employment of dynamic criteria in defining the notion of a word. The semantic criterion determines the minimum sequence length of the phonemes which convey a meaning. Thus, words may vary in complexity, from the shortest meaning-bearing ones to the more complex compound words. Based on meaningful words, we may define, in general terms, a sentence as being a cluster of words capable to generate a cognitive meaning in a competent receiver (reader or hearer). This concept is further articulated by specifying the dynamic influence and interdependence of words. Here we emphasize the importance of the verb’s function in each sentence. Containing a verb is a necessary condition for being a sentence. The verb’s role as the organizing centre that distributes the actantial places was discussed in the previous works of Tesnière and Thom (Tesnière, 1959; Thom, 1983). In the present approach we extend this concept by emphasizing the role of the verb’s dynamics in forming the sentence-meaning (Crisan, 2009c).
At the sentence level, a similar process of attractor enfoldment as in the case of words can account the formation of UMW. In order to model such a process we need a metalinguistic description. One possibility is to apply the attractor enfoldment process in conjunction with the differentiated cognition model [Crisan, 2006; Matilal, 1985]. According to this concept, we describe cognition as knowing something as something else. In other words, we may know an object by its property to be known. If a certain object
the meaning can be described in the following terms of cognized objects and properties:
where
where
The verb is modeled by a chaotic attractor that provides a suitable dynamics for making a natural connection to the other words in the sentence. The verb in the head position attracts the other words to it. The two C’ nodes in Fig. 9 represent the slots of attraction in the manifold. One slot attracts the nominal element (
If the verb’s dynamics obeys a potential field behavior as mentioned above the best description is through differential equations. We consider the following differential equations in general form:
This can provide a very rich variety of dynamics according to the values of the coefficients and may suitably model different verb constructs. The feature vector of the verb’s component phonemes can be mapped to the twelve coefficients of (22). A remark should be made at this stage. Naturally, in order to model the dynamics of verbs, a similar process of word construct out of component phonemes, as in the previous section, should be considered. However, for the sake of simplicity, in our example, the verb
Next, we suggest that the dynamics of the nominal element can be modeled by a sine-map. This type of map can naturally adapt to the type of dynamics implied by the substance-like nominal element. In the present approach we investigate the following form of sine-map (Okuda & Tsuda, 1994):
where the parameters
According to the DCP description the verb is the central dynamic element for the entire sentence which couples both the nominal element and verb’s object. When the first word in the sentence is uttered this is attracted according to its type of dynamics to the corresponding slot in relation with the verb. Even if the verb is missing, the receiver tends to supply it in order to complete the sentence meaning. In other words, the whole ideas are verbalized. For instance, after the substantive noun
where
The verb’s object can be modeled by another type of dynamics such as a two-dimensional quadratic map of the following form:
where the eight coefficients can be adjusted to map the word’s feature vector. In our example, for the simulation of the verb’s object
In the DCP description, the verb dynamics attracts naturally an object. For, instance, when a verb is uttered first, the receiver expects naturally a verb object to follow. This process can be modeled by coupling the systems (22) and (25) as follows:
where
Finally, according to the DCP semantic description (21) we can couple (24) with (26) and obtain the dynamics for the entire sentence (19). The dynamic behavior can be seen in Fig. 12(b) for
5. Conclusion
Our purpose was to study the possibility of using dynamical systems in modeling natural language. Of particular interest in any model of meaning, and semantics in general, is to account the interplay process of the empirical manifold of individual phonemes or separate words and the unitary characteristic of meaning as a whole. We started from the premise of UMW and the observation facts of language apprehension and noted a similitude with the chaotic behavior of dynamical systems. The attractor behavior as studied for a series of iterated map seems to be robust enough to accept feature vectors for phonemes that may compose any word of length
References
- 1.
Andersen P. B. 2002 Dynamic semiotics . ,139-1/4 , 2002,161 210 ,0037-1998 - 2.
Aronoff M. Rees-Miller J. 2003 The handbook of linguistics , Blackwell Publishing,1-40510-252-7 Malden - 3.
Bohm D. 1990 A new theory of the relationship of mind and matter . ,3 2 1990,271 286 ,0951-5089 - 4.
Bollt E. M. Dolnik M. 1997 Encoding information in chemical chaos by controlling symbolic dynamics , ,55 6 June 1997,6404 6413 ,1539-3755 - 5.
Bozsak E. et al. 2002 Kaon- towards a large scale semantic web, , LNCS, Springer,304 313 . - 6.
Chomsky N. 1957 Syntactic Structures, Mouton & Co., Publishers,3-11017-279-8 Hague - 7.
Collins J. 2001 Truth Conditions Without Interpretation, ,13 52 71 ,1135-1349 - 8.
Cook G. 2003 Applied Linguistics , Oxford University Press,978-0-19437-598-6 Oxford - 9.
Coward H. G. 1980 The Sphota Theory of Language: A Philosophical Analysis, Motilal Banarsidass,8-12080-181-4 - 10.
Crisan M. 2006a Meaning as Cognition, ,369 373 ,108461131053 Spain, Oct. 2006, Open Institute of Knowledge, Badajoz, Spain - 11.
Crisan M. 2006b Information Machine and the Gödelian Case. ,51 (65),4 2006,45 50 ,0122-4600X - 12.
Crisan M. 2008 Chaos-Based Meaning Modeling, (NCM 2008),2 314 319 ,978-0-76953-322-3 Gyeongju, Korea, Sep. 2008, IEEE CS, CPS, Los Alamitos, California - 13.
Crisan M. 2009a Dynamic Modeling of Natural Language. ,54 68),1 2009,39 44 ,0122-4600X - 14.
Crisan M. 2009b Upon Dynamic Natural Language Processing, ,487 492 ,9780769536873 Beijing, China, July 2009, IEEE CS, CPS, Los Alamitos, CA - 15.
Crisan M. 2009c Upon the Dynamic Modeling of Sentence Meaning, (NCM 2009), Seoul, Korea, August, 2009, IEEE CS (in press) - 16.
Croft W. Cruse D. A. 2004 Cognitive Linguistics (Cambridge Textbooks in Linguistics), Cambridge University Press,0-52166-770-4 Cambridge/New York - 17.
Davies A. Elder C. 2004 , Blackwell Publishing,0-63122-899-3 Oxford/Malden - 18.
Evans V. Green M. 2006 Cognitive Linguistics . An Introduction, Edinburgh University Press,0-74861-832-5 , Edinburgh - 19.
Fromkin V. A. et al. 2000 Linguistics: An Introduction to Linguistic Theory, Blackwell,0-63119-711-7 Oxford/Malden - 20.
Geeraerts D. Cuyckens H. 2007 The Oxford Handbook of Cognitive Linguistics , Oxford University Press,10019514378 Oxford/New York - 21.
Graben P. beim. Gerth S. Vasishth S. 2008 Towards dynamical system models of language-related brain potentials . ,2 3 Sept. 2008,229 255 ,1871-4080 - 22.
Greenberg M. Harman G. 2005 , Ernie Lepore, Barry Smith, eds., Oxford University Press. - 23.
Heim I. Kratzer A. 1998 Semantics in Generative Grammar , Blackwell,0-63119-713-3 , Oxford - 24.
Huang X. Acero A. Hon H. W. 2001 Spoken Language Processing: A Guide to Theory, Algorithm and System Development, Prentice-Hall,0-13022-616-5 Upper Saddle River, NJ - 25.
Hutchins W. J. Somers H. L. 1992 An introduction to machine translation , Academic Press, ISBN 012362830X, London - 26.
Jurafsky D. Martin J. H. 2000 Speech and Language Processing: An Introduction to Natural Language Processing, Computational Linguistics, and Speech Recognition, Prentice-Hall,0-13095-069-6 Upper Saddle River, NJ - 27.
Kohonen T. 2001 , Springer-Verlag (3rd extended ed.),3-54067-921-9 , Berlin - 28.
Lai Y-C. 2000 Encoding Digital Information Using Transient Chaos , ,10 4 2000,787 795 ,0218-1274 - 29.
Langacker R. 2008 Cognitive Grammar: A Basic Introduction, Oxford University Press,0-19533-196-7 , New York - 30.
Lycan W. G. 2000 Philosophy of Language: A contemporary introduction, Routledge,0-41517-116-4 , London - 31.
Malmkjaer K. 2009 (3rd edition), Routledge,100415424321 , London - 32.
Manjali F. 1995 Dynamic Semiotics or the Case for Actantial Case. ,6-7 , Décembre 1995,85 97 ,1160-9907 - 33.
Manning C. D. Schütze H. 1999 Foundations of Statistical Natural Language Processing , MIT Press,0-26213-360-1 - 34.
Matilal B. K. 1985 , Motilal Banarsidass,8-12080-717-0 New Delhi - 35.
Mel’čuk I. 2004 Actants in semantics and sintax. I: actants in semantics. , 42-1, 2004,1 66 ,0024-3949 - 36.
Mitkov R. 2005 The Oxford Handbook of Computational Linguistics , Oxford University Press,019927634X Oxford/New York - 37.
Modrak D. K. W. 2001 , Cambridge University Press,0-52177-266-4 Cambridge/New York - 38.
Moisl H. L. 2001 Linguistic Computation with State Space Trajectories , In: (S. Wermter, J. Austin & D. Willshaw, ed.),442 460 , Springer,3-540-42363-X London - 39.
Morris M. 2007 An introduction to the philosophy of language , Cambridge University Press,978-0-521-84215-0 , Cambridge/New York - 40.
Nirenburg S. Raskin V. 2004 Ontological Semantics . The MIT Press,0-262-14086-1 - 41.
Okuda H. Tsuda I. 1994 A coupled chaotic system with different time scales: Possible implications of observations by dynamical systems.4 4 1994,1011 1022 ,0218-1274 - 42.
Peregrin J. 2003 Meaning: The Dynamic Turn. Current Research in the Semantics/Pragmatics Interface, Elsevier,0-08044-187-4 , London - 43.
LePore E. Ludwig K. 2005 Donald Davidson: Meaning, Truth, Language, and Reality, Oxford University Press,100199251347 Oxford/New York. - 44.
Rabinovich M. et al. 2006 Dynamical principles in neuroscience . ,78 4 2006,1213 1265 ,0034-6861 - 45.
Sampson G. 2002 , Continuum International,0-82644-883-6 London/New York - 46.
Saussure F. de. 2006 Writings in General Linguistics , Oxford University Press, ISBN 019926144X, Oxford - 47.
Somervuo P. 2003 Speech Dimensionality Analysis on Hypercubical Self-Organizing Maps . ,17-2 , April, 2003,125 136 ,1370-4621 - 48.
Sprott J. C. 2003 Chaos and Time-Series Analysis , Oxford University Press,0-19850-840-9 Oxford/New York - 49.
Stainton R. J. 1996 Philosophical perspectives on language . Peterborough, Ont., Broadview Press. - 50.
Steels L. Hanappe P. 2006 Interoperability through Emergent Semantics. A Semiotic Dynamics Approach . In: ,143 167 , Springer,978-3-54036-712-3 Berlin/Heidelberg - 51.
Taylor J. R. 2002 Cognitive Grammar , Oxford University Press,0-19870-033-4 Oxford/New York - 52.
Tesnière L. 1959 , Klincksieck,2-25201-861-5 Paris - 53.
Thom R. 1983 Mathematical models of morphogenesis , Ellis Horwood, Halsted Press,100470274999 Chichester, New York - 54.
Vogels T. P. Rajan K. Abbott L. F. 2005 Neural Networks Dynamics. ,28 July 2005,357 376 ,0014-7006X - 55.
Wildgen W. 1986 Procesual Semantics of the Verb. ,5 4 1986,321 344 ,0167-5133 - 56.
Wildgen W. 2008 The “dynamic turn” in cognitive linguistics. Studies in Variation, Contacts and Change in English, Vol. 3-Approaches to Language and Cognition, 2008,1797-4453 - 57.
Yang T. 2003 Dynamics of vocabulary evolution . ,1 1 March 2003,1 19 ,1542-8060