Open access

Shared Neural Correlates for Speech and Gesture

Written By

Meghan L. Healey and Allen R. Braun

Submitted: 29 June 2012 Published: 19 June 2013

DOI: 10.5772/56493

From the Edited Volume

Functional Brain Mapping and the Endeavor to Understand the Working Brain

Edited by Francesco Signorelli and Domenico Chirchiglia

Chapter metrics overview

2,276 Chapter Downloads

View Full Metrics

1. Introduction

Humans are inherently social creatures: we spend a remarkable portion of our waking hours communicating with one another. We share our thoughts, goals, and desires, tell stories about what happened at lunch and make plans for the weekend. Although messages can be written, signed, or typed, the majority of this communication occurs through spoken language and face-to-face dialogue. These interactions demand that message recipients attend not only to words and sentences, but also to numerous nonverbal cues that include body language, facial expressions, and gestures, among others.

Hand gestures have been the focus of a substantial body of research in recent decades. While the body as a whole can be used to signify general emotional state, hand gestures tend to represent more precise semantic content. These spontaneous movements can be used independently or in conjunction with speech. For example, a “thumbs up” sign in the absence of any speech may indicate “I’m okay” after a bad fall, while wiggling index and middle fingers accompanying the statement “I went to the store earlier” may indicate the subject walked rather than drove. These and other examples suggest that gestures convey semantic and/or pragmatic information much in the same way that speech does. In light of this, some researchers have suggested that gesture, which is still relied upon by our primate ancestors for communication, may constitute the evolutionary basis of spoken language [1]. The following chapter will offer a comprehensive look at this intimate relationship between gesture and language, as well as a critique of the so-called “gestural origins theory.” More specifically, we will address the following questions: (1) Are gesture and speech fundamentally linked, representing two parts of a single system that underlies human communication? (2) Did language initially emerge as a purely manual system?

Advertisement

2. Overview of gesture

While we may fail to recognize it, we use gestures constantly to convey and extract meaning. The variety of gestures we use on a daily basis also goes somewhat unnoticed. Some gestures are idiosyncratic, while others are more conventionalized. Some require the co-presence of speech to be interpretable, while others can stand alone. Although researchers have begun to focus on the characteristics of different gesture types, the field still lacks a consistent nomenclature system. Types of gestures overlap, sub-groups are combined, and definitions vary slightly, all depending on who is doing the labeling. Of course, this makes it difficult to formally conceptualize the nature of gestural communication and to compare findings across studies conducted by different research groups. Figure 1 below illustrates the wide range of gestures that have been individually defined.

Efforts have been made to develop a more systematic method for categorizing gesture types. The simplest of these schemes may be the one McNeill [2] termed “Kendon’s continuum.” According to this scheme, hand movements progress in the following linear sequence:

gesticulations → speech-framed gestures → pantomimes → emblems → sign languages

Moving from left to right along the continuum, the necessity for concurrent speech disappears and the presence of language-like properties increases. At the left extreme of the spectrum, gesticulations are defined as spontaneous and idiosyncratic movements of the hands and arms that rarely occur independent of speech (in fact, these gestures are temporally synchronized with the speech they accompany ninety percent of the time). Within this category, McNeill distinguishes between iconics, metaphorics, deictics, and beats. He explains that each gesture type performs a different function within discourse: iconic gestures refer to concrete events or features of a scene, metaphoric gestures to abstract concepts or relationships, deictic gestures to locations and orientations, and beat gestures to thematic highlights (see [2] for more information). The majority of research, including the next sections of this chapter, focuses on these subcategories of gesticulations. See Figure 1 below for definitions and examples of speech-framed gestures, pantomimes, emblems, and sign languages.

Regardless of type, gesture production can be defined in three stages: preparation, stroke, and retraction. The stroke of the gesture contains the content of the message. Gestures are generally performed in the front of the body; McNeill writes that “the gesture space can be visualized as a shallow disk in front of the speaker, the bottom half flattened when the speaker is seated” ([2], p.86).

Gesture Type Definition Example
Gesticulations Spontaneous and idiosyncratic movements of the hands and arms. Rarely occur in absence of speech/require speech for full comprehension. Any iconic, metaphoric, deictic, or beat gestures.
Iconic Gestures Visually represents the co-expressive speech content. While describing a car accident, hands form a T-shape, representing how the two collided.
Metaphoric Gestures Represent abstract concepts or relationships. Using the hands to form a spherical shape, representing the idea of “wholeness”
Deictic Gestures Also known as pointing gestures. Locate objects and actions in space. Can be concrete or abstract. Classical deictic gesture is an extended index finger.
Beat Gestures Also known as “baton” gestures. Provide temporal highlighting to speech. Signal the speaker feels part of the message is particularly important. Generally a rhythmic waving of the hands or arms.
Speech-framed gestures Fill a grammatical slot in a sentence. Do not overlap with speech, but require speech to set up the context. “The ball went [gesture indicates ball bounced up and down repeatedly].”
Pantomimes Hands are used to imitate objects or actions. Speech is not obligatory. Can combine multiple gestures to demonstrate a sequence.
** The term transitive gesture is also used to represent those gestures imitating use of everyday tools. Intransitive gestures, on the contrary, do not involve tools.
Hands assume the shape of a camera and index finger moves downward, imitating taking a photograph.
Emblems Arbitrary but conventionalized representations of linguistic meaning. Can function independently. Emblems are culturally specific.
*Instrumental gestures (gestures intended to influence the behavior of another, e.g. “come here”) generally fall into this category.
Thumbs up sign means “I’m okay” or “Everything is good”
One finger to the lips means “be quiet”
Instrumental Gestures Meant to influence or direct the behavior of another.
Generally these gestures can also be classified as emblems.
“Come here” sign with one finger extending and then forming a hook back to the speaker.
Expressive Gestures Express inner feeling states.
May also be classified as emblems.
Hands turned up and to the sides to indicate “I don’t know”
Sign Language Full-fledged language system with syntactic structure and a community of users. American Sign Language, Nicaraguan Sign Language

Figure 1.

Names, definitions, and examples of commonly-referred to gesture types.

Advertisement

3. Competing theories

While there is a general consensus that gestures are used to communicate, the exact nature of the relationship between gesture and speech is still a matter of some controversy. David McNeill [2] was first to propose that, at their core, gesture and speech reflect the same cognitive process: only the modality of expression differs. Others, like Robert Krauss [3] for example, take an alternate view, arguing that gesture and speech are separate and independent systems, only loosely related. According to this second camp, gesture is merely used as an auxiliary support when speech processing is unusually difficult.

Evidence is accumulating in favor of the first proposal that gesture and speech are intimately connected and combine to form a single system of meaning. While they are undoubtedly used to bolster communication under adverse conditions (e.g. loud environments), gestures are used far more widely than this hypothesis would suggest. Instead, McNeill explains that gestures are able to convey ideas that cannot always be captured with conventional spoken language (e.g. information about spatial relationships). While speech is highly structured and arbitrary, gesture provides information in a more holistic and imagistic fashion [4]. Gesture and speech serve distinct, but complementary functions in this regard: a speaker’s message cannot always be expressed, nor understood in its entirety without this composite signal. The movement of the hands is not just a “bonus” feature; it is fundamental to successful transmission of the message.

There are several lines of evidence that support McNeill’s claim of an intimate relationship between speech and gesture: 1) gesture and speech are temporally synchronized, 2) speech and gesture co-develop in children, 3) there is a correlation between handedness and the cerebral lateralization of language, 4) people readily incorporate gestural information into the retelling of speech-only content, and 5) the use of gesture does not disappear when people are physically removed from their audience [5-19]. Each of these arguments will be explored in more detail below.

Advertisement

4. Temporal synchronization of speech and gesture

When we produce gestures, we instinctively produce them so they overlap with their co-expressive speech. Consider an example cited by McNeill [2]: while describing a scene from a comic in which a character bends a tree towards the ground, the speaker grips an imaginary branch and pulls it inwards and down (from the upper gesture space to the body). The gesture stroke concludes as the subject finishes the utterance “he grabs a big oak tree and he bends it way back” [2, p.25]. Here, the gesture and speech are carefully synchronized so the hand movement can be linked to the content it both depends and elaborates upon. In general, the gesture stroke generally precedes speech onset, within a certain restricted time window. The gesture stroke is rarely, if ever, initiated after the speech it is meant to represent or supplement.

Several researchers have examined the sensitive nature of temporal relationship between speech and gesture. For example, Rauscher, Krauss, and Chen [5] manipulated participants’ ability to gesture while they described a cartoon to a listener. In those conditions where hand movement was restricted, subjects spoke less fluently and produced more unfilled pauses. Based on these findings, the authors argue that gestures facilitate the speech production process itself (in particular access to the mental lexicon), rather than serving as a backup mechanism for communication once speech has failed.

Mayberry and Jaques [6] reach a similar conclusion in their work on persons who stutter. When these individuals narrate cartoons, gestures are only produced alongside fluent speech. In the cases when gestures have been initiated prior to a stuttering event, the gesture stroke is frozen until speech is resumed and the two can continue to co-occur. Again, the results directly contradict the independent systems theory: if gesture and speech were separate processes, persons who stutter would be expected to continue gesturing even when speech is temporarily interrupted. In fact, these people would likely gesture more in order r to compensate for the breakdown in speech. This bidirectional relationship—the fact that the gesture stroke is halted in time with the stuttering events-- suggests speech and gesture must be linked at a deep, neural level. Mayberry and Jaques [6] exclude the possibility that it is simply a “manual-motor shutdown” that prevents gesturing during stuttering events by showing that only speech-related hand movements (and not simultaneous button-pressing, finger-tapping, etc.) are suspended during dysfluencies. Instead, the two must be connected at a planning stage, prior to motor execution.

Advertisement

5. Co-development of speech and gesture

Speech and gesture are known to show similar developmental trajectories in children. Bates and Dick [7] provide a comprehensive review of these parallel milestones, starting with the co-emergence of rhythmic hand movements and babbling in six to eight month olds. The same trends continue as children age and language abilities expand rapidly. Between twelve and eighteen months, gesture and naming are positively correlated (children who gesture earlier also name objects earlier). By18 months of age, toddlers begin to form both gesture-word and gesture-gesture combinations, and at 24 months, the ability to reproduce arbitrary sequences of manual actions is correlated with grammatical competence [7,8].This tight developmental link between speech and gesture can be easily understood if we believe speech and gesture are supported by a common and amodal system of communication.

Interestingly, hand banging is significantly correlated with onset of babbling and single word production even in infants with Williams Syndrome (WS), a rare genetic disorder causing broad developmental delays. More importantly, these manual movements in infants with WS are not correlated with other motor milestones; the link is specific to these early precursors of spoken language and gesture [9]. Also interesting is the observation that in congenitally deaf children, the emergence of manual babbling is developmentally appropriate, coinciding with the emergence of vocal babbling in typical hearing children [10]. This suggests that infants are innately disposed to acquire language, but that the system is flexible in terms of the input (e.g. visual or auditory) it will accept and later imitate.

Relatedly, studies have also shown that language and handedness both emerge early in development. The left hemisphere has long been known to support language function, and the majority of the global population develops a right handed bias for motor activity (motor activity on the right side of the body is also controlled by the left hemisphere of the brain). Interestingly, this handedness effect is stronger when producing symbolic rather than non-communicative hand movements [11]. These results suggest that that there is a common network within the left hemisphere that may support any type of communicative act, whether it is achieved through spoken language or manual movements.

Advertisement

6. Incorporation of gesture into speech retell

Numerous studies have demonstrated that people incorporate gestural information into the retelling of stories [12-15,among others]. For example, Church, Garber, and Rogalski [12] compared subject recall for ambiguous statements (e.g. “My brother went to the gym”) alone versus when accompanied by a complementary gesture (e.g. shooting a basketball). At testing, researchers found a significant memory enhancement effect when both speech and gesture were available to subjects. Moreover, when asked to recall the speech items, 75% of the subjects added pieces of information based on the accompanying gestures. This pattern of results suggests that the brain does not “tag” the incoming information as originating in separate channels, but immediately integrates the two sources and processes them together.

Subjects may also add new content to a narrative in order to resolve potential mismatches between speech and gesture. For example, a conflict is introduced if a subject hears the phrase “and then Granny gives him a penny” but sees a gesture suggesting that Granny was actually on the receiving end of the interaction. In this case, the subject might insert additional information in their retelling: “and she threw him a penny, so he picked up the penny.” Now, the gesture towards the body is aligned with “he picked up the penny,” which is more logical than the mismatch that was originally presented [13]. Importantly, the subject does not ignore the gestural information in favor the speech. Instead, the two are seen as equally viable sources of information that must be linked in some fashion.

Advertisement

7. Gesture in self-only conditions

An additional line of evidence verifying the intimate relationship for speech and gestures comes from the repeated observation that the presence of gesture does not disappear entirely when a speaker’s audience is removed (i.e. separated by a partition, on the phone, etc.). While the rate of gesturing is always higher in conditions where the receiver of the message is visible to the speaker), we do not stop gesturing in monologue or non face-to-face conditions. Why gesture if it cannot ease the comprehension load of our listener? Some researchers hypothesize that in these instances, gestures are used to benefit the speaker by facilitating word retrieval and lexical access, while others suggest that it is simply the result of habit. However, in the context of other research, it seems most likely that because gesture and speech are so tightly and inextricably linked, it becomes challenging to produce the speech without simultaneously producing the gesture [16-18]

Similarly, there is evidence that congenitally blind individuals gesture as well, suggesting that – since they have never observed it—their use of gesture and its association with speech is innate rather than learned. Moreover, they gesture at a rate that is comparable to sighted individuals [19]. This behavior persists even when they are talking to individuals whom they know to be blind and could not benefit from the visual input.

Advertisement

8. Evidence from neuroimaging

While the behavioral studies described above are somewhat convincing, neuroimaging techniques may provide more compelling evidence that speech and gesture are best described as two example of a singular process. Functional magnetic resonance imaging (fMRI) and electroencephalography /event-related potentials (EEG/ERP) provide useful methods to explore what the brain is doing as it processes speech and gesture, either separately or together. Results of imaging studies have demonstrated that 1) gestures influence the earliest stages of speech processing, 2) gestures are subject to the same semantic processing as speech, and 3) speech and gesture activate a common neural network.

Advertisement

9. Early sensory processing

A handful of studies have indicated that gestures can affect the earliest stages of language processing [20-25]. In an ERP experiment, Kelly, Kravitz, and Hopkins [21] showed a modulatory effect of gesture on the sensory P1-N1 and P2 components elicited at frontal sites. Since these early components are generally reflective of low level and automatic sensory processing, this suggests that the interaction between speech and gesture occurs obligatorily and prior to any conscious semantic processing. Such a finding directly contradicts the view that gesture is an “add-on” or “bonus” feature, only used post-hoc in cases when speech fails. Similarly, in an fMRI experiment, Hubbard et al. [23] presented subjects with videos of speech accompanied by spontaneous production of beat gestures (i.e. rapid movements of the hands which provide ‘temporal highlighting’ to accompanying speech; [1]), nonsense hand movements, or no hand movements. Analysis revealed higher BOLD signal in brain regions relevant to speech perception, including the left superior temporal gyrus and the right planum temporale, in the beat gesture condition.

Gestures do not only affect how we process speech; they also affect how we produce it. Bernardis and Gentilucci [24] compared the properties of speech and gesture emitted in multimodal (speech + gesture) conditions versus unimodal (speech only or gesture only) conditions. The authors found increased F2 and pitch in vocal spectra when words were accompanied by meaningful gestures, but no effect when words were accompanied by aimless arm movements. Similarly, speaking a word, but not a pseudoword, aloud reduced the maximal height reached by the hands and duration of meaningful gestures. These findings offer clear evidence that there is a bi-directional relationship between speech and gesture: producing one automatically and reflexively influences how we produce the other. Krahmer and Swerts [25] confirm that producing a gesture (in this case, a beat gesture) influences how a speaker generates co-occuring speech in terms of its acoustic features (emphasis, duration, frequency, etc). The reverse is also true: when participants can see a speaker’s gesture, they rate the accompanying word as more “prominent.”

Advertisement

10. Semantic processing of speech and gesture

A series of ERP experiments has shown that speech and gesture reflect the same semantic and cognitive processing. These experiments focus on the N400 component, which is thought to be an index of semantic integration and is commonly elicited by both words and gestures that are incongruent with the ongoing discourse. While the N400 was initially reported as generated by incongruent or unexpected words [26], the N400 to incongruent gestures is an incredibly robust finding [21, 27-29, among others]. For example, Kelly, Kravitz, and Hopkins [21] showed participants video clips in which an actor gestured to one of two objects (a short, wide dish or a tall, thin glass) and then described the same object aloud. The N400 was smallest when the gesture and verbal descriptor referred to the same object and largest when they referred to different objects. Similarly, Holle and Gunter [27] used homonyms to investigate the ability of gesture to disambiguate speech. An N400 effect to the homonym was found when the ongoing discourse failed to support the meaning that was previously indicated via gesture.

11. Shared neural networks

A smaller body of research has examined the processing of autonomous gestures, like emblems and pantomimes. Studying these gesture types, rather than the gesticulations dependent on speech for context, allows researchers to contrast the brain’s response to each form of communication separately. For example, a recent fMRI study [30] demonstrated that language and symbolic gestures both activate a common, left-lateralized network of inferior frontal and posterior temporal regions, including the inferior frontal gyrus/Broca’s Area (IFG), posterior middle temporal gyrus (pMTG), and superior temporal sulcus (STS) (see Figure 2 for illustration).The authors suggest that these regions are not language-specific but rather function more broadly to link symbols with their meaning. This is true regardless of the modality or form the symbol adopts: sounds, words, gestures, pictures, etc.

Figure 2.

Common areas of activation for processing symbolic gestures and spoken language minus their respective baselines, identified using a random effects conjunction analysis. The resultant t map is rendered on a single subject T1 image: 3D surface rendering above, axial slices with associated z axis coordinates, below. See [30] for more details.

12. The gestural origins theory

The findings that speech and gesture are tightly integrated at multiple stages of processing and that they appear to activate a common neural system have significant implications for the question of how language evolved. The Gestural Origins Theory, made popular by Michael Arbib, Michael Tomasello, and Michael Corballis, proposes that spoken language emerged from the system of gestural communication we still see today in non-human primates (see [31] for review). In humans, a growth in brain size and the development of the vocal tract permitted a gradual transition to a more complex language system based upon vocalizations. Subsequently, and although we still use gestures to express ourselves, spoken language became the dominant mode of communication because it freed the hands for simultaneous tool use, was less demanding of energy resources, and did not require the speaker and addressee to be in the same physical (not to mention well-lit) location.

13. Gesture in our primate ancestors

Renowned primatologist Jane Goodall, as well as many other scientists, cites our sophisticated spoken language system as the crucial difference between humans and chimpanzees. Our primate relatives do produce sounds in order to communicate, but these vocalizations are limited in their scope and function and are used mainly to direct attention. Instead, it is their gesticulations that serve a more “language-like” function. These gestures are numerous: pointing, shaking, begging, and offering are all common [32]. These manual gestures can also be used intentionally, flexibly, and across many contexts, unlike facial and vocal gestures which are more automatic and ritualized [33].

So, the question is now what is unique about humans that supports spoken language ability? Spoken language requires the same careful coordination of motor systems as manual gestures, only the same fine motor control of the hands gradually transitioned to similar movements of the vocal tract. This transition was only possible due to skeletal changes: the lowering of the larynx, lengthening of the tongue and neck, etc. A popular theory claims that a genetic mutation in the FOXP2 gene located on chromosome 7 may be responsible for the development of fine motor skills necessary for articulation and vocalization [34].

14. Gesture and the mirror neuron system

The discovery of the mirror neuron system lent added credence to the gestural origins theory. Mirror neurons were first identified in area F5 of the monkey ventral premotor cortex and fire whether an animal executes or observes an action (for review, see [35]). A similar system is thought to exist in humans, and the areas of the human MNS, activated both by speech and by gesture, overlap largely with the classical language areas (i.e. Brodmann Area 44/Broca’s Area). In terms of the Gestural Origins Theory, the mirror neuron system accounts for what Michael Arbib terms parity: the fact that what a listener hears and understands is the message that the speaker intended to send [36]. However, the role of the MNS has been hotly debated in recent years, with some researchers suggesting that it cannot account for the complex semantic features of our language system [37] and suggesting its role in action understanding may be overstated [38-39].

15. Gesture as a universal language

The existence of a communication system is a feature of every human culture. However, spoken language is not a unitary phenomenon: depending on geographic location and the community we belong to, we speak one or two (or in some circumstances, maybe three or four) out of hundreds of modern languages. When an English speaker travels to China for the first time, for example, it is highly unlikely he will understand even simple words or phrases if he has not spent extensive time memorizing vocabulary and practicing with fluent speakers first. In these situations, we turn to gestures. Unlike speech, gestures, such as pointing, is relatively consistent across cultures (emblems, of course, are culturally bound and the exception to this rule). For example, Liszkowski et al. [40] showed that infants and caregivers from seven different cultures all pointed with the same general frequency and under the same circumstances, suggesting a universal and prelinguistic basis for communication.

Many studies have examined the frequency of gesture usage in situations where no common language exists between speakers or when an individual is speaking in his non-native language. In general, speakers rely more upon gesture when communicating in their second language (L2) [41-42]; gesture under these circumstances likely function to decrease the production burden for the speaker and increase the likelihood of comprehension for the listener. Another line of research has been study the role of gesture in L2 vocabulary acquisition. This work has demonstrated that learning novel words paired with meaningful gestures helps learners retain the material over time [43-45].

Similarly, it seems that it is easier for members of deaf communities to develop a common gesture or sign-based language than it is for members of separate speech communities to develop a new spoken language. The most notable example is perhaps Nicaraguan Sign Language, which emerged in the 1970s after the opening of a special education school that brought deaf children in the community together for the first time [46]. In sum, the fact that 1) we rely upon gesture as a common platform for communication when we lack a common language and 2) signed (but not spoken) languages still arise spontaneously, suggest that gestures may indeed form the core of our communication system.

16. Conclusions

Evidence overwhelmingly favors the view that speech and gesture are tightly integrated with one another, at both the behavioral and neural levels, suggesting that forms of verbal and nonverbal communication are parts of one amodal system that enables complex human communication.

Considered broadly, evidence also seems to support a view of language evolution rooted in manual gesture. The mechanisms that underlie this, however, are still somewhat unclear. The mirror neuron system may be the center of the “language-ready brain,” but this theory is not free from controversy. Equally viable (and not mutually exclusive) is the proposal we advocate here: the system that supported nonverbal communication was co-opted over the course of evolution to support spoken language.

Nevertheless, David McNeill, whose work we see as central to both of these hypotheses, is actually a critic of the “gesture-first” view, instead claiming that speech and gesture emerged alongside one another and in response to the same environmental pressures. Challenging this view, however, is the literature on comparative biology, primate vocalizations and gesture, molecular, and the developmental trajectories of gesture and speech in children, all of which all suggest that speech lags behind gesture in our evolutionary history.

In the end, the question of how language evolved and whether or not it emerged from a system built on manual gestures is not as important as what the relationship is between speech and gesture, now that they both exist. The intimate relationship between the two, which is now well established, has important implications for education, acquisition of second languages, effective public speaking, treatment of patients with communication disorders, and much, much more.

References

  1. 1. Call, J., & Tomasello, M. (2007). The gestural communication of apes and monkeys. New York: Lawrence Erlbaum Associates.
  2. 2. McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.
  3. 3. Krauss, R. M. (1998). Why do we gesture when we speak? Current Directions in Psychological Science, 7, 54–59.
  4. 4. McNeill, D., Cassell, J., & McCullough, K. (1994). Communicative effects of speech-mismatched gestures. Research on Language and Social Interaction, 27(3), 223-237.
  5. 5. Rauscher, F. H., Krauss, R. M., & Chen, Y. (1996). Gesture, speech, and lexical access: The role of lexical movements in speech production. Psychological Science, 7, 226–231.
  6. 6. Mayberry, R. & Jaques, J. (2000). Gesture production during stuttered speech: insights into the nature of gesture-speech integration. In D. McNeill (ed.). Language and gesture, pp. 199-214. Cambridge: Cambridge University Press.
  7. 7. Bates, Elizabeth & Dick, Frederic. (2002). Language, gesture and the developing brain. In B. J. Casey & Y. Munakata (eds.), Special issue: Converging Method Approach to the Study of Developmental Science. Developmental Psychobiology 40: 293- 310.
  8. 8. Bauer, P.J., Herstgaard, L.A., Dropik, P., & Daly, B.P. (1998). When even arbitrary order becomes important: Developments in reliable temporal sequencing of arbitrarily ordered events. Memory, 6, 165-198.
  9. 9. Masataka, N. (2001). Why early linguistic milestones are delayed in children with Williams syndrome: Late onset of hand banging as a possible rate-limiting constraint on the emergence of canonical babbling. Developmental Science, 4, 158-164.
  10. 10. Petitto, L.A. & Marentette, P.F. (1991). Babbling in the manual mode: evidence for the ontogeny of language. Science, 251(5000), 1493-6.
  11. 11. Bates, E., O'Connell, B., Vaid, J., Sledge, P. & Oakes, L. (1986). Language and hand preference in early development. Developmental Neuropsychology, 2(1), 1-15.
  12. 12. Church, R. B., Garber, P., & Rogalski, K. (2007). The role of gesture in memory and social communication. Gesture, 7(2), 137-158.
  13. 13. Cassell, J., McNeill, D., & McCullough, K. (1999). Speech–gesture mismatches: Evidence for one underlying representation of linguistic and nonlinguistic information. Pragmatics & Cognition, 7(1), 1-34.
  14. 14. Kelly, S. D. (2001). Broadening the units of analysis in communication: Speech and nonverbal behaviours in pragmatic comprehension. Journal of Child Language, 28(2), 325-349.
  15. 15. Kelly, S. D., Barr, D. J., Church, R. B., & Lynch, K. (1999). Offering a hand to pragmatic understanding: The role of speech and gesture in comprehension and memory. Journal of Memory and Language, 40(4), 577-592.
  16. 16. Bavelas, J. B., Gerwing, J., Sutton, C., & Prevost, D. (2008). Gesturing on the telephone: Independent effects of dialogue and visibility. Journal of Memory and Language 58, 495-520.
  17. 17. Alibali, M. W., Heath, D. C., & Myers, H. J. (2001). Effects of visibility between speaker and listener on gesture production: Some gestures are meant to be seen. Journal of Memory and Language, 44(2), 169-188.
  18. 18. Clark, H. H. (1996). Using language. Cambridge: Cambridge University Press.
  19. 19. Iverson, J.M. & Goldin-Meadow, S. (1997). What’s communication got to do with it? Gesture in children blind from birth. Developmental Psychology, 33(3), 453-67.
  20. 20. Wu, Y. C., & Coulson, S. (2010). Gestures modulate speech processing early in utterances. Neuroreport, 21(7), 522-526.
  21. 21. Kelly, S. D., Kravitz, C., & Hopkins, M. (2004). Neural correlates of bimodal speech and gesture comprehension. Brain and Language, 89(1), 253-260.
  22. 22. Skipper, J. I., Goldin-Meadow, S., Nusbaum, H. C., & Small, S. L. (2007). Speech-associated gestures, broca's area, and the human mirror system. Brain and Language, 101(3), 260-277.
  23. 23. Hubbard, A. L., Wilson, S. M., Callan, D. E., & Dapretto, M. (2009). Giving speech a hand: Gesture modulates activity in auditory cortex during speech perception. Human Brain Mapping, 30(3), 1028-1037.
  24. 24. Bernardis, P. & Gentilucci, M. (2006). Speech and gesture share the same communication system. Neuropsychologia, 44, 178–190.
  25. 25. Krahmer, E. and M. Swerts (2007). "The effects of visual beats on prosodic prominence: acoustic analyses, auditory perception and visual perception." Journal of Memory and Language, 57, 396-414.
  26. 26. Kutas, M., & Hillyard, S. A. (1980). Reading senseless sentences: Brain potentials reflect semantic incongruity. Science, 207, 203–204.
  27. 27. Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19(7), 1175-1192.
  28. 28. Ozyurek, A., Willems, R. M., Kita, S., & Hagoort, P. (2007). On-line integration of semantic information from speech and gesture: Insights from event-related brain potentials. Journal of Cognitive Neuroscience, 19(4), 605-616.
  29. 29. Bernardis, P., Salillas, E., & Caramelli, N. (2008). Behavioural and neurophysiological evidence of semantic interaction between iconic gestures and words. Cognitive Neuropsychology, 25(7-8), 1114-1128.
  30. 30. Xu, J., Gannon, P.J., Emmorey, K., Smith, J.F., & Braun, A.R. (2009). Symbolic gestures and spoken language are processed by a common neural system. Proceedings of the National Academy of Sciences, 106(49), 20664-20669.
  31. 31. Gentilucci, M., & Corballis, M.C. (2006). From manual gesture to speech: A gradual transition. Neuroscience & Biobehavioral Reviews, 30(7), 949-960.
  32. 32. Tomasello, M., Call, J., and Gluckman, A. (1997). Comprehension of novel communicative signs by apes and human children. Child Development, 68(6), 1067-80.
  33. 33. Pollick, A.S. & de Waal, F.B.M. (2007) Ape gestures and language evolution. Proceedings of the National Academy of Sciences, 104(19): 8184-8189.
  34. 34. Corballis, M.C. (2004). FoxP2 and the mirror neuron system. TRENDS in Cognitive Science. 8(3), 95-6.
  35. 35. Rizzolatti, G., & Craighero,L. (2004). The mirror-neuron system. Annual Review of Neuroscience, 27, 169-192.
  36. 36. Arbib, M.A. (2008). From grasp to language: embodied concepts and the challenge of abstract. Journal of Physiology, 102(1-3), 4-20.
  37. 37. Tettamanti, M. & Moro, A. (2012). Can syntax appear in a mirror system? Cortex, 48(7), 923-35.
  38. 38. Hickok, G. (2009). Eight problems for the mirror neuron theory of action understanding in monkeys and humans. Journal of Cognitive Neuroscience, 21(7), 1229-43.
  39. 39. Lingnau, A., Gesierich, B., & Caramazza, A. (2009). Asymmetric fMRI adaptation reveals no evidence for mirror neurons in humans. Proceedings of the National Academy of Sciences, 106(24), 9925-30.
  40. 40. Liszkowski, U., Brown, P., Callaghan, T., Takada, A., & de Vos, C. (2012). A prelinguistic gestural universal of human communication. Cognitive Science. 36(4), 698-713.
  41. 41. Gullberg, M. (1998). Gesture as a Communication Strategy in Second Language Discourse: A Study of Learners of French and Swedish. Lund, Sweden: Lund University Press.
  42. 42. Hadar, U., Dar, R., & Teitelman, A. (2001). Gesture during Speech in First and Second Language: Implications for Lexical Retrieval. Gesture, 1(2),151-165.
  43. 43. Kelly, S. D., McDevitt, T., & Esch, M. (2009). Brief training with co-speech gesture lends a hand to word learning in a foreign language. Language and Cognitive Processes, 24, 313-334.
  44. 44. Macedonia, M. & Knosche, T. (2011). Body in mind: how gestures empower foreign language learning. Mind, Brain, and Education, 35(4),196-211.
  45. 45. Macedonia, M., Muller, K., & Friederici, A.D. (2011). The impact of iconic gestures on foreign language word learning and its neural substrate. Human Brain Mapping, 31(6), 982-98.
  46. 46. Senghas, A., Kitas, S., & Ozyurek, A. (2004). Children creating core properties of language: evidence from an emerging sign language in Nicaragua. Science, 305(5691): 1779-82.

Written By

Meghan L. Healey and Allen R. Braun

Submitted: 29 June 2012 Published: 19 June 2013