Open access peer-reviewed chapter

Proprioceptive and Kinematic Profiles for Customized Human‐ Robot Interaction for People Suffering from Autism

Written By

Pauline Chevalier, Brice Isableu, Jean‐Claude Martin, Christophe Bazile and Adriana Tapus

Submitted: 24 May 2016 Reviewed: 22 December 2016 Published: 12 April 2017

DOI: 10.5772/67339

From the Edited Volume

Autism - Paradigms, Recent Research and Clinical Applications

Edited by Michael Fitzgerald and Jane Yip

Chapter metrics overview

1,371 Chapter Downloads

View Full Metrics

Abstract

In this chapter, we presented a method to define individual profiles in order to develop a new personalized robot‐based social interaction for individual with autistic spectrum disorder (ASD) with the hypothesis that hyporeactivity to visual motion and an overreliance on proprioceptive information would be linked to difficulties in integrating social cues and in engaging in successful interactions. We succeed to form three groups among our 19 participants (children, teenagers, and adults with ASD), describing each participant's response to visual and proprioceptive inputs. We conducted a first experiment to present the robot Nao as a social companion and to avoid fear or stress toward the robot in future experiment. No direct link between the behavior of the participants toward the robot and their proprioceptive and visual profiles was observed. Still, we found encouraging results going in the direction of our hypothesis. In addition, almost all of our participants showed great interest to Nao. Defining such individual profiles prior to social interactions with a robot could provide promising strategies for designing successful and adapted human‐robot interaction (HRI) for individuals with ASD.

Keywords

  • autism
  • personalized interaction
  • socially assistive robotics
  • proprioception
  • kinematics

1. Introduction

Autism spectrum disorders (ASDs) are characterized by deficits in communication and social skills and the presence of restricted and repetitive patterns of behaviors, interests, or activities, as described in the DSM‐V [1]. The ASD literature describes widely the impairments in communication, interaction, emotion recognition, joint attention, and imitation [2]. Children with ASD show a great affinity for robots, computers, and mechanical components [3]. In the field of socially assistive robotics (SAR), robots are used as tools in socialization therapies for children with ASD in order to enhance social engagement, imitation, and joint attention skills [47].

In Ref. [8], the authors suggest that individuals with ASD show an overreliance on proprioceptive information. Proprioception can be defined as the sense of an individual of the relative position of body segment (i.e., joint position sense) and the strength of efforts to produce movements. This sense is derived from complex somatosensory signals provided to the brain by different sensors in the body: multiple muscles [911], joints [12], and skin receptors [13]. Individuals with ASD show normal to exacerbated integration of proprioceptive cues compared to typically developed (TD) individuals [14]. TD individuals have been repeatedly shown to rely more heavily on vision in various perceptivo‐cognitive and sensorimotor tasks, followed by a progressive age‐related decline of visual dependency [15, 16]. Proprioceptive integration in ASD is studied so as to better understand how the contribution of these cues influences interactive and social capacities. In Ref. [8], the authors observed the link between the proprioception and social and imitation skills in children with ASD. Results showed that more the children had a high reliance on proprioception cues, the more they exhibited impairments in social functions and imitation.

Moreover, children with ASD show impaired visual processing skills that would lead to difficulties in managing social interactions [17]. Indeed, vision is an important component in communication and social skills. In individuals with ASD, the visual processing impairment may lead to unusual eye contact, difficulty in following the gaze of others or supporting joint attention, and difficulty in interpreting facial and bodily expressions of emotions [17]. In addition, visual field‐dependent individuals are considered to show more social skills than visual field‐independent individuals [18, 19].

Our project aim is to develop a new personalized human‐robot interaction model for individuals with ASD. This would come as a complement to standard therapy. We based our work on the interindividual sensory differences between individuals with ASD. Indeed, there are strong interindividual differences in ASD [1], and adapted interaction is a need in ASD therapies. We built our work on the hypothesis that an individual's own integration of proprioceptive and visual cues will affect the way he/she interacts with a humanoid robot [8, 1820]. We hypothesize that a hyporeactivity to visual motion and an overreliance on proprioceptive information are linked in individuals with ASD to their difficulties in integrating social cues and engaging in successful interactions (H1). The chapter is structured as follow: Section 2 presents the related work in the atypical integration of cues in ASD and SAR for individuals with ASD; Section 3 presents the participants of our study; Section 4 presents our method to define participants’ perceptivo‐cognitive and sensorimotor profile with respect to the integration of visual inputs; Section 5 describes the first interaction between the robot Nao and our participants; finally, Section 6 concludes our work.

Advertisement

2. Related work

2.1. Atypical integration of proprioceptive and visual integration of cues in ASD

Motor, sensory, and visual processing impairments are present in autism and were taken into account recently with the publication of the DSM‐V [1, 14, 17]. However, these deficits have an influence on the quality of life of individuals suffering from ASD and on their social development. In Ref. [21], the authors showed that children with motor impairments are more likely to have solitary play and less interaction with peers in comparison with TD children. They do not explore their physical and social environment, which leads to social and emotional difficulties. Visual deficiencies are known to lead to difficulties in social behaviors and have been widely documented in the literature.

An overreliance on proprioceptive information in ASD is suggested [8, 14, 2225]. Individuals with autism show normal to exacerbated integration of proprioceptive cues compared to TD individuals [14]. More specifically, in Refs. [22] and [23], the authors observed abnormal postural behavior in autism. They found that individuals with ASD show less age‐related postural behaviors and are less stable than TD individuals. Results in Ref. [22] suggest that postural hyporeactivity to visual information is present in the tested individuals with autism (individuals suffering from ASD with IQs comparable to those of TD individuals). Furthermore, Gepner et al. [24] pointed out that individuals with ASD show very poor postural response to visual motion and have movement perception impairments. This result was also observed in Ref. [25]. Proprioceptive integration in ASD has been studied to better understand how the contribution of these cues influences interactive and social capacities. In Ref. [8], the authors observed a stronger than normal association between self‐generated motor commands and proprioceptive feedback in the autistic brain. This would confirm that individuals with ASD have an overreliance on proprioceptive cues. Furthermore, they observed that the more the children with ASD rely on proprioception, the more they exhibit impairments in social function and imitation.

2.2. Robots in ASD therapy

Over the past decade, SAR has been a growing research area, with a great interest for therapy for individuals with ASD. Indeed, robots have been shown to be appealing, attracting, and engaging for individuals with ASD. In addition to their mechanical system that is known to attract people with ASD [3, 5], they propose simple, repetitive, and predictable behaviors, which can be reassuring for individuals with ASD. SAR focuses on different topics [26]: the design of adapted robots for individuals with ASD, the design of autonomous interaction between the children and the robots, and the evaluations of the therapy proposed for children with ASD.

Numerous studies relate the positive effects of robots on children with ASD. In Ref. [27], the authors observed an increased collaborative behavior with a human partner after an intervention of dyadic interactions through play between children with ASD and a robot. In Ref. [5], the authors observed that a robot was a successful social bridge with a human partner for children with ASD in triadic interactions. In therapy designed for children with autism [7], the authors found that children with ASD engaged spontaneously in dyadic play with the robot Keepon. This study was also expanded to a triadic interaction between a robot, an adult, and a child. In Ref. [27], the authors used the robot Probo as a social story telling agent for children with ASD. The authors observed that in specific situations, the social performance of children with ASD improves more significantly when it was the robot Probo telling the stories than when it was a human reader. In light of these encouraging findings, many challenges in SAR for individuals with ASD must be addressed. Because of small subject pools and/or short‐term experiments, generalized results in the improved skills are often questionable [28]. In addition, there is a great variability in the human‐robot interaction (HRI) setups that may influence the findings in SAR for individuals with ASD [29]. The new challenge of SAR will be to identify how to reduce the variability in HRI therapies for individuals with ASD. In particular, in Ref. [30], the authors propose a new step in robot‐assisted therapy: robotic‐assisted therapeutic scenarios should develop more substantial levels of autonomy, which would allow the robot to adapt to the individual needs of children over longer periods of time.

Advertisement

3. Participants

We conducted our research in collaboration with three care facilities for people suffering from ASD: IME MAIA (France) and IME Notre Ecole (France), associations for children and teenagers with ASD, and FAM La Lendemaine (France) a medical house for adults with ASD. Informed consent for participation was obtained from the parents or by the participants themselves when able. The experimental protocol was approved by the EA 4532 local university ethics committee.

Our subject pool is composed of 12 children and teenagers with ASD (11.7 ± 2.6 years old) and seven adults with ASD (26.8 ± 7.9 years old) from these three care facilities. There are 14 male and five female participants. For confidentiality reasons, we coded the participants’ identities as follows: CH#, with # from 1 to 12 for children and teenagers and AD#, with # from 1 to 7 for adults. In Table 1, we give a short description of each participant.

ID#GenderAgeComments
CH3M12
CH5F11Nonverbal; West syndrome (uncommon to rare epileptic disorder [31]).
CH8M15
CH10M10
CH11M17
AD2F25Diagnosed with creatine transporter deficiency
AD4F21
AD6M25Suffers of echolalia (i.e., defined as the unsolicited repetition of vocalizations made by another person)
CH1M11
CH4M13High level of cognition. Asked to be part of the program to meet Nao.
CH7M8
CH9F12
CH12M9
AD1F25
AD3M44Asperger syndrome
CH2M9Suffers of echolalia
CH6M13
AD5M27Suffers of epilepsy
AD7M21

Table 1.

Participants’ description.

Advertisement

4. Defining proprioceptive and kinematic profiles

4.1. Methods

The first step of our work was to determine how to define participants’ perceptivo‐cognitive and sensorimotor profiles. We used two methodologies: (1) the perceptivo‐cognitive adolescents/adults sensory profiles (AASP) developed by Dunn and Brown [31] and (2) an experimental sensorimotor setup dedicated to assess the individual's reliance on visual over proprioceptive inputs to control postural balance while confronted to a moving virtual visual room.

The AASPs were completed by all participants. We selected this questionnaire because it has been successfully used in ASD [3234]. As described in Ref. [35], it enabled us to assess an individual's sensory processing preferences described in terms of the quadrants in Dunn's model of sensory processing [35]:

  • Low registration: tendency to miss or take a long time to respond to stimuli.

  • Sensation seeking: tendency to try to create additional stimuli or to look for environment that provides sensory stimuli.

  • Sensory sensitivity: tendency to answer quickly to stimuli.

  • Sensation avoiding: tendency to be overwhelmed or bothered by sensory stimuli and to be actively involved to reduce the stimuli in their environment.

As most of our participants do not have the cognitive level to fill themselves the questionnaire, it was filled with the help of their caregivers who know well their habits and response to everyday life sensory stimuli. In the instruction of the questionnaire, it is specified that the questions can be filled (1) by the person himself/herself; (2) with the help of a caregiver or parent; and (3) by a caregiver or parent. Indeed, this questionnaire targets also individuals with intellectual deficiencies. We asked the caregivers to fill the questionnaire as we had a direct contact with them and were able to inform them well about the conditions and forms of the questionnaire. We assessed movement, visual, touch, and auditory processing using 29 of the 60 items of the AASP. We eliminated the taste/smell processing and activity level and questions, which were not relevant for the purpose of our study or suitable for individual with invasive ASD's behavior. We designed a sensorimotor experimental setup to assess the sensory integration of each participant. The setup has been used in several studies [36, 37]. It evaluates (1) the effect of a moving virtual visual room on postural control and (2) one's capability to use proprioceptive information to reduce visual dependency [36, 37]. It has been shown that the integration of proprioceptive cues differs among individuals in unstable posture [3840]. A visual‐dependent individual integrates less proprioceptive cues than other individuals, and when they are exposed to visual motion in an unstable posture, their body sway follow the visual stimulus [41].

To assess the visual dependence of our participants with ASD, they were asked to stand quietly in two postural conditions: (1) normal and (2) tandem Romberg (i.e., one foot in front of the other one), in front of a virtual room, static (SVR) or rolling (RVR) at 0.25 Hz with an inclination of ±10°. We chose a rolling frequency of 0.25 Hz: virtual room setups frequently use rolling frequency between 0.1 and 0.5 Hz [24, 25, 42]. It has been found that a frequency of 0.2 Hz produces the strongest, most synchronized body sway, and that frequencies above 0.5 Hz produce little body sway [43].

They were asked to stand on a force platform in front of the virtual room, static or rolling in three conditions (see Figure 1):

  • C1—stable position with SVR: the participant stands on the force platform, straight, feet separated by the length of the hips. The VR stays still. The recording lasts 30 seconds.

  • C2—stable position with RVR: the participant stands on the force platform, straight with feet separated by the length of the hips. The VR has a sinusoidal movement. The recording lasts 50 seconds.

  • C3—tandem Romberg position with RVR: the participant stands on the force platform, straight, one foot in front of the other one. The VR has a sinusoidal movement. The recording lasts 50 seconds.

Figure 1.

Experimental setup for adult participants in condition C3.

The virtual room consisted of a 3D environment (see Figure 2) created with Blender, a free 3D rendering software. It was designed as a child bedroom and was decorated with child toys and furniture, as we aimed to create a friendly environment. We placed in the line of sight of the participants with ASD a toy plane in order to help them to focus on the task, and not to be distracted away: we instructed them to focus on this plane [24, 25]. The virtual room was projected to a white wall with a short focal projector in a dark room. For the adult group setup, the dimension of the projection was 2.4 m large × 1.8 m high and the participants stood at 1.3 m of the point of observation. For the children group setup, the dimension was 1.75 m large × 1.30 m high and the participants stood at 1 m. This permitted us to maintain the angular diameter around 31° in horizontal and 41° in vertical in both setup.

Figure 2.

Screenshot of the virtual room used in the experiment, developed with Blender.

We investigated if the age of our participants had an influence on their center of pressure (CoP) behavior. Indeed, in TD individuals, children show more dramatic postural reactions to visual sway than adults [44]. However, as shown in Ref. [25], children with ASD showed less response to visual stimulus in virtual room experiments in stable position than TD children.

We expect that:

  1. children participants will show more swaying than adults in stable position and without visual stimulus (C1);

  2. as we work in a population with ASD, the postural reaction to visual sway will not be influenced by age;

  3. the effect of the RVR should be maximum in the unstable postural condition (C3); and

  4. the effect of (3) should not be larger in adults than in children.

4.2. Data analysis

We used an AMTI OR6‐5‐1000 force platform to record the displacement of the CoP of our participants. The sampling frequency was of 1 KHz. To reduce noise, a Butterworth filter with a cut‐off frequency of 10 Hz on the recorded data was used. The root mean square (RMS) of the displacement of the CoP in mediolateral directions was computed as an indicator of an individual's stability. Indeed, as described in Ref. [45], the RMS provided the information about the variability of the CoP in space. The frequency power at 0.25 Hz (Fpo) of the CoP was computed to evaluate the postural response to the visual stimulus. The more an individual is coupled with the visual stimulus, the more the Fpo. We observed the CoP behavior in mediolateral direction as it is the direction of our visual stimulus. RMS and Fpo should be correlated if our participants with ASD follows the RVR movement:

  1. If the RMS and the Fpo are correlated, then we can expect these coupling capabilities with contextual cues promise higher social interaction capabilities.

  2. If the RMS and the Fpo are not correlated (no coupling), then one can conclude that the visual stimulus is integrated as noise, inducing disorientation and instability.

We performed repeated measures analysis on the RMS and the Fpo for the age groups (adults; children) and the conditions (C1; C2; C3). The significance threshold was set to p < 0.05. We used Statistica version 13 to perform the analyses.

Except for the clustering analysis, we excluded four participants (AD6, CH6, CH7, and CH12) of the statistical analysis, as they showed distress, nervousness, and/or agitation during the recording, resulting to dramatic changes in their CoP behavior during some recording.

4.3. Results

4.3.1. Displacements and root mean square (RMS)

As we expected, we found a significant main effect of the participants’ age on their RMS (F (1; 13) = 6.92; p < 0.05) across all conditions. The mediolateral RMS of the adults was smaller (M = 0.84; SD = 0.57) than those of the children (M = 1.70; SD = 1.12), indicating that the children were globally more variable than the adults (see Figure 3). The conditions (C1 vs. C2 vs. C3) did not impact the displacements of the CoP of the participants. The conditions × age interaction was not significant on the RMS.

Figure 3.

Mean RMS for the adults and children in the three conditions.

4.3.2. Displacements and frequency power response at 0.25 Hz (Fpo)

As we expected, the main effect of conditions was significant on the Fpo (F (2; 26) = 13.11; p < 0.05). In condition C1 (M = 0.17; SD = 0.16), participants had the smallest Fpo, followed by condition C2 (M = 0.34; SD = 0.38) and condition C3 (M = 0.67; SD = 0.31) had the highest Fpo. This suggests that the participants’ displacements of the CoP were coupled with the movement of the visual room when they were exposed to it, and that this coupling is maximized in the more difficult stance. In addition, the Fpo was positively correlated with the RMS in condition C3 (R = 0.73; p < 0.01), indicating that the larger displacement of the CoP are possibly coupled with the RVR (i.e., the visual stimulus). We found a significant main effect of the participants’ age (F (1; 13) = 5.37; p < 0.05). The Fpo of adults was smaller (M = 0.29; SD = 0.33) than those of children (M = 0.46; SD = 0.36) across all conditions. This suggests a higher coupling between postural response to the RVR and the displacements of the CoP in children than in adults. However, the age × conditions interaction for the Fpo was not significant. In Figure 4, we can observe that children's and adults’ postural response to the visual stimulus is similar in condition C3, indicating that children with ASD do not respond in a higher way to visual cues in comparison with adults with ASD. The frequency of 0.25 Hz is present in natural swaying [43] and the RMS was higher in children than in adults. As the age × conditions interaction for the Fpo was not significant, we can assume that the higher Fpo on the whole experiment was induced by the greater variability of the displacements of the CoP of the children. This result suggests that our participants’ postural behavior was not driven by age as we expected. We found that our participants’ postural coupling to the virtual room was driven by the conditions (C1; C2; C3).

Figure 4.

Mean Fpo for the adults and children in the three conditions.

4.3.3. Grouping the participants

We performed a clustering analysis (dendrogram, Ward method) on the AASP items on movement and visual sensory preferences, the RMS, and the Fpo of all the 19 participants (12 children and seven adults with ASD) (see Table 2 to see the specific items selected). We sought to identify if the postural response to the visual stimulus and AASP scores were able to discriminate our participants, as we aimed to use theses profiles to propose personalized interactions with robots. The dendrogram gave us three groups, see Figure 5:

  • G1: 8 participants CH3; CH5; CH8; CH10; CH11; AD2; AD4; AD6.

  • G2: 7 participants CH1; CH4; CH7; CH9; CH12; AD1; AD3.

  • G3: 4 participants CH2; CH6; AD5; AD7.

Figure 5.

Dendrogram analysis.

Movement low registration
Visual low registration
Movement sensation seeking
Visual sensation seeking
Movement sensory sensitivity
Visual sensory sensitivity
Movement sensation avoiding
Visual sensation avoiding
Mediolateral RMS for condition C1, C2, and C3
Fpo for conditions C1, C2, and C3 in mediolateral direction

Table 2.

AASP items and CoP behavior selected for the dendrogram analysis.

The RMS in all conditions for the three groups detected is shown in Figure 6. Figure 7 shows Fpo in condition C3. In Figure 8, the mean AASP score (movement sensory sensitivity (MSS), visual sensory sensitivity (VSS), and visual sensation avoiding (VSA)) for each group is illustrated.

Figure 6.

Histogram of the mean RMS for the groups defined by clustering analysis for the three conditions.

Figure 7.

Histogram of frequency power at 0.25 Hz in all conditions of the three groups.

Figure 8.

Histograms for three relevant AASP items scores of the groups.

Repeated measures analysis was applied on the RMS and the Fpo for the groups (G1; G2; G3) and the conditions (C1; C2; C3). We found no main effect of the groups on the RMS and on the Fpo of the participants. However, we found significant interaction effect between groups over the conditions on the RMS (F (4; 24) = 3.55; p < 0.05), see Figure 6. Participants from groups G1 and G2 showed great CoP variability in all conditions, unlike participants from group G3 that showed a greater CoP variability only in condition C3. Figure 6 also informs us that participants in group G1 had their RMS that decrease from C1 to C3, whereas participants in group G3 had their RMS that increased from C1 to C3. This indicates that participants from group G1 maximized the use of proprioception to reduce the effect of the visual stimulus in unstable position. Identically, we found significant interaction effect between the groups and the conditions on the Fpo (F (4; 24) = 9.79; p < 0.001), see Figure 7. This indicates us that each group has a different postural response toward the visual stimulus. Figure 7 suggests that in condition C1, participants from groups G1 and G2 showed a higher coupling with the frequency of the visual stimulus than participants from group G3. As in this condition, the participants are not exposed to the RVR, this result shows the higher instability of these participant in comparison to participants from G3. The coupling with the rolling visual stimulus is similar in conditions C2 and C3 for participants from groups G1 and G2, indicating that the difficulty of postural task (stable or unstable) did not increase the strength of the coupling with the rolling virtual room: being in a stable or unstable posture did not affected them in their response to the visual stimulus. Participants from G3 showed a greater coupling to the visual stimulus in condition C3 than in other conditions, indicating that they responded more strongly to the visual stimulus in an unstable posture and thus they are more visual dependent (see Figure 7). Furthermore, we also examined the correspondence between the scores of different items of the AASP and the data obtained from the CoP recording, listed in Table 2:

  • High score in MSS (i.e., tendency to answer quickly to movement stimuli) was inversely correlated to Fpo in mediolateral direction for condition C3 (R = 0.53; p < 0.05), indicating that participants who showed by the AASP questionnaire to have a tendency to answer quickly to movement stimuli were less driven by the visual stimulus in unstable position.

  • High score in VSS (i.e., tendency to answer quickly to visual stimuli) was positively correlated to mediolateral RMS for condition C3 (R = 0.61; p < 0.05). This suggests that participants who showed by the AASP questionnaire to have a tendency to answer quickly to visual stimuli were more unstable when exposed to the visual stimulus in unstable position.

  • High score in VSA (i.e., tendency to be overwhelmed or bothered by visual sensory stimuli) was positively correlated to mediolateral RMS for condition C3 (R = 0.59; p < 0.05), indicating that participants who showed by the AASP questionnaire to have a tendency to be overwhelmed or bothered by visual stimuli were more unstable when exposed to the visual stimulus in unstable position.

Three of the selected items of the AASP showed to be correlated with the postural response variability to a visual stimulus in an unstable position (C3), confirming that these AASP items match with the behavioral response of the participants.

4.4. Conclusion

Thanks to this experiment, we succeeded to form three groups between our participants, describing each participant's response to visual and proprioceptive inputs:

  • Group G1 (participants: CH3; CH5; CH8; CH10; CH11; AD2; AD4; AD6) includes participants with high scores in MSS, low scores in VSS, and low score in VSA. Participants showed strong visual independence to the RVR, suggesting an overreliance on proprioceptive cues and hyporeactivity to visual cues.

  • Group G2 (participants: CH1; CH4; CH7; CH9; AD1; AD3) includes participants with high scores in MSS, low scores in VSS, and low score in VSA. Participants showed moderate reactivity to the RVR, suggesting that they rely evenly on visual and proprioceptive cues.

  • Group G3 (participants: CH2; CH6; AD5; AD7) includes participants with low scores in MSS, high scores in VSS, and high score in VSA. Participants showed hypereactivity to the RVR, suggesting a hyporeactivity to proprioceptive cues and an overreliance on visual cues.

We found that overall children had greater variability of their CoP than adults which was expected (due to biomechanical, anthropometrical, sensory integration factors and maturation of these processes). The different conditions of posture and exposure to the visual stimulus (C1; C2; C3) did not impact the variability of the CoP of our participants. However, the participants’ displacements of the CoP showed to be more coupled with the frequency of the visual stimulus in condition C3 than in conditions C1 and C2. We observed that the coupling to the frequency of the visual stimulus was higher in children than in adults on the whole experiment, but not inside the conditions. As the children were experiencing more variability to their CoP and as the frequency 0.25 Hz (i.e., the frequency of our visual stimulus) is present in natural swaying [43], we posit that this higher coupling to this frequency on the whole experiment by the children was induced by the greater variability of the displacements of their CoP. This result suggests that our participants’ postural behavior was not driven by age as we expected.

In TD individuals, children show more dramatic postural reaction to visual stimulus [44]. In children with ASD, it is suggested that sensory integration differs from TD individuals [8, 14, 24, 25]. We found in this experiment that the age of the participants with ASD did not impact the strength of the postural coupling with the visual stimulus, differently to TD individuals. Similarly to TD individuals, children with ASD had more variability in the displacement of their CoP in comparison to adults with ASD.

We also observed a great variability in our participants’ postural behavior. In our results, we found three groups with different sensory integration among our participants. As in Refs. [46] and [47], we found in our participants individuals(mixing children and adults) with a weak proprioceptive integration and strong visual dependency. In Ref. [46], the authors found an impairment of the proprioception input in autism: children with ASD used more visual cues to reduce sway and maintain balance. In Ref. [47], the authors found that unlikely to typically developed individuals, individuals with ASD have an impaired proprioception development. Their sensory motor signal appears to remain at the kinesthetic stage of typically developed 3–4 years old children and have to rely on visual inputs. They also conjectured that the impaired proprioception of physical micromovements of the individuals with ASD impedes as well their visual perception of micromovements in others during real‐time interactions, impairing their abilities to interact with people. And, as in Refs. [8] and [25], we found in our participants individuals relying more on proprioceptive inputs and weak visual dependency.

With these results and our hypothesis H1, we are able to make assumptions on the behaviors that each individual will have during human‐robot interaction sessions. Therefore, we posit that individuals from group G1 will have less successful social interactions than the ones from groups G2 and G3, and that individuals from group G3 will have the most successful social interactions.

Advertisement

5. Greetings with Nao

5.1. Objectives

A first analysis of the behavior of the participants toward the Nao robot was conducted. Nao is a minihumanoid robot, developed by SoftBank Robotics (former Aldebaran Robotics). The purpose of the interaction was to present the robot to the children and adults with ASD for a short duration (up to 2 minutes). Indeed, some of the individuals with ASD are reluctant to unusual events and changes in their daily routine. The robot was smoothly introduced so as to avoid fear toward the robot. In addition, in Ref. [48], the authors observed that children who saw the robot act in a social‐communicative way and were more likely to follow its gaze than those who did not. Hence, we believe that introducing them smoothly the robot as a social partner by showing them the Nao robot in the context of a short greeting task may help the participants to interact with the robot in further experiments.

We also wanted to verify that the behavior of our participants was linked to their proprioceptive and visual profiles as described in Section 4. To do so, we video analyzed the interaction with Nao robot and annotated our participants’ social behavior following the items described in Table 3.

Smiles, laughter
Speech to Nao by his/her initiative
Speech to Nao after being encouraged by his/her caregiver
Social gesture toward Nao (waving back)
Gaze of the participant toward:
  1. The robot

  2. His/her caregiver

  3. Other (somewhere else)

Table 3.

Description of the tracked social behaviors.

5.2. Method

The scenario of this first interaction with the robot was in two phases. First, the greetings: the participant was seated in front of the robot and Nao said “Hello, I am Nao. You and I, we are going to be friends” while waving to him/her (Figure 9). Then, if the participant was verbal, the robot asked his/her name, and repeated it. Second, the dance: the robot asked if the participant wanted it to dance, and then danced (Figure 10). During the interaction with the robot, all participants were with their caregiver. The caregivers were instructed to encourage the participants to look and answer to the robot.

Figure 9.

Nao greets a child.

Figure 10.

Nao dances for a child.

5.3. Data analysis

We analyzed the videos of the interaction with the robot and observed the parameters described in Table 3 for each participant. A first coder (first author) annotated all of the videos of the interaction. A second coder, unaware of the hypotheses of the setup, annotated a 21% of the videos, randomly selected. The intraclass correlation coefficient (ICC) was used to ensure intercoder reliability. The ICC score was of 0.99, indicating a very good reliability.

AD3 was removed from the statistical analysis as he was becoming more withdrawn from social interaction since few weeks, and unwilling to participate into the tasks. CH5 was removed from the statistical analysis on speech data as this participant is nonverbal. For the gaze and smile behavior analyses, we performed a one‐way ANOVA among the groups. We performed Fisher's exact test on the speech and gesture behaviors. The significance threshold was set to p < 0.05. We used Statistica version 13 to perform the analyses.

5.4. Results

The participants’ behaviors are described in Tables 4 and 5. Overall, except from AD6 (G1) and AD3 (G2), participants from all groups looked more than 60% of the time to the robot. No statistical evidence was found about the gazing behavior of our participants and its relations with the groups. However, we can still observe in Figure 11 the frequency/percentage distribution of the participants’ gaze toward the robots among the groups. Participants from group G3 appeared to show more interest (gaze more frequently) toward the robot than the two other groups, and participants from group G1 appeared to display fewer gaze toward the robot than the two other groups. Participants from all groups smiled during the interaction. No statistical evidence was found on smiling behavior and groups. However, we can still notice in Figure 11 that participants from group G1 appeared to smile more than participants from groups G2 and G3. The number of participants speaking to Nao by their own initiative was significantly different among groups (p < 0.05). Only one participant out of seven from group G1 responded by its own initiative to the robot, when five out of six participants from group G2 and three out of four participants from group G3 spoke to Nao by their own initiative. We can see that participants CH8, CH11, and AD2 from group G1 were the only participants not to respond to the robot with or without encouragement. No statistical evidence was found between social gesture and groups. However, we observed that participants from group G1 did not show social gesture toward the robot, but two participants from group G2 and two participants from group G3 did.

GroupsID#Gaze towardSmiles, laughterSpeech to NaoSocial gesture toward Nao
(waving back)
NaoCaregiverOtherBy his/her initiativeAfter being encourage by his/her caregiver
G1CH363.1%18.17%18.73%19.93%010
G1CH581.90%2.76%15.34%0%0
G1CH8100%0%0%100%000
G1CH1088.52%5.35%6.13%43%020
G1CH1183.28%0%16.72%0%000
G1AD261.73%34.13%4.13%71.22%000
G1AD483.05%14.15%2.81%91.47%410
G1AD623.14%0%76.86%8.99%010
G2CH174.61%1.59%23.8%0%010
G2CH488.31%0%11.69%0%200
G2CH785.13%4.21%%10.66%58.61%120
G2CH971.68%0%28.32%6.54%110
G2CH1281.24%7.74%11.02%4.85%222
G2AD194.47%2.33%3.20%46.37%402
G2AD318.87%2.04%79.1%3.26%020
G3CH291.86%0.79%7.35%0%111
G3CH699.09%0%0.91%22.09%203
G3AD599.12%0%0.88%61.81%100
G3AD766.70%5.58%27.72%12.34%200

Table 4.

Participants’ behavior during the interaction with the robot.

GroupsID#Comments
G1CH3He was really amazed by the robot, and switched his gaze to his caregiver to show his amazement
G1CH5Her caregiver was really impressed of the concentration she showed for the robot
G1CH8
G1CH10
G1CH11His caregiver was impressed that he looked that much the robot.
G1AD2She was more talking and looking at the caregiver than to Nao but was enthusiast to participate to the task
G1AD4She was really amazed by the robot and the task was corresponding of one of her routine: asking the name of the person near her
G1AD6
G2CH1
G2CH4He was really excited and impressed to see the robot (this boy asked to participate to the project)
G2CH7First, he was impressed by the robot and stand back. He finally approached it and gave it a kiss at the end of the interaction
G2CH9She was impressed by the robot and stand back. She moved several time across the room, without giving that much attention to the robot
G2CH12
G2AD1She was scared of the robot when its arms were moving toward her
G2AD3It has been reported that he was more withdrawn since few weeks, and unwilling to participate to tasks
G3CH2
G3CH6He was particularly enthusiast to see the robot moving and talking to him (saying his name, waving)
G3AD5He showed some reluctance toward the robot, by saying “we should put it in the garbage”
G3AD7He was reported by his caregiver by being really shy toward new things

Table 5.

Descriptive comments on the behavior of the participants during the interaction.

Figure 11.

Plot of the percentage of participants’ gaze toward the robot and smiles.

5.5. Conclusion

The presentation of the Nao robot to the ASD participants permitted us to introduce it as a social partner. Most of the participants answered to it and some showed social gesture toward it. This introduction to the robot may help the participants to interact easier with the robot in further experiments, as found in Ref. [48]. We also removed the “surprise” and “novelty” effect of the robot. Some participants showed to be slightly afraid and impressed by the robot. These participants seemed to be reassured at the end of the interaction. Participants showed numerous smiles, and looked toward the robot a great amount of time. The statistical analysis only showed a relation between the participants’ groups and their answer to Nao, when initiated by their own. Participants from group G3 showed more free speech to Nao than the two other groups, and participants from group G1 showed less free speech than the others. Still, we can observe that the participants from group G3 appeared to show more gaze at the robot and social gesture than participants from group G1, and that participants from group G1 showed fewer gazes to the robot and social gesture than participants from groups G2 and G3. However, participants from G1 appeared to show more smiles than participants from groups G2 and G3. Unfortunately, this first experiment did not permit us to validate that the behavior of the participants was linked to their proprioceptive and visual profiles. However, we have some encouraging results going in the direction of our hypothesis.

Advertisement

6. General conclusion and discussion

The long‐term goal of our work is to define individual profiles in order to develop a new personalized robot‐based social interaction for individual with ASD. We hypothesized that hyporeactivity to visual motion and an overreliance on proprioceptive information would be linked to difficulties in integrating social cues and in engaging in successful interactions. We worked with 19 children, teenagers, and adults with ASD from three care facilities.

Our first experiment enabled us to form three groups between our participants, describing each participant's response to visual and proprioceptive inputs. Based on our hypothesis, we made assumptions on the behaviors each individual will have during the human‐robot interaction sessions. With these results and our hypothesis H1, we were able to make assumptions on the behaviors that each individual will have during human‐robot interaction sessions. Therefore, we posit that individuals from group G1 will have less successful interactions than the ones from groups G2 and G3, and that individuals from group G3 will have the most successful interactions.

A first interaction with the robot was conducted. The robot Nao was presented to our 19 participants. The purpose of this experiment was to present the robot Nao as a social companion and to avoid fear or stress toward the robot in future experiment. We did not observe a direct link between the behavior of the participants toward the robot and their proprioceptive and visual profiles, but we still found encouraging results going in the direction of our hypothesis. As it was already seen in SAR almost all of our participants (children, teenagers, and adults) showed great interest to their new mechanical companion.

Defining such individual profiles prior to social interactions with a robot could provide promising strategies for designing successful and adapted HRI for individuals with ASD. The behavior of our participants toward emotion recognition [49] and joint attention [50] has already been studied. We are currently planning to investigate these issues in repetitive interaction involving imitation, where the behavior of the robot is adapted to the profile of the participants.

References

  1. 1. American Psychiatric Association and Others. Diagnostic and Statistical Manual of Mental Disorders (DSM‐5). American Psychiatric Association: Pub; 2013.
  2. 2. Charman, T, Swettenham, J, Baron‐Cohen, S, Cox, A, Baird, G, and Drew, A. Infants with autism: an investigation of empathy, pretend play, joint attention, and imitation. Developmental Psychology. 1997;33(5):781.
  3. 3. Hart, M. Autism/excel study. Proceedings of the 7th International ACM SIGACCESS Conference on Computers and Accessibility. 2005:136–141.
  4. 4. Tapus, A, Peca, A, Aly, A, Pop, C, Jisa, L, Pintea, S, Rusu, AS, and David, DO. Children with autism social engagement in interaction with Nao, an imitative robot-A series of single case experiments. Interaction Studies. 2012;13(3):315–347.
  5. 5. Kim, ES, Berkovits, LD, Bernier, EP, Leyzberg, D, Shic, F, Paul, R, and Scassellati, B. Social robots as embedded reinforcers of social behavior in children with autism. Journal of Autism and Developmental Disorders. 2013;43(5):1038–1049.
  6. 6. Wainer, J, Dautenhahn, K, Robins, B, and Amirabdollahian, F. Collaborating with Kaspar: Using an autonomous humanoid robot to foster cooperative dyadic play among children with autism. 10th IEEE‐RAS International Conference on Humanoid Robots (Humanoids), 2010. 2010:631–638.
  7. 7. Kozima, H, Nakagawa, C, and Yasuda, Y. Interactive robots for communication‐care: A case‐study in autism therapy. IEEE International Workshop on Robot and Human Interactive Communication, 2005. ROMAN 2005. 2005:341–346.
  8. 8. Haswell, CC, Izawa, J, Dowell, LR, Mostofsky, SH, and Shadmehr, R. Representation of internal models of action in the autistic brain. Nature Neuroscience. 2009;12(8):970–972.
  9. 9. Goodwin, GM, McCloskey, DI, and Matthews, PBC. Proprioceptive illusions induced by muscle vibration: Contribution by muscle spindles to perception? Science. 1972; 175(4028):1382–1384.
  10. 10. Burke, D, Hagbarth, K‐E, Löfstedt, L, and Wallin, BG. The responses of human muscle spindle endings to vibration of non‐contracting muscles. The Journal of Physiology. 1976;261(3):673–693.
  11. 11. Roll, JP, and Vedel, JP. Kinaesthetic role of muscle afferents in man, studied by tendon vibration and microneurography. Experimental Brain Research. 1982;47(2):177–190.
  12. 12. Ferrell, WR, Gandevia, SC, and McCloskey, DI. The role of joint receptors in human kinaesthesia when intramuscular receptors cannot contribute. The Journal of Physiology. 1987;386(1):63–71.
  13. 13. Edin, BB. Cutaneous afferents provide information about knee joint movements in humans. The Journal of Physiology. 2001;531(1):289–297.
  14. 14. Gowen, E, and Hamilton, A. Motor abilities in autism: A review using a computational context. Journal of Autism and Developmental Disorders. 2013;43(2):323–344.
  15. 15. Bagust, J, Docherty, S, Haynes, W, Telford, R, and Isableu, B. Changes in rod and frame test scores recorded in schoolchildren during development-A longitudinal study. PloS One. 2013;8(5):e65321.
  16. 16. Assaiante C. Development of locomotor balance control in healthy children. Neuroscience & Biobehavioral Reviews. 1998;22(4):527–532.
  17. 17. Simmons, DR, Robertson, AE, McKay, LS, Toal, E, McAleer, P, and Pollick, FE. Vision in autism spectrum disorders. Vision Research. 2009;49(22):2705–2739.
  18. 18. Liu, W, and Chepyator‐Thomson, JR. Field dependence‐independence and physical activity engagement among middle school students. Physical Education and Sport Pedagogy. 2009;14(2):125–136.
  19. 19. Saracho, O. Matching teachers’ and students’ cognitive styles. Early Child Development and Care. 2003;173(2–3):161–173.
  20. 20. Coates, S, Lord, M, and Jakabovics, E. Field dependence‐independence, social‐non‐social play and sex differences in preschool children. Perceptual and Motor Skills. 1975;40(1):195–202.
  21. 21. Bar‐Haim, Y, and Bart, O. Motor function and social participation in kindergarten children. Social Development. 2006;15(2):296–310.
  22. 22. Greffou, S, Bertone, A, Hahler, E‐M, Hanssens, J‐M, Mottron, L, and Faubert, J. Postural hypo‐reactivity in autism is contingent on development and visual environment: a fully immersive virtual reality study. Journal of Autism and Developmental Disorders. 2012;42(6):961–970.
  23. 23. Kohen‐Raz, R, Volkman, FR, and Cohen, DJ. Postural control in children with autism. Journal of Autism and Developmental Disorders. 1992;22(3):419–432.
  24. 24. Gepner, B, Mestre, D, Masson, G, and de Schonen, S. Postural effects of motion vision in young autistic children. NeuroReport. 1995;6(8):1211–1214.
  25. 25. Gepner, B, and Mestre, DR. Brief report: postural reactivity to fast visual motion differentiates autistic from children with Asperger syndrome. Journal of Autism and Developmental Disorders. 2002;32(3):231–238.
  26. 26. Feil‐Seifer, D, and Mataric, M. Defining socially assistive robotics. 9th International Conference on Rehabilitation Robotics, 2005. ICORR 2005. 2005:465–468.
  27. 27. Dautenhahn, K, Nehaniv, CL, Walters, ML, Robins, B, Kose‐Bagci, H, Mirza, NA, and Blow, M. KASPAR-A minimally expressive humanoid robot for human‐robot interaction research. Applied Bionics and Biomechanics. 2009;6(3–4):369–397.
  28. 28. Scassellati, B, Admoni, H, and Mataric, M. Robots for use in autism research. Annual Review of Biomedical Engineering. 2012;14:275–294.
  29. 29. Salter, T, Michaud, F, and Larouche, H. How wild is wild? A taxonomy to characterize the ‘wildness’ of child‐robot interaction. International Journal of Social Robotics. 2010;2(4):405–415.
  30. 30. Thill, S, Pop, CA, Belpaeme, T, Ziemke, T, and Vanderborght, B. Robot‐assisted therapy for autism spectrum disorders with (partially) autonomous control: Challenges and outlook. Paladyn, Journal of Behavioral Robotics. 2012;3(4):209–217.
  31. 31. Brown, C, and Dunn, W. Adolescent‐Adult Sensory Profile: User's Manual. Therapy Skill Builders, Tucson, AZ; 2002.
  32. 32. Jones, CRG, Happe, F, Baird, G, Simonoff, E, Marsden, AJS, Tregay, J, Phillips, RJ, Goswami, U, Thomson, JM, and Charman, T. Auditory discrimination and auditory sensory behaviours in autism spectrum disorders. Neuropsychologia. 2009;47(13):2850–2858.
  33. 33. Ferguson, H, Myles, BS, and Hagiwara, T. Using a personal digital assistant to enhance the independence of an adolescent with Asperger syndrome. Education and Training in Developmental Disabilities. 2005;40(1):60–67.
  34. 34. Crane, L, Goddard, L, and Pring, L. Sensory processing in adults with autism spectrum disorders. Autism. 2009;13(3):215–228.
  35. 35. Dunn, W. The Sensory Profile: User's Manual. Psychological Corporation, San Antonio, TX; 1999.
  36. 36. Bray, A, Subanandan, A, Isableu, B, Ohlmann, T, Golding, JF, and Gresty, MA. We are most aware of our place in the world when about to fall. Current Biology. 2004; 14(15):R609–R610.
  37. 37. Isableu, B, Fourre, B, Vuillerme, N, Giraudet, G, Amorim, M‐A. Differential integration of visual and kinaesthetic signals to upright stance. Experimental Brain Research. 2011;212(1):33–46.
  38. 38. Kluzik, J, Peterka, RJ, and Horak, FB. Adaptation of postural orientation to changes in surface inclination. Experimental Brain Research. 2007;178(1):1–17.
  39. 39. Isableu, B, and Vuillerme, N. Differential integration of kinaesthetic signals to postural control. Experimental Brain Research. 2006;174(4):762–768.
  40. 40. Isableu, B, Ohlmann, T, Cremieux, J, Vuillerme, N, Amblard, B, and Gresty, MA. Individual differences in the ability to identify, select and use appropriate frames of reference for perceptuo‐motor control. Neuroscience. 2010;169(3):1199–1215.
  41. 41. Ravaioli, E, Oie, KS, Kiemel, T, Chiari, L, and Jeka, JJ. Nonlinear postural control in response to visual translation. Experimental Brain Research. 2005;160(4):450–459.
  42. 42. Greffou, S, Bertone, A, Hanssens, J‐M, and Faubert, J. Development of visually driven postural reactivity: A fully immersive virtual reality study. Journal of Vision. 2008;8(11):15.
  43. 43. Van Asten, WNJC, Gielen, CCAM, and van der Gon, JJD. Postural adjustments induced by simulated motion of differently structured environments. Experimental Brain Research. 1988;73(2):371–383.
  44. 44. Schmuckler, MA. Children's postural sway in response to low‐and high‐frequency visual information for oscillation. Journal of Experimental Psychology: Human Perception and Performance. 1997;23(2):528.
  45. 45. Chiari, L, Rocchi, L, and Cappello, A. Stabilometric parameters are affected by anthropometry and foot placement. Clinical Biomechanics. 2002;17(9):666–677.
  46. 46. Molloy, CA, Dietrich, KN, and Bhattacharya, A. Postural stability in children with autism spectrum disorder. Journal of Autism and Developmental Disorders. 2003;33(6):643–652.
  47. 47. Torres, EB, Brincker, M, Isenhower, RW, Yanovich, P, Stigler, KA, Nurnberger, JI, Metaxas, DN, and José, JV. Autism: The micro‐movement perspective. Frontiers in Integrative Neuroscience. 2013;7:32.
  48. 48. Meltzoff, AN, Brooks, R, Shon, AP, and Rao, RPN. “Social” robots are psychological agents for infants: A test of gaze following. Neural Networks. 2010;23(8):966–972.
  49. 49. Chevalier, P, Martin, J‐C, Isableu, B, Bazile, C, and Tapus, A. Impact of sensory preferences of individuals with autism on the recognition of emotions expressed by two robots, an avatar, and a human. Autonomous Robots. 2016;40(5):1–23.
  50. 50. Chevalier, P, Martin, J‐C, Isableu, B, Bazile, C, Iacob, DO, and Tapus, A. Joint attention using human‐robot interaction: Impact of sensory preferences of children with autism. 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). New York; 2016; 849–854.

Written By

Pauline Chevalier, Brice Isableu, Jean‐Claude Martin, Christophe Bazile and Adriana Tapus

Submitted: 24 May 2016 Reviewed: 22 December 2016 Published: 12 April 2017