CrossMark
Available online at www.sciencedirect.com
ScienceDirect
Procedia Computer Science 67 (2015) 241 - 251
6th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Infoexclusion (DSAI 2015)
Do children in the spectrum of autism interact with real-time emotionally expressive human controlled avatars?
Olga Mantziou, Ioannis Vrellis,Tassos A. Mikropoulos*
Educational Approaches to Virtual Reality Technologies Laboratory, Department of Primary Education, The University of Ioannina, Ioannina 45110, Greece
Abstract
Children in Autism Spectrum Disorder (ASD) are characterized by impairments in social skills and they usually face difficulties in recognizing facial emotion expressions. Early intervention and treatment is of major concern and therefore a number of detection and teaching methods have been developed, including the use of ICT. This article presents a brief but extensive literature review on the way tutors are represented in digital environments. The results showed that there is a need for further investigation on the effectiveness of the methods used for the interaction of children on ASD with technology tools, as well as on knowledge transfer. Since there is a lack of empirical data concerning the preference of different interaction modalities by children with ASD, this article also reports on an exploratory study conducted to investigate the acceptance and preference of three different real-time modalities used in facial emotion recognition by two children with ASD (low and high functioning autism). The results indicated a discrepancy between the two children which can be mainly attributed to the differences accompanied the categorization of children with ASD in low and high functioning autism. © 2015 TheAuthors.PublishedbyElsevierB.V.This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).
Peer-review under responsibilityof organizingcommitteeofthe6thInternationalConference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion (DSAI 2015)
Keywords: Autism Spectrum Disorder; Facial Emotion Recognition; low and high functioning autism; avatar
1. Introduction
Autism Spectrum Disorder (ASD) is a term characterizing a group of disorders which includes Autism, Asperger, Rett's and Childhood Disintegrative Disorders as well as Pervasive Developmental Disorder-Not Otherwise Specified [1]. This range of disorders is under the umbrella of Pervasive Developmental Disorders and
* Corresponding author. Tel.: +30-2651005697; fax: +30-2651005854. E-mail address:amikrop@uoi.gr
1877-0509 © 2015 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.Org/licenses/by-nc-nd/4.0/).
Peer-review under responsibility of organizing committee of the 6th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-exclusion (DSAI 2015) doi: 10.1016/j.procs.2015.09.268
therefore it is related with disorders in child development which is of major concern. Demographic characteristics show that there is an alarming increase of people who are diagnosed with ASD [2] while early intervention and treatment of its symptoms is crucial to child development which in turn leads to an immediate need for attention from the scientific community [3].
According to the American Psychiatric Association Diagnostic and Statistical Manual-DSM-IV-TR [1] and International Classifications of Diseases- ICD-10 [4] there is a triad of diagnosis criteria for ASD and especially Autism. These concern qualitative impairments in social interaction and communication (verbal and non-verbal) as well as restricted repetitive and stereotyped patterns of behavior, interests and activities. In Asperger syndrome the difference is that there is no recorded delay in language or cognitive development. The above impairments also include deficits in emotion processing: the use of multiple nonverbal behaviors such as eye-to-eye gaze, facial expression, body postures, and gestures to regulate social interaction. In accordance with Herba & Phillips [5] and Beeger et al. [6] these items reflect deficits in production of emotional state and regulation of that state, the two core elements of emotion processing. Deficits in the identification of emotional cues including facial ones, it is not a factor accounting with ASD diagnosis. Nevertheless, it is commonly assumed as a critical factor in studying the behavior of children with ASD. More specifically, a large number of individuals with autism present difficulties in recognizing facial emotion expressions, especially those representing complex emotions (e.g., jealousy, embarrassment, pride) or mental states [7, 8]. These difficulties could have their origins either in impairment in configural processing of facial emotion recognition, which often leads children with ASD to attend only local facial features, making them incapable of holistic emotional perception [9] or in abilities concerning Theory of Tom (the ability to attribute mental states to oneself and to other people) [10]. Some other researchers attribute the deficits in neurological damages such as atypical amygdale functioning hypothesis [11].
The literature reveals that there is no consistency among researchers about a typical profile of characteristics with ASD syndrome regarding the recognition of facial emotional expressions. This is attributed to the high inter-variability presented among individuals with ASD which is usually more related to Intellectual Quotient (IQ) of individuals and less to other demographic characteristics. Indeed, there is no upper ceiling in IQ of a person with ASD. Hence in case of Autism Disorder, researchers usually refer to high functioning autism (normal or high IQ) and low functioning autism (low IQ) to mention the difference in cognitive skills. Moreover, some researchers assume that there are probably compensatory mechanisms in individuals with ASD which account for the performance in cognitive tasks and thus lead to the appearance of atypical processing style in every individual [7]. However, regardless the range or the intensity of the recorded deficits in abilities associated to the expression or recognition of face expressions, it can be said that these reflect difficulties in social communication because human face is regarded as fundamental in both production and recognition of emotions and in communication. Thus, it is extremely important for children in the spectrum of autism to learn how to recognize facial emotion expressions in order to facilitate their social life.
Intervention and assessment methods related to Facial Emotion Recognition (FER) apart from the traditional ones are supported by Information and Communication Technologies (ICT). There are many studies that investigate the effect of various intervention methods in FER including Virtual Reality (VR). However, the majority of the studies use pre-constructed learning material (pictures, videos, pre-animated avatars) in order to represent the various emotional expressions to be learned. Regardless the way tutors are represented (e.g. pre-animated avatars) there is a lack of a real-time communication between children and tutors. Furthermore, the acceptance of the technological tool from the participant with ASD is taken for granted.
This article presents a brief and concise literature review on the way the tutor is represented in FER treatment. Moreover, the article presents an exploratory study regarding the acceptance of emotionally expressive human controlled avatars.
2. Literature Review
This is an extensive review on the way tutors are represented in digital environments as well as on the interaction between children and the various facial expressions representations.
Most ICT-based intervention methods focus on the recognition of basic emotions from pictures or photographs. Bolte and colleagues [12] have developed a program for teaching and assessment of FER which consisted of 1000
photographs of male and female adult actors which reflected basic facial expressions, namely anger, disgust, fear, happiness, sadness and surprise [13]. Ten adolescent participants with high functioning autism and Asperger syndrome had to accomplish tasks regarding the matching of an emotion word label on the screen with the photo or part of the photo (eyes, mouth) presented. According to the authors, the results proved the effectiveness in teaching FER with such learning material.
Recent approaches to teaching facial emotion expressions to individuals with ASD use more interactive game-based solutions. "FaceSay" [14], a software application for teaching facial expressions, was presented to 49 students (6-15 years) with high and low-functioning autism and appeared to improve their skills in this area. The tasks included eye-gaze attention and facial expressions recognition from photos of persons. "Let's Face It! Emotion Skills Battery" is a tool developed by Tanaka et al. [15] in order to assess the ability of FER in children with autism and was presented in 85 adolescents and young adults, as well as in a control group of 130 typically developing participants. All users completed assignments such as matching word labels to facial emotions, emotional expressions of a particular face to the same facial expression of three probe faces of different identities and a facial emotion expression to the suitable isolated part item of it. The material consisted of photos of people depicting the six basic facial emotions. In summary, findings showed that the participants with ASD recognized the six basic facial emotional expressions, but failed to generalize them across different identities or to use holistic approaches to facial emotion recognition. Furthermore, another software tool called "Mind Reading" [16] was administered to 22 participants with high functioning autism and Asperger. "Mind Reading" is comprised of a library of videos of facial emotional expressions and mental states. Each emotion is defined in silent films of faces, voice recordings and written stories of situations. Access to this emotion database happens through the use of the following applications: (a) an emotion library where users can have a look into the material of emotions and make comments, notes or even comparisons, (b) a learning centre where in a more structured and repetitive manner there are lessons and quizzes, and (c) a game zone. According to the assessment results which followed the teaching sessions, students improved well in the recognition of complex and mental states but didn't perform well on tasks of generalizations. The same software was also used by Lacava et al. [17] in a study where four boys with ASD were instructed in FER and social behavior change. The boys improved in both types of tasks, although no strong causal relations among the variables were measured. The effectiveness of another educational program, the "Transporters", especially designed for young children with autism (2-8 years) was investigated in three studies. The software includes fifteen episodes about key emotional feelings or mental states (happy, sad, angry, afraid, disgusted, surprised, excited, tired, unfriendly, kind, sorry, proud, jealous, joking and ashamed) and quizzes. The key characters are vehicles such as trains, trams, trucks with a human face featured in front of them. In a study of Golan and colleagues [18] children with ASD (4-7 years) watched the Transporters DVD for a period of four weeks and they were assessed in emotional word definition as well as in matching familiar or novel situations and facial expressions. The authors concluded that this method led to the significant development of emotion recognition in faces. Moreover, Young & Posselt [19] evaluated the Transporters application and compared it to a control DVD (Thomas the tank engine, a TV series). They conducted a study with children on ASD (4-8 years). The results indicated a significant improvement in emotional expression recognition which was assessed by showing pictures of basic and complex facial expressions and the NEPSY-II Affect Recognition Task. Finally, Williams, Gray & Tonge [20] were experimented in teaching facial emotional expressions in children with Autism and co-morbid Intelligence Disability (4-7 years). Again, a comparison method was used including the control DVD of "Thomas the tank engine". Overall, findings showed limited efficacy in teaching basic emotion recognition skills to young children with autism with a lower range of cognitive ability. The improvement was limited to the recognition of expressions of anger, with poor maintenance of these skills at follow-up.
Furthermore, a number of computer-based game-tools, similar to those mentioned above were designed for the treatment of facial emotional recognition but do not show yet empirically validated results [21, 22, 23, 24]. Kaliouby & Robinson [21] presented the "Emotional Hearing Set, EHS", a portable assistive tool designed for people with mild symptoms of ASD or Asperger Syndrome. Based on the assumption that the "reliability of facial expressions as indicators of emotion is significantly improved when they are perceived in relation to contextual events instead of in isolation", the EHS set combines facial emotional expressions with "temporal and contextual clues" presented in the frames of the video showing a facial expression. In addition, Tseng & Yi-Luen Do [23] designed for the same purposes a tool named "Facial Expression Wonderland, FEW" with cartoons based on the story of "Alice in
Wonderland". The users have to match an emotion to the facial expression presented in a cartoon story or check options/stories according a specific facial expression.
In some other studies, researchers experimented with the use of avatars in teaching and assessment of facial emotion expressions. Miranda et al. [25] developed the game "LIFE is GAME, where children with ASD are encouraged to: (a) identify a specific expression from a set of presented expressions regarding full faces, half faces or mixed faces of avatars in comic cartoons format, (b)construct a facial expression in order to match a specific emotion, (c) modify the expression of an avatar according to instructions given or (d) perform some expressions according to the situations presented in stories. The game was presented for assessment reasons to nine participants with Autism (six with high functioning autism) and Asperger syndrome who played only the first session of it, the identification of facial expressions. According to preliminary results, there was a positive relation between the use of the game and the development of facial emotional recognition skills. Moreover, Fabri et al. [26] developed an Instant Messaging tool called "Virtual Messenger" which "allowed two users to virtually enter a meeting place in order to communicate and collaborate on a given task. Users see each other's' virtual representations and chat with each other, as well as visibly express emotions via their animated avatar heads ". Although authors assumed that such tools can be useful in teaching facial emotions to people with ASD, in a case study of 30 individuals with high functioning autism and four with low functioning autism [27, 26] they developed a single user computer system which concerned four basic facial expressions (happy, sad, angry and frightened) and included three stages related to identification of facial emotion expression of a drawing character, for the prediction of emotion caused in certain drawing events presented and vice-versa. Overall, the children with high functioning autism managed to use the game at a level higher than that of chance. Instead, the children with low functioning autism presented difficulties in understanding the emotions of avatars. Thus, the authors suggested the study of the use of this specific tool in individuals with high functioning autism for more complete validity and noted the need for further work to check a long-term benefit for participants. In another study, Valeria &Theng [28] developed the tool "Learn with Me" which is based on the concept of Affecting Tutoring Systems (ATS)."Learn with Me" includes videos and songs related to specific basic emotions as a learning content for teaching and assessment and a Behavior Monitoring System which recognizes the facial expressions of the user through a typical camera. According to the information of facial expressions and the labels that are matched to them, the fourth component, a Virtual Tutor includes a video format of a real tutor who responds and encourages the user to continue with the learning material. The results from an experiment conducted to evaluate the effectiveness of the tool to children with disabilities aged 8-15 years (10 of which diagnosed with ASD) showed positive outcomes for students' performance towards the technological tool and the learning content. Another system was designed by Cheng & Ye [29] which represents two settings, a classroom and an outdoor setting. Therefore, two 3D animated social context scenes with reading sound were presented to three individuals with high functioning autism who were asked to answer to questions related to the stories. The communication was held via a set of communication channels (speech, text-communication and 3D expressive avatar) and the results were positive. Konstantinidis, Luneski & Nikolaidou [30] used ACALPA (Affective Computer-Aided Learning Platform for Children with Autism) to support the teaching process of educators working in a special school for people with autism. ACALPA uses avatars, synthesized speech, images, videos and other multimedia content and is broken down in several modules with different learning activities such as identification of emotions from visual expressions or through the use of semi-virtual contexts. According to an empirical study conducted in a specialized school with 50 persons with ASD, the use of ACALPA revealed a clear potential of this tool in teaching emotions in individuals with autism. Furthermore, by enhancing a VR design package (Vizard) in teaching four adolescents with ASD, Lahiri and colleagues [31] assumed that there was significant potential of using VR-based systems in the improvement of social-appropriate mechanisms in conversational tasks. The learning modules regarded (a) the narrating of stories by an avatar which makes some pointing gestures and moves dynamically while in the background a relevant scene was displayed and (b) a pre-constructed conversation among the avatar and the participant through the use of a menu-driven structure. Bekele et al. [32] also tested the usability of a dynamic VR (based on the UNITY game engine) and eye-tracking system in 20 teenagers (ten with ASD). The tasks that included identification of emotional facial expressions (joy, surprise, contempt, sadness, fear, disgust, and anger) displayed by seven avatars. Each emotion was divided into four animations according to the four most commonly intensity levels of emotion appearance (low, medium, high, and extreme).The results didn't show any discrepancy among the two groups at basic affect recognition regardless the intensity.
An emphasis on facial emotion expressions in conjunction with dialogue disambiguation was given by Grynszpan et al. [33] who designed the software game "What to choose". A dialogue that contained pragmatic subtleties and an avatar face displaying one of the sixth basic emotions were presented to users who had to choose the correct assertion associated to them. Two groups were formed, a clinical consisting of 10 teenage boys diagnosed with high functioning autism and a control one. According to the results of pre and post evaluations a potential in the use of such modalities was detected.
The utilization of 3D MUVEs (Multi User Virtual Environments) was investigated byKandalaft et al. [34] in teaching social skills, social cognition and social functioning.The platform of Second Life was used for developing social scenarios such as "meeting new people, dealing with a roommate conflict, negotiating financial of social decisions and interviewing for a job ". Eight adolescents with Asperger Syndrome and Pervasive Developmental Disorder-Not Otherwise Specified logged into the MUVE and completed a series of tasks associated with various learning objectives including emotional expressions of others in voice, face and social contextual cues. Although in the environment of SL avatars cannot express emotional cues in faces, and the present study did not specifically address emotion recognition from faces, the results indicated that there was an improvement in that area which according to authors it can be attributed to a treatment's affect in a socially relevant domain of social cognition.
A different approach in the treatment of facial emotion expressions in ASD is proposed by Deriso and colleagues [35] who have developed "Emotion Mirror". The method focuses on the real time feedback on child's production of a face emotion expression. Consequently, the system with the use of a face tracking program, "mirrors" the face expressions of the user on screen with the help of cartoon human or animal characters. An intervention included the mimic of the avatar expressions and vice versa in order to exceed a child's performance in expressions of emotions with face motor movements. In a more sophisticated way, Tsay & Lin [36] assumed that implementing interaction among the user and an educational game in terms of displaying the participant's facial expression into a computer based story line, may be an effective way in teaching facial emotion expressions to children with ASD. Hence, they have designed a game where the user has to recognize and express them the same emotion in order to make the flow of the story to go on. Both studies propose future suggestions for treatment, without presenting any empirical study.
Finally, robotic technology has been also used in technology intervention methods for the treatment of deficits in facial recognition in children with ASD. Kaspar is a robot that can make all of the basic facial expressions and it is used in treatment of ASD. In a study by Dickerson, Robins and Dautenhahn [37] the interaction involving a child with ASD and the humanoid robot was examined and according to authors there is a potential value in such methods.
The aforementioned studies reveal a number of issues for further discussion. Although the interventions presented show certain improvement in recognition of facial expressions, they are limited in terms of generalization of their effects. There are mainly two factors accounting to that: (a) the specific difficulties children with ASD face in transferring knowledge into new domains, especially in terms of socialization which create the need for the development of appropriate tasks and (b) the acceptance of interaction with the technological tools. Moreover, it is noted that individuals with ASD lack imagination and thus they find difficulties in knowledge transfer [38]. Therefore, intervention methods should provide more real-world training and learning material and tasks which represent and not only resemble isolated cases and parts of reality.
In the above interventions, the types of the stimuli most used vary and include line drawings, schematic faces and photographs or in some cases videotaped facial expressions of real people. In case of drawings or schematic faces of humans, animals or other graphical figures, facial expressions are defined as a group of particular face parts in the same configuration, whereas in photos static images of prototypical expressive patterns are presented. In both cases, facial expressions are presented as slides of emotional facial motions. This raises two questions concerning the generalization and validity of the learning tasks. Firstly, it is doubtful whether a representative figure such as an animal face or a cartoon character can be generalized in human face characteristics. Secondly, facial expressions are presented isolated from the context and therefore no relevant information is presented (e.g. a specific situation) which provides a set of cues that contribute to the comprehension of emotions. In some cases, this was an issue addressed by implementing photos of the relevant scene, text and drawings of the situation. However, all the above cannot replicate a real situation unless imagination is used, something that is difficult for children with ASD. Furthermore, there are a number of factors accounting in facial emotion decoding such as regulation, low-intensity behavior and particularly dynamic measures [39].
In this line of research, although there is a consensus among educators concerning certain stereotypical patterns which represent emotions or mental states, that emotions can be represented in more than one facial expression [40]. Moreover, facial emotion expressions in real life do exist in a dynamic form. The comparison between static and dynamic stimuli of facial expressions was investigated by a number of researchers [41, 42] who concluded that dynamic information displayed can play a more significant role in facial emotion recognition.
Concerning the learning tasks, they usually regard the matching of the following:
• drawings or photos of facial expressions to verbal labels
• drawings or photos of facial expressions across identity
• videotaped facial expressions to photographs of faces or parts of them and schematic drawings.
In case of ASD there is a need to investigate the effectiveness of such tasks according to the learning profile of the children with a syndrome under the umbrella of ASD. For example, children with Asperger syndrome and children with high functioning autism are capable of learning through language-based (verbal) activities. However, in the case of low functioning autism this is not effective as there is impairment in language. Usually these children don't talk at all or they use a set of few basic words under the tutor guidance. Therefore, such tasks are not effective in learning settings. Furthermore, the interaction is significant to learning procedure. Since children in this population face problems in social activities (and learning activities are a subcategory of them), it is rather surprisingly that there is no relevant research.
Therefore, this article proposes the design of a real-time emotionally expressive human controlled avatar to teach FER to children with ASD. As a pilot empirical evaluation, an exploratory study was conducted with two individuals. The aim was to compare this novel modality (real-time avatar) with two more conventional ones (videoconference and face to face interaction) in terms of preference and acceptance by the children.
3. Exploratory Study
The aim of this study was to investigate the feasibility of VR technology (real-time emotive avatar) for the treatment of FER deficits in individuals with ASD. More specifically, the study compared the acceptance and preference of this modality with both face to face and videoconference modalities. The study reports on two case studies of children with autism (one with low-functioning autism and one with high-functioning autism). It is noted that studies in special education often involve case study designs [43] because there is an extremely large variation in symptomatology or inner-individual characteristics.
3.1. Experimental Protocol
The experimental protocol was comprised of four sessions. The first one was dedicated to the establishment of a relation among the child and the educator. The next three sessions were dedicated to the presentation of the three modalities to students,(a) face-to-face communication, (b) videoconference and (c) avatar.
In face to face communication the experimenter had a real communication interaction with the children with ASD and mimicked the basic emotional states in his face.
Skype was used in order to implement the videoconference sessions. Figure 1 shows the experimenter expressing the six basic emotions during a videoconference session.
Finally, in order to create a real-time emotionally expressive avatar the Mixamo Face Plus plugin for Unity was used. This plugin allows real-time control of the facial expressions of a 3D character by capturing the facial expressions of the user via the webcam. One of the demo 3D characters provided with the plugin, the "Battery Boy", was employed, after simplifying its appearance (the sci-fi armor was removed).The avatar was presented to the subjects though an audio only Skype session with desktop sharing.Figurel shows the experimenter expressing the six basic emotions during an avatar session.
Fig. 1.Videoconference and avatar session screenshots (a) happiness; (b) sadness; (c) anger;(d) fear;(e) surprise;(f) disgust.
3.2. Case 1: VM, high-functioning autism
3.2.1. Social and school background
VM is a nine year-old boy who is attending the third grade of a mainstream primary school. He is a newcomer at this school, since his family moved recently from another town.
3.2.2. Psychological and learning profile
VM displays some stereotypical behaviors and some rigidness in free motion. He also shows certain deficit in social skills and a self-isolating tendency. He has been diagnosed with high-functioning autism. Regarding his cognitive abilities, no learning difficulties are observed. VM responds satisfactorily to gross and fine motor skills tests and is capable of accomplishing all assigned cognitive tasks. He only displays some writing difficulties related to the disciplines of Language and Mathematics. Nevertheless, VM doesn't fall short of the skills that are expected of a child of his age.
The teacher had difficulties approaching VM and creating a relationship, due to his deficit in social skills. It took a considerable amount of time and many different approaches were made by the teacher, in order to build a trusting relationship. Once the relationship with his teacher was established, VM was open to learning tasks, especially those
involving the use of computer. He also started to feel comfortable at the inclusion classroom and could coexist with others without problems. Nevertheless, VM would not cooperate with others, except with his classmate Alban.
3.2.3. Facial emotion recognition
As a member of the inclusion classroom of the school, VM is supervised by the school psychologist, who performed some casual testing, regarding associative attention, imitation and facial emotion recognition on him. VMs' performance was satisfactory. Although he recognized all six basic emotions (happiness, sadness, anger, fear, surprise, disgust) in a face to face condition, he recognized only four (happiness, sadness anger and fear) when presented with photographs of the same person. The most difficult emotions to recognize were surprise and disgust.
During the sessions of the present study (face to face, videoconference, avatar), VM only recognized four emotions (happiness, sadness, anger and fear). When asked to imitate the emotions himself, he succeeded only in the case of happiness by showing a smile. In the case of the other emotions, he displayed irrelevant facial expressions.
VM seems to recognize the basic emotions when presented as dynamic stimuli, but his performance degrades when emotions are presented as static stimuli.
3.2.4. Interaction with various modalities (face to face, videoconference, avatar)
Regarding the interaction of VM with the three modalities, the following have been recorded:
• Videoconference: when VM first saw the experimenter on the screen, he reached out for the hand of his teacher for reassurance and would only look at the screen peripherally. Nevertheless, after a while, VM became more receptive and seemed to enjoy the experience. Although he did not succeed to recognize and imitate most of the emotions, he tried willingly. This willingness to communicate was something that his teacher had not observed before.
• Face to face: the interaction was very good. VM held sufficient eye contact with the experimenter and replied to his casual questions (hi, how are you, do you like computers?). He also pointed that he remembered the experimenter from the videoconference session (which proceeded the face to face session), where he was "making faces". VM was also positive in the possibility of a future visit of the experimenter.
• Avatar: VM was very receptive to the avatar. He held a constant eye contact with it and replied to the casual questions promptly. When asked if he liked the appearance of and the interaction with the avatar, he answered positively.
Overall, VM was focused on the cognitive task of emotion recognition and imitation, regardless of the modality. He also seemed to enjoy the interaction with the experimenter, regardless of the context.
3.3. Case 2: SG, low-functioning autism
3.3.1. Social and school background
SG was early diagnosed with developmental disorders in the spectrum of autism. He receives various educational interventions like special pedagogy, speech therapy, therapeutic horseback riding, drama therapy and music (playing drums).
3.3.2. Psychological and learning profile
SG wants to be in constant motion and usually hums a rhythm that does not correspond to any particular song. He does not initiate communication, but he is receptive to certain instructions from persons in his close environment or sometimes from his teachers who try to communicate with him.
SG displays low functioning autism characteristics, so he has a substantial deficit in social skills, communication and interaction. His verbal speech is composed only of single words or phrases of two or three words, and is not spontaneous. He can greet if asked, reply with a single "yes" or "no", or reply with a single word when addressed with a two-choice question (e.g. would you like water or candy?). He cannot write at all.
3.3.3. Facial emotion recognition
Initially, SG could not recognize any of the basic emotions during his interaction with his teacher. After a lot of effort he could recognize happiness and could express it on his face with a smile. He also could satisfactorily
recognize and express the feeling of sadness. No other emotions could be learned, mainly due to his inability to focus on a task and his deficits in speech perception and expression.During the sessions of the study (face to face, videoconference, avatar), SG could not recognize any emotion presented on any modality.
3.3.4. Interaction with various modalities (face to face, videoconferencing, avatar)
Regarding the interaction of SG with the three modalities, the following have been recorded:
• Face to face: At first, SG was somehow receptive to the communication with the experimenter although he did not seem to enjoy it. The interaction was not spontaneous and it was directed by his teacher who guided him to greet, make eye contact, and say some basic words. When SG asked to recognize the emotion of happiness or sadness shown by the experimenter, he merely repeated the last words of his teacher suggestions (is he happy or sad?). Later, SG started to avoid the communication and physical proximity with the experimenter.
• Videoconference: SG did not show any interest in communicating with the experimenter through videoconferencing. He did not want to focus on screen or to the task requirements.
• Avatar: Similarly, SG did not show any interest in communicating with the experimenter via the avatar. He was spinning on a swivel chair while humming an intelligible rhythm.
Overall, SG did not show any interest to communicate with the experimenter through the computer. This might be related to his general negative attitude to computer-based activities and to his inability to concentrate on any given cognitive task.
4. Conclusion
This article reviewed the ICT tools use for the treatment of facial emotion deficits in individuals with ASD.
The review suggests that although these types of interventions hold a promise in FER treatment, there is a need for further investigation of their effectiveness. There are a number of issues concerning generalization of the results regarding the validity of methods and the specific problems in transferring the acquainted knowledge into novel situations [38]. In addition, there is a lack of research concerning the interaction of children with ASD with technology tools. More specifically, the acceptance of technology by children with ASD is taken for granted. There is also a lack of empirical data concerning the preference of different interaction modalities.
For the above reasons, an exploratory study was conducted to investigate the acceptance and preference of different real-time modalities in FER assessment by two children with ASD (low and high functioning autism).
The results indicate that the participant with high functioning autism showed an interest in interacting with the teacher in all modalities, although a better acceptance was observed in the modalities involving the ICT tools.
Indeed, literature reveals a great value of using ICT for the education of children with ASD regarding the use of different modalities: face to face [44], ICT tools [45, 46], or more sophisticated tools concerning interactive environments, whether virtual or not [47]. This is related with the monotropic interest system that people with ASD have. Therefore, they prefer educational environments compiled of restricted stimuli, clearly defined cut boundary lines as well as controllable instructional conditions which are text-free [48]. In case of FER, according to the articles reviewed, a great acceptance of ICT in high functioning ASD can be assumed also and this leads to the suggestion of further use of ICT tools in this area of cognition in ASD.
However, it can be said that the advantages of using ICT in education of people with ASD, are limited in children with high functioning ASD. The differences that lie between high and low functioning autism can be considered as a critical factor in the sufficiency of various instructional methods [7]. Thus, in the use of ICT modalities, and specifically in the treatment of FER, many researchers mention the discrepancy observed between individuals with high and low functioning autism where the sample was consisted of both types of ASD [14, 26, 27].
The same results were also revealed in our exploratory study. The child with low functioning autism did not interact at all with ICT modalities. The student preferred only face to face interaction. He has learned to interact in that way within his academic courses. Besides, there is always a need for restricted guidance from the person-teacher who is related to, which makes extremely difficult for a course with a computer human proxy teacher to be
held. Furthermore, such courses are based on verbal communication a field where students with low functioning face many straits [4, 7].
Moreover, regarding the recognition of facial expressions, a useful outcome was obtained concerning the individual with high functioning autism because the individual with low functioning faced many difficulties in every type of modality. Although he could recognize a variety of facial emotional expressions, the same outcome did not appear in the interaction of each modality. One reason could possibly be that the expressions which were shown by the teacher were similar to static ones. This might cause problems related to the absence of dynamicity in the expression of emotions in faces [41, 42]. Furthermore, the expressions were presented without any social context and this also can lead to confusions in FER [39]. However, an interesting point of discussion should be that the individual, when asked, has presented different facial emotional expressions different than those generally used in such cases [13] which raises the question whether there are some particular predefined emotional expressions and not a number of expressions which can be applied to the same emotion [40]. Consequently, in the treatment of FER in ASD, there is a need for therapists to take into account students' profile in learning and transferring knowledge. Interactive ICT should provide situations more similar to real life ones in order to fulfill the requirements of students' preferences and acceptances and to meet the instructional goals designed.
In conclusion, the preferences and acceptance of different ICT tools of children with ASD should be taken into account when methods of treatment of FER in ASD are designed. However, further study should be accomplished in this research field.
References
1. American Psychiatric Association. Diagnostic and statistical manual oof mental disorders.4th edition.2000.
2. Kim YS, Leventhal BL, Koh YJ, Fombonne E, Laska E, Lim EC, Cheon KA, Kim SJ, Kim YK, Lee H, Song DH, Grinker RR.Prevalence of
autism spectrum disorders in a total population sampleAm J Psychiatry 2011;168(9):904-12.
3. Interagency Autism Coordinating Committee. IACC strategic plan for autism spectrum disorder research. 2013.
4. World Health Organization. ICD Classifications of Mental and Behavioural Disorder: Clinical Descriptions and Diagnostic Guidelines. Geneva.
5. Herba C, Phillips M. Development of facial expression from childhood to adolescence: behavioural and neurological perspectives. J Child
Psychol Psychiatry 2004;45(7):1185-98.
6. Begeer S, Koot, HM, Rieffe C, MeerumTerwogt M, Stegge H. Emotional competence in children with autism: Diagnostic criteria and
empirical evidence.DevelopmentalReview 2008;28(3):342-369.
7. Harms MB, Martin A, Wallace GL.Facial emotion recognition in autism spectrum disorders: a review of behavioural and neuroimaging
stfudies. Neuropsychol Rev. 2010;20(3):290-322.
8. Uljarevic M, Hamilton A. Recognition of emotions in autism: a formal meta-analysis. J Autism DevDisord. 2013;43(7):1517-26.
9. Joseph RM, Tanaka J. Holistic and part-based face recognition in children with autism. J Child Psychol Psychiatry.2003; 44(4):529-42.
10. Baron-Cohen S, Campbell R, Karmiloff-Smith A, Grant J, Walker J. Are children with autism blind to the mentalistic significance of the eyes? British J Developmental Psychology 1995;13:379-398.
11. Critchley HD, Daly EM, Bullmore ET, Williams SC, Van Amelsvoort T, Robertson DM, Rowe A, Phillips M,McAlonan G, Howlin P, Murphy DG. The functional neuroanatomy of social behaviour: changes in celebral blood flow when people with autistic disorder process facial expressions. Brain 2000;123:2203-12.
12. Bolte S, Feineis-Matthews S, Leber S, Dierks T, Hubl D, Poustka F.The development and evaluation of a computer-based program to test and to teach the recognition of facial affect. Int J Circumpolar Health. 2002;61(2):61-8.
13. Ekman P, Friesen WV. Unmasking the face. Englewood Cliffs, New Jersey: Spectrum-Prentice Hall;1975
14. Hopkins IM, Gower MW, Perez TA, Smith DS, Amthor FR, Wimsatt FC, Biasini FJ. Avatar assistant: improving social skills in studies with an ASD through a computer-based intervention. J Autism DevDisord2011;41(11):1543-55.
15. Tanaka JW, Wolf JM, Klaiman C, Koenig K, Cockburn J, Herlihy L, Brown C, Stahl SS, South M, McPartland JC, Kaiser MD, Schultz RT.The perception and identification of facial emotions in individuals with autism spectrum disorders using the Let's Face It! Emotion Skills Battery.J Child Psychol Psychiatry 2012;53(12):1259-67.
16. Golan O, Baron-Cohen S. Systemizing empathy: teaching adults with Asperger syndrome or high-functioning autism to complex emotions using interactive multimedia. DevPsychopathol2006;18(2):591-617.
17. LaCava PG, Rankin A, Mahlios E, Cook K, Simpson RL. A single case design evaluation of a software and tutor intervention addressing emotion recognition and social interaction in four boys with ASD. Autism 2010;14(3):161-78.
18. Golan O, Ashwin E, Granader Y, McClintock S, Day K, Leggett V, Baron-Cohen S. Enhancing emotion recognition in children with autism spectrum conditions: an intervention using animated vehicles with real emotional faces.J Autism DevDisord2010;40(3):269-279.
19. Young RL, Posselt M. Using the Transporters DVD as a learning tool for children with autism spectrum disorders (ASD). J Autism DevDisord2012;42:984-991.
20. Williams BT, Gray KM, Tonge BJ. Teaching emotion recognition skills to young children with autism: a randomised controlled trial of an emotion training programme. J Child Psychol Psychiatry 2012;53(12):1268-1276.
21. El Kaliouby R, Robinson P. The emotional hearing aid: an assistive tool for children with Asperger syndrome. Univ Access InfSoc2005;4:121-134.
22. Faja S, Aylward E, Bernier R, Dawson G. Becoming a face expert: a computerized face-training program for high-functioning individuals with autism spectrum disorders. Developmental Neuropsychology 2007;33(1):1-24.
23. Tseng R, Yi-Luen Do E. The role of information and computer technology for children with autism spectrum disorder and the Facial Expression Wonderland (FEW). International Journal of Computational Models and Algorithms in Medicine 2011;2(2):23-41.
24. Jain S, Tamersoy B, Zhang Y, Aggarwal JK. An interactive game for teaching facial expressions to children with autism spectrum disorders. In: Proceedings of the 5th International Symposium on Communications, Control and Signal Processing 2012;Rome Italy 2-4 May 2012.
25. Miranda JC, Fernandes T, Sousa AA, Orvallho V. Interactive technology: teaching people with autism to recognize facial emotions. In: Williams T, editor. Autism Spectrum Disorders- From Genes to Environment. Croatia: InTech.; 2011. p. 299-31.
26. Fabri M, Elzouki SA, Moore D. Emotionally expressive avatars for chatting, learning and therapeutic intervention in human-computer interaction.In: Jacho J, editor. HCI intelligent multimodal interaction environments. Heidelber Berlin: Springer; 2007.p.275-285.
27. Moore D, Cheng Y, McGrath P, Powell NJ. Collaborative Virtual Environment Technology for People with Autism.Focus on Autism and Other Developmental Disabilities 2005;20(4):231-243.
28. Valeria N, Theng LB. Collaborative learning through facial expression for special children. International Journal oof New Computer Architectures and their Applications 2011;1(2):490-509.
29. Cheng Y, Ye J. Exploring the social competence of students with autism spectrum conditions in a collaborative virtual learning environment-the pilot study. Computers &Educatio 2010;54(4):1068-1077.
30. Konstantinidis EI, Hitoglou-Antoniadou M, Bamidis PD, Nikolaidou MM. Using affective avatars and rich multimedia content for education of children with autism. In: Proceedings of the 2nd International Conference on Pervasive Technologies Related to Assistive Environments
31.Lahiri U, Bekele E, Dohrmann E, Warren Z, Sarkar N. Design of a virtual reality based adaptive response technology for children with autism. IEEE Trans Neural SystRehabilEng 2013;21(1):55-64.
32. Bekele E, Crittendon J, Zheng Z, Swanson A, Weitlauf A, Warren Z, Sarkar N.Assessing the utility of a virtual environment for enhancing facial affect recognition in adolescents with autism.J Autism DevDisord2014;44(7):1641-50.
33. Grynszpan O, Martin J-C, Nadel J. Multimedia interfaces for users with high functioning autism: an empirical investigation. Int J Human Computer Studies 2008;66:628-639.
34.Kandalaft MR1, Didehbani N, Krawczyk DC, Allen TT, Chapman SB.Virtual reality social cognition training for young adults with high-functioning autism.J Autism DevDisord2013;43(1):34-44.
35. Deriso D, Sussking J, Krieger L, Bartlett M. Emotion Mirror: a novel intervention for autism based on real-time expression recognition. Lecture Notes in Computer Science 2012;7585:671-674.
36. Tsai T-W, Lin M-Y. An application of interactive game for facial expression of the autisms.Lecture Notes in Computer Science 2011;6872:204-211.
37. Dickerson P, Robins B, Dautenhahn K. Where the action is: a conversation analytic perspective on interaction between a humanoid robot, a co-present adult and a child with an ASD. Interaction Studies 2013;14(2):297-316.
38. Herrera G, Alcantua F, Jordan R, Blanquer A, Labajo G, Pablo DE. Development of symbolic play through the use of virtual reality tools in children with autistic spectrum disorders: two case studies. Autism: the International Journal of Research 2006;12(2):143-158.
39. Mauss IB, Robinson MD. Measures of emotion: a review. Cognition & Emotion 2009;23(2):209-237.
40. Russell JA, Bachorowski J-A, Fernandez-Dols J-M.Facial and vocal expressions of emotionAnnual Review ofPsychology2003.54(1):329-349.
41. Fiorentini C, Viviani P. Is there a dynamic advantage for facial expressions? Journal of Vision 2011;11(3):1-17.
42. Cunningham DW, Wallraver C. Dynamic information for the recognition of conversational expressions. Journal of Vision 2009;9(13):1-17.
43. Yin RK. Case study research: design and methods.2nd ed. Newbury Park: Sage Publications; 1994.
44. Pennington R-O. Computer-assisted instruction for teaching academic skills to students with autism spectrum disorders: a review of literature.
Focus on Autism and Other Developmental Disorders 2010;25(4):239-248.
45. Grynszpan O, Weiss P-L, Perez-Diaz F, Gal E. Innovative technology-based intervention for autism spectrum disorders: A meta-analysis. Autism 2013;18(4):346-361.
46. Jimoyiannis A, Tsiopela D. Pre-Vocational Skills Laboratory: Development and investigation of a Web-based environment for students with autism. Procedia Computer Science 2014; 27:207 - 217.
47. Boucenna S, Narzisi A, Tilmont E, Muratori F, Pioggia G, Cohen D, Chetouani M. Interactive Technologies for Autistic Children: A review. Cognitive Computation 2014;6(4):722-740.
48. Ploog B-O, Scharf A, Nelson D, Brooks P-J. Use of Computer-Assisted Technologies (CAT) to enhance social, communicative, and language development in children with autism spectrum disorders. J Autism & Dev Dis 2013;43(2):301-322.