=Paper= {{Paper |id=Vol-1982/paper3 |storemode=property |title=Artificial Intelligence for Robot-Assisted Treatment of Autism |pdfUrl=https://ceur-ws.org/Vol-1982/paper3.pdf |volume=Vol-1982 |authors=Giuseppe Palestra,Berardina De Carolis,Floriana Esposito |dblpUrl=https://dblp.org/rec/conf/aiia/PalestraCE17 }} ==Artificial Intelligence for Robot-Assisted Treatment of Autism== https://ceur-ws.org/Vol-1982/paper3.pdf
                           Artificial Intelligence for
                      Robot-Assisted Treatment of Autism

                 Giuseppe Palestra, Berardina De Carolis, and Floriana Esposito

                     Department of Computer Science, University of Bari, Bari, Italy
                                   giuseppe.palestra@uniba.it



          Abstract. Designing robot-based treatments for children with Autistic Spectrum
          Disorder (ASD) is a growing research field. This paper presents an artificial intel-
          ligence system based on a robot-assisted treatment of autism. The robot acts as a
          social mediator, trying to elicit specific behaviors in autistic children. A first pre-
          liminary evaluation of the system has been performed involving 3 high function-
          ing children with autism spectrum disorders. The experiments carried out make
          it possible to evaluate the behavioral response of the children in the eye contact
          exercise.

          Keywords: artificial intelligence, social robots, autism spectrum disorder, eye
          contact


1     Introduction
Autism is a severe disorder of development that is characterized by social interac-
tion/communication difficulties and tendency to engage in repetitive patterns of behav-
ior. A quite large number of early diagnosis and treatment protocols have been designed
empirically tested and published in the autism literature. The most recent protocols are
derived from Applied Behavior Analyses (ABA) [6] and they have the focus of teaching
new skills to autistic children. Artificial intelligence, in particular in robotics, suggests
that robots play a promising role to build up the interventions to help autistic chil-
dren and to cope their impairments related to eye contact, joint attention, imitation, and
emotion recognition and production. Several social robots are enable to execute tasks in
autistic treatment. Each social robots differs for physical appearance, targeted eliciting
behavior, level of autonomy [9]. These characteristics are currently under investigation
to understand how and at which extent they influence the treatment. Nevertheless, in
the state-of-the-art, significant attention is given to the robot characteristics whereas has
not been enough investigated how artificial intelligence can be integrated in traditional
autism treatments [10]. A natural robot assisted treatment for ASD children requires to
have or simulate intelligent behavior and interaction, based on human speech and body
language understanding, emotion recognition and eye contact ability, and other typical
intelligent behaviors. In order to build a natural assisted treatment for ASD children a
multidisciplinary effort is necessary. Therapists, psychologists, robot developers, and
researchers are involved in design robotics treatment protocols for autistic people. The
aim of this work is to present an artificial intelligence system based on robot-assisted
treatment protocol for autistic children.

D. Impedovo and G. Pirlo (Eds.), Workshop on Artificial Intelligence with Application in Health, Bari, Italy, November 14, 2017.
Copyright held by the authors.
    The protocol has been partially used in the SARACEN (Social Assistive Robots
for Autistic Children EducatioN) project aimed at developing innovative methods for
early diagnosis of ASD and therapy support for autistic children with socially assistive
robots. SARACEN has been partially supported by the Italian Ministry for Education,
University and Research (MIUR) and by the European Union in the framework of Smart
Cities and Communities and Social Innovation of 2007-2013.
    This paper is organized as follows. Section 2 reports the state of the art relative to
the robot-assisted treatments for autistic children. Section 3 presents an overview of the
Artificial Intelligence in Robot-Child Interaction. Section 4 describes the experimental
setup and the preliminary results. Finally, conclusions are drawn.


2     Related Work

Positive effects of social robots in autistic children treatment are already reported in
the literature to elicit specific behavior in ASD children such as emotion generation
[8], joint attention and triadic interaction [3], eye contact and social gaze [1]. Many
other studies report evidences in the use of social robots in autistic children treatment
considering just one aspect at time of the impairment. Barakova and Lourens [2] present
three ABA interventions based on NAO robot. The authors analyze the needs and the
opportunity for combining artificial intelligence within an application domain such as
that of autism. Jarrold [4] proposes a AI-based tutoring system for ASD children that
teach mind reading skills. Just some studies take into account more that one aspects of
the impairment of ASD children. Zheng et al. [11] present a robot-mediated therapeutic
system for imitation skill learning. The system is designed in such a manner that it
can operate autonomously or with a therapist depending on the therapy needs. Their
study is aimed at drawing attention from the children with ASD and teaching gestures.
Palestra et. al [7] present the implementation of an interface for digital PECS therapy
that enable ASD people to overcome imitation and motors skills difficulties. None of
these studies takes into account all the disabilities of the autistic children during the
treatments. An artificial intelligence technology that can deal with several disorders
during the treatment is what is missing in the behavioral autism treatment of children.


3     Artificial Intelligence in Robot-Child Interaction

The proposed system for ASD treatment includes four main modules: the RGB-D cam-
era, the workstation, and the social robot and the robot camera. The overview of the
system is illustrated in the figure 1. The child’s behavior is captured by two cameras: a
5 mega-pixel auto focus camera on board the robot, and a RGB-D camera. The social
robot is the Softbank NAO H25 humanoid robot 1 . NAO has the following technical
characteristics:

    – 25 degrees of freedom (11 for the lower part and 14 for the upper part);
    – x86 AMD GEODE 500MHz CPU with 256 MB of RAM and 2GB of storage;
1 https://www.ald.softbankrobotics.com/en/cool-robots/nao
 – Ethernet and Wi-Fi connections.
The workstation is equipped with:
 – Intel Core i7-4700MQ CPU (2.40 GHz), with 8GB of RAM;
 – 1TB of storage;
 – Ubuntu GNU/Linux 16.10 as operating system;
and the following software modules:
 – NAOqi API;
 – Kinect SDK 2;
 – the Robot Intelligence Module (RIM);
 – the Behavior Manager (BM).
    The workstation uses the NAOqi API to communicate with the robot in order to cap-
ture the video streaming from the robot camera and in order to activate specific robot
behaviors. The Kinect SDK 2 installed on the workstation is used to acquire depth
streaming from the RGB-D camera. The video and depth frames acquired via the sen-
sors are then sent to the RIM. This is composed by four software components: head
pose, body posture, eye contact, and facial expression. Each module use specific com-
puter vision algorithms. The RIM detects the child’s non verbal signals and transfer
them to the BM. In this module is implemented the treatment protocol. A log file in the
BM reports an anonymous code for the child, the behavior performed from the robot,
the behavioral response of the child, and the exercise performed by the social robot.




Fig. 1. Artificial Intelligence System for Robot-Assisted Treatment of Autism. The schema shows
how the child-robot interaction loop and the software modules used by the robot to interact with
the child: the Robot Intelligent Module (RIM) and the Behavior Manager (BM). The RIM is
composed of 4 components: head pose, body posture, eye contact, and facial expression. The BM
consist of two components: the treatment protocol and the NAOqi API.
3.1   The Robot-Assisted Treatment Protocol
The protocol is designed to improve a difficult behavior for an autistic child. It is based
on the ABA program that includes: a stimulus presentation, a behavioral response, and
a reinforcement. The new aspect is the presence of a social robot as a partner to perform
the treatment. The protocol has five exercises with three levels of difficulty (see figure
2). The exercises focus on: eye contact, joint attention, body imitation, facial imitation,
and facial expression imitation. The child has to performs each level for 5 times and
when he/she performs the exercise correctly he/she can pass the next level. The therapist
can assign each exercise or a set of exercise to a child according the functioning level.




Fig. 2. Robot-Assisted Treatment Protocol. The protocol is composed of four exercises (three
levels of difficulty for each exercise).


   In this study only the eye contact exercise has been carried out with autistic children.
Therefore, only the eye contact exercise is described in this subsection.

3.2   Robot-Assisted Eye Contact Exercise
The eye contact exercise is design to improve the eye contact behavior typically reduced
in ASD children. This behavior is essential for interpersonal communication [5]. This
exercise consists of three levels which differ in terms of stimulus and reinforcement.

Easy Level The robot performs the stimulus: call the child by name and it says "Look at
me". The robot repeats the stimulus until the child looks the robot (behavioral response).
NAO says "Good!" followed by the name of the child and it plays a music when the eye
contact occurs (reinforcement).
Medium Level In this level, the stimulus is changed: the robot call the child by name
and it does not say "Look at me".


Hard Level The robot does not play the music in this level.


3.3   Automatic Eye Contact Detection

A description of the computer vision based algorithm used to implement the eye contact
detection is provided in this section. Our algorithm for eye contact detection needs the
RGB camera of the robot placed close (max 40 − 50cm.) to the face of the child.
    The pipeline of the eye contact detector, illustrated in the figure 3, is composed of 5
steps: eye detection, preprocessing, iris detection, and pupil detection, pupil position.




                           Fig. 3. Eye Contact Detection Pipeline.


    The algorithm takes images from the camera as an input (raw images) and as a first
step the eyes are detected using the well-known Viola-Jones detector implemented in
OpenCV. Then, in the preprocessing step the right and left eye patches are converted
in 8-bit-deep gray-level image and several filters are applied. The filters applied in the
preprocessing step are: thresholding, erosion, and morphological gradient. The next step
of the pipeline draws the contours of the iris finding the dark part of the eye (iris) from
the white background (sclera). Once the location of the iris has been obtained the pupil
can be detected by calculating the centroid of the iris. In the final step the eye contact
detection is performed. The eye patches are divided in 8x8 sections: if the pupils are
in the center of this grid the eye contact occurred (see figure 4) otherwise the child is
looking into something else.


4     Experiments

In this section an assessment of the system is provided analyzing the behavioral re-
sponses of the autistic children. The analyze has the following goals:

 1. test the artificial intelligent system components in a real environment;
 2. evaluating the behavioral responses elicited of ASD children.

    Three children (C1, C2, C3) with a diagnosis of high functioning ASD (age range
of 6-13 years) have been involved in this study.
                           Fig. 4. Eye Contact Detection Example.



4.1   Procedure

According to the ethical guidelines the personal data related to the children have been
anonymized so that the individual identity can not be revealed. The parents of the chil-
dren signed the informed consent, written in Italian (the participants’ mother tongue) of
which one copy has been kept by the therapists and the other one by the parents of the
child. Participants have been asked to perform three sessions (S1, S2, and S3) with the
interface. Each child tests the Eye Contact exercise (Easy, Medium, and Hard levels)
to achieve the first assessment of the child with the robotic interface. Subsequently, the
robotic treatment program will be tailored on the specific needs of the child. Each chil-
dren played 15 eye contact exercises for each session. A session lasted 20 minutes on
average. The experiment was conducted by expert therapist. The children were admit-
ted one at a time in the experimental room. The therapist and the child entered the room
together, the child were placed in front of the robot sitting on a chair. Beforehand, all
children participated in a familiarization session lasting 10 minutes. Then, the therapist
introduced the social robot providing a simple description of it and answer any child’s
questions. Subsequently, once the children felt comfortable in the presence of the social
robot (usually 10 minutes), the first experimental session (S1) under the supervision of
the therapist started. In the first session the robot started with the Easy Level of the Eye
Contact exercise as detailed in Section 3.1. In the second session the robot started the
Medium Level of the same exercise. Finally, the third session start with the Hard Level
of the Eye Contact exercise. For each session the robots repeated 15 times the corre-
sponding level. At the end of each session, a debriefing was given to each participant.


4.2   Results

In general, the system was able to operate well in the treatment environment for all
the ASD children. To evaluate the behavior of the children during the interaction with
the system, the focus was on the number of the eye contact correctly performed (nEC).
The system considers 1000 seconds as the maximum time (tMAX) to perform the eye
contact.
       Table 1. Percentage of the eye contact act for each participant in S1, S2, and S3.

                                         S1     S2    S3
                                   C1 73,33 53,33 20,00
                                   C2 66,67 53,33 26,67
                                   C3 80,00 60,00 40,00
                                   AVG 73,33 55,55 28,89



     With respect to nEC, it has been observed that, in percentage, the eye contact act
had an average success rate equal to 73.33% in S1, 55.55% in S2, and 28.89% in S3 as
reported in Table 1.
     Figure 5 depicts the nEC achieved by each child in S1, S2, and S3.
     A first preliminary evaluation of the system has been performed involving 3 high
functioning children with autism spectrum disorders. Results were encouraging, ana-
lyzing the nEC in the three different sessions. In fact, it is possible to understand in an
objective way the level of difficulty of the child involved in the treatment. This measure
can be useful to adjust the treatment in the next session with the robot. In the experiment
all children are able to perform well the Eye Contact exercise at the easy level, but they
need help when they perform the eye contact at the medium level and at the hard level.




Fig. 5. Number of the eye contact acts (nEC) grouped by child (C1, C2, C3).S1=Easy level,
S2=Medium level, S3=Hard level




5   Conclusions and Future Work
In this paper an Artificial Intelligence system for Robot-Child interaction based on a be-
havioral treatment protocol has been proposed. Results show as a social robot playing
the role of mediator can be successful in robot-assisted treatment of autistic children.
The same children who involved in this experiment will be interact with the social robot
in the same exercises to test the follow up of the treatment. Moreover, investigations in-
cluding experiments with a larger sample of autistic children that interact with whole
protocol (eye contact joint attention, body imitation, facial imitation, and facial expres-
sion imitation) will allow us to test the exercises completely.


6    Acknowledgments

This work has been partially supported by Italian Ministry for Education, University
and Research (MIUR) and European Union under Grants PON04a3_00201, SARACEN
project.


References
 1. Admoni, H., Scassellati, B.: Social eye gaze in human-robot interaction: A review. Journal
    of Human-Robot Interaction 6(1), 25–63 (2017)
 2. Barakova, E., Lourens, T.: Interplay between natural and artificial intelligence in training
    autistic children with robots. In: International Work-Conference on the Interplay Between
    Natural and Artificial Computation. pp. 161–170. Springer (2013)
 3. Chevalier, P., Martin, J.C., Isableu, B., Bazile, C., Iacob, D.O., Tapus, A.: Joint attention
    using human-robot interaction: Impact of sensory preferences of children with autism. In:
    Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International
    Symposium on. pp. 849–854. IEEE (2016)
 4. Jarrold, W.L.: Treating autism with the help of artificial intelligence: a value proposition.
    In: Proceedings of the Agent-based Systems for Human Learning Workshop (ABSHL) at
    Autonomous Agents and Multiagent Systems (AAMAS). pp. 30–37 (2007)
 5. Jeffries, T., Crosland, K., Miltenberger, R.: Evaluating a tablet application and differential
    reinforcement to increase eye contact in children with autism. Journal of applied behavior
    analysis 49(1), 182–187 (2016)
 6. Leaf, J.B., Leaf, R., McEachin, J., Taubman, M., AlaâĂŹi-Rosales, S., Ross, R.K., Smith,
    T., Weiss, M.J.: Applied behavior analysis is a science and, therefore, progressive. Journal of
    autism and developmental disorders 46(2), 720–731 (2016)
 7. Palestra, G., Cazzato, D., Adamo, F., Bortone, I., Distante, C.: Assistive robot, rgb-d sensor
    and graphical user interface to encourage communication skills in asd population. Journal of
    Medical Robotics Research p. 1740002 (2016)
 8. Palestra, G., Varni, G., Chetouani, M., Esposito, F.: A multimodal and multilevel system
    for robotics treatment of autism in children. In: Proceedings of the International Workshop
    on Social Learning and Multimodal Interaction for Designing Artificial Agents. p. 3. ACM
    (2016)
 9. Scassellati, B., Admoni, H., Matarić, M.: Robots for use in autism research. Annual review
    of biomedical engineering 14, 275–294 (2012)
10. Yun, S.S., Kim, H., Choi, J., Park, S.K.: A robot-assisted behavioral intervention system
    for children with autism spectrum disorders. Robotics and Autonomous Systems 76, 58–67
    (2016)
11. Zheng, Z., Young, E.M., Swanson, A.R., Weitlauf, A.S., Warren, Z.E., Sarkar, N.: Robot-
    mediated imitation skill training for children with autism. IEEE Transactions on Neural Sys-
    tems and Rehabilitation Engineering 24(6), 682–691 (2016)