<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Artificial Intelligence for Robot-Assisted Treatment of Autism</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Giuseppe Palestra</string-name>
          <email>giuseppe.palestra@uniba.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Berardina De Carolis</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Floriana Esposito</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science, University of Bari</institution>
          ,
          <addr-line>Bari</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Designing robot-based treatments for children with Autistic Spectrum Disorder (ASD) is a growing research field. This paper presents an artificial intelligence system based on a robot-assisted treatment of autism. The robot acts as a social mediator, trying to elicit specific behaviors in autistic children. A first preliminary evaluation of the system has been performed involving 3 high functioning children with autism spectrum disorders. The experiments carried out make it possible to evaluate the behavioral response of the children in the eye contact exercise.</p>
      </abstract>
      <kwd-group>
        <kwd>artificial intelligence</kwd>
        <kwd>social robots</kwd>
        <kwd>autism spectrum disorder</kwd>
        <kwd>eye contact</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        Autism is a severe disorder of development that is characterized by social
interaction/communication difficulties and tendency to engage in repetitive patterns of
behavior. A quite large number of early diagnosis and treatment protocols have been designed
empirically tested and published in the autism literature. The most recent protocols are
derived from Applied Behavior Analyses (ABA) [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] and they have the focus of teaching
new skills to autistic children. Artificial intelligence, in particular in robotics, suggests
that robots play a promising role to build up the interventions to help autistic
children and to cope their impairments related to eye contact, joint attention, imitation, and
emotion recognition and production. Several social robots are enable to execute tasks in
autistic treatment. Each social robots differs for physical appearance, targeted eliciting
behavior, level of autonomy [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. These characteristics are currently under investigation
to understand how and at which extent they influence the treatment. Nevertheless, in
the state-of-the-art, significant attention is given to the robot characteristics whereas has
not been enough investigated how artificial intelligence can be integrated in traditional
autism treatments [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. A natural robot assisted treatment for ASD children requires to
have or simulate intelligent behavior and interaction, based on human speech and body
language understanding, emotion recognition and eye contact ability, and other typical
intelligent behaviors. In order to build a natural assisted treatment for ASD children a
multidisciplinary effort is necessary. Therapists, psychologists, robot developers, and
researchers are involved in design robotics treatment protocols for autistic people. The
aim of this work is to present an artificial intelligence system based on robot-assisted
treatment protocol for autistic children.
      </p>
      <p>The protocol has been partially used in the SARACEN (Social Assistive Robots
for Autistic Children EducatioN) project aimed at developing innovative methods for
early diagnosis of ASD and therapy support for autistic children with socially assistive
robots. SARACEN has been partially supported by the Italian Ministry for Education,
University and Research (MIUR) and by the European Union in the framework of Smart
Cities and Communities and Social Innovation of 2007-2013.</p>
      <p>This paper is organized as follows. Section 2 reports the state of the art relative to
the robot-assisted treatments for autistic children. Section 3 presents an overview of the
Artificial Intelligence in Robot-Child Interaction. Section 4 describes the experimental
setup and the preliminary results. Finally, conclusions are drawn.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Related Work</title>
      <p>
        Positive effects of social robots in autistic children treatment are already reported in
the literature to elicit specific behavior in ASD children such as emotion generation
[
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], joint attention and triadic interaction [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], eye contact and social gaze [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Many
other studies report evidences in the use of social robots in autistic children treatment
considering just one aspect at time of the impairment. Barakova and Lourens [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] present
three ABA interventions based on NAO robot. The authors analyze the needs and the
opportunity for combining artificial intelligence within an application domain such as
that of autism. Jarrold [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] proposes a AI-based tutoring system for ASD children that
teach mind reading skills. Just some studies take into account more that one aspects of
the impairment of ASD children. Zheng et al. [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] present a robot-mediated therapeutic
system for imitation skill learning. The system is designed in such a manner that it
can operate autonomously or with a therapist depending on the therapy needs. Their
study is aimed at drawing attention from the children with ASD and teaching gestures.
Palestra et. al [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] present the implementation of an interface for digital PECS therapy
that enable ASD people to overcome imitation and motors skills difficulties. None of
these studies takes into account all the disabilities of the autistic children during the
treatments. An artificial intelligence technology that can deal with several disorders
during the treatment is what is missing in the behavioral autism treatment of children.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Artificial Intelligence in Robot-Child Interaction</title>
      <p>The proposed system for ASD treatment includes four main modules: the RGB-D
camera, the workstation, and the social robot and the robot camera. The overview of the
system is illustrated in the figure 1. The child’s behavior is captured by two cameras: a
5 mega-pixel auto focus camera on board the robot, and a RGB-D camera. The social
robot is the Softbank NAO H25 humanoid robot 1. NAO has the following technical
characteristics:
– 25 degrees of freedom (11 for the lower part and 14 for the upper part);
– x86 AMD GEODE 500MHz CPU with 256 MB of RAM and 2GB of storage;</p>
      <sec id="sec-3-1">
        <title>1 https://www.ald.softbankrobotics.com/en/cool-robots/nao</title>
        <p>– Ethernet and Wi-Fi connections.</p>
        <p>The workstation is equipped with:
– Intel Core i7-4700MQ CPU (2.40 GHz), with 8GB of RAM;
– 1TB of storage;
– Ubuntu GNU/Linux 16.10 as operating system;
and the following software modules:
– NAOqi API;
– Kinect SDK 2;
– the Robot Intelligence Module (RIM);
– the Behavior Manager (BM).</p>
        <p>The workstation uses the NAOqi API to communicate with the robot in order to
capture the video streaming from the robot camera and in order to activate specific robot
behaviors. The Kinect SDK 2 installed on the workstation is used to acquire depth
streaming from the RGB-D camera. The video and depth frames acquired via the
sensors are then sent to the RIM. This is composed by four software components: head
pose, body posture, eye contact, and facial expression. Each module use specific
computer vision algorithms. The RIM detects the child’s non verbal signals and transfer
them to the BM. In this module is implemented the treatment protocol. A log file in the
BM reports an anonymous code for the child, the behavior performed from the robot,
the behavioral response of the child, and the exercise performed by the social robot.
The protocol is designed to improve a difficult behavior for an autistic child. It is based
on the ABA program that includes: a stimulus presentation, a behavioral response, and
a reinforcement. The new aspect is the presence of a social robot as a partner to perform
the treatment. The protocol has five exercises with three levels of difficulty (see figure
2). The exercises focus on: eye contact, joint attention, body imitation, facial imitation,
and facial expression imitation. The child has to performs each level for 5 times and
when he/she performs the exercise correctly he/she can pass the next level. The therapist
can assign each exercise or a set of exercise to a child according the functioning level.</p>
        <p>In this study only the eye contact exercise has been carried out with autistic children.
Therefore, only the eye contact exercise is described in this subsection.
3.2</p>
        <sec id="sec-3-1-1">
          <title>Robot-Assisted Eye Contact Exercise</title>
          <p>
            The eye contact exercise is design to improve the eye contact behavior typically reduced
in ASD children. This behavior is essential for interpersonal communication [
            <xref ref-type="bibr" rid="ref5">5</xref>
            ]. This
exercise consists of three levels which differ in terms of stimulus and reinforcement.
Easy Level The robot performs the stimulus: call the child by name and it says "Look at
me". The robot repeats the stimulus until the child looks the robot (behavioral response).
NAO says "Good!" followed by the name of the child and it plays a music when the eye
contact occurs (reinforcement).
          </p>
          <p>Medium Level In this level, the stimulus is changed: the robot call the child by name
and it does not say "Look at me".</p>
        </sec>
      </sec>
      <sec id="sec-3-2">
        <title>Hard Level The robot does not play the music in this level.</title>
        <p>3.3</p>
        <sec id="sec-3-2-1">
          <title>Automatic Eye Contact Detection</title>
          <p>A description of the computer vision based algorithm used to implement the eye contact
detection is provided in this section. Our algorithm for eye contact detection needs the
RGB camera of the robot placed close (max 40 50cm:) to the face of the child.</p>
          <p>The pipeline of the eye contact detector, illustrated in the figure 3, is composed of 5
steps: eye detection, preprocessing, iris detection, and pupil detection, pupil position.</p>
          <p>The algorithm takes images from the camera as an input (raw images) and as a first
step the eyes are detected using the well-known Viola-Jones detector implemented in
OpenCV. Then, in the preprocessing step the right and left eye patches are converted
in 8-bit-deep gray-level image and several filters are applied. The filters applied in the
preprocessing step are: thresholding, erosion, and morphological gradient. The next step
of the pipeline draws the contours of the iris finding the dark part of the eye (iris) from
the white background (sclera). Once the location of the iris has been obtained the pupil
can be detected by calculating the centroid of the iris. In the final step the eye contact
detection is performed. The eye patches are divided in 8x8 sections: if the pupils are
in the center of this grid the eye contact occurred (see figure 4) otherwise the child is
looking into something else.
4</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Experiments</title>
      <p>In this section an assessment of the system is provided analyzing the behavioral
responses of the autistic children. The analyze has the following goals:
1. test the artificial intelligent system components in a real environment;
2. evaluating the behavioral responses elicited of ASD children.</p>
      <p>Three children (C1, C2, C3) with a diagnosis of high functioning ASD (age range
of 6-13 years) have been involved in this study.
According to the ethical guidelines the personal data related to the children have been
anonymized so that the individual identity can not be revealed. The parents of the
children signed the informed consent, written in Italian (the participants’ mother tongue) of
which one copy has been kept by the therapists and the other one by the parents of the
child. Participants have been asked to perform three sessions (S1, S2, and S3) with the
interface. Each child tests the Eye Contact exercise (Easy, Medium, and Hard levels)
to achieve the first assessment of the child with the robotic interface. Subsequently, the
robotic treatment program will be tailored on the specific needs of the child. Each
children played 15 eye contact exercises for each session. A session lasted 20 minutes on
average. The experiment was conducted by expert therapist. The children were
admitted one at a time in the experimental room. The therapist and the child entered the room
together, the child were placed in front of the robot sitting on a chair. Beforehand, all
children participated in a familiarization session lasting 10 minutes. Then, the therapist
introduced the social robot providing a simple description of it and answer any child’s
questions. Subsequently, once the children felt comfortable in the presence of the social
robot (usually 10 minutes), the first experimental session (S1) under the supervision of
the therapist started. In the first session the robot started with the Easy Level of the Eye
Contact exercise as detailed in Section 3.1. In the second session the robot started the
Medium Level of the same exercise. Finally, the third session start with the Hard Level
of the Eye Contact exercise. For each session the robots repeated 15 times the
corresponding level. At the end of each session, a debriefing was given to each participant.
4.2</p>
      <sec id="sec-4-1">
        <title>Results</title>
        <p>In general, the system was able to operate well in the treatment environment for all
the ASD children. To evaluate the behavior of the children during the interaction with
the system, the focus was on the number of the eye contact correctly performed (nEC).
The system considers 1000 seconds as the maximum time (tMAX) to perform the eye
contact.</p>
        <p>With respect to nEC, it has been observed that, in percentage, the eye contact act
had an average success rate equal to 73.33% in S1, 55.55% in S2, and 28.89% in S3 as
reported in Table 1.</p>
        <p>Figure 5 depicts the nEC achieved by each child in S1, S2, and S3.</p>
        <p>A first preliminary evaluation of the system has been performed involving 3 high
functioning children with autism spectrum disorders. Results were encouraging,
analyzing the nEC in the three different sessions. In fact, it is possible to understand in an
objective way the level of difficulty of the child involved in the treatment. This measure
can be useful to adjust the treatment in the next session with the robot. In the experiment
all children are able to perform well the Eye Contact exercise at the easy level, but they
need help when they perform the eye contact at the medium level and at the hard level.
In this paper an Artificial Intelligence system for Robot-Child interaction based on a
behavioral treatment protocol has been proposed. Results show as a social robot playing
the role of mediator can be successful in robot-assisted treatment of autistic children.
The same children who involved in this experiment will be interact with the social robot
in the same exercises to test the follow up of the treatment. Moreover, investigations
including experiments with a larger sample of autistic children that interact with whole
protocol (eye contact joint attention, body imitation, facial imitation, and facial
expression imitation) will allow us to test the exercises completely.
6</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgments</title>
      <p>This work has been partially supported by Italian Ministry for Education, University
and Research (MIUR) and European Union under Grants PON04a3_00201, SARACEN
project.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Admoni</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Scassellati</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>Social eye gaze in human-robot interaction: A review</article-title>
          .
          <source>Journal of Human-Robot Interaction</source>
          <volume>6</volume>
          (
          <issue>1</issue>
          ),
          <fpage>25</fpage>
          -
          <lpage>63</lpage>
          (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Barakova</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lourens</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Interplay between natural and artificial intelligence in training autistic children with robots</article-title>
          .
          <source>In: International Work-Conference on the Interplay Between Natural and Artificial Computation</source>
          . pp.
          <fpage>161</fpage>
          -
          <lpage>170</lpage>
          . Springer (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Chevalier</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Martin</surname>
            ,
            <given-names>J.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Isableu</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bazile</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iacob</surname>
            ,
            <given-names>D.O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tapus</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Joint attention using human-robot interaction: Impact of sensory preferences of children with autism</article-title>
          .
          <source>In: Robot and Human Interactive Communication (RO-MAN)</source>
          ,
          <year>2016</year>
          25th IEEE International Symposium on. pp.
          <fpage>849</fpage>
          -
          <lpage>854</lpage>
          . IEEE (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Jarrold</surname>
            ,
            <given-names>W.L.</given-names>
          </string-name>
          :
          <article-title>Treating autism with the help of artificial intelligence: a value proposition</article-title>
          .
          <source>In: Proceedings of the Agent-based Systems for Human Learning Workshop (ABSHL) at Autonomous Agents and Multiagent Systems (AAMAS)</source>
          . pp.
          <fpage>30</fpage>
          -
          <lpage>37</lpage>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Jeffries</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Crosland</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Miltenberger</surname>
          </string-name>
          , R.:
          <article-title>Evaluating a tablet application and differential reinforcement to increase eye contact in children with autism</article-title>
          .
          <source>Journal of applied behavior analysis 49(1)</source>
          ,
          <fpage>182</fpage>
          -
          <lpage>187</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Leaf</surname>
            ,
            <given-names>J.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leaf</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>McEachin</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taubman</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>AlaâA</surname>
          </string-name>
          ˘ Z´
          <string-name>
            <surname>i-Rosales</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ross</surname>
            ,
            <given-names>R.K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Weiss</surname>
          </string-name>
          , M.J.:
          <article-title>Applied behavior analysis is a science and, therefore, progressive</article-title>
          .
          <source>Journal of autism and developmental disorders 46(2)</source>
          ,
          <fpage>720</fpage>
          -
          <lpage>731</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Palestra</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cazzato</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Adamo</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bortone</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Distante</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          :
          <article-title>Assistive robot, rgb-d sensor and graphical user interface to encourage communication skills in asd population</article-title>
          .
          <source>Journal of Medical Robotics Research</source>
          p.
          <volume>1740002</volume>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Palestra</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Varni</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chetouani</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Esposito</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>A multimodal and multilevel system for robotics treatment of autism in children</article-title>
          .
          <source>In: Proceedings of the International Workshop on Social Learning and Multimodal Interaction for Designing Artificial Agents</source>
          . p.
          <fpage>3</fpage>
          .
          <string-name>
            <surname>ACM</surname>
          </string-name>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Scassellati</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Admoni</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , Mataric´,
          <string-name>
            <surname>M.</surname>
          </string-name>
          :
          <article-title>Robots for use in autism research</article-title>
          .
          <source>Annual review of biomedical engineering 14</source>
          ,
          <fpage>275</fpage>
          -
          <lpage>294</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Yun</surname>
            ,
            <given-names>S.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Choi</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Park</surname>
            ,
            <given-names>S.K.</given-names>
          </string-name>
          :
          <article-title>A robot-assisted behavioral intervention system for children with autism spectrum disorders</article-title>
          .
          <source>Robotics and Autonomous Systems</source>
          <volume>76</volume>
          ,
          <fpage>58</fpage>
          -
          <lpage>67</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Zheng</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Young</surname>
            ,
            <given-names>E.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Swanson</surname>
            ,
            <given-names>A.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Weitlauf</surname>
            ,
            <given-names>A.S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Warren</surname>
            ,
            <given-names>Z.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sarkar</surname>
          </string-name>
          , N.:
          <article-title>Robotmediated imitation skill training for children with autism</article-title>
          .
          <source>IEEE Transactions on Neural Systems and Rehabilitation Engineering</source>
          <volume>24</volume>
          (
          <issue>6</issue>
          ),
          <fpage>682</fpage>
          -
          <lpage>691</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>