<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Social Imaging and Human Technology for Empowering People</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>ACM Classification Keywords</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Author Keywords Wearable Device, Affective Computing, Computational Behavioral Science</institution>
          ,
          <addr-line>Autism Spectrum Disorders</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>H.5.2 [User Interfaces]: User-centered design</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Kenji Suzuki University of Tsukuba Tsukuba</institution>
          ,
          <country country="JP">Japan</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Social imaging is regarded as a technology to identify and represent social behaviors. By using wearable devices and mixed reality technology, we aim at supporting people who have difficulty making facial expressions and interacting with other people, to express feelings and to act among people. In this paper, we introduce the overview of social imaging technologies and its application to empower people. We define the Social Imaging as the technology to identify and represent social behavior, and also define Human Technology as the technology for understanding and shaping behavior, which brings out latent human capabilities and potential abilities of people.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The Social Imaging is defined as the technology to identify
and represent social behavior. We consider that actions for
interaction with other people according to one’s own
motivation and initiative are regarded as social behavior, and
will establish a methodology to support the function and
capabilities such as movement and functions of the mind and
body, activities of daily lives, and social participation where
people play a role at work, family and in the region, which
are specified in the ICF (International Classification of
Functioning, Disability and Health [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]). Sociality or social
skill, which can be represented by understanding facial
expressions, talking to someone eye-to-eye, understanding
others’ stances and feelings, cooperating with each other,
expressing what you want to do, adjusting to the environment
by reducing your own desire, is an important characteristic
© 2018. Copyright for the individual papers remains with the authors.
Copying permitted for private and academic purposes.
      </p>
      <p>SymCollab '18, March 11, Tokyo, Japan.
for people to live in harmony with others as well as
intelligence and motor functions.</p>
      <p>These technologies include: wearable technology to measure
implicit and explicit social behavior, robotics technology that
supports actions and modification through interactive
feedback to sensory organs, biosignal processing technology
that captures human intention and emotional response. We
develop these technologies, called human technology,
centered on advanced wearable technology that works with
people to innovative science and technology fields for
understanding and shaping behavior.</p>
      <p>HUMAN TECHNOLOGY
One of the common global social challenges is the functional
rehabilitation/regeneration of people who have difficulties
and disabilities in movement. In addition to physically
impaired persons, children with developmental disorders
such as Autism Spectrum Disorder or Learning Disabilities
are increasing around the world. Therefore, research on
human-machine systems using robots and wearable devices
in the field of medical care, nursing care, healthcare, medical
treatment and education support, has been popular in recent
years. Examples in the United States are Cyber-Human
Systems (CHS) that aims to make a human-machine system
where human and machines coordinate with each other. Also
in Europe, in Horizon 2020, robot and ICT advancements
made for societal implementation is an important issue and
has a considerable amount of attention. These kinds of
research on the innovation of human-machine systems
(Shaping the human-technology frontier) is dramatically
increasing importance around the world, and it is positioned
as one of the six main strategic targets of NSF (National
Science Foundation, US) along with data science, quantum
mechanics, and astrophysics. Assistive technology is a term
used for devices for people with disabilities, and it
supplements the reduced physical and sensory functions. The
assistive technology aims to allow people with disabilities to
participate more fully in all aspects of life (home, school, and
community) and increases their opportunities for education,
social interactions, and potential for meaningful employment.
On the other hand, the technology for human augmentation
or enhancement have already become widely available. With
the advancement of biomedical sciences and technologies
progress, new ethical, legal and social implications should be
considered.</p>
      <p>We develop four primary categories of abilities underlying
the realization of social behavior (e.g., independent abilities)
for the major outcome measures related to the movement and
understanding for individuals with disabilities. The
theoretical model is presented in Figure 1. There are typical
impairment or disabilities as also shown in the Figure. These
factors, in combination and alone, determine individuals
with disabilities integration into society; an individual’s type
and severity of impairment. In the past, shaping behavior or
training focused on the voluntary actions rather than the
internal mechanism underlying social behavior. As
mentioned before, difficulties of children with ASD or
developmental disorders are regarded as the problems in the
involuntary movement. We develop a methodology to let
them notice the involuntary movement as the voluntary
movement. The design methodology of these assistance for
the deficits in sensory motor function may not suppress the
cause but may lead the brain to find new solutions. This is a
triggering or signaling based assistance for shaping
behaviors. Involuntary movement are represented by the
voluntary movement, and then they understand the behavior
as the subjective experience.</p>
      <p>Voluntary Movement: Action is the process of acting or
doing something with his/her own intention. “Voluntary”
here means that the human agent initiated the motion of her
body based on her (spontaneous) decision and regardless
whether there has been a previous external stimulus
provoking a reflex or any internal constraint. The voluntary
(active) approach to motor learning is more effective than a
passive approach because the active approach entails
activation of the entire processes related to the intended
movement. It is also irrelevant whether motion actually
occurs, concerning the release of a controlling neural signal
(motor program) which may or may not result in bodily
motion (ex. Paralysis). Therefore, persons with difficulty in
the voluntary movement are mainly physically impaired. a
disability that limits a person's physical capacity to move and
coordinate actions although their intention is available.
Involuntary Movement: Persons with difficulty in the
involuntary movement includes patients with
neurodegenerative diseases, such as the tremor, chorea, or
myoclonus by Alzheimer's disease, ALS or Parkinson's
disease. In the framework of social behavior, we assume that
persons with Autism Spectrum Disorder
(Neurodevelopmental disorder) are main target group with
special needs. One of the main characteristics of person with
ASD is related to social problems that include difficulty
communicating and interacting with others (DSM-5,
Diagnostic and Statistical Manual of Mental Disorders) as
well as restricted interests and repetitive behaviors [2]. We
follow the approach that ASD starts as a movement disorder
[3]). Children with ASD often exhibit clear movement
developmental delays and disorganization, labeled and as
clumsiness. The function of the brain is to organize
movement, combined with attention to the feeling of self [4].
We consider that this is due to the ability to perceive
differences and then we need help their brain to perceive
differences.</p>
      <p>Voluntary Understanding: Actions speak louder than
pictures when it comes to understanding what others are
doing and feeling [5]. The representative ability of voluntary
understanding is considered as awareness, which is the state
or level of consciousness where sense data can be confirmed
by an observer. Here we consider that it is regarded as the
process to understand self and other’s behaviors in a
voluntary manner based on the subjective experience.
However, current science, psychology, philosophy and
neuroscience, has limited explanations for subjective
experience. The mechanisms as well as influences of
knowledge, attention, and intention on sensory awareness,
including perceived timing of events are investigated in this
field. This also include to assess the functions of information
perceived without awareness in determining what is
perceived with awareness.</p>
      <p>
        Involuntary Understanding: In the previous researches on
perception without awareness in cognitive psychology [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ],
stimuli are perceived even when observers are unaware of
the stimuli. The involuntary cognitions in everyday life
without a pre-determined focus is [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] Especially, an
involuntary response to an unexpected and sudden stimulus
is closely linked to affective processing [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. Therefore, we
focus on the emotion in this domain. In recent years, there
has been an increased interest in detecting and supporting
human physical and psychological wellbeing. A potential
source of information can be the experiences of pleasure
expressed though facial expressions, which can be used to
infer a person’s emotions and give insights into their internal
state and happiness.
      </p>
      <p>SOCIAL IMAGING
Social imaging is regarded as a technology to identify and
represent social behaviors. By using wearable devices and
mixed reality technology, we aim at supporting people who
have difficulty making facial expressions and interacting
with other people, to express feelings and to act among
people.</p>
      <p>Technologies that visualize biological functions and
behaviors that cannot be seen from the outside are called
“imaging”. For example, “brain imaging” is regarded as the
visualization of the functions of the brain, which reveals the
structural and functional characteristics of each person’s
brain. In recent years, several different sensing technologies
have been developed about measurement and analysis of
human behavior. In this domain, “behavior imaging” has
been proposed in computational behavior science including
the measurement of a personal viewpoint, which tries to
understand each person’s behavior.</p>
      <p>
        The uses of technology in interventions for children with
autism have been widely reported (e.g., [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]). Previous studies
have shown effectiveness of technology-based intervention
for teaching academic skills [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] or social skills [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ].
Our emerging data highlights the effects of the technology
use in measuring and promoting of positive social behaviors
(emotion recognition, face-to-face, approaching to the peers,
and touching with peers) in children with ASD.
      </p>
      <p>
        In this study, we use several wearable devices in order to
measure interactions as shown in Figure 2.; EnhancedTouch
is a bracelet-type device to detect and enhance human-human
physical touch [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. Enhanced Reach is a bibs-type device to
understand group dynamics and facilitate interaction [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ].
Facelooks is a head-band type device to detect and enhance
face-to-face behavior. FUTUREGYM is an interactive
school gymnasium with a large-scale interactive floor
projection system in a school setting in order to develop
social interaction skills such as prosocial behaviors including
helping and cooperating behaviors [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. We design and
implement an ergonomic wearable device, with high
reliability, for reading positive expressions from facial EMG
signals. The physical and physiological signals of the
emotions such as smiles caused by these interactions
[
        <xref ref-type="bibr" rid="ref16">16</xref>
        ][
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. New technologies to identify and represent social
interactions are proposed as “social imaging.” The measured
social behaviors are stored in a database as personal data,
which includes physiological and behavioral characteristics.
There is a high demand in society for the establishment of an
effective developmental support method for children with
autism spectrum disorders (ASD). It is known that children
with ASD have difficulty in social interactions involving
understanding and using facial expressions. Therefore,
support of social interaction at the early stage of children’s
lives has drawn considerable attention in the field of child
development. However, even though there have been several
studies about motivations for facial expressions and
communication towards people, it is difficult to carry out
objective evaluations since it is not easy to make quantitative
measurements in daily lives and at school. In order to
understand these kinds of behaviors, it is indispensable to
consider “social imaging”, whose aim is to understand
people’s social behaviors.
      </p>
      <p>
        DISCUSSION AND CONCLUSION
Our goal is to provide long-term support for children with
special needs, who have social impairments, and to evaluate
the effects of this support. Throughout long-term feasibility
study, we have measured the smiles of children with ASD in
a quantitative manner and have reported that their smiles
might facilitate social positive behaviors [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]. We are going
to develop an evidence-based developmental support system
especially for shaping social behaviors of children with ASD.
Deeper insight into their behavior, and elucidating the
development of social interaction among people at the early
stage of their lives, is very significant for learning social
skills such as joint attention, imitation, language
understanding, and understanding the development of
communication skills. In this study, we aim to elucidate the
environmental conditions in which children with ASD can
demonstrate their abilities to the fullest, and to the
establishment of an inclusive developmental support system,
where we support forming creativity and sociality including
cognitive functions and language functions.
      </p>
      <p>Empathic design is an effective approach for paying attention
to the user's feelings toward a solution in the practical site
such as schools, after-school playroom or home. We need
deep understanding of the existing environment, in particular,
the unique goals and players of the organization. We need
define the problem and potential areas of impact as well as
the allocated resources and operational process. Then, we
could align the current technology state to the target
environment, and implement solutions into it together with
players. Individuals with disabilities will be involved in each
activity, as joint investigators of this project.</p>
      <p>Helping children with developmental disorders (ASD, LD)
to interact with their peers or caregivers with given visual
aids. This helps these children to be aware of social cues, a
behavior that is considered critical for the development of
social interaction. Providing them with social signals by
using visual or auditory feedback with wearable technologies.
In addition, providing an automatic, objective measure of
wellbeing could be used to support therapists, patients,
engineers and doctors in order to improve therapeutic
activities and devices, and also to understand spontaneous
laughter and smiles. The developed wearable device for
reading positive expressions are used. Based on the further
investigations together with verbal information, providing
quantitative measure of laughter and smiles recorded during
long-term therapeutic interventions to quantify and infer the
user’s affective state in order to support psychological or
medical professionals.</p>
      <p>ACKNOWLEDGMENTS
This research was supported by JST-CREST, Japan (No.
JPMJCR14E2). The authors would like to thank all
colleagues, in particular, teachers and students of special
needs school at Otsuka for the FUTUREGYM project.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1. World Health Organization.
          <year>2001</year>
          .
          <article-title>International Classication of Functioning, Disability and Health (ICF), Geneva</article-title>
          , World Health Organization.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>Dimitris</given-names>
            <surname>Bolis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Leonhard</given-names>
            <surname>Schilbach</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Observing and Participating in Social Interactions: Action Perception and Action Control Across the Autistic Spectrum</article-title>
          , Dev Cogn Neurosci, doi:10.1016/j.dcn.
          <year>2017</year>
          .
          <volume>01</volume>
          .009.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Victoria L. Chester</surname>
          </string-name>
          ,
          <source>Matthew Calhoun</source>
          <year>2012</year>
          .
          <article-title>Gait symmetry in children with autism</article-title>
          ,
          <source>Autism Research and Treatment</source>
          , ID:
          <fpage>576478</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <given-names>David</given-names>
            <surname>Franklin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Daniel</given-names>
            <surname>Wolpert</surname>
          </string-name>
          .
          <year>2011</year>
          .
          <article-title>Computational mechanisms of sensorimotor control</article-title>
          ,
          <source>Neuron</source>
          ,
          <volume>72</volume>
          , 3 :
          <fpage>425</fpage>
          -
          <lpage>442</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <given-names>Charles</given-names>
            <surname>Darwin</surname>
          </string-name>
          .
          <year>1872</year>
          .
          <article-title>The expression of the emotions in man and animals</article-title>
          , London, John Murray.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Philip</surname>
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Merikle</surname>
            , Daniel Smilek,
            <given-names>John D.</given-names>
          </string-name>
          <string-name>
            <surname>Eastwood</surname>
          </string-name>
          .
          <year>2001</year>
          .
          <article-title>Perception without awareness: perspectives from cognitive psychology</article-title>
          ,
          <source>Cognition</source>
          ,
          <volume>79</volume>
          ,
          <fpage>1</fpage>
          -
          <lpage>2</lpage>
          :
          <fpage>115</fpage>
          -
          <lpage>134</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>Julie</given-names>
            <surname>Krans</surname>
          </string-name>
          , June de Bree, Michelle L.
          <string-name>
            <surname>Moulds</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Involuntary Cognitions in Everyday Life: Exploration of Type, Quality, Content,</article-title>
          and
          <string-name>
            <surname>Function</surname>
          </string-name>
          , Front Psychiatry,
          <volume>6</volume>
          :
          <fpage>7</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Peter J. Lang</surname>
          </string-name>
          ,
          <string-name>
            <surname>Margaret M. Bradley</surname>
            ,
            <given-names>Bruce N.</given-names>
          </string-name>
          <string-name>
            <surname>Cuthbert</surname>
          </string-name>
          .
          <year>1990</year>
          .
          <article-title>Emotion, attention, and the startle reflex</article-title>
          .
          <source>Psychol</source>
          . Rev.,
          <volume>97</volume>
          :
          <fpage>377</fpage>
          -
          <lpage>395</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <given-names>Joe</given-names>
            <surname>McCleery</surname>
          </string-name>
          .
          <year>2015</year>
          .
          <article-title>Comment on Technology-Based Intervention Research for Individuals on the Autism Spectrum</article-title>
          ,
          <source>J Autism Dev Disord</source>
          ,
          <volume>45</volume>
          , 12:
          <fpage>3832</fpage>
          -
          <lpage>5</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Victoria</surname>
            <given-names>Knight</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bethany R. McKissick</surname>
            ,
            <given-names>Alicia</given-names>
          </string-name>
          <string-name>
            <surname>Saunders</surname>
          </string-name>
          .
          <year>2013</year>
          .
          <article-title>A review of technology-based interventions to teach academic skills to students with autism spectrum disorder</article-title>
          ,
          <source>J Autism Dev Disord</source>
          ,
          <volume>43</volume>
          , 11:
          <fpage>2628</fpage>
          -
          <lpage>48</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Florence D. DiGennaro Reed</surname>
          </string-name>
          ,
          <string-name>
            <surname>Sarah R. Hyman</surname>
          </string-name>
          ,
          <string-name>
            <surname>Jason</surname>
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Hirst</surname>
          </string-name>
          .
          <year>2011</year>
          .
          <article-title>Applications of Technology to Teach Social Skills to Children with Autism</article-title>
          , Research in Autism Spectrum Disorders,
          <volume>5</volume>
          ,
          <issue>3</issue>
          :
          <fpage>1003</fpage>
          -
          <lpage>1010</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Anne M. Donnellan</surname>
            ,
            <given-names>David A.</given-names>
          </string-name>
          <string-name>
            <surname>Hill</surname>
          </string-name>
          ,
          <string-name>
            <surname>Martha</surname>
            <given-names>R.</given-names>
          </string-name>
          <string-name>
            <surname>Leary</surname>
          </string-name>
          .
          <year>2012</year>
          .
          <article-title>Rethinking autism: implications of sensory and movement differences for understanding and support, Front Integr Neurosci</article-title>
          .,
          <volume>6</volume>
          :
          <fpage>124</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Kenji</surname>
            <given-names>Suzuki</given-names>
          </string-name>
          , Taku Hachisu, Kazuki Iida,
          <year>2016</year>
          ,
          <article-title>EnhancedTouch: A Smart Bracelet for Enhancing Human-Human Physical Touch</article-title>
          ,
          <source>In Proceedings of SIGCHI Conference on Human Factors in Computing Systems (CHI'16)</source>
          ,
          <fpage>1282</fpage>
          -
          <lpage>1293</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Asaki</surname>
            <given-names>Miura</given-names>
          </string-name>
          , Takashi Isezaki and
          <string-name>
            <given-names>Kenji</given-names>
            <surname>Suzuki</surname>
          </string-name>
          .
          <year>2013</year>
          .
          <article-title>Social Playware with an Enhanced Reach for Facilitating Group Interaction</article-title>
          ,
          <source>In Proceedings of CHI '13 Extended Abstracts on Human Factors in Computing Systems</source>
          ,
          <volume>1155</volume>
          -
          <fpage>1160</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Issey</surname>
            <given-names>Takahashi</given-names>
          </string-name>
          , Mika Oki, Baptiste Bourreau, Itaru Kitahara,
          <string-name>
            <given-names>Kenji</given-names>
            <surname>Suzuki</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>FUTUREGYM: A gymnasium with interactive floor projection for children with special needs</article-title>
          ,
          <source>Int J Child Comput Interact</source>
          , (in press)
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <given-names>Anna</given-names>
            <surname>Gruebler</surname>
          </string-name>
          and
          <string-name>
            <given-names>Kenji</given-names>
            <surname>Suzuki</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>Design of a Wearable Device for Reading Positive Expressions from Facial EMG Signals</article-title>
          ,
          <source>IEEE Trans Affect Comput</source>
          ,
          <volume>5</volume>
          ,
          <issue>3</issue>
          :
          <fpage>227</fpage>
          -
          <lpage>237</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Monica</surname>
            Perusquia-Hernandez, Masakazu Hirokawa,
            <given-names>Kenji</given-names>
          </string-name>
          <string-name>
            <surname>Suzuki</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>A Wearable Device for Fast and Subtle Spontaneous Smile Recognition</article-title>
          ,
          <source>IEEE Trans Affect Comput</source>
          ,
          <volume>8</volume>
          ,
          <issue>4</issue>
          :
          <fpage>522</fpage>
          -
          <lpage>533</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Atsushi</surname>
            <given-names>Funahashi</given-names>
          </string-name>
          , Anna Gruebler, Takashi Aoki, Hideki Kadone and
          <string-name>
            <given-names>Kenji</given-names>
            <surname>Suzuki</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>The Smiles of a Child with Autism Spectrum Disorder During an Animal-assisted Activity May Facilitate Social Positive Behaviors - Quantitative Analysis with Smile-detecting Interface</article-title>
          ,
          <source>J Autism Dev Disord</source>
          ,
          <volume>44</volume>
          , 3:
          <fpage>685</fpage>
          -
          <lpage>693</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>