<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Towards Game-based Assessment of Executive Functions in Children</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alexis Lueckenhoff</string-name>
          <email>alexis.lueckenhoff@uta.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Maria Kyrarini</string-name>
          <email>maria.kyrarini@uta.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Game-based assessment, Executive Function, Flanker Task, Eye</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Callen Wessels</string-name>
          <email>callen.wessels@uta.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Fillia Makedon</string-name>
          <email>makedon@uta.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Computer Science and Engineering Department, The University of Texas at Arlington</institution>
          ,
          <addr-line>Arlington, Texas</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Gaze</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p />
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Executive Functions are very important mental skills that help us
to coordinate, plan, pay attention, organize, and multitask, among
others. Weak executive functions may affect school or work
performance. Therefore, there is a need of identifying executive
function deficits early during childhood and enable interventions
that could improve executive functioning skills. In this work, we
present a game-based assessment system of executive functions in
children that could be performed at home. The proposed system
utilizes machine learning techniques to detect and track head and
eye movements from image frames and fuses this data with game
performance. A novel variation of the Flanker task has been
developed as a game to measure engagement, attention, working
memory, and processing speed. In the future, the proposed system
will be evaluated in a real-world study on children between 6 and
14 years old.</p>
    </sec>
    <sec id="sec-2">
      <title>CCS CONCEPTS</title>
      <p>• Human-centered computing ~ Human-computer interaction
(HCI) ~ Interaction paradigms ~ Web-based interaction</p>
    </sec>
    <sec id="sec-3">
      <title>1 Introduction</title>
      <p>
        Executive Functions (EFs) are a set of cognitive skills that support
the regulation of thoughts, emotions, and behaviors. The role of
EF is very important as they assist us to achieve goals in our daily
lives, whether planning an event, multi-tasking, or regulating
emotions. EFs are essential for school achievements, for the
preparation and adaptability of our future workforce, and for
avoiding a wide range of health problems [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. EFs are dramatically
developed during infancy and childhood. Executive function
deficits are common symptoms of some neurodevelopmental
disorders observed in children, such as Attention Deficit and
Hyperactivity Disorder (ADHD), Learning Disability (LD), and
Autism Spectrum Disorder (ASD) [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. In the U.S., according to
researchers, 9.26% of children between 6-11 years suffer from
ADHD, 8.02% from LD, and 1.75% from ASD [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. Therefore, there
is a fundamental need to help children suffering from
neurodevelopmental disorders to overcome deficits of EFs. The
development of EFs requires proper assessment and intervention
at the appropriate time during childhood [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Traditionally,
psychologists and medical experts have been assessing EFs
through written closed-ended questionnaires that the children,
their parents, and their teachers require to complete. However,
these assessments are subjective based on the personal feelings
and opinions of the respondents and time-consuming as they
require multiple visits. Therefore, an objective system to assess
EFs is vital.
      </p>
      <p>
        The NIH toolbox cognitive battery [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] is a set of computer-based
tests to assess EFs, such as working memory, inhibitory control,
attention, and processing speed. When a test is completed, the
NIH toolbox yields the measured scores. However, the NIH
toolbox calculates the score based on a child’s performance during
the test. Nowadays, devices, such as smartphones, tablets, and
laptops, are part of the everyday life of children. Most of them
play video games from a young age. Therefore, a child may not be
engaged with the NIH toolbox tests and because of this, s/he may
not perform well.
      </p>
      <p>
        Another assessment system is the Activate Test of Embodied
Cognition (ATEC) [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ][
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], which is designed to measure EFs in
children through physically and cognitively demanding tasks.
Embodied cognition is a theory of cognitive psychology
suggesting that bodily actions can influence cognition [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. The
ATEC has 17 physical tasks with several variations and difficulty
levels, designed to provide measurements of executive and motor
functions. The ATEC is developed for school environments and
consists of two Kinect cameras, a large screen, and a table
interface for the administrator. However, the ATEC system is not
suited for a home environment.
      </p>
      <p>Moreover, children with weak EFs may stay undetected because
of limited access to health professionals. Identifying issues with
EF early can be beneficial for the child’s development and could
improve the likelihood of success in school and later in life.
Therefore, it is crucial to have an assessment system of EF that is
engaging and can be conducted at home with widely-used
everyday devices. In this paper, we propose a Game-based
Assessment Test of EFs (G-ATEF), which is web-based and
compatible with the most widely-used devices (e.g. smartphones,
laptops, tablets). Additionally, G-ATEF measures not only the
game performance metrics but also physiological measurements,
such as eye and head movements, from a camera already available
on the devices. The eye and head movements of the children
during the game can provide valuable information regarding
engagement and attention. Deep learning methods will be utilized
to identify the movements from the camera images and to
calculate the scores of attention, engagement, working memory,
and inhibition by combining the eye and head movements with
the game performance.</p>
      <p>The rest of the paper is organized as follows; Section 2 presents
an overview of the G-ATEF system, section 3 discusses the
proposed game and section 4 concludes and provides future
directions.</p>
    </sec>
    <sec id="sec-4">
      <title>2 Overview of the Proposed System</title>
      <p>
        Figure 1 illustrates an overview of the proposed G-ATEF system.
The G-ATEF consists of a web-based Graphical User Interface
(GUI) that is compatible with most smartphones, tablets, laptops,
etc. At the beginning of the assessment, the parents are required
to give their consent and to provide an email so they can receive
the assessment scores and additional information about EFs at the
end of the test. First, the GUI instructs the child to look at specific
locations on the screen, as calibration is required to enable
accurate eye tracking. Both parent and child consent will be asked
for and is required for this functionality as well. Subsequently,
the child starts playing the proposed game described in section 3.
During the game, image frames of the device’s camera are used to
detect and track the head and eyes of the child. The head is
detected and tracked by the framework developed in [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], which
detects the face, estimates the position and orientation of the head,
and tracks the head’s pose in subsequent image frames. A
recurrent Convolutional Neural Network (CNN) developed by
[
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] is used to detect the eye gaze. In parallel, game performance
is analyzed to measure game metrics, such as correctness and
response time. The game metrics and the head and eye
movements are synchronized and a deep learning framework is
used for the fusion of the data. The output of the framework is the
scores of attention, working memory, engagement, and
processing speed, which are important EFs. The attention is
scored based on the correct answers in the game combined with
the eye gaze and head motion data. The working memory is
scored based on correct answers according to the rules of the
game and the engagement is computed by the eye gaze and head
motion data. The processing time is computed based on the
response time in the game combined with the eye gaze. The
calculated scores are then grouped into three classes “low EF”,
“medium EF” and “high EF” and are sent to the parent with
additional resources for EFs and contact information for experts.
      </p>
    </sec>
    <sec id="sec-5">
      <title>3 Proposed Game</title>
      <p>
        The NIH Toolbox proposes a Flanker Inhibitory Control and
Attention Test (Flanker Task) in order to measure EFs. In the
flanker task, the subjects are required to indicate the left or right
orientation of a centrally presented arrow that is surrounded by
two arrows on either side (i.e. the flankers) [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>In this paper, we present a variation on the Flanker Task that
strives to be more engaging for children to collect more accurate
results on EFs in children. In the proposed game-based assessment
task, various sharks are arranged across the screen facing left or
right. The child is directed to only focus on one. Their goal is to
quickly identify its direction while ignoring the distractor sharks.
The task has different variations, or levels, in which the rules
slightly change. The first level is the closest to the traditional
flanker task. The child is instructed to focus only on the center
shark and sequences of five sharks arranged horizontally or
vertically or nine sharks arranged in a grid are tested. An example
of the horizontal arrangement of the sharks is shown in Figure 2.
Level two uses a grid of nine sharks, but rather than focusing on
the middle shark, a spotlight will identify the focus-shark briefly
before the sharks appear. This spotlight-location changes every
round. Figure 3 shows an example of the second level of the
proposed game. Level three uses a grid layout of various-sized
sharks. The spotlight is used at this level as well. Figure 4
illustrates an example of level three.
There is an additional long-term goal for the child to focus on. If
at any time during the task the child spots a dolphin anywhere on
the screen, they are to press the dolphin button rather than the
direction of the shark in focus. The child is told about the dolphin
at the beginning of level one and is not reminded for the
remainder of the task. Figure 5 shows an example of the dolphin.
In addition to collecting correctness and timing of each round,
head and eye movement data is used to discover trends in the
child’s attention and engagement. By analyzing the eye gaze data
we hope to be able to infer how the child approaches the task, why
the child incorrectly identifies a shark’s direction, and for how
long the child continues to look for the dolphin as the rounds
progress.
a
b</p>
    </sec>
    <sec id="sec-6">
      <title>4 Conclusion and Future Directions</title>
      <p>In this position paper, we have proposed a game-based assessment
system of EFs in children. A web-based GUI has been developed
to enable a child to play the game and the head and eye
movements are detected and tracked by a camera and advanced
machine learning techniques. We have designed a novel game
based on the flanker task, which can measure EFs, such as
engagement, attention, working memory, and processing speed.
The proposed system has the potential to be used as a home
assessment tool, which provides parents initial indications to seek
further professional assistance. The next step of our research is to
conduct a real-world study with children in elementary school
(age range between 6 and 14 years old), to evaluate the proposed
system and its machine/deep learning algorithms.</p>
    </sec>
    <sec id="sec-7">
      <title>ACKNOWLEDGMENT</title>
      <p>This paper is based upon work supported by the National Science
Foundation under Grant No 1565328. Any opinions, findings, and
conclusions or recommendations expressed in this paper are those
of the author(s) and do not necessarily reflect the views of the
National Science Foundation.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1] Center on the Developing Child at Harvard University,
          <year>2012</year>
          .
          <string-name>
            <given-names>Executive</given-names>
            <surname>Function (InBrief).</surname>
          </string-name>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Scandurra</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Emberti</surname>
            <given-names>Gialloreti</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            ,
            <surname>Barbanera</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            ,
            <surname>Scordo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Pierini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            , and
            <surname>Canitano</surname>
          </string-name>
          ,
          <string-name>
            <surname>R.</surname>
          </string-name>
          ,
          <year>2019</year>
          .
          <article-title>Neurodevelopmental Disorders and Adaptive Functions: A Study of Children with Autism Spectrum Disorders (ASD) and/or Attention Deficit and Hyperactivity Disorder (ADHD)</article-title>
          .
          <source>Frontiers in psychiatry, 10</source>
          , p.
          <fpage>673</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Zablotsky</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Black</surname>
            ,
            <given-names>L.I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maenner</surname>
            ,
            <given-names>M.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schieve</surname>
            ,
            <given-names>L.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Danielson</surname>
            ,
            <given-names>M.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bitsko</surname>
            ,
            <given-names>R.H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Blumberg</surname>
            ,
            <given-names>S.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kogan</surname>
            ,
            <given-names>M.D.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Boyle</surname>
            ,
            <given-names>C.A.</given-names>
          </string-name>
          ,
          <year>2019</year>
          .
          <article-title>Prevalence and Trends of Developmental Disabilities among Children in the United States:</article-title>
          <year>2009</year>
          -
          <fpage>2017</fpage>
          . Pediatrics,
          <volume>144</volume>
          (
          <issue>4</issue>
          ), p.
          <fpage>e20190811</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>McClelland</surname>
            ,
            <given-names>M.M.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Cameron</surname>
            ,
            <given-names>C.E.</given-names>
          </string-name>
          ,
          <year>2012</year>
          .
          <article-title>Self‐regulation in early childhood: Improving conceptual clarity and developing ecologically valid measures</article-title>
          .
          <source>Child development perspectives</source>
          ,
          <volume>6</volume>
          (
          <issue>2</issue>
          ), pp.
          <fpage>136</fpage>
          -
          <lpage>142</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Zelazo</surname>
            ,
            <given-names>P.D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Anderson</surname>
            ,
            <given-names>J.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Richler</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wallner‐Allen</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beaumont</surname>
            ,
            <given-names>J.L.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Weintraub</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <year>2013</year>
          . II.
          <article-title>NIH Toolbox Cognition Battery (CB): Measuring executive function and attention</article-title>
          .
          <source>Monographs of the Society for Research in Child Development</source>
          ,
          <volume>78</volume>
          (
          <issue>4</issue>
          ), pp.
          <fpage>16</fpage>
          -
          <lpage>33</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Dillhoff</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tsiakas</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Babu</surname>
            ,
            <given-names>A.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zakizadehghariehali</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Buchanan</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bell</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Athitsos</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Makedon</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <year>2019</year>
          ,
          <string-name>
            <surname>September.</surname>
          </string-name>
          <article-title>An automated assessment system for embodied cognition in children: from motion data to executive functioning</article-title>
          .
          <source>In Proceedings of the 6th international Workshop on Sensor-based Activity Recognition and Interaction</source>
          (pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Ramesh</given-names>
            <surname>Babu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Zadeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.Z.</given-names>
            ,
            <surname>Jaiswal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Lueckenhoff</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Kyrarini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            , and
            <surname>Makedon</surname>
          </string-name>
          ,
          <string-name>
            <surname>F.</surname>
          </string-name>
          ,
          <year>2020</year>
          .
          <article-title>A Multi-modal System to Assess Cognition in Children from their Physical Movements</article-title>
          .
          <source>In Proceedings of the 2020 International Conference on Multimodal Interaction</source>
          (pp.
          <fpage>6</fpage>
          -
          <lpage>14</lpage>
          ). ACM.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Babu</surname>
            ,
            <given-names>A.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zakizadeh</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brady</surname>
            ,
            <given-names>J.R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Calderon</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Makedon</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <year>2019</year>
          ,
          <string-name>
            <surname>August.</surname>
          </string-name>
          <article-title>An Intelligent Action Recognition System to assess Cognitive Behavior for Executive Function Disorder</article-title>
          .
          <source>In 2019 IEEE 15th International Conference on Automation Science and Engineering (CASE)</source>
          (pp.
          <fpage>164</fpage>
          -
          <lpage>169</lpage>
          ). IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Goldau</surname>
            ,
            <given-names>F.F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shastha</surname>
            ,
            <given-names>T.K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kyrarini</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Gräser</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <year>2019</year>
          , June. Autonomous multi
          <article-title>-sensory robotic assistant for a drinking task</article-title>
          .
          <source>In 2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR)</source>
          (pp.
          <fpage>210</fpage>
          -
          <lpage>216</lpage>
          ). IEEE.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Palmero</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Selva</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bagheri</surname>
            ,
            <given-names>M.A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Escalera</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <year>2018</year>
          .
          <article-title>Recurrent CNN for 3d gaze estimation using appearance and shape cues</article-title>
          . arXiv preprint arXiv:
          <year>1805</year>
          .03064.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>