<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Towards Game-based Assessment of Executive Functions in Children</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Alexis</forename><surname>Lueckenhoff</surname></persName>
							<email>alexis.lueckenhoff@uta.edu</email>
							<affiliation key="aff0">
								<orgName type="department">Computer Science and Engineering Department</orgName>
								<orgName type="institution">The University of Texas at Arlington</orgName>
								<address>
									<settlement>Arlington</settlement>
									<region>Texas</region>
									<country key="US">USA</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Callen</forename><surname>Wessels</surname></persName>
							<email>callen.wessels@uta.edu</email>
							<affiliation key="aff1">
								<orgName type="department">Computer Science and Engineering Department</orgName>
								<orgName type="institution">The University of Texas at Arlington</orgName>
								<address>
									<settlement>Arlington</settlement>
									<region>Texas</region>
									<country key="US">USA</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Maria</forename><surname>Kyrarini</surname></persName>
							<email>maria.kyrarini@uta.edu</email>
							<affiliation key="aff2">
								<orgName type="department">Computer Science and Engineering Department</orgName>
								<orgName type="institution">The University of Texas at Arlington</orgName>
								<address>
									<settlement>Arlington</settlement>
									<region>Texas</region>
									<country key="US">USA</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Fillia</forename><surname>Makedon</surname></persName>
							<email>makedon@uta.edu</email>
							<affiliation key="aff3">
								<orgName type="department">Computer Science and Engineering Department</orgName>
								<orgName type="institution">The University of Texas at Arlington</orgName>
								<address>
									<settlement>Arlington</settlement>
									<region>Texas</region>
									<country key="US">USA</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Towards Game-based Assessment of Executive Functions in Children</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">990281B0B856CE75B95C37B1802C693B</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-25T00:01+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Game-based assessment</term>
					<term>Executive Function</term>
					<term>Flanker Task</term>
					<term>Eye Gaze</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Executive Functions are very important mental skills that help us to coordinate, plan, pay attention, organize, and multitask, among others. Weak executive functions may affect school or work performance. Therefore, there is a need of identifying executive function deficits early during childhood and enable interventions that could improve executive functioning skills. In this work, we present a game-based assessment system of executive functions in children that could be performed at home. The proposed system utilizes machine learning techniques to detect and track head and eye movements from image frames and fuses this data with game performance. A novel variation of the Flanker task has been developed as a game to measure engagement, attention, working memory, and processing speed. In the future, the proposed system will be evaluated in a real-world study on children between 6 and 14 years old.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">Introduction</head><p>Executive Functions (EFs) are a set of cognitive skills that support the regulation of thoughts, emotions, and behaviors. The role of EF is very important as they assist us to achieve goals in our daily lives, whether planning an event, multi-tasking, or regulating emotions. EFs are essential for school achievements, for the preparation and adaptability of our future workforce, and for avoiding a wide range of health problems <ref type="bibr" target="#b0">[1]</ref>. EFs are dramatically developed during infancy and childhood. Executive function deficits are common symptoms of some neurodevelopmental disorders observed in children, such as Attention Deficit and Hyperactivity Disorder (ADHD), Learning Disability (LD), and Autism Spectrum Disorder (ASD) <ref type="bibr" target="#b1">[2,</ref><ref type="bibr" target="#b2">3]</ref>. In the U.S., according to researchers, 9.26% of children between 6-11 years suffer from ADHD, 8.02% from LD, and 1.75% from ASD <ref type="bibr" target="#b2">[3]</ref>. Therefore, there is a fundamental need to help children suffering from neurodevelopmental disorders to overcome deficits of EFs. The development of EFs requires proper assessment and intervention at the appropriate time during childhood <ref type="bibr" target="#b3">[4]</ref>. Traditionally, psychologists and medical experts have been assessing EFs through written closed-ended questionnaires that the children, their parents, and their teachers require to complete. However, these assessments are subjective based on the personal feelings and opinions of the respondents and time-consuming as they require multiple visits. Therefore, an objective system to assess EFs is vital.</p><p>The NIH toolbox cognitive battery <ref type="bibr" target="#b4">[5]</ref> is a set of computer-based tests to assess EFs, such as working memory, inhibitory control, attention, and processing speed. When a test is completed, the NIH toolbox yields the measured scores. However, the NIH toolbox calculates the score based on a child's performance during the test. Nowadays, devices, such as smartphones, tablets, and laptops, are part of the everyday life of children. Most of them play video games from a young age. Therefore, a child may not be engaged with the NIH toolbox tests and because of this, s/he may not perform well.</p><p>Another assessment system is the Activate Test of Embodied Cognition (ATEC) <ref type="bibr" target="#b5">[6]</ref> <ref type="bibr" target="#b6">[7]</ref>, which is designed to measure EFs in children through physically and cognitively demanding tasks. Embodied cognition is a theory of cognitive psychology suggesting that bodily actions can influence cognition <ref type="bibr" target="#b7">[8]</ref>. The ATEC has 17 physical tasks with several variations and difficulty levels, designed to provide measurements of executive and motor functions. The ATEC is developed for school environments and consists of two Kinect cameras, a large and a table interface for the administrator. However, the ATEC system is not suited for a home environment.</p><p>Moreover, children with weak EFs may stay undetected because of limited access to health professionals. Identifying issues with EF early can be beneficial for the child's development and could improve the likelihood of success in school and later in life. Therefore, it is crucial to have an assessment system of EF that is engaging and can be conducted at home with widely-used everyday devices. In this paper, we propose a Game-based Assessment Test of EFs (G-ATEF), which is web-based and compatible with the most widely-used devices (e.g. smartphones, laptops, tablets). Additionally, G-ATEF measures not only the game performance metrics but also physiological measurements, such as eye and head movements, from a camera already available on the devices. The eye and head movements of the children during the game can provide valuable information regarding engagement and attention. Deep learning methods will be utilized to identify the movements from the camera images and to calculate the scores of attention, engagement, working memory, and inhibition by combining the eye and head movements with the game performance.</p><p>The rest of the paper is organized as follows; Section 2 presents an overview of the G-ATEF system, section 3 discusses the proposed game and section 4 concludes and provides future directions. The G-ATEF consists of a web-based Graphical User Interface (GUI) that is compatible with most smartphones, tablets, laptops, etc. At the beginning of the assessment, the parents are required to give their consent and to provide an email so they can receive the assessment scores and additional information about EFs at the end of the test. First, the GUI instructs the child to look at specific locations on the screen, as calibration is required to enable accurate eye tracking. Both parent and child consent will be asked for and is required for this functionality as well. Subsequently, the child starts playing the proposed game described in section 3. During the game, image frames of the device's camera are used to detect and track the head and eyes of the child. The head is detected and tracked by the framework developed in <ref type="bibr" target="#b8">[9]</ref>, which detects the face, estimates the position and orientation of the head, and tracks the head's pose in subsequent image frames. A recurrent Convolutional Neural Network (CNN) developed by <ref type="bibr" target="#b9">[10]</ref> is used to detect the eye gaze. In parallel, game performance is analyzed to measure game metrics, such as correctness and response time. The game metrics and the head and eye movements are synchronized and a deep learning framework is used for the fusion of the data. The output of the framework is the scores of attention, working memory, engagement, and processing speed, which are important EFs. The attention is scored based on the correct answers in the game combined with the eye gaze and head motion data. The working memory is scored based on correct answers according to the rules of the game and the engagement is computed by the eye gaze and head motion data. The processing time is computed based on the response time in the game combined with the eye gaze. The calculated scores are then grouped into three classes "low EF", "medium EF" and "high EF" and are sent to the parent with additional resources for EFs and contact information for experts. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">Overview of the Proposed System</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">Proposed Game</head><p>The NIH Toolbox proposes a Flanker Inhibitory Control and Attention Test (Flanker Task) in order to measure EFs. In the flanker task, the subjects are required to indicate the left or right orientation of a centrally presented arrow that is surrounded by two arrows on either side (i.e. the flankers) <ref type="bibr" target="#b4">[5]</ref>.</p><p>In this paper, we present a variation on the Flanker Task that strives to be more engaging for children to collect more accurate results on EFs in children. In the proposed game-based assessment task, various sharks are arranged across the screen facing left or right. The child is directed to only focus on one. Their goal is to quickly identify its direction while ignoring the distractor sharks.</p><p>The task has different variations, or levels, in which rules slightly change. The first level is the closest to the traditional flanker task. The child is instructed to focus only on the center shark and sequences of five sharks arranged horizontally or vertically or nine sharks arranged in a grid are tested. An example of the horizontal arrangement of the sharks is shown in Figure <ref type="figure" target="#fig_2">2</ref>. Level two uses a grid of nine sharks, but rather than focusing on the middle shark, a spotlight will identify the focus-shark briefly before the sharks appear. This spotlight-location changes every round. Figure <ref type="figure" target="#fig_3">3</ref> shows an example of the second level of the proposed game. Level three uses a grid layout of various-sized sharks. The spotlight is used at this level as well. Figure <ref type="figure" target="#fig_4">4</ref> illustrates an example of level three. There is an additional long-term goal for the child to focus on. If at any time during the task the child spots a dolphin anywhere on the screen, they are to press the dolphin button rather than the direction of the shark in focus. The child is told about the dolphin at the beginning of level one and is not reminded for the remainder of the task. Figure <ref type="figure" target="#fig_5">5</ref> shows an example of the dolphin.</p><p>In addition to collecting correctness and timing of each round, head and eye movement data is used to discover trends in the child's attention and engagement. By analyzing the eye gaze data we hope to be able to infer how the child approaches the task, why the child incorrectly identifies a shark's direction, and for how long the child continues to look for the dolphin as the rounds progress.   </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4">Conclusion and Future Directions</head><p>In this position paper, we have proposed a game-based assessment system of EFs in children. A web-based GUI has been developed to enable a child to play the game and the head and eye movements are detected and tracked by a camera and advanced machine learning techniques. We have designed a novel game based on the flanker task, which can measure EFs, such as engagement, attention, working memory, and processing speed.</p><p>The proposed system has the potential to be used as a home assessment tool, which provides parents initial indications to seek further professional assistance. The next step of our research is to a b conduct a real-world study with children in elementary school (age range between 6 and 14 years old), to evaluate the proposed system and its machine/deep learning algorithms.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1</head><label>1</label><figDesc>Figure 1 illustrates an overview of the proposed G-ATEF system.The G-ATEF consists of a web-based Graphical User Interface (GUI) that is compatible with most smartphones, tablets, laptops, etc. At the beginning of the assessment, the parents are required to give their consent and to provide an email so they can receive the assessment scores and additional information about EFs at the</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Overview of the Proposed Game-based Assessment Test of Executive Functions (G-ATEF) System.</figDesc><graphic coords="2,319.20,385.76,237.59,235.65" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: An example of the first level of the proposed flanker task -Horizontal arrangement of the sharks.</figDesc><graphic coords="3,63.00,346.59,220.00,115.00" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: An example of the second level of the proposed flanker task -A spotlight will identify the focus-shark briefly before the sharks appear (left image). The sharks appear and the child has to identify the direction of the focus-shark (right image).</figDesc><graphic coords="3,318.00,82.20,243.00,56.88" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: An example of the third level of the proposed flanker task -Grid layout of various-sized sharks.</figDesc><graphic coords="3,336.98,221.55,202.05,130.15" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_5"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: An example of the long-term goal to spot a dolphin.</figDesc><graphic coords="3,337.00,410.25,202.50,120.50" type="bitmap" /></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>ACKNOWLEDGMENT</head><p>This paper is based upon work supported by the National Science Foundation under Grant No 1565328. Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<monogr>
		<title level="m">Executive Function</title>
				<imprint>
			<publisher>InBrief</publisher>
			<date type="published" when="2012">2012</date>
		</imprint>
		<respStmt>
			<orgName>Center on the Developing Child at Harvard University</orgName>
		</respStmt>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Neurodevelopmental Disorders and Adaptive Functions: A Study of Children with Autism Spectrum Disorders (ASD) and/or Attention Deficit and Hyperactivity Disorder (ADHD)</title>
		<author>
			<persName><forename type="first">V</forename><surname>Scandurra</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Emberti Gialloreti</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Barbanera</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Scordo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Pierini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Canitano</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Frontiers in psychiatry</title>
		<imprint>
			<biblScope unit="volume">10</biblScope>
			<biblScope unit="page">673</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Prevalence and Trends of Developmental Disabilities among Children in the United States: 2009-2017</title>
		<author>
			<persName><forename type="first">B</forename><surname>Zablotsky</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">I</forename><surname>Black</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">J</forename><surname>Maenner</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">A</forename><surname>Schieve</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">L</forename><surname>Danielson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">H</forename><surname>Bitsko</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">J</forename><surname>Blumberg</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">D</forename><surname>Kogan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">A</forename><surname>Boyle</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Pediatrics</title>
		<imprint>
			<biblScope unit="volume">144</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page">e20190811</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Self-regulation in early childhood: Improving conceptual clarity and developing ecologically valid measures</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">M</forename><surname>Mcclelland</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">E</forename><surname>Cameron</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Child development perspectives</title>
		<imprint>
			<biblScope unit="volume">6</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="136" to="142" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">II. NIH Toolbox Cognition Battery (CB): Measuring executive function and attention</title>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">D</forename><surname>Zelazo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">E</forename><surname>Anderson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Richler</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Wallner-Allen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">L</forename><surname>Beaumont</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Weintraub</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Monographs of the Society for Research in Child Development</title>
		<imprint>
			<biblScope unit="volume">78</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="16" to="33" />
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">An automated assessment system for embodied cognition in children: from motion data to executive functioning</title>
		<author>
			<persName><forename type="first">A</forename><surname>Dillhoff</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Tsiakas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">R</forename><surname>Babu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Zakizadehghariehali</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Buchanan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Bell</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Athitsos</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Makedon</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 6th international Workshop on Sensor-based Activity Recognition and Interaction</title>
				<meeting>the 6th international Workshop on Sensor-based Activity Recognition and Interaction</meeting>
		<imprint>
			<date type="published" when="2019-09">2019. September</date>
			<biblScope unit="page" from="1" to="6" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">A Multi-modal System to Assess Cognition in Children from their Physical Movements</title>
		<author>
			<persName><forename type="first">Ramesh</forename><surname>Babu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Zadeh</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">Z</forename><surname>Jaiswal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Lueckenhoff</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Kyrarini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Makedon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2020 International Conference on Multimodal Interaction</title>
				<meeting>the 2020 International Conference on Multimodal Interaction</meeting>
		<imprint>
			<publisher>ACM</publisher>
			<date type="published" when="2020">2020</date>
			<biblScope unit="page" from="6" to="14" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">August. An Intelligent Action Recognition System to assess Cognitive Behavior for Executive Function Disorder</title>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">R</forename><surname>Babu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Zakizadeh</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">R</forename><surname>Brady</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Calderon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Makedon</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE 15th International Conference on Automation Science and Engineering (CASE)</title>
				<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2019">2019. 2019</date>
			<biblScope unit="page" from="164" to="169" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Autonomous multi-sensory robotic assistant for a drinking task</title>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">F</forename><surname>Goldau</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><forename type="middle">K</forename><surname>Shastha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Kyrarini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Gräser</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE 16th International Conference on Rehabilitation Robotics (ICORR)</title>
				<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2019-06">2019. June. 2019</date>
			<biblScope unit="page" from="210" to="216" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<monogr>
		<title level="m" type="main">Recurrent CNN for 3d gaze estimation using appearance and shape cues</title>
		<author>
			<persName><forename type="first">C</forename><surname>Palmero</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Selva</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Bagheri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Escalera</surname></persName>
		</author>
		<idno type="arXiv">arXiv:1805.03064</idno>
		<imprint>
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
	<note type="report_type">arXiv preprint</note>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
