<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Analysis of the Emotions Experienced by Learning Greedy Algorithms with Augmented Reality</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Maximiliano Paredes-Velasco</string-name>
          <email>maximiliano.paredes@urjc.es</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mónica Gómez Rios</string-name>
          <email>mgomezr@ups.edu.ec</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Angel Velázquez-Iturbide</string-name>
          <email>angel.velazquez@urjc.es</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Universidad Politécnica Salesiana</institution>
          ,
          <addr-line>Guayaquil</addr-line>
          ,
          <country country="EC">Ecuador</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Universidad Rey Juan Carlos</institution>
          ,
          <addr-line>Móstoles</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Universidad Rey Juan Carlos</institution>
          ,
          <addr-line>Móstoles</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>-Students have difficulties in understanding algorithm subjects, in particular how the source code of algorithms proceeds to solve problems. This article presents an augmented reality tool intended to assist in learning greedy algorithms. Students use their smartphone's camera to focus the source code of Dijkstra's algorithm as written on paper, and the tool shows how the algorithm works. An experience was conducted in the classroom to assess students' emotions and knowledge level. The results show that positive emotions experienced by students were almost twice as intense as negative ones. Despite the complexity of the task (i.e. understanding Dijkstra's algorithm), the level of enjoyment of students was continuous during the experience. However, the anxiety experienced by students was the double than at the beginning.</p>
      </abstract>
      <kwd-group>
        <kwd>Emotions</kwd>
        <kwd>learning</kwd>
        <kwd>Augmented reality</kwd>
        <kwd>Greedy Algorithm</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION</title>
      <p>In algorithm design subjects of university computer
science degrees, the development of algorithms that solve
optimization problems is usually studied. In this educational
context, learning greedy algorithms is one of the most
complex topics for the student, not being able to translate into
source code the elements of their development (i.e., set of
candidates, selection function, feasibility function, objective
function) [1]. This difficulty may affect the emotional state of
students and cause them to become discouraged during the
learning process. Motivation and emotions in learning play a
fundamental role since they influence memory and logical
reasoning and help improve attention [2]. Today,
neuroscience research helps us in understanding how the brain
works and the influence and importance of emotions to
improve learning [3,4]. Along with this, if a student is not
predisposed to learn, or he/she experiences strong negative
emotions, it is unlikely that he/she will be able to achieve
his/her full potential. Furthermore, learning is characterized as
a cognitive and motivational process [5], where emotions may
affect both the intrinsic and extrinsic motivation of the student
[6,7].</p>
      <p>
        Not only emotions can influence learning outcomes. The
use of new technologies plays a relevant role in learning [8].
Specifically, augmented reality (AR) technology may
improve learning performance [9] and constitutes a
technology option with great potential and effectiveness to
activate positive emotions in students [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. Augmented reality
not only provides immersive experiences of visibility or
observation [
        <xref ref-type="bibr" rid="ref11 ref12 ref13">11-13</xref>
        ], but it may also contribute to student’s
feeling of greater satisfaction [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ], improved usability [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]
and reduction of his/her cognitive load [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] in the use of
technological tools during the learning process. Augmented
reality technologies have been applied to programming
learning at different educational levels, showing satisfactory
results: from early ages [
        <xref ref-type="bibr" rid="ref17 ref18">17,18</xref>
        ], high school and university
students [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] to professional adults [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ], being applied mainly
through teaching methodologies based on gamification
[
        <xref ref-type="bibr" rid="ref21 ref22">21,22</xref>
        ] and collaborative learning [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ]. The objective of this
work is to conduct an exploratory study of the advancement
of knowledge and the emotions that students experience while
they study greedy algorithms using augmented reality
technologies. An experience was conducted in a classroom
where students used an augmented reality tool on their
smartphones, called RA-AVD (Realidad Aumentada –
Algoritmo Voraz de Dijkstra, in English Augmented Reality
– Dijkstra’s Greedy Algorithm), along with paper notes
provided by the teacher.
      </p>
    </sec>
    <sec id="sec-2">
      <title>II. METHODOLOGY</title>
      <sec id="sec-2-1">
        <title>A. Educational objective and context</title>
        <p>
          The objective of the experience is to assess the level of
knowledge and the positive and negative emotions that
students experience by using the RA-AVD tool in solving
optimization problem. The experience is conducted in the
context of the Computer Science Degree at the Salesian
Polytechnic University of Ecuador, specifically in the area of
Data Structures. At some point in this subject, students have
to learn and develop greedy algorithms. In particular, they
must understand Dijkstra’s algorithm [
          <xref ref-type="bibr" rid="ref24">24</xref>
          ], which solves a
classic graph problem: determining the minimum length path
from a source node to the rest of the nodes in the graph.
Students participated voluntarily and they had no previous
contact or knowledge about greedy algorithms, although they
knew how to program and have basic knowledge of the Java
programming language.
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>B. Variables and measurement instruments</title>
        <p>The variables measured were the emotions experienced by
the students and the level of knowledge they acquired after the
experience. The level of knowledge is measured by a
knowledge test on the behavior of Dijkstra’s algorithm. The
test was formed by 5 multiple-choice questions where each
question was scored with maximum 2 points.</p>
        <p>
          The instrument used to measure emotions was AEQ
(Achievement Emotions Questionnaire), which is a consistent
and validated scale in the educational context [
          <xref ref-type="bibr" rid="ref25">25</xref>
          ]. Taking
into account the principles of neuroeducation [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ], the
emotional variables to be measured can be classified into two
types: 1) activation emotions, which are the emotions that
produce a higher degree of agitation (fear, anxiety, anger, etc.)
and 2) deactivation emotions, which produce lower agitation
(depression, calm, boredom, etc.). In addition, these emotions
can be classified by the positive (pleasant sensation) or
negative impact (unpleasant or uncomfortable sensation) that
they produce on participants. Overall, up to four classes of
emotions could be identified.
        </p>
        <p>The AEQ scale measures these emotions by offering a
series of statements about the participant’s emotional state and
students must assess the degree to which they describe their
emotions and feelings. Each statement is rated in a Likert
scale, which ranges from very little (value 1) to extremely
(value 5). In Table I, the variables of emotions measured with
AEQ are detailed (the number in parentheses corresponds to
the total of items to be assessed for that emotion). Note that
the test only addresses three kinds of emotions and that it
consists of 75 items which measure 8 emotions.</p>
        <p>Embarrassment (11)</p>
        <p>AEQ scale is organized so that students assess at the end
of the session their emotional state at three different times:</p>
      </sec>
      <sec id="sec-2-3">
        <title>1) Before starting the learning task: the student assesses</title>
        <p>how he/she feels right at the beginning of the experience.</p>
      </sec>
      <sec id="sec-2-4">
        <title>2) During the task: the student assesses how he/she feels</title>
        <p>while doing the learning task.</p>
      </sec>
      <sec id="sec-2-5">
        <title>3) After the Task: the student rates how he/she feels after completing the experience.</title>
        <p>Finally, the opinion of participants about the experience
was collected by asking some of them their opinion in a short
interview.</p>
      </sec>
      <sec id="sec-2-6">
        <title>C. Phases</title>
        <p>In the first phase, students attended at a laboratory and they
received on printed sheets the markers to present the
information with augmented reality. These sheets contained
the Java source code for Dijkstra's algorithm.</p>
        <p>Subsequently, students downloaded the application to
their mobile devices and a brief explanation was given about
the task to be carried out (for 10 minutes), which consisted of
reading and interpreting the source code that appeared on the
sheets provided.</p>
        <p>While reading the Java code on paper, students could use
the mobile to focus parts of the code and receive assistance
from the application. This phase lasted 30 minutes.</p>
        <p>Once this phase was finished, the students proceeded to
carry out the evaluation of acquired knowledge and they
measured the emotions experienced before, during and after
the learning task, using 15 minutes for this. The whole
experience was organized in a single session.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>III. APP DESIGN</title>
      <p>RA-AVD is a tool created for learning Dijkstra’s
algorithm. The use of the tool is based on the following idea.
The student uses the teacher's notes where the Java code that
implements Dijkstra’s algorithm is shown (for the tool to
recognize it, it has to be a specific source code). When the
student has doubts about how the code solves the problem,
he/she uses the tool, focusing on the source code with his/her
mobile. At that point, the tool shows the source code that the
student is viewing on paper, augmenting it with comments and
explanations in order to understand how it works. If the
student focuses the source code where the graph is declared,
RA-AVD tool draws the graph on the mobile’s screen (see</p>
      <p>The tool has a user interface that allows to control step by
step the execution of the algorithm, obtaining a display of its
execution trace, as in a debugger (see Figure 2). This trace
screen shows the graph declared in the source code at the top
and the trace table along with the step-by-step execution
controls below (see Figure 2).</p>
      <p>Figure 2 shows the trace in an intermediate state to
calculate the minimum paths from node 0 to the other nodes.
Notice that some nodes present a label formed by brackets in
the form of [A,B]N, where A is the distance of the path from
the source node to the node, B is the predecessor node from
that path and N is the resolution stage number. The algorithm
solves the problem in stages, in such a way that at each stage
it analyzes possible new paths between the source node and
each remaining node, recording a new path if it is shorter than
the current path. Therefore, these labels are dynamically
generated as tracing progresses, and they are replaced every
time the algorithm finds a shorter path. In this case, the path
that partially expresses the bracket label is discarded (by
displaying a horizontal line that crosses it out) and a new label
is shown by the node, representing the new path.</p>
      <p>An example may assist in better understanding this
notation. In Figure 2 we can see that node 3 has the label [8,1]4
with a crossing line, which means:



</p>
      <p>Number 8: length of the path from source node to
node 3 (labeled node).</p>
      <p>Number 1: node predecessor to node 3 in the path
from the source node.</p>
      <p>The subscript 4: step or stage in the solution
construction process.</p>
      <p>Strikethrough label (crossing line): it means that the
path denoted by that label is discarded. Node 3 has
two labels since, at that time, the algorithm had found
two alternative paths (namely, [3,2]2 and [8,1]4). The
path with a longer distance from the source to node 3
was discarded. In this case, the label [8,1]4, with
distance 8, was discarded since the other path was
shorter, with length 3.</p>
      <p>Fig. 2. Running the algorithm step-by-step in RA-AVD</p>
      <p>The trace table has the following columns (see Figure 2):





</p>
      <p>Step: number of the stage of construction of the
solution.</p>
      <p>Fixed node: the candidate node selected by the
selection function at each stage (of all the candidate
nodes, the one with the shortest path length from the
source node is selected).</p>
      <p>Adjacent nodes: the nodes adjacent to the selected
node.</p>
      <p>Fixed-origin length: for each adjacent node, it
indicates its distance from the source node.</p>
      <p>Pending lengths: the lengths of the paths from the
source node that have not been selected so far. It is
made up of values from the column "Fixed Origin
Length" that have not been selected.</p>
      <p>Minimum length: the shortest length of the lengths of
the adjacent nodes of the fixed node and the pending
lengths.</p>
      <p>Let's see an example to better understand the meaning of
these columns (note that the source node is 0). Step 0 indicates
the initial state. In step 1, the algorithm marks the origin node
0 as a fixed node (in the first step the origin node itself is
chosen as the fixed node), and determines the adjacent nodes,
which are nodes 1, 2 and 4, writing them down in the
"Adjacent Nodes" column. Next, the algorithm determines the
distance from these nodes to the origin, whose lengths are 4,
2 and 8 respectively, and write them down in the column
"Fixed-Origin Length". Subsequently, the algorithm selects
the smallest of these paths (4, 2 and 8), in this case the smallest
path is 2, and its corresponding node (node 2), so it is marked
as a fixed node for the next stage (step 2). That would be the
end of the stage, hence a new stage would start until the
minimal path to all nodes is reached.</p>
      <p>At the top of the trace table, the "Optimal route" and
"Length" fields show, at each stage, the optimal path and its
length respectively for the selected, fixed node. Initially, all
nodes are painted in one color and as they are processed in
the construction stages their color changes to show which part
of the graph is being processed.</p>
    </sec>
    <sec id="sec-4">
      <title>IV. RESULTS</title>
      <p>This section presents the statistical analysis of the data
obtained during the experience, in which 18 students aged 19
to 22 participated. Eleven students (61.1%) were men and 7
(38.9%) were women. The analysis was performed with the
IBM SPSS tool and the Pandas Matplotlib library in Phyton.x.</p>
      <sec id="sec-4-1">
        <title>A. Acquired knowledge</title>
        <p>Table II shows the level of knowledge acquired by the
students after the experience. Students had neither previous
contact nor knowledge about greedy algorithms at the
beginning of the experience. However, at the end of
experience they obtained an average score close to 7 out of 10
(specifically 6.95). Table II displays the mean scores per
question (maximum 2 points per question) and the total mean.
It should be noted that half of the students in the group
(50.5%) obtained a score higher than 8, the maximum score
being 10 and the lowest 3. Furthermore, we can see that in
three of the five questions students scored more than 1.7 out
of 2 points. Therefore, that the level of knowledge is quite
satisfactory.</p>
        <p>KNOWLEDGE EVALUATION QUESTIONNAIRE AND SCORES</p>
        <p>OBTAINED BY QUESTIONS</p>
        <p>Statement of the task or question
Select the distance of two adjacent nodes when they
are equal
Indicate the sequence of edges that Dijkstra would
calculate from the origin to node X
Find the shortest path from vertex X to vertex Y
Identify which is the predecessor node of node X in
the graph.</p>
        <p>Starting from the origin node, what would be the
predecessor node Not selected for node X in step N.
1.72
1.78
0.56
1.89
1.00
6.95</p>
        <p>Figure 3 shows the mean of the positive emotions
(enjoyment, hope and pride) valued by the students at three
different times: before, during and after the experience of use
of the tool. Not all emotions were measured at all times. For
example, measuring proud about accomplishment in the task
does not make sense before having done it, or there is no point
in measuring the students’ level of hope about what they will
learn once they had learnt it. Therefore, in Figure 3 not all the
positive variables appear at all times.</p>
        <p>We can see that the average enjoyment remains roughly
the same throughout the experience (between 3.72 and 3.70).
We also found out that hope decreased in students while they
did the task from 4.13 to 3.96 (i.e. 4.12%), and that pride felt
once students finished the task decreased from 3.86 down to
3.67 (4.92%).</p>
        <p>The fact that enjoyment remained constant can be
interpreted as a favorable symptom, since the learning task
entailed concepts that were new and difficult to understand for
students, and despite this fact they did not stop enjoying
during learning. The authors wonder whether the use of AR
may have been the reason of keeping constant the levels of
enjoyment. In relation to this feeling of enjoyment, it should
be noted that item 110 of the AEQ questionnaire (“I study
more than necessary because I enjoy it a lot”) obtained the
lowest score with an average of 2.89, which indicates that the
student is not interested in studying more than strictly
necessary. Note that we numbered the items same way as the
AEQ questionnaire. Rather, it seems that they were interested
in acquiring new knowledge that seems significant to them,
judging by the evaluation of question 139 (“I enjoy acquiring
new knowledge”), which obtained the highest score (4.33 out
of 5).</p>
        <p>Regarding the decrease in pride, it should be noted that the
highest score of the questions of the AEQ questionnaire to
measure this emotion is item 135 (“When I excel in my work
I feel proud”), which had an average of 4.00 out of 5. Thus,
students value the work they do and feel proud of it. It is
possible that they did not value the task they had to do
(analyzing Dijkstra’s algorithm) as an interesting task and this
could have caused that pride decreased at the end of the
experience.</p>
      </sec>
      <sec id="sec-4-2">
        <title>C. Negative activation emotions</title>
        <p>In relation to these emotions we could observe that the
average anger decreased during learning and increased when
the task was finished by 7.65% (see Figure 4). On the other
hand, the level of anxiety increased notably from the
beginning of the experience, being 15.83% higher during the
experience and 28.5% at the end of it (Figure 4).</p>
        <p>It was also observed that students felt worried and
anguished about having to deal with too many study materials
and about having little time, since the items of the AEQ
questionnaire related to this aspect of anxiety received the
highest ratings (items 86, 96, 111 and 132 in Table III). This
may have partly increased anxiety during the experience,
reaching a high value at the end of the experience compared
to the beginning (Figure 4). Note that at the end, anxiety was
the negative emotion that was experienced with the greatest
intensity. Therefore, this finding constitutes an important
aspect that should be taken into account in the design and
construction of educational tools in order to reduce this feeling
for the student.</p>
        <p>Besides, we may observe in Figure 4 that the emotional
level of embarrassment decreased throughout the experience,
which probably means that the student started feeling more
confident as he/she moved forward in the learning activity.</p>
      </sec>
      <sec id="sec-4-3">
        <title>D. Negative deactivation emotions</title>
        <p>The results show that the hopelessness of students
decreased while the experience with the AR tool was carried
out. However, it increased again to the levels of the beginning
when the task was finished (Figure 5). The authors cannot
explain well the reason for this effect, although they believe
that it could be related to the fact that the use of the AR tool
gave them hope of learning while they were using it.</p>
        <p>
          Regarding boredom, it was detected that it decreased
while using the tool (Figure 5). This is an important aspect
since boredom leads to reduced intrinsic motivation and
cognitive withdrawal from the task [
          <xref ref-type="bibr" rid="ref27">27</xref>
          ]. The authors think
that the use of augmented reality could have awaken the level
of attention and interest of the student and therefore could
have reduced boredom, perhaps increasing student’s
motivation.
        </p>
      </sec>
      <sec id="sec-4-4">
        <title>E. Comparison of positive and negative emotions</title>
        <p>Figure 6 shows the emotional levels grouped by positive
activation (enjoyment, hope and pride), negative activation
(anger, anxiety and embarrassment) and negative
deactivation emotions (hopelessness and boredom). It should
be noted that the positive activated emotions experienced by
the students as a whole are almost twice as intense as the
negative ones. The authors wonder if it could be that the use
of AR in learning algorithms has something to do with this
fact of experiencing positive emotions more intensely than
negative ones. Which raises an interesting line of future
research. Regarding negative emotions (both activation and
deactivation), it could be seen that they decreased slightly
while students were using the tool in the task (Figure 6,
“During” section of the diagram). However, at the end all
these emotions increased.</p>
        <p>Fig. 6. General average of emotions according to time</p>
        <p>The decrease in these negative emotions during the
experience could be related to the use of the tool, while the
increase detected once the task was finished could be related
to the fact that they had to take a knowledge assessment test
at the end of the task. This could make them anxious, angry,
or hopeless.</p>
      </sec>
      <sec id="sec-4-5">
        <title>F. Students’ opinion</title>
        <p>It was observed that students preferred to work
collaboratively during the task. They spontaneously came
together in pairs and in some cases even in groups of three
students. Some of the students pointed out the lack of any
collaborative interaction support in the tool. On the other
hand, one of the problems that was observed is that some
students did not have a smartphone with minimal
characteristics and this caused that the camera focus was not
optimal, causing the student to be bringing the mobile closer
to the paper several times until the tool detected the source
code. The students commented that they were surprised to
learn a new subject matter through the tool and even more so
because some did not know the augmented reality technology
and it caused them great interest and curiosity, since they had
not commonly used it in the classroom. In general, they
indicated that they were satisfied with the tool and that it had
been useful in the experience.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>V. CONCLUSIONS AND FUTURE WORK</title>
      <p>This article has presented a classroom experience to learn
greedy algorithms using an augmented reality tool called
RAAVD. In this experience, both the emotions and feelings of the
students and the level of knowledge acquired have been
evaluated. The knowledge evaluation was satisfactory. The
students managed to obtain an average of almost 7 (6.95) out
of 10, which means that most of the students managed to
understand the operation and behavior of the algorithm under
study (Dijkstra’s algorithm). Note that the students had no
previous knowledge nor contact with greedy algorithms, thus
the learning curve was high.</p>
      <p>In relation to emotions experienced by using the tool, the
experience revealed that the students experienced positive
emotions (enjoyment, hope and pride) much more intensely
than negative ones (such as anxiety or anger), being almost
twice as intense. In addition, boredom and embarrassment
decreased notably during the use of the tool, keeping the
feeling of enjoyment roughly constant during its use.
However, the student's level of anxiety increased from the
beginning while using the tool, being almost the double at the
end of the experience. This last aspect is especially relevant
and it should be taken into account in the design and
construction of future educational tools.</p>
      <p>As future projects, we plan to analyze in greater depth the
results obtained by carrying out a correlation analysis between
emotions and learning. Furthermore, we intend to replicate the
experience with a control group and compare traditional
learning with the use of augmented reality.</p>
    </sec>
    <sec id="sec-6">
      <title>ACKNOWLEDGEMENTS</title>
      <p>This work has been supported by research grants iProg (ref.
TIN2015-66731-C2-1-R) and e-Madrid-CM (ref.
P2018/TCS-4307) with FSE and FEDER funds. The support
of the GIIAR group of the Salesian Polytechnic University is
appreciated.</p>
      <p>REFERENCES</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Velázquez-Iturbide</surname>
            ,
            <given-names>J. Á.</given-names>
          </string-name>
          <article-title>The design and coding of greedy algorithms revisited</article-title>
          .
          <source>In Proceedings of the 16th Annual Joint Conference on Innovation and Technology in Computer Science Education, ITiCSE'11</source>
          (pp.
          <fpage>8</fpage>
          -
          <lpage>12</lpage>
          ).
          <source>June</source>
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>McConnell</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Emociones en educación: Cómo las emociones, cognición y motivación influyen en el aprendizaje y logro de los estudiantes</article-title>
          .
          <source>Revista Mexicana de Bachillerato a Distancia</source>
          ,
          <volume>11</volume>
          (
          <issue>21</issue>
          ),
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Tapia</surname>
            ,
            <given-names>A. A. F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Anchatuña</surname>
            ,
            <given-names>A. L. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cueva</surname>
            ,
            <given-names>M. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Poma</surname>
            ,
            <given-names>R. M. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jiménez</surname>
            ,
            <given-names>S. F. R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Corrales</surname>
            ,
            <given-names>E. N. P.</given-names>
          </string-name>
          <article-title>Las neurociencias. Una visión de su aplicación en la educación</article-title>
          .
          <source>Open Journal Systems en Revista: Revista Órbita Pedagógica ISSN 2409-0131</source>
          , (
          <volume>61</volume>
          -
          <fpage>74</fpage>
          ),
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Benavidez</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Flores</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <article-title>La importancia de las emociones para la neurodidáctica</article-title>
          .
          <source>Wimblu</source>
          ,
          <volume>14</volume>
          (
          <issue>1</issue>
          ),
          <fpage>25</fpage>
          -
          <lpage>53</lpage>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <article-title>Teaching Strategy of Programming Course Guided by Neuroeducation</article-title>
          .
          <source>In 2019 14th International Conference on Computer Science &amp; Education (ICCSE)</source>
          (pp.
          <fpage>406</fpage>
          -
          <lpage>409</lpage>
          ). IEEE,
          <year>August 2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Lacave</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Velázquez-Iturbide</surname>
            ,
            <given-names>J. Á.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Paredes-Velasco</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Molina</surname>
            ,
            <given-names>A. I. Analyzing</given-names>
          </string-name>
          <article-title>the influence of a visualization system on students' emotions: An empirical case study</article-title>
          .
          <source>Computers &amp; Education</source>
          ,
          <volume>149</volume>
          ,
          <fpage>103817</fpage>
          ,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <source>Computers in Human Behavior</source>
          ,
          <volume>31</volume>
          ,
          <fpage>499</fpage>
          -
          <lpage>508</lpage>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Sánchez</surname>
            ,
            <given-names>M. D. R. G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Añorve</surname>
            ,
            <given-names>J. R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Alarcón</surname>
            ,
            <given-names>G. G.</given-names>
          </string-name>
          <string-name>
            <surname>Las</surname>
          </string-name>
          <article-title>Tic en la educación superior, innovaciones y retos/The ICT in higher education, innovations and challenges</article-title>
          .
          <source>RICSH Revista Iberoamericana de las Ciencias Sociales y Humanísticas</source>
          ,
          <volume>6</volume>
          (
          <issue>12</issue>
          ),
          <fpage>299</fpage>
          -
          <lpage>316</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Rios</surname>
            ,
            <given-names>M. D. G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Paredes</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Augmented reality as a methodology to development of learning in programming</article-title>
          .
          <source>In International Conference on Technology Trends</source>
          (pp.
          <fpage>327</fpage>
          -
          <lpage>340</lpage>
          ). Springer, Cham,
          <year>2018</year>
          ,
          <year>August</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Fidan</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Tuncel</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Integrating augmented reality into problem based learning: The effects on learning achievement and attitude in physics education</article-title>
          .
          <source>Computers &amp; Education</source>
          ,
          <volume>142</volume>
          ,
          <fpage>103635</fpage>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Agarwal</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Marotta</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leo</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Chau</surname>
            ,
            <given-names>D. H.</given-names>
          </string-name>
          <string-name>
            <surname>Mixed</surname>
          </string-name>
          <article-title>Reality for Learning Programming</article-title>
          .
          <source>In Proceedings of the 18th ACM International Conference on Interaction Design and Children</source>
          (pp.
          <fpage>574</fpage>
          -
          <lpage>579</lpage>
          ). June, 2019
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Magnenat</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ben-Ari</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Klinger</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Sumner</surname>
            ,
            <given-names>R. W.</given-names>
          </string-name>
          <article-title>Enhancing robot programming with visual feedback and augmented reality</article-title>
          .
          <source>In Proceedings of the 2015 ACM conference on innovation and technology in computer science education</source>
          (pp.
          <fpage>153</fpage>
          -
          <lpage>158</lpage>
          ). June,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Guenaga</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Menchaca</surname>
            , I., de Guinea,
            <given-names>A. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dziabenko</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>GarcíaZubía</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Salazar</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Serious Games, Remote Laboratories and Augmented Reality to Develop and Assess Programming Skills</article-title>
          .
          <source>In International Simulation and Gaming Association Conference</source>
          (pp.
          <fpage>29</fpage>
          -
          <lpage>36</lpage>
          ). Springer, Cham. June,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Mesia</surname>
            ,
            <given-names>Natali</given-names>
          </string-name>
          <string-name>
            <surname>Salazar</surname>
            , Cecilia Sanz, and
            <given-names>Gladys</given-names>
          </string-name>
          <string-name>
            <surname>Gorga</surname>
          </string-name>
          .
          <article-title>"Augmented reality for programming teaching</article-title>
          .
          <source>Student satisfaction analysis." 2016 international conference on collaboration technologies and systems (cts)</source>
          . IEEE, (pp.
          <fpage>165</fpage>
          -
          <lpage>171</lpage>
          )
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Teng</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          <article-title>Impact of augmented reality on programming language learning: Efficiency and perception</article-title>
          .
          <source>Journal of Educational Computing Research</source>
          ,
          <volume>56</volume>
          (
          <issue>2</issue>
          ),
          <fpage>254</fpage>
          -
          <lpage>271</lpage>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Melcer</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <article-title>Moving to Learn: Exploring the Impact of Physical Embodiment in Educational Programming Games</article-title>
          .
          <source>In Anonymous Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM</source>
          , pp.
          <fpage>301</fpage>
          -
          <lpage>306</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Klopfenstein</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Andriy</surname>
            <given-names>Fedosyeyev</given-names>
          </string-name>
          , and
          <string-name>
            <given-names>Alessandro</given-names>
            <surname>Bogliolo</surname>
          </string-name>
          .
          <article-title>"Bringing an unplugged coding card game to augmented reality."</article-title>
          <source>Proceedings of the 11th International Technology, Education and Development Conference (IATED)</source>
          . (pp.
          <fpage>9800</fpage>
          -
          <lpage>9805</lpage>
          ),
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Jin</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Deng</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zheng</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Chiu</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <article-title>AR-Maze: a tangible programming tool for children based on AR technology</article-title>
          .
          <source>In Anonymous Proceedings of the 17th ACM Conference on Interaction Design and Children</source>
          . ACM, pp.
          <fpage>611</fpage>
          -
          <lpage>616</lpage>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Tan</surname>
            ,
            <given-names>K. S. T.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Lee</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          <article-title>An Augmented Reality Learning System for Programming Concepts</article-title>
          .
          <source>In Anonymous International Conference on Information Science and Applications</source>
          . Springer, pp.
          <fpage>179</fpage>
          -
          <lpage>187</lpage>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Stadler</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kain</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Giuliani</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mirnig</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stollnberger</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Tscheligi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Augmented reality for industrial robot programmers: Workload analysis for task-based, augmented reality-supported robot control</article-title>
          .
          <source>In 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)</source>
          , pp.
          <fpage>179</fpage>
          -
          <lpage>184</lpage>
          .
          <year>August 2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Deng</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jin</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <article-title>ARCat: A Tangible Programming Tool for DFS Algorithm Teaching</article-title>
          .
          <source>In Anonymous Proceedings of the 18th ACM International Conference on Interaction Design and Children</source>
          . ACM, pp.
          <fpage>533</fpage>
          -
          <lpage>537</lpage>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Dass</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kim</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ford</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Agarwal</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Chau</surname>
            ,
            <given-names>D. H.</given-names>
          </string-name>
          <article-title>Augmenting coding: Augmented reality for learning programming</article-title>
          .
          <source>In Proceedings of the Sixth International Symposium of Chinese CHI</source>
          (pp.
          <fpage>156</fpage>
          -
          <lpage>159</lpage>
          ). April,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Toledo</surname>
            ,
            <given-names>J. A. J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Collazos</surname>
            ,
            <given-names>C. A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cantero</surname>
            ,
            <given-names>M. O.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Redondo</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Á</surname>
          </string-name>
          .
          <article-title>Collaborative strategy with augmented reality for the development of algorithmic thinking</article-title>
          .
          <source>In Iberoamerican Workshop on HumanComputer Interaction</source>
          (pp.
          <fpage>70</fpage>
          -
          <lpage>82</lpage>
          ). Springer, Cham. April,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>McAllister</surname>
            ,
            <given-names>William.</given-names>
          </string-name>
          <article-title>Data structures and algorithms using Java</article-title>
          . Jones &amp; Bartlett Publishers,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Pekrun</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Goetz</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Frenzel</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barchfeld</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Perry</surname>
            ,
            <given-names>R. P.</given-names>
          </string-name>
          <article-title>Measuring emotions in students' learning and performance: The Achievement Emotions Questionnaire (AEQ)</article-title>
          .
          <source>Contemporary educational psychology</source>
          ,
          <volume>36</volume>
          (
          <issue>1</issue>
          ),
          <fpage>36</fpage>
          -
          <lpage>48</lpage>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Pekrun</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Goetz</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Titz</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Perry</surname>
            ,
            <given-names>R. P.</given-names>
          </string-name>
          <article-title>Academic emotions in students' self-regulated learning and achievement: A program of qualitative and quantitative research</article-title>
          .
          <source>Educational Psychologist</source>
          ,
          <volume>37</volume>
          (
          <issue>2</issue>
          ),
          <fpage>91</fpage>
          -
          <lpage>105</lpage>
          ,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <surname>Anaya-Durand</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Anaya-Huertas</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ¿
          <article-title>Motivar para aprobar o para aprender? Estrategias de motivación del aprendizaje para los estudiantes</article-title>
          . Tecnología, Ciencia, Educación,
          <volume>25</volume>
          (
          <issue>1</issue>
          ),
          <fpage>5</fpage>
          -
          <lpage>14</lpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>