<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Perceptions Regarding Online Assessment in Times of COVID-19: a Study of University Students in Lima</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Iván Montes-Iturrizaga</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Gloria María Zambrano Aranda</string-name>
          <email>gzambrano@pucp.edu.pe</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yajaira Licet Pamplona-Ciro</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Klinge Orlando Villalba-Condori</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Pontificia Universidad Católica del Perú 1</institution>
          ,
          <addr-line>Av. Universitaria 1801, San Miguel, Lima,15088</addr-line>
          ,
          <country country="PE">Perú</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Universidad Catolica de Santa Maria</institution>
          ,
          <addr-line>San Jose S/N, Arequipa</addr-line>
          ,
          <country country="PE">Perú</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Universidad Continental 3</institution>
          ,
          <addr-line>Av. San Carlos 1980, Urb. San Antonio, Huancayo, 12000</addr-line>
          ,
          <country country="PE">Perú</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The assessment of learning in the classroom requires relevant evidence (performance tests as opposed to answer selection tests) in order to establish adequate feedback processes. Thus, it is considered that performance tests such as oral, essay or case exams would offer an adequate guideline to meaningfully address the study materials. In this context, a quantitative study was conducted to characterize the perceptions and preferences regarding the tests that teachers apply. The sample consisted of university students (n = 240) from a private university in Lima. A questionnaire was applied through Google Forms to students who were doing their studies online (exceptionally) due to COVID-19. It was found that students were evaluated, mostly, with multiple choice tests with rote questions and reasoning questions. In addition, it was found that students recognize that essay exams were the most important for their training; but they preferred multiple choice exam.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Assessment of learning</kwd>
        <kwd>multiple choice exam</kwd>
        <kwd>essay exam</kwd>
        <kwd>college students</kwd>
        <kwd>online education</kwd>
        <kwd>COVID19 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The assessment of learning in the classroom is one of the most relevant processes to lead students
towards the achievement of the competencies, skills, contents or abilities included in a curriculum
or program [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]– [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. This is because evaluation, if it is formative, should go hand in hand with
two relevant components. The first of these is associated with the quality of the evidence of
learning that we generate through the review of assignments, oral participations and exams that
are applied throughout a semester or academic cycle [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. This quality of evidence is directly
associated with the meaningfulness, realism and contextual validity of the assessment conditions
set for students [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. For example, an essay exam will have a closer correspondence to the
intelligent demands of a given professional field as opposed to one where students must select
the correct answer purposely from a list of alternatives [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        The second component or deployment is related to the possibility that university professors
use this evidence (hopefully those that obey performance exams) to provide feedback on their
teaching and learning processes. In this way, we could say that we would be facing an evaluative
process of formative nature; otherwise, we would be measuring or contemplating how students
are progressing in terms of their learning [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]. For this reason, it could be affirmed that formative
assessment is a process oriented to students learning what they should learn. In other words,
evaluation understood in this way is aimed at learning; this is in total opposition to the practices
that understand evaluation as a simple verification of what has been learned.
      </p>
      <p>0000-0002-9411-4716 (I. Montes-Iturrizaga); 0000-0001-6021-5757 (G. Zambrano Aranda);
0000-0001-90244444 (Y. Pamplona-Ciro); 0000-0002-8621-7942 (K. Villalba-Condori)
© 2023 Copyright for this paper by its authors.</p>
      <p>Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).</p>
      <p>CEUR Workshop Proceedings (CEUR-WS.org)</p>
      <p>
        On the other hand, it is worth mentioning that Peruvian universities in the framework of the
current reform generated by the current University Law 30220 of 2014 have seen the need to
deploy greater efforts for better teaching. Thus, and under these influences, assessment would
have improved significantly thanks to the development of courses, workshops and specializations
in university teaching. A series of very determined actions had thus been initiated to optimize the
teaching skills of professors and to investigate the impact of the measures undertaken; all this in
a context of greater competitiveness of Peruvian universities to conquer better positions in terms
of scientific production, respectability and expectant location in the spectrum of higher education
[
        <xref ref-type="bibr" rid="ref9">9</xref>
        ].
      </p>
      <p>
        However, these dynamics underwent abrupt changes and modifications as a result of the
COVID-19 pandemic [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. In this case, it is not that improvements in teaching had been stopped,
but rather, more urgent issues had to be addressed at that time, such as: the acquisition of
platforms for online teaching, teacher training in the use of these platforms and the transfer of
teaching materials to a virtual space little known by the vast majority of teachers [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. Likewise,
programs had to be designed to establish teaching skills through digital platforms. Some of these
training spaces would have been developed under a face-to-face education logic and others from
a perspective closer to engineering than to pedagogy. In other cases, situations have also been
reported where some university owners ordered the merger of several sections or classrooms
into one; and where a professor previously used to teach 25 or 35 students went from one
moment to another to have more than 100 students in a single virtual classroom [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. This
would have occurred to a greater extent in private for-profit universities. In addition, these facts
would have occurred in other latitudes with respect to the excessive increase in the number of
students per classroom [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
      </p>
      <p>Thus, the above-mentioned situation would have had a significant impact on the quality of
teaching and assessment itself; and where, since it is practically impossible to assess 100 or 125
students through essay exams, an excessive use of multiple-choice exams would have been
resorted to. At the same time, it is to be expected that - in other cases - teachers have transferred
the good or bad of their face-to-face teaching (and their ways of assess) to the online modality.
But this is also a problem for students, a significant percentage of whom graduate from secondary
education without having achieved the basic competencies needed to enter the university world.
This translates into problems in oral expression, in the search for reliable information, in writing
and at the level of thinking in general, among others. This is largely explained by the results of the
2018 PISA (Peru), which revealed the serious deficiencies of our young people who are close to
finishing high school.</p>
      <p>
        In addition, recent studies report the existence of problems that have become pandemic, such
as plagiarism, lack of academic integrity, the increase of artificial grading, the difficulty of grading
group work, and the absence of face-to-face relationships between professors and students[
        <xref ref-type="bibr" rid="ref11">11</xref>
        ],
[
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. But also, the evidence points to favorable perceptions of inclusion, well-being and
satisfaction with online assessment in the context of pandemic-relevant evidence [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. On
the other hand, other studies report that in pandemics, scores would have risen considerably on
the tests [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] and that student recognized that the online modality demanded a greater effort, but
with important advantages [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. It should be noted that most of these studies are oriented to
compare scores before and during the pandemic, but by means of multiple-choice exams[
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]; The
number of research studies (such as this one) that analyze the types of exams used in online
classes is smaller in number. This, under a formative assessment perspective, and where the type
of exams matters especially given the need to provide feedback to students [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. Other studies
have reported the need to transcend the traditional multiple-choice exam that would have been
used during this health crisis with insistence as well [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]–[
        <xref ref-type="bibr" rid="ref18">18</xref>
        ].
      </p>
      <p>
        This scenario described above has become the object of study in numerous recently published
works that highlight the relevance of studying the evaluation process, as well as its instruments,
tools and conditions. By virtue of the aforementioned, the general objective of the present
research is to study the perceptions and preferences of university students with respect to the
assessment of their learning in the classroom in the context of the emergency online teaching
given during the pandemic by COVID-19 [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], [19], [20].
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Methods</title>
      <p>A quantitative approach study was developed using the survey method and the questionnaire
was constructed as an instrument [21]. In terms of its characteristics, this research is
observational (non-experimental), transversal and descriptive, comparative and correlational.</p>
      <p>The sample consisted of 240 undergraduate university students from a private for-profit
university that serves vulnerable socioeconomic sectors of Metropolitan Lima (Peru). There were
181 (75.4%) women and 54 (24.6%) men. In terms of the degree, they came from: 9 from
Nutrition (3.7%), 55 from Pharmacy 55 (22.9%), 8 from Law (3.3%), 33 from Administration
(13.8), 23 from Marketing (9.6), 23 from Accounting (9.6%), 48 from Psychology (20%) and 41
from Nursing (17.1%). Ages ranged from 16 to 54 years; with a mean of 25.4 and a standard
deviation of 8.105.</p>
      <p>An instrument was elaborated where in the first part identification data such as sex, age and
degree program were collected. The second part comprises 11 response selection items that
explore students' perceptions regarding the type of tests (and their emphases) that were mostly
applied by their university professors. Also, this instrument explores test preferences and their
considerations regarding the most relevant instruments for their professional training. It should
be specified that this self-administered instrument was applied thanks to Google Forms during
the second wave of the health crisis by COVID-19. It is also important to note that at all times the
perceptions of the evaluation practiced by their teachers in virtual emergency education in Peru
were questioned. This instrument had content validity, determined through the participation of
3 expert judges. The questionnaire (anonymous) was answered under informed consent and
voluntarily.</p>
      <p>Statistical analyses were performed in SPSS version 27 for Windows, including descriptive,
comparative and correlational reports.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Results</title>
      <p>This study focuses on characterizing, from a quantitative approach, the perceptions that students
at a university in Lima (Peru) have regarding the tests used by their professors in the context of
emergency virtual education due to the health crisis caused by COVID 19. This, by virtue of the
fact that each of the professional training programs would present their own habits,
idiosyncrasies and evaluative practices that would be associated with their own pedagogical
traditions. In this way, these degree-specific emphases (and those that are common) would be
installed in the teaching staff, who have the power to decide which tests they will use (and which
ones they exclude) in their classrooms.</p>
      <p>By virtue of the above, it can be seen in Table 1 that the students indicated that they would
have had to answer a greater number of response selection tests (or the so-called "objective
tests"). It is necessary to point out that law students are the ones who considered that these tests
were less used on them (37.5%); and where the rest of the students of all the programs
considered that they were mostly applied these instruments. In this panorama, psychology
(91.7%) and nutrition (89.9%) students considered that these response selection tests (PS) were
the ones most applied by their professors.</p>
      <p>Also, the purpose was to know how the students considered the answer selection tests (PS)
applied by their professors and according to degree program. In this item, which is expressed as
a result in Table 2, only one alternative could be marked and the students were asked about how
these tests (PS) were applied by their professors (in general terms). It was found that the most
frequently applied PS tests combined memoristic items (because they demanded irrelevant data
to be memorized) and those oriented to thinking or reasoning. This is followed by PS tests with
items for understanding and thinking (reasoning) and, to a lesser extent, exclusively memoristic
tests, where much lower percentages (from 0.0% to 9.8%) are evidenced for all the degree
programs. It is important to mention, and as something favorable, that no student of psychology,
nutrition and law considered that there were exclusively memoristic answer selection tests in
their university courses. As for the PS more exclusively oriented to thinking or reasoning, it was
found that these were applied to a greater extent (from the students' perception) in nutrition
(44.4%), administration (39.4%) and law (37.5).</p>
      <p>We were also interested in finding out the preferences of students in all degree programs for
two types of instruments to evaluate their learning throughout their professional training (Table
3): essay tests (ET) and response selection tests (RS). Thus, we found that the preferences of
university students pointed more to the PS than to the PE. It is worth mentioning that this
preference for PS was present in all degree programs except accounting (26.1%) and marketing
(43.5%). Likewise, we identified that nursing (68.3%) and pharmacy (65.5%) students were the
ones who preferred PS to a greater extent.</p>
      <p>However, when students were asked about the most important tests for their university
education, we found that in all the degree programs the essay tests stand out (Table 4). In other
words, PS are preferred in almost all degree programs knowing that they would not be as
important or would be the least relevant for their preparation for the professional field. In this
case, it should be noted that 85% of the students at this university come (according to data from
its academic office) from public schools with the lowest learning results in Metropolitan Lima
(Peru) as measured by the tests applied by the Ministry of Education. Thus, as is well known, the
students who graduated from these basic education schools evidenced poor study habits, less
ability to understand what they read and incipient resources to express themselves adequately
in writing. Thus, this paradox found would be revealing a clear understanding that the essay tests
(PE) are the most important; but they are not preferred because they would be anticipating
problems to be able to face them successfully given the deficiencies they would bring from high
school or middle school.</p>
      <p>We also wanted to know whether age (ordinal variable) was associated with the preference
for response selection tests or essay tests. In this case, there was no association between these
variables (χ² = 0.430, df = 1, p &gt; 0.512). Similarly, we did not find a significant association between
the age of the students and the consideration of the most relevant tests for professional training
(χ² = 1.566, df = 1, p &gt; 0.211). In this sense, it was determined that the marked preferences for
response selection tests and the consideration of essay tests as the most relevant for vocational
training were not associated with age. In other words, in all age groups the perceptions are
similar.</p>
      <p>Along the same lines, we found that the sex of the students was not associated with
preferences for either response selection or essay tests (χ² = 3.175 df = 2, p &gt; 0.204). Along the
same lines, we found that there was no association between the sex of the students and the
consideration of developmental or essay (EP) tests as the most important for their professional
training (χ² = 1,500, df = 2, p &gt; 0.472).</p>
    </sec>
    <sec id="sec-4">
      <title>4. Discussion</title>
      <p>
        This quantitative research was oriented to explore the perceptions and preferences regarding
classroom evaluation in students of a private for-profit university, which preferentially serves
vulnerable populations. In this context, the figures of this university report 15% of students who
are in poverty and 85% of them have completed their high school studies in public schools in the
peripheral areas of the city of Lima. This data is very important because we are probably dealing
with a student population (represented in the sample) with largely unsatisfactory school
experiences and poor cultural stimuli (books, reflective dialogues and student models). This fact
has most probably generated the paradox found that reveals that PS tests are preferred and, at
the same time, essay tests (EP) are recognized as the most adequate for university education.
Given this, it is possible that for these students it is more complicated to have to write and write
an answer because, most likely, in basic education they had few opportunities to develop this
competence. On the other hand, answering an answer selection test -even a memorized
onewould be more within reach given the low cognitive demand of these tests; even more so when
there are still teachers in much of the Peruvian educational system who are more focused on
having their students retain large amounts of data, dates, principles and formulas in their
memory.[
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], [19].
      </p>
      <p>
        In addition to this, the perceived emphasis on the tests that would be applied by the university
professors in the study, where it was clearly identified that the most frequent instruments would
be those of response selection (PS) and, within these, those that combine memorized items with
those that explore thinking or reasoning. [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
      </p>
      <p>In any case, it will be necessary to develop qualitative studies aimed at knowing the
perceptions, attributions and reasons that professors have for using some types of tests and not
others. The latter would have to consider the professors of the specialty courses of each of the
degrees; since in the current study it has been found that not only the students would be different,
but also the professors in terms of the evaluative process[22]. Also, other variables or
characteristics of the students should be considered in future studies in order to discriminate
more clearly the phenomena under study (even more so when we only found the degree program
as a relevant variable). For example, considering the semester or cycle and the type of subject
(general studies and specialty) will make it possible to characterize and understand more
precisely the perceptions regarding the tests applied in the classroom. [22]. But also, it would be
pertinent to study the university professors themselves to learn about the tests they apply and
the reasons they have for considering them for their work in the classroom.</p>
      <p>
        Finally, it is important to note that we have no previous studies on classroom assessment prior
to the pandemic, which would have played a baseline role in establishing reliable comparisons to
indicate whether assessment in virtual emergency education is the same or different from that
practiced in face-to-face settings. Nor have we considered evidence of academic performance in
the students in our sample: either their grades or their scores on tests administered by their
teachers. However, it is important to clarify that in this study we are focusing on perceptions
about the tests applied by teachers in their classes and on their preferences in terms of
assessment [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ]. In view of the above, and although we can hypothesize that classroom
evaluation would have suffered a setback in this health crisis at the university we have studied,
we do not have a study similar to the present one that could serve as a relevant comparator to
understand evaluation in these times [23].
      </p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgements</title>
      <p>Special thanks are extended to Professor Christian Pérez Sánchez for his collaboration in the
application of the questionnaire within the framework of this study. Gratitude is also expressed
to each of the university students who agreed to answer the questionnaire.
Journal of Information and Education Technology, vol. 11, no. 5, pp. 220–228, May 2021, doi:
10.18178/ijiet.2021.11.5.1515.
[19] F. J. García-Peñalvo, A. Corell, V. Abella-García, and M. Grande, “Online assessment in higher
education in the time of COVID-19,” Education in the Knowledge Society, vol. 21, pp. 1–26,
2020, doi: 10.14201/eks.23013.
[20] W. J. Popham, Test better, teach better: the instructional role of assessment, vol. 42, no. 01.</p>
      <p>Alejandría, Virginia: Association for Supervision and Curriculum Development, 2004. doi:
10.5860/choice.42-0445.
[21] I. Montes Iturrizaga and L. M. Sime Poma Elizabeth Salcedo Lobatón Edith Soria Valencia
Dany Briceño Vela, Investigación educativa: técnicas para el recojo y análisis de la
información ESCUELA DE POSGRADO MAESTRÍA EN EDUCACIÓN. [Online]. Available:
https://posgrado.pucp.edu.pe/maestria/educacion/
[22] D. B. Wayne, M. Green, and E. G. Neilson, “Medical education in the time of COVID-19,” Sci</p>
      <p>Adv, vol. 6, no. 31, 2020, doi: 10.1126/sciadv.abc7110.
[23] “Experience of e-learning and online assessment during the COVID-19 pandemic at the
College of Medicine, Qassim University”.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M. Z.</given-names>
            <surname>Joya</surname>
          </string-name>
          <string-name>
            <surname>Rodríguez</surname>
          </string-name>
          , “
          <article-title>La evaluación formativa, una práctica eficaz en el desempeño docente,” Revista Scientific</article-title>
          , vol.
          <volume>5</volume>
          , no.
          <issue>16</issue>
          , pp.
          <fpage>179</fpage>
          -
          <lpage>193</lpage>
          ,
          <year>2020</year>
          , doi: 10.29394/scientific.issn.
          <volume>2542</volume>
          -
          <fpage>2987</fpage>
          .
          <year>2020</year>
          .
          <volume>5</volume>
          .
          <issue>16</issue>
          .9.
          <fpage>179</fpage>
          -
          <lpage>193</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>I.</given-names>
            <surname>Montes-Iturrizaga</surname>
          </string-name>
          , “
          <article-title>La evaluación en la universidad en tiempos de la virtualidad: ¿retroceso u oportunidad?,” Revista Signo Educativo</article-title>
          , Lima, Dec.
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>I.</given-names>
            <surname>Montes-Iturrizaga</surname>
          </string-name>
          ,
          <article-title>Evaluación Educativa: reflexiones para el debate</article-title>
          . Madrid: UDL Editores,
          <year>2020</year>
          . [Online]. Available: https://www.amazon.com/-/es/Iván-Montes-Iturrizagaebook/dp/B08KRZ5DJ5
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>R.</given-names>
            <surname>Jáuregui</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Carrasco</surname>
          </string-name>
          ,
          <string-name>
            <surname>and I. Montes</surname>
          </string-name>
          , “Evaluando, evaluando:¿
          <article-title>Qué piensa y qué hace el docente en el aula</article-title>
          ,” Informe Final de Investigación.
          <article-title>Universidad Católica Santa María</article-title>
          . Perú. Recuperado desde http://cies. org. pe/files/active/0, vol.
          <volume>204</volume>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>C.</given-names>
            <surname>Rosales</surname>
          </string-name>
          ,
          <article-title>Evaluar es reflexionar sobre la enseñanza</article-title>
          ,
          <source>Tercera</source>
          . Madrid: Narcea Ediciones,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>W. J.</given-names>
            <surname>Popham</surname>
          </string-name>
          ,
          <article-title>Evaluación trans-formativa: El poder transformador de la evaluación formativa</article-title>
          .
          <source>Narcea Ediciones.</source>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>J. A.</given-names>
            <surname>Román</surname>
          </string-name>
          , “
          <article-title>La educación superior en tiempos de pandemia: una visión desde dentro del proceso formativo</article-title>
          ,” Revista Latinoamericana de Estudios Educativos, vol.
          <volume>50</volume>
          , no.
          <source>ESPECIAL</source>
          , pp.
          <fpage>13</fpage>
          -
          <lpage>40</lpage>
          ,
          <year>2020</year>
          , doi: 10.48102/rlee.
          <year>2020</year>
          .
          <volume>50</volume>
          .especial.
          <volume>95</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>R.</given-names>
            <surname>Stake</surname>
          </string-name>
          , “
          <article-title>The countenance of educational evaluation,” Teach Coll Rec</article-title>
          , vol. abril, pp.
          <fpage>523</fpage>
          -
          <lpage>540</lpage>
          ,
          <year>1967</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>I.</given-names>
            <surname>Montes-Iturrizaga</surname>
          </string-name>
          , “
          <article-title>Apreciaciones en torno a la propuesta de nueva Ley Univeritaria,” Revista Signo Educativo</article-title>
          , Lima, pp.
          <fpage>26</fpage>
          -
          <lpage>28</lpage>
          , May
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>A. H.</given-names>
            <surname>Al-Maqbali and R. M. R. Hussain</surname>
          </string-name>
          , “
          <article-title>The impact of online assessment challenges on assessment principles during COVID-</article-title>
          19 in Oman,
          <source>” Journal of University Teaching and Learning Practice</source>
          , vol.
          <volume>19</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>73</fpage>
          -
          <lpage>92</lpage>
          ,
          <year>2022</year>
          , doi: 10.53761/1.19.
          <issue>2</issue>
          .6.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Friedman</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Blau</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Y.</given-names>
            <surname>Eshet-Alkalai</surname>
          </string-name>
          ,
          <article-title>“Cheating and Feeling Honest: Committing and Punishing Analog versus Digital Academic Dishonesty Behaviors in Higher Education,”</article-title>
          <source>Interdisciplinary Journal of e-Skills and Lifelong Learning</source>
          , vol.
          <volume>12</volume>
          , pp.
          <fpage>193</fpage>
          -
          <lpage>205</lpage>
          ,
          <year>2016</year>
          , doi: 10.28945/3629.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Rahman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Novitasari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Handrianto</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Rasool</surname>
          </string-name>
          , “
          <article-title>Challenges In Online Learning Assessment During The COVID-19 Pandemic,” KOLOKIUM Jurnal Pendidikan Luar Sekolah</article-title>
          , vol.
          <volume>10</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>15</fpage>
          -
          <lpage>25</lpage>
          , Apr.
          <year>2022</year>
          , doi: 10.24036/kolokium.v10i1.
          <fpage>517</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>D.</given-names>
            <surname>Domínguez-Figaredo</surname>
          </string-name>
          ,
          <string-name>
            <surname>I.</surname>
          </string-name>
          <article-title>Gil-Jaurena, and</article-title>
          <string-name>
            <given-names>J.</given-names>
            <surname>Morentin-Encina</surname>
          </string-name>
          , “
          <article-title>The Impact of Rapid Adoption of Online Assessment on Students' Performance and Perceptions: Evidence from a Distance Learning University</article-title>
          ,” The
          <source>Electronic Journal of e-Learning</source>
          , vol.
          <volume>20</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>224</fpage>
          -
          <lpage>241</lpage>
          ,
          <year>2022</year>
          , [Online]. Available: www.ejel.org
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14] “
          <article-title>Online learning and assessment during the Covid-19 pandemic exploring the impact on undergraduate”.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>B.</given-names>
            <surname>Hassan</surname>
          </string-name>
          et al.,
          <article-title>“Oncology and Radiotherapy © Online assessment for the final year medical students during COVID-19 pandemics; the exam quality and students' performance</article-title>
          .”
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <article-title>“paper Formative learning assessment Covid 19 Montes-Iturrizaga &amp;</article-title>
          <string-name>
            <surname>Franco-Chalco</surname>
          </string-name>
          .”
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>W. J.</given-names>
            <surname>Popham</surname>
          </string-name>
          , Classroom Assessment, What Teachers Need to Know, Eigtht Edi. Los Angeles: Pearson,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>L.</given-names>
            <surname>Ali</surname>
          </string-name>
          and
          <string-name>
            <surname>N. A. H. H.</surname>
          </string-name>
          <article-title>al Dmour, “The shift to online assessment due to covid-19: An empirical study of university students, behaviour and performance, in the region of UAE,” International</article-title>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>