<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Quality Assurance Methods Assessing Instructional De- sign in MOOCs that implement Active Learning Peda- gogies: An evaluative case study</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Valeria Aloizou</string-name>
          <email>aloizouv@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sara Lorena Villagrá Sobrino</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alejandra Martínez Monés</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Juan I. Asensio-Pérez</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sara García Sastre</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="editor">
          <string-name>structional Design, Active Learning Pedagogies</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>GSIC/EMIC Research Group, Universidad de Valladolid</institution>
          ,
          <addr-line>Valladolid</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>University of Piraeus</institution>
          ,
          <addr-line>Piraeus</addr-line>
          ,
          <country country="GR">Greece</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <fpage>14</fpage>
      <lpage>19</lpage>
      <abstract>
        <p>Since more and more Massive Open Online Courses (MOOCs) appear, constituting an innovative part of online education, they should also undergo quality assurance to assess their pedagogical design similarly to other online courses. Recently, new studies have appeared presenting quality assurance methods (QA) for assessing the instructional design in MOOCs with pedagogical innovations, like those implementing active learning pedagogies (e.g. collaboration, gamification), which provide designers and instructors with useful strategies in order to design more interactive MOOCs. However, it is not that clear how these efforts are being addressed and up to which point are appropriate to assess active learning pedagogies. Therefore, a Systematic Literature Review (SLR) was conducted to identify the most mature existing MOOC QA methods. Afterwards, an evaluative case study was carried out, based on the Evaluand-oriented Responsive Evaluation Model (EREM), to apply the selected methods to a MOOC implementing active learning pedagogies. As the results suggest, the instruments of the selected QA methods need enrichment to assess effectively the instructional design based on active learning pedagogies, by providing specific questions proper for this kind of MOOCs or, by stating the underlying pedagogical model clearly so that the designers could consider beforehand if it is appropriate or not for their case. The results of the study are a first step to define new, enriched quality assessment methods for MOOCs that apply active learning approaches.</p>
      </abstract>
      <kwd-group>
        <kwd>MOOCs</kwd>
        <kwd>Quality Assurance Methods</kwd>
        <kwd>Quality Frameworks</kwd>
        <kwd>In-</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Millions of people are learning in hundreds of Massive Open Online Courses
(MOOCs). MOOC learners are a vast online learning community with diverse
motivational interests; therefore MOOCs should also undergo quality assurance like other
online courses [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Recently, some new studies have appeared presenting quality
assurance (QA) methods assessing specific aspects of the instructional design in
MOOCs, such us: collaboration, feedback course overview, learning objectives,
assessment, instructional materials, learner interaction and engagement, learner support
and accessibility [
        <xref ref-type="bibr" rid="ref2 ref3">2,3</xref>
        ]. However, most existing QA approaches for MOOCs do not
cater for the increasing use of active learning pedagogies (e.g. collaboration,
gamification), that aim at proposing richer learning experiences that go beyond
onesize-fits all approaches [
        <xref ref-type="bibr" rid="ref4 ref5 ref6">4-6</xref>
        ]. This paper addresses this research gap in the QA for
MOOCs implementing active learning pedagogies, by exploring the following
research question (RQ): How can we assess the instructional design quality of a MOOC
implementing active learning pedagogies? In order to answer this RQ, we firstly
carried out a Systematic Literature Review (SLR) to identify and select the most mature
existing MOOC QA methods. Secondly, an evaluative case study was carried out,
based on the Evaluand-oriented Responsive Evaluation Model (EREM), to apply the
selected methods to a MOOC implementing active learning pedagogies and evaluate
their instruments. Finally, the findings and the results of the data analysis are
reported, followed by the main conclusions and recommended lines for future work.
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>Related Work</title>
      <p>
        Following the RQ question of the present study, we firstly searched the terrain of QA
in MOOCs through an SRL [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] aiming to find out: (i) which QA methods have been
proposed so far? (ii) which are their main strengths and weaknesses while assessing
the instructional design of a MOOC? Specific electronic databases were selected;
IEEE Xplore Digital Library1, Springer Link2, ACM Digital Library3 and Google
Scholar4, and particular search strings were used: “MOOCs quality”, “pedagogical
quality in MOOCs” and “instructional design QA in MOOCs”, including journal
publications, conference proceedings, books and book chapters. The inclusion criteria
were publications: (i) written in English, (ii) referring to QA and accreditation, (iii)
analyzing quality frameworks and (iv) presenting quality indicators. The exclusion
criterion was not referring at all to online learning QA. The 10 initially identified QA
methods were assessed to find out which of them better fulfilled the following
criteria: (i) focus on MOOCs, (ii) inclusion of assessment instruments, (iii) focus on
instructional design, (iv) evidence of testing of the framework, (v) inclusion of the
process/methodology of analysis, and (vi) assessment of active learning pedagogies. 3
QA methods fulfilled most of the criteria and were finally selected in order to proceed
with the case study; the Ten-principle framework [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], the OpenupED Quality Label
[
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] and Quality Matters [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Methodology</title>
      <sec id="sec-3-1">
        <title>3.1 Context</title>
        <p>In order to address the RQ in real contexts, we selected the “Innovative and
Collaborative Learning with ICT” MOOC (CLAT MOOC) carried out by the University of
1 IEEE Xplore Digital Library, available at http://ieeexplore.ieee.org/Xplore/home.jsp .
2 Springer International Publishing, available at https://link.springer.com/ .
3 ACM Digital Library, available at https://dl.acm.org/ .
4 Google Scholar, available at https://scholar.google.gr/ .
Pompeu Fabra5 and the University of Valladolid6. The course was designed based on
the principles of active learning pedagogies, applying innovative elements in terms of
collaboration and gamification, such as innovative group formation policies and
collaborative quizzes leading to badge acquisition. CLAT MOOC was a 6-week-course,
with 759 total enrollments, delivered by 4 teachers. The course was targeted to
innovative pre-service and in-service teachers interested in incorporating collaboration
with technology into their own teaching practices.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2 The case study</title>
        <p>
          The assessment tools of the 3 finally selected QA methods were applied in a real case
study of the CLAT MOOC. The design of the evaluative case study was based on the
Evaluand-oriented Responsive Evaluation Model (EREM), which highly encourages
the plurality of data gathering techniques and informants in order to obtain different
perspectives about the Evaluand, thus enriching the evaluation process [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ]. We
assumed that this research method was the most appropriate, due to the mixed data
gathering techniques and the multiple informants needed in this phase of the study.
The Issue (the evaluative tension in EREM terminology) of the case study was
defined as: To what extent did the selected QA methods help assess the instructional
design quality and the active learning pedagogies of the CLAT MOOC? To help
illuminate our research question, we performed an anticipatory data reduction process
during the evaluation design (see figure 1), where the Issue was split into more
concrete topics (T). Each topic was explored through a number of Informative Questions
(IQ), in order to give answers to the issue of the CLAT MOOC study.
        </p>
        <p>
          Multiple data gathering techniques were used (see table 1). Firstly, we analyzed
the design of the CLAT MOOC in order to understand its context, which was
enriched with data gathered through informal interviews with the CLAT MOOC
designers. Secondly, we carried out an evaluation of the quality tools, by conducting one
questionnaire and a focus group, addressed to seven (7) research experts in learning
design and assessment in MOOCs. Important findings emerged after the content
analysis of the transcription and the notes, which are presented in the following section
along with the supporting evidence [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ].
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>2.3 Findings</title>
        <p>N
10
2
7
7</p>
      </sec>
      <sec id="sec-3-4">
        <title>Description of technique</title>
        <p>Analysis of the CLAT MOOC instructional design to
understand how the active learning strategies were
designed and implemented.</p>
        <p>Informal interviews with CLAT MOOC designers to
analyze the instructional design of the CLAT MOOC.
Questionnaires provided to research experts, to gather
their initial opinion regarding the QA tools after their
application on the CLAT MOOC.</p>
        <p>Focus group with research experts to gather their
overall opinion about the QA methods.</p>
      </sec>
      <sec id="sec-3-5">
        <title>Label</title>
        <p>[CLAT]
[INT]
[QUEST]
[FOC]
The main findings derived from the data analysis in terms of the Issue and its
corresponding topics to help illuminate the IQ are followingly presented. T1: According to
participants, assessment tools could be useful since they: (i) provide guidelines to
MOOC designers, (ii) help MOOC designers to identify gaps during the design
process, (iii) could work as a good strategy while re-designing a MOOC, considering the
results of the quality assessment (see Table 2, [QUEST]-A). T2: Participants were
not satisfied with the questions referring to the collaborative activities and active
learning pedagogies, supporting that there is a room for further improvement (see
Table 2, [QUEST]-B). T3: The effort which was required while applying the quality
tools was assessed according to the time consuming requirements, a factor which is
affected either by the comprehensible vocabulary, or by the quality tools layout (see
Table 2, [QUEST]-C). T4: Participants stated that a QA method by itself cannot help
them assess the objectives that a designer sets (see Table 2, [FOC]-A). Also, it was
stated the idea of forming more specific questions with special emphasis in the
reflection on pedagogical aspects such as collaboration. In this way the quality of particular
instructional design initiatives could be assessed more efficiently (see Table 2,
[FOC]-B). T5: Most of the participants considered that QA methods were useful since
they could reflect on important elements of their MOOCs. However there is space for
improvement (see Table 2, [QUEST]-D). T6: Lastly, it was pointed out that one of
the main challenges that MOOC teachers’ face is the assessment of students’ final
products due to the massive scale of the courses (see Table 2, [FOC]-C).</p>
      </sec>
      <sec id="sec-3-6">
        <title>Data</title>
      </sec>
      <sec id="sec-3-7">
        <title>Source</title>
        <p>[QUEST]
[FOC]
A. The QA tools were useful to identify aspects that should be taken into
account when designing a MOOC, and possible weaknesses that have to be
addressed. For example, in our case, the aspects related to assessment and the
institutional support seems to be the weakest aspect.</p>
        <p>B. The questions regarding active pedagogies were mainly focused on
participation and collaboration, but there are many other strategies that promote
active learning and were not evaluated (e.g., inquiry-based learning,
problemsolving, role-playing, game-based learning or gamification).</p>
        <p>C. For me Ten-Principle framework dealt with the instructional design and
was easier for me to answer its questions. The other two had some questions
that I did not find easy to answer.</p>
        <p>D. I’d say they helped me to realize that the instructional design had taken into
account some “basic” elements of any learning design (e.g., stating objectives,
defining assessment criteria, etc.). On the other hand, the assessment of
feedback is not very well developed. Important aspects regarding the assessment
that affects the quality design in a MOOC are not informed in this instrument.
A. I can’t expect from any instrument to say that this objective is correct,
because it depends on each MOOC context.</p>
        <p>B. For example, for the collaborative learning, I would expect questions such
as: “Did you choose specific criteria for forming the groups?”
C. Also, we couldn’t assess the student’s final products. We couldn’t provide
real assessment for the objectives after the activities submission.
3</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusions and future work</title>
      <p>The work presented in this paper has explored the area of QA in MOOCs, and its
findings have provided initial insights into how the most significant existing methods
can be enriched to assess effectively the instructional design in MOOCs implementing
active learning pedagogies. To sum up, there is indeed a need of QA methods in order
to: (i) detect weaknesses and elements of the course that need improvement, (ii)
assess elements that have not been taken into account while designing, (iii) acquire
important information while re-designing a MOOC. The QA methods should report
clearly the underlying pedagogical model, including clear and simple questions
assessing as well specific elements of the active learning pedagogies. Thus, designers
could end up with more valid and accurate conclusions about the quality of their
MOOCs. A line of future work opened by this research is to consider the above
mentioned insights and apply them to the definition of a new, enriched QA method for
assessing MOOCs.</p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgments</title>
      <p>This research has been partially funded by the European Regional Development
Fund and the National Research Agency of the Spanish Ministry of Science,
Innovations and Universities under project grants TIN2017-85179-C3-2-R and
TIN201453199-C3-2-R, by the European Regional Development Fund and the Regional
Ministry of Education of Castile and Leon under project grant VA257P18, and by the
European Commission under project grant 588438-EPP-1-2017-1-EL- EPPKA2-KA.
The authors thank the rest of the GSIC-EMIC research team for their valuable ideas
and support.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Gamage</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fernando</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Perera</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          :
          <article-title>Quality of MOOCs: A review of literature on effectiveness and quality aspects</article-title>
          .
          <source>Ubi-Media Computing (UMEDIA)</source>
          . pp.
          <fpage>224</fpage>
          -
          <lpage>229</lpage>
          . IEEE. Colombo, Sri Langa. (
          <year>2015</year>
          ). doi:
          <volume>10</volume>
          .1109/UMEDIA.
          <year>2015</year>
          .7297459
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Yousef</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chatti</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schroeder</surname>
            ,
            <given-names>U.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Wosnitza</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>What drives a successful MOOC? An Empirical Examination of Criteria to Assure Design Quality of MOOCs</article-title>
          .
          <source>IEEE 14th International Conference in Advanced Learning Technologies (ICALT)</source>
          . pp.
          <fpage>44</fpage>
          -
          <lpage>48</lpage>
          . Athens, Greece. (
          <year>2014</year>
          ). doi:
          <volume>10</volume>
          .1109/ICALT.
          <year>2014</year>
          .23
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Jansen</surname>
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rosewell</surname>
            <given-names>J.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Kear</surname>
            <given-names>K.</given-names>
          </string-name>
          :
          <article-title>Quality Frameworks for MOOCs</article-title>
          . In:
          <string-name>
            <surname>Jemni</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kinshuk</surname>
          </string-name>
          , Khribi M.
          <article-title>(eds) Open Education: from OERs to MOOCs</article-title>
          . Lecture Notes in Educational Technology. pp.
          <fpage>261</fpage>
          -
          <lpage>281</lpage>
          . Springer. Berlin, Heidelberg. (
          <year>2017</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>662</fpage>
          -52925-6_
          <fpage>14</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Bonwell</surname>
            ,
            <given-names>C.C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Eison</surname>
            ,
            <given-names>J.A.</given-names>
          </string-name>
          :
          <article-title>Active Learning: Creating Excitement in the Classroom</article-title>
          .
          <article-title>ASHE-ERIC High</article-title>
          .
          <source>Educ. Rep. No. 1</source>
          . Washington, DC. (
          <year>1991</year>
          ). doi:ED340272
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Khalil</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wong</surname>
            , J., de Koning,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ebner</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Paas</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>Gamification in MOOCs: A Review of the State of the Art</article-title>
          .
          <source>Global Engineering Education Conference (EDUCON)</source>
          . pp.
          <fpage>1629</fpage>
          -
          <lpage>1638</lpage>
          . IEEE. Tenerife, Spain. (
          <year>2018</year>
          ). doi:
          <volume>10</volume>
          .1109/EDUCON.
          <year>2018</year>
          .8363430
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Stracke</surname>
            ,
            <given-names>C. M.:</given-names>
          </string-name>
          <article-title>The Quality of MOOCs: How to improve the design of open education and online courses for learners?</article-title>
          <source>International Conference on Learning and Collaboration Technologies</source>
          . pp.
          <fpage>285</fpage>
          -
          <lpage>293</lpage>
          . Springer. Vancouver, Canada (
          <year>2017</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>319</fpage>
          -58509-3_
          <fpage>23</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Kitchenham</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Charters</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Guidelines for performing systematic literature reviews in software engineering</article-title>
          .
          <source>Technical report. Ver. 2</source>
          .3. Durham, UK. EBSE. (
          <year>2007</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Margaryan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bianco</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Littlejohn</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Instructional quality of Massive Open Online Courses (MOOCs)</article-title>
          . pp.
          <fpage>77</fpage>
          -
          <lpage>83</lpage>
          . Computers &amp; Education. Glasgow, UK. (
          <year>2015</year>
          ). doi:
          <volume>10</volume>
          .1016/j.compedu.
          <year>2014</year>
          .
          <volume>08</volume>
          .005
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Rosewell</surname>
          </string-name>
          , J.:
          <article-title>Benchmarks for MOOCs: the OpenupEd quality label</article-title>
          .
          <source>International Conference on Enhancement and Innovation in Higher Education. Glasgow</source>
          , UK. (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Lowenthal</surname>
            ,
            <given-names>R. L.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Hodges</surname>
            ,
            <given-names>C. B.</given-names>
          </string-name>
          :
          <article-title>In Search of Quality: Using Quality Matters to Analyze the Quality of Massive, Open, Online Courses (MOOCs)</article-title>
          .
          <source>The International Review of Research in Open and Distributed Learning</source>
          . pp.
          <fpage>83</fpage>
          -
          <lpage>100</lpage>
          . (
          <year>2015</year>
          ). doi:
          <volume>10</volume>
          .19173/irrodl.v16i5.
          <fpage>2348</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Jorrín-Abellán</surname>
            ,
            <given-names>I. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stake</surname>
            ,
            <given-names>R. E.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Martínez-Moné</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>The needlework in evaluating a CSCL system: The Evaluand oriented responsive evaluation model</article-title>
          .
          <source>In Proceedings of the 9th international conference on Computer supported collaborative learning</source>
          . pp.
          <fpage>68</fpage>
          -
          <lpage>72</lpage>
          .
          <source>International Society of the Learning Sciences. Rhodes</source>
          , Greece. (
          <year>2009</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Miles</surname>
            ,
            <given-names>M. B.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Huberman</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          :
          <article-title>Qualitative data analysis: An expanded sourcebook</article-title>
          .
          <source>SAGE</source>
          . New Delhi, India. (
          <year>1994</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>