=Paper=
{{Paper
|id=Vol-2066/seels2018paper03
|storemode=property
|title=Towards a Pattern Catalogue for E-Assessment System Integration
|pdfUrl=https://ceur-ws.org/Vol-2066/seels2018paper03.pdf
|volume=Vol-2066
|authors=Michael Striewe
|dblpUrl=https://dblp.org/rec/conf/se/Striewe18
}}
==Towards a Pattern Catalogue for E-Assessment System Integration==
<pdf width="1500px">https://ceur-ws.org/Vol-2066/seels2018paper03.pdf</pdf>
<pre>
         Towards a Pattern Catalogue for E-Assessment
                      System Integration

                                                           Michael Striewe
                                        paluno – The Ruhr Institute for Software Technology
                                                   University of Duisburg-Essen
                                                          Essen, Germany
                                                michael.striewe@paluno.uni-due.de

    Abstract— This paper presents preliminary results of an           and in the sense of assessment activities integrated into larger
extensive literature study on software components commonly            educational contexts. Both cases are not possible from the
used in e-assessment systems. The purpose of the study is to          software engineering perspective without understanding
prepare the creation of a pattern catalogue for design patterns,      educational systems as a composition of components and
which can be used for integrating e-assessment features into          services. Although situations might exist in which a system
larger systems.                                                       offering only e-assessment features is appropriate to use, ITS
                                                                      or learning management systems (LMS) can be expected to
    Keywords— E-learning and E-assessment systems, Design
                                                                      integrate e-assessment capabilities either as own components
patterns, System integration
                                                                      or as external services. This is the most favorable view in
                                                                      particular when using a broad notion of assessment that
                      I. INTRODUCTION                                 includes any kind of non-formal (self-)assessment that might
    Following a general tendency in system design and system          occur during learning and training. Consequently, there will be
architectures in recent decades, educational systems                  no strict definition on how to tell an LMS with e-assessment
transformed in three generations from monolithic blocks via           features from an e-assessment system with LMS features and
modular systems to service oriented frameworks [5]. This is a         alike.
comprehensible development due to the many similarities                   The remainder of this paper hence reports in chapter II on
between educational systems and other software products.              different kinds of components found in the literature, that
Consequently, there is also a tendency in very recent years to        typically appear in the context of educational systems. The
move forward to cloud based solutions in e-learning and e-            assumption is that these components may be integrated with
assessment, which is considered a fourth generation by some           other components in a system offering e-assessment features.
authors [11].                                                         The goal of this chapter is hence to compile an overview
    These trends were not only driven by purely technical             including a rough description of component interfaces. The
innovations, but also by actual requirements in the context of        intention is to use this overview as a baseline for subsequent
these systems. For example, service oriented architectures were       considerations on architectural patterns. Chapter III provides a
in particular introduced due to the need for sharing materials or     first and preliminary sketch for these considerations. It takes
assessments across courses and teachers or even institutions [1,      some of these bits and provides abstract descriptions of some
4]. A similar need for sharing expert systems and knowledge           reoccurring patters found in the components and systems
modules also led to modularization in the area of Intelligent         mentioned above. The goal is not yet to provide a full pattern
Tutoring Systems (ITS) [8], which usually also include some           catalogue, but to present and discuss abstractions on various
kind of assessment features. Learning management systems              granularity levels by example. Chapter IV reviews these results
(LMS) also included a rising amount of e-assessment features.         in order to name future work towards a more complete pattern
Especially those systems that are developed (as open source           catalogue.
projects) by a distributed community (such as MOODLE or
ILIAS) benefit from modularization. With rising numbers of                     II. A LITERATURE STUDY ON COMPONENTS
students and in particular rising numbers of electronic
                                                                          The following sections provide an overview on typical
assessments, scalability became a crucial issue for e-
                                                                      components related to e-assessment features that can be found
assessment systems in particular and thus put arguments in
                                                                      in literature. The study includes publications from major
favor of cloud solutions to the front [25].
                                                                      conferences and journals in the computer-aided assessment and
    While the notion of different generations of systems              intelligent tutoring systems community as well as
according to their architecture refers to the internal structure of   documentation for commercial tools. The literature study
these systems in the first place, modularization also is a            particularly includes (amongst other sources) a systematic
prerequisite for constructing integrated systems. Integrated e-       review of papers from the International Conference on
assessment can be understood in two ways: In the sense of             Technology Enhanced Assessment (TEA) (formally known as
technical integration of e-assessment features into other tools       International Conference on Computer Assisted Assessment




SEELS 2018: Software Engineering für E-Learning-Systeme @ SE18, Ulm, Germany                                                       62
(CAA)), the IEEE Global Engineering Education Conference              assessment. However, it can also appear in non-adaptive
(EDUCON), the International Conference on Intelligent                 context in which nevertheless a particular exam needs to be
Tutoring Systems (ITS) and the IEEE Transactions on                   retrieved from a database to be delivered to a student. As the
Learning Technology (TLT). Although the study provides                former case attracts a lot of research, it is highly present in the
some remarks on the quantity of publications, its focus is on         literature.
the qualities and characteristics of the components.
                                                                          An additional problem generator (also called item
                                                                      constructor) is mentioned sometimes in the literature as well [1,
A. User Interface Components                                          20, 21, 26]. It is concerned with filling item templates with
    The literature review identified three main user interface        actual content, for example by creating random numbers.
components, where one of them faces the students and two face         Consequently, it is not used in context in with fixed items are
the educators or administrators.                                      used and in which any adaptations are performed by the
    A student frontend (also called student LMS, student VLE,         assessment generator mentioned above. This explains the lower
student CMS, student agent, or learning interface) is most            number of occurrences in the literature.
commonly mentioned in literature [1, 2, 3, 8, 10, 11, 12, 14, 18,         A pedagogical module (also called hint generator) is
25, 26]. It offers features to display assessments to the students    mentioned sometimes in the literature [6, 10, 18, 26]. It is
and to retrieve their answers. The student frontend is thus           concerned with providing hints to students while they work on
typically highly interactive and the amount of different item         an assessment item. Consequently, these components primarily
types supported by an e-assessment system is typically                occur in assessments that focus on learning, training, or
determined by the amount of different types of interactions the       tutoring instead of formal evaluation of student performance.
student frontend is able to offer. This in turn explains the large    Notably, a literature review from 2009 [24] explicitly makes a
amount of papers on student interfaces, as publishing new             distinction between plain feedback on correctness (which
features in this area appears highly attractive for the               would refer to an evaluator component discussed in the next
community. Systems often employ one student frontend                  paragraph) and more intelligent analysis as required by a
component, which is extensible by plug-ins (see section III.C).       pedagogical module. Although one would expect the latter to
    A teacher frontend (also called teacher LMS, teacher VLE,         be a crucial part of intelligent tutoring systems, the literature
teacher CMS, or admin agent) is mentioned less often                  review reports a low occurrence rate of components for
explicitly in literature [2, 11, 17, 25]. It offers features for      intelligent analysis of student solutions in intelligent tutoring
administration, authentication, and assessment scheduling. It         systems (3 out of 34).
thus aggregates the features related to the organizational                An evaluator component (also called checker, diagnose
aspects of assessments. As these are in the focus of research         module, assessor, or expert module) is mentioned very often in
more rarely, publication counts for these interfaces are low,         the literature [1, 2, 3, 8, 10, 14, 18, 21, 26]. It is concerned with
which does not imply that these interfaces are offered more           analyzing submissions from students and identifying mistakes
rarely by e-assessment systems.                                       that may occur in these submissions. As part of that, it is also
     More often, an authoring tool is discussed explicitly in         concerned with the generation of feedback that is presented to
literature [3, 12, 17, 19, 20, 21, 22]. It offers features required   the student. It is hence somewhat similar to the pedagogical
to create contents, which in particular refers to assessment          module mentioned above and may be used by these modules.
items, item pools, and grading schemas. It thus aggregates the        However, it also may be much more simpler in that it basically
features related to the educational aspects of assessment and is      just applies a grading schema to a solution but is not able to
related more closely to the student interface and its features.       provide any hint on how to improve a wrong solution. As this
Thus it is more in the interest of research and thus mentioned        seems to be sufficient in several situations, an evaluator
more often in literature, but also remarkably often by                component is mentioned much more often than a pedagogical
commercial tools.                                                     module. Large e-assessment systems often employ a large
                                                                      amount of different evaluator components, where each one is
                                                                      specialized to process a specific type of input or create a
B. Educational Components                                             specific type of feedback.
    The core of e-assessment systems are their educational
qualities and thus the algorithmic power they offer for               C. Knowledge Representation and Storing Components
generating contents, providing advice, and evaluate answers.
The literature study identified four components that relate to            Virtually any e-assessment system contains a component
this area. They are discussed here in the order of appearance         for general data storage for users, assessment items, and
during an assessment.                                                 solutions. These very basic features are common to almost
                                                                      every information processing systems and are thus out of scope
    An assessment generator (also called instructional                for this literature study. However, there are also components
manager, curriculum agent, task selector, tutoring component,         for storing more specific data, which are often mentioned in the
or steering component) is mentioned very often in the literature      context of intelligent tutoring systems or adaptive assessment
[6, 8, 10, 15, 18, 20, 23, 25, 26]. It is concerned with preparing    systems.
an assessment for delivery to the student. This often includes
selecting appropriate items from an item pool in case of                  A domain knowledge model (also called knowledge base) is
adaptive system behavior in order to individualize training or        mentioned often in the literature [3, 6, 7, 8, 10, 18, 22, 26]. It is
                                                                      responsible for storing information on the domain of the




SEELS 2018: Software Engineering für E-Learning-Systeme @ SE18, Ulm, Germany                                                            63
assessment, which are not specific to a certain assessment item,      III. ARCHITECTURAL PATTERNS FOR E-ASSESSMENT SYSTEMS
but reflect facts or competencies of the particular domain.                The previous chapter reported on typical building blocks
Domain knowledge models are mentioned most often in                   for e-assessment systems that have been found in recent
conjunction with expert modules that are able to evaluate a           literature. Based on these findings, this chapter now reports on
submission by using domain knowledge, but without knowing             patterns that can be considered useful when designing and
the correct answer to the particular assessment item explicitly.      engineering e-assessment systems using some of these
The same goes for connections to pedagogical modules that use         components. A particular focus of these considerations is on
domain knowledge to generate hints.                                   questions regarding integration and thus also on well-defined
    A student model is mentioned often in the literature as well      interfaces that describe suitable connections. The idea of this
[3, 6, 7, 8, 14, 15, 18, 22, 23, 26]. It is responsible for storing   chapter is to some extent inspired by the similar idea of
information on a particular student, which again is not specific      architectural patterns for intelligent tutoring systems (ITS)
to a particular assessment item. Instead, a student model             explored by Andreas Harrer et al. 10 to 15 years ago [7, 13].
reflects competencies or similar properties that relate to the        Unlike in that work, this chapter does not focus on the
person and his or her capabilities or performance. These may          decomposition of a complete system into parts. Instead, it
be designed as records referring to an underlying competency          discusses system parts that can be integrated with each other or
model, which in turn is stored in a domain knowledge model as         with other systems in order to create meaningful e-assessment
mentioned above. Student models are mentioned most often in           features. To ensure a broader exploration of the design space, it
conjunction with adaptive system behavior, where adaptation is        is not limited to patterns found directly in the literature.
based on the information stored in the student model.                 Considering the limited space of this paper, the following
                                                                      sections primarily look at static aspects of system architectures,
    Additional domain-specific data storage is mentioned only         interfaces, and general data handling. Behavioral aspects
rarely in the literature [16]. It is relevant only in domains in      (including adaptive behavior) are not discussed in this paper.
which submissions to assessment items are large or complex
objects, such as program code in the domain of programming
                                                                      A. Component Types
assessment. Consequently, specific components for this
purpose are explored only in conjunction with these domains               As a general observation, one can identify two types of
and almost never as part of general assessment systems.               components: Passive services are waiting for requests that are
                                                                      directly or indirectly cause by user interactions. They perform
D. Management Components                                              some actions upon these requests and then wait for the next
                                                                      request to process. They can be considered a standard way of
    The core features and requirements of e-assessment                designing business information systems. Some literature
systems motivate the components discussed so far. However,            mentions them as a general principle of system design [2, 4, 5].
additional requirements may introduce some more components.           In contrast to that, active agents have their own agenda on what
Some more components may exist primarily for the sake of              to do and thus they perform their actions potentially even
better software architectures. In general, these components are       without any user input. They are used both for educational
far less present in the literature.                                   components (such as agents that generate hints or exercise
    A reservation service realizes an additional feature of e-        suggestions without explicit request from the user) and
assessment systems reported sometimes in the literature and by        management components (such as agents adjusting the cloud
commercial tools [16, 17, 20]. It is responsible for registering      infrastructure to the current load). They are particularly
students for assessments and thus covers an additional part of        common in the domain of intelligent tutoring systems [3, 23]
the organizational process around assessments, which is not
necessarily covered by the teacher frontend discussed in              B. Data Storage
section II.A above.                                                       Regardless of the number and design of components, many
    A service broker (also called spooler or middleware) is           systems employ the pattern of a central data storage, which
mentioned in discussions of system architectures only [2, 9]. It      accumulates data for all components. This is particularly useful
connects some frontend or steering components to evaluator            when using several agents that are supposed to work on the
components that may run in parallel on separate systems for           same data. Moreover, data storage is centralized in cases in
performance or security reasons.                                      which most components are realized as stateless services. An
                                                                      alternative pattern is that of a distributed data storage, which is
    An infrastructure agent is reported for cloud-based               used when components typically process specific data that is of
solutions only [25]. It is responsible for starting and shutting      no meaning to other components, such as domain knowledge in
down instances of other components to adjust the size of the          different expert modules. A third and rarely used pattern is that
running system to the current needs. It is only necessary in          of a duplicate storage, where data is prepared and stored in one
systems which are aware of being a cloud system. Different to         place but copied to another place on demand. This is used for
that, components can also be deployed as services in a cloud          example when item pools are stored in one place for authoring
based or container based environment in which the underlying          and copied to another place when running an actual
cloud or container infrastructure is responsible for starting and     assessment.
shutting down additional instances.




SEELS 2018: Software Engineering für E-Learning-Systeme @ SE18, Ulm, Germany                                                          64
C. Plug-In Types                                                                   [5]  Dagger, D.; O’Connor, A.; Lawless, S.; Walsh, E.; Wade, V.: Service-
                                                                                        oriented e-learning platforms: From monolithic systems to flexible
     User interface components offer various ways of how to                             services. IEEE Internet Computing, 11(3):28–35, May 2007.
integrate into larger context. One pattern is that of a native                     [6] Devedzic, V.; Radovic, D.; Jerinic, L.: On the notion of components for
plugin, which implements the full feature set of the component.                         intelligent tutoring systems. Intelligent Tutoring Systems (ITS 1998),
It is written in the same language as the host system and uses                          volume 1452 of LNCS, pages 504–513, 1998.
the data storage provided by the host system. This is the                          [7] Devedzic, V.; Harrer, A.: Architectural patterns in pedagogical agents.
standard way of implementing plug-ins in the LMS MOODLE or                              Intelligent Tutoring Systems (ITS 2002), volume 2363 of LNCS, pages
                                                                                        81–90, 2002.
ILIAS. An alternative pattern is that of a foreign plugin, which
only implements a subset of the desired features directly.                         [8] El-Sheikh, E.; Sticklen, J.: Generating intelligent tutoring systems from
                                                                                        reusable components and knowledge-based systems. Intelligent Tutoring
Besides connecting to the plug-in API of the host system, it                            Systems (ITS 2002), volume 2363 of LNCS, pages 199–207, 2002.
also connects to an own backend component which implements                         [9] Garmann, R.; Heine, F.; Werner, P.: Grappa - die Spinne im Netz der
the missing part of the feature set and often also offers its own                       Autobewerter und Lernmanagementsysteme DeLFI 2015 - Die 13. e-
data storage mechanism. The third alternative is that of an                             Learning Fachtagung Informatik der Gesellschaft für Informatik e.V.
external tool. In this pattern, the host system redirects the user                      (GI), 2015, 169-181
to the external tool via some standard API and receives a                          [10] Gonzalez-Sanchez, J.; Chavez-Echeagaray, M.; Van Lehn, K.; Burleson,
callback when the user has finished their duties there. This                            W.; Girard, S.; Hidalgo-Pontet, Y.; Zhang, L.: A system architecture for
                                                                                        affective meta intelligent tutoring systems. Intelligent Tutoring Systems
mechanism is also realized in LMS via the IMS-LTI standard.                             (ITS 2014), pages 529–534, 2014.
                                                                                   [11] Gusev, M.; Ristov, S.; Armenski, G.; Velkoski, G.; Bozinoski, K.: E-
D. Job Delegation                                                                       Assessment Cloud Solution: Architecture, Organization and Cost Model.
                                                                                        iJET, 8 (Special Issue 2):55–64, 2013.
    The connection between the student frontend and an
evaluator component can be realized in many different ways.                        [12] H5P.org. H5p documentation. https://h5p.org/documentation. Last
                                                                                        accessed: 2017-12-01.
One pattern is that of a synchronous push. In this pattern, user
                                                                                   [13] Harrer, A.; Martens, A.: Towards a pattern language for intelligent
interaction directly triggers the grading process and the user                          teaching and training systems. Intelligent Tutoring Systems (ITS 2006),
has to wait until the input is processed. Systems in which                              volume 4053 of LNCS, pages 298–307, 2006.
grading tasks are short running and in which the next step                         [14] Kenfack, C.; Nkambou, R.; Robert, S.; Nyamen Tato, A.; Brisson, J.;
depends on the previous result usually employ this pattern. An                          Kissok, P.: A brief overview of logic-muse, an intelligent tutoring
alternative is the asynchronous push pattern, which also                                system for logical reasoning skills. Intelligent Tutoring Systems (ITS
triggers the grading process directly, but without blocking user                        2016), 2016.
interaction by waiting. A third alternative is asynchronous pull,                  [15] Kurup, M.; Greer, J.; McCalla, G.: The fawlty article tutor. Intelligent
                                                                                        Tutoring Systems (ITS 1992), 1992.
in which user input is stored in a queue and pulled from there
by the evaluation module. This pattern often occurs in                             [16] Küppers, B.; Politze, M.; Schroeder, U.: Reliable e-assessment with GIT
                                                                                        - practical considerations and implementation. EUNIS 23rd Annual
conjunction with a service broker component or with evaluator                           Congress, 2017.
components realized as agents.                                                     [17] LPLUS GmbH. Lplus: Portfolio. https://lplus.de/en/lplus-portfolio/. Last
                                                                                        accessed: 2017-12-01.
                       IV. CONCLUSIONS                                             [18] Martens, A.: Time in the adaptive tutoring process model. Intelligent
                                                                                        Tutoring Systems (ITS 2006), volume 4053 of LNCS, pages 134–143,
    The results achieved so far are instrumental in two ways:                           2006.
First, they suggest a structure for a classification of existing                   [19] Martin, B.: Authoring educational games with greenmind. Intelligent
components and a pattern catalogue derived from existing                                Tutoring Systems (ITS 2008), volume 5091 of LNCS, pages 684–686,
systems and components. Second, they provide a preliminary                              2008.
overview on some design alternatives for designing integrated                      [20] MapleSoft.             Features          in           Maple          T.A.
e-assessment systems. However, these results are far from                               https://www.maplesoft.com/products/mapleta/mainfeatures.aspx. Last
                                                                                        accessed: 2017-12-01.
being complete, yet, and hence more detailed research and
                                                                                   [21] Maths for More. Wiris quizzes - technical description.
work on pattern descriptions is still required to provide a more                        http://www.wiris.com/en/quizzes/docs. Last accessed: 2017-12-01.
complete picture. In particular, a large body of standards
                                                                                   [22] Murray, T.: Having it all, maybe: Design tradeoffs in ITS authoring
existing in the domain of e-learning systems has not been                               tools. Intelligent Tutoring Systems (ITS 1996), 1996.
reviewed so far. Behavioral aspects also need to be included                       [23] Neji, M.; Ben Ammar, M.: Agent-based collaborative affective e-
during the next steps.                                                                  learning framework. Electronic Journal of e-Learning, 5(2):123–134,
                                                                                        2007.
                               REFERENCES                                          [24] Papadimitriou, A.; Grigoriadou, M.; Gyftodimos, G.: Interactive
                                                                                        problem solving support in the adaptive educational hypermedia system
[1]   Armenski, G.; Gusev, M.: E-testing based on service oriented                      MATHEMA. TLT, 2(2):93–106, 2009
      architecture. Proceedings of the 10th CAA Conference, 2006.
                                                                                   [25] Ristov, S.; Gusev, M.; Armenski, G.; Velkoski, G.: Scalable and Elastic
[2]   Amelung, M.; Krieger, K.; Rösner, D.: E-assessment as a service. IEEE             e-Assessment Cloud Solution. IEEE Global Engineering Education
      Transactions on Learning Technologies, 4:162–174, 2011.                           Conference (EDUCON), 2014.
[3]   Costa, E.; Silva, P.; Silva, M.; Silva, E.; Santos, A.: A multiagent-based   [26] Rickel, J.: Intelligent computer-aided instruction: A survey organized
      ITS using multiple viewpoints for propositional logic. Intelligent                around system components. IEEE Transactions on Systems, Man, and
      Tutoring Systems (ITS 2012), pages 640–641, 2012.                                 Cybernetics, 19(1):40–57, 1989.
[4]   Davies, W.; Howard, Y.; Davis, H.; Millard, D.; Sclater, N.:
      Aggregating assessment tools in a service oriented architecture. 9th
      International CAA Conference, 2005.




SEELS 2018: Software Engineering für E-Learning-Systeme @ SE18, Ulm, Germany                                                                                  65

</pre>