=Paper= {{Paper |id=Vol-1446/GEDM_2015_Submission_5 |storemode=property |title=Studio: Ontology-Based Educational Self-Assessment |pdfUrl=https://ceur-ws.org/Vol-1446/GEDM_2015_Submission_5.pdf |volume=Vol-1446 |dblpUrl=https://dblp.org/rec/conf/edm/WeberV15 }} ==Studio: Ontology-Based Educational Self-Assessment== https://ceur-ws.org/Vol-1446/GEDM_2015_Submission_5.pdf
     Studio: Ontology-Based Educational Self-Assessment
                                    Christian Weber                             Réka Vas
                                   Corvinno Technology                     Corvinus University
                                     Transfer Center                          of Budapest
                                    Budapest, Hungary                      Budapest, Hungary
                               cweber@corvinno.com                    reka.vas@uni-corvinus.hu


ABSTRACT                                                              cognitive behavior. In his discussion, eight out of nine knowledge
Students, through all stages of education, grasp new knowledge        types underline that knowledge in the scope of learning is
in the context of knowledge memorized all through their previous      interrelated and strongly associated with previous experiences
education. To self-predict personal proficiency in education, self-   [2]. As such, a supporting solution for self-assessment should
assessment acts as an important learning feedback. The in-house       grasp and formalize the knowledge to assess in the context of
developed Studio suit for educational self-assessment enables to      related knowledge.
model the educational domain as an ontology-based knowledge           The Studio suit for educational self-assessment, presented in this
structure, connecting assessment questions and learning material      paper, provides here a software solution for testing the personal
to each element in the ontology. Self-assessment tests are then       proficiency in the context of related knowledge. It enables to
created by utilizing a sub-ontology, which frames a tailored          model areas of education as a substantial source for assessment
testing environment fitting to the targeted educational field. In     and narrows the gap between a potentially flawed self-prediction
this paper we give an overview of how the educational data is         and the real proficiency, by offering an objective and adaptive
modeled as a domain ontology and present the concepts of              online knowledge-test. To follow the natural learning process
different relations used in the Studio system. We will deduct how     and enable an easy extension, the software embeds the assessed
the presented self-assessment makes use of the knowledge              knowledge into a network of contextual knowledge, which
structure for online testing and how it adapts the test to the        enables to adapt the assessment to the responses of the students.
performance of the student. Further we highlight where
potentials are for the next stages of development.                    This paper will give an overview of the Studio educational
                                                                      domain ontology and the aspects of the system supporting
Keywords                                                              personalized self-assessment. Further it will highlight potentials
                                                                      for data mining on the gathered educational data with an outlook
Education, adaptive test, self-assessment, educational ontology       on the next stages of evaluation.

1. INTRODUCTION                                                       2. THE STUDIO APPROACH FOR SELF-
Students exploring new fields of education are always confronted      ASSESSMENT
with questions regarding their individual progress: how much do       The basic concept of Studio is to model the focused education as
they know after iterations of learning, in which directions should    an interrelated knowledge structure, which divides the education
they progress to fill the field most effectively, how to grasp the    into sub-areas and knowledge items to know. The managed
outline and details of the field and how much of their time           structure formalizes the relation between knowledge areas as a
should they invest in learning? Especially in higher education,       learning context and models the requirements to master specific
where learning becomes a self-moderated, personalized process,        parts of the education. This structure is used to create and
students are in need of continuous self-assessment to capture         support knowledge tests for students. Through this combination
their current state of proficiency. At the same time, the             of assessment and knowledge structure, the student gains the
unframed, informal self-prediction of students regarding their        freedom to explore not only single knowledge items but the
personal skills is often substantive and systematically flawed [1].   education in the context of related knowledge areas, while the
Here a systematic and objective solution for self-assessment is       embedded requirements are used to map the modeled knowledge
substantial to prevent a wrong or biased self-evaluation and to       against the expected educational outcome.
support the self-prediction of the personal proficiency.
                                                                      The assessment-system is designed to be accompanied by phases
Following Jonassen, knowledge in education could be split into        of learning within the system, where the student gets access to
nine types across three categories to capture the human’s             learning material, based on and supported by the test feedback.
                                                                      This combined approach offers a unique self-assessment to the
                                                                      students, where the backing knowledge context is used to adapt
                                                                      the assessment in dependency of the test performance of the
                                                                      student.
                                                                      Before any regular examination students may use Studio to
                                                                      assess their knowledge on their own. It is the tutor’s
                                                                      responsibility to set the course of self-assessment test in Studio
system by selecting knowledge areas and sub-knowledge areas            aspects of education as the curriculum or aspects relevant for the
which are relevant for the target education from the domain            task of learning and course creation [8][9][10] or describe the
ontology. Then the frame will be automatically completed with          design, use and retrieval of learning materials till creating
elements from the ontology which detail the selected knowledge         courses [11], as well as directly the learner within the education
areas and are modeled as required for this part of the education.      [12].
As the system stores assessment questions for each knowledge           Within the area of educational ontologies, domain ontologies
element, Studio will then automatically prepare an assessment          tend to model too specific details of the education, in an attempt
test, based on the defined selection and the domain ontology. The      to model the specific field as complete as possible. This enables
resulting knowledge-test is then accessible as a self-assessment       a comprehensive view on the field but it comes at the cost of
test for the student, who explores the backed knowledge                generality, with the potential to be inflexible to handle changes.
structure, which pictures the expected learning outcome, in            Other concepts model the education across different ontologies,
cycles of testing, reflection and learning. The process of test        matching concepts like the learner, the education and the course
definition and assessment is shown in Figure 1, while the result       description, introducing a broad horizon but with additional
preparation for reflection and learning is discussed in section 2.5.   overhead to combine modelled insights and reason on new
                                                                       instances.
                                                                       The appeal of the Studio educational ontology is the size and
                                                                       focus of the main classes and their relationships between each
                                                                       other. The knowledge to learn is the main connecting concept in
                                                                       the core of education. It enables a great flexibility to be
                                                                       resourceful for different education related questions. An example
                                                                       is here the business process management extension PROKEX,
                                                                       which maps process requirements against knowledge areas to
                                                                       create assessment test, reflecting the requirements of attached
                                                                       processes [13].
                                                                       An important factor in learning is the distance between the
                                                                       expectation of the tutor and the learning performance of the
                                                                       student. Here a short cycle of repeated assessment and learning
                                                                       is a major factor for a better personal learning performance [14].
                                                                       This aspect directly benefits from the focused concentration on
                                                                       knowledge-areas as the main exchange concept between students
                                                                       and tutors. As even further the close connections between
                                                                       learners and educators via direct tutoring is one major enabler for
                                                                       computer aided systems [15], each step towards a more direct
                                                                       interaction through focused concepts is an additional supporter.
                                                                       The class structure fuses the idea of interrelated knowledge with
                                                                       a model of the basic types of educational concepts, involved in
                                                                       situations of individual learning. Figure 2 visualizes the class
                                                                       concepts as knowledge elements, together with the relation types,
                                                                       used to model the dependencies between different aspects of
                                                                       knowledge and learning within the educational ontology.
                                                                       The Knowledge Area is the super-class and core-concept of the
                                                                       ontology. The ontology defines two qualities of main relations
                                                                       between knowledge areas: Knowledge areas could be a sub-
                                                                       knowledge area of other knowledge areas with the “has_sub-
Figure 1: The overall design, assess and reflection cyle of the        knowledge_area” relation or be required for another knowledge
                           system.                                     area with the “requires_knowledge_of” relation. A knowledge
                                                                       area may have multiple connected knowledge areas, linked as a
2.1 The Educational Domain Ontology                                    requirement or sub-area. The “requires_knowledge_of” relation
The Studio system is based on a predesigned educational
                                                                       defines that a node is required to complete the knowledge of a
ontology, explained in detail by Vas in [3]. Domain ontology is a      parent knowledge area. This strict concept models a requirement
frequently used term in the field of semantic technologies and
                                                                       dependency between fields of knowledge in education and yields
underlines the storage and conceptualization of domain
                                                                       the potential to assess perquisites of learning, analog to the basic
knowledge and is often used in a number of projects and
                                                                       idea of perquisites within knowledge spaces, developed by
solutions [4][5][6] and could address a variety of domains with        Falmagne [16].
different characteristics in their creation, structure and
granularity, depending on the aim and the modeling person [7]. A       Education is a structured process which splits the knowledge to
specialization in terms of the field is the educational domain         learn into different sub-aspects of learning. Knowledge areas in
ontology which is a domain ontology adapted to the area and            the ontology are extended by an additional sub-layer of
concepts of education. They could target to model different            knowledge elements in order to effectively support educational
and testing requirements. Figure 2 visualizes the sub-elements                 2.3 Creating and Maintaining Tests
and their relations. By splitting the assessed knowledge into sub-             The creation and continues maintenance of the domain ontology
concepts, the coherence and correlation of self-assessment                     is a task of ontology engineering. The ontology engineer (the
questions could be expressed more efficiently and with the                     ontologist), creates, uses and evaluates the ontology [17], with a
potential of a more detailed educational feedback.                             strong focus on maintaining the structure and content. Within
                                                                               Studio, this process is guided and supported by a specialized
              Has sub-                                            Requires     administration workflow and splits in three consecutive task
                                     Knowledge Area                            areas, in line with decreasing access rights:
           knowledge area                                       knowledge of

                                                                                        Ontology engineering (instance level): The creation
                                                                                        and linking of instances of the existing knowledge-area
                                Part of               Part of                           classes into the overall domain ontology.
                                                Part of
                                                                                        Test definition: Knowledge areas, which are relevant
                                                                                        to a target self-assessment test, are selected and
Refers                                                                                  grouped into specialized containers called Concept
                   Basic Concept                                Example
  to
                                          Refers to                                     Groups (CG). These concept groups are organized into
                                                                                        a tree of groups, in line with the target of the
                                                                                        assessment. The final tree in this regards captures a
                                                       Refers to
                                                                                        sub-ontology. Concept groups are internally organized
                                                                                        based on the overall ontology and include all relations
                      Pr
         Co




                        em
          nc




                                                                                        between knowledge elements, as defined within the
                          ise
            lu
               s




                                                                                        domain ontology.
              io
                 n




                                          Theorem               Refers
                                                                  to                    Question and learning material creation: Questions
                                                                                        and learning materials alike are directly connected to
                                                                                        single knowledge areas within the designed test frame
                                                                                        and get imported, if already existing, from the domain
                           Test                           Learning                      ontology. More questions and learning materials are
                         Questions                        Material                      defined now, in line with the additional need of the
                                          Testbank
                                                                                        targeted education and are available for future tests.
                                                                               The pre-developed structure of classes and relations is fixed as
          Figure 2: Model of the educational ontology.                         the central and integral design of the system. A view of the
Theorems express in a condensed and structured way the                         system interface for administration is provided in Figure 3. The
fundamental insights within knowledge areas. They fuse and                     left area shows the visualization of the current ontology section
explain the basic concepts of the depicted knowledge and set                   in revision and the right area shows the question overview with
them in relation to the environment of learning with examples.                 editing options. Tabs give access to additional editing views,
Multiple theorems could be “part_of” a knowledge area. Each                    including the learning material management and interfaces to
theorem may define multiple Basic Concepts as a “premise” or                   modify relations between nodes and node descriptions.
“conclusion”, to structure how the parts of the knowledge area
are related. Examples enhance this parts as a strong anchor for                2.4 Adaptive Self-Assessment
self-assessment questions and “refer_to” the theorems and basic                To prepare an online self-assessment test, the system has to load
concepts as a “part_of” one or more knowledge areas.                           the relevant educational areas from the domain ontology and
                                                                               extract the questions and relations of the filtered knowledge
2.2 The Testbank                                                               areas.
In order to connect the task of self-assessment with the model of              The internal test algorithm makes use of two assumptions:
the educational domain, the system integrates a repository of
assessment questions. Each question addresses one element of                            Knowledge-area ordering: As the main knowledge
the overall knowledge and is directly associated with one                               areas are connected through “requires_knowledge_of”
knowledge area or knowledge element instance within the                                 and “part_of” relations, every path, starting with the
ontology. The domain ontology provides here the structure for the                       start-element, will develop on average from general
online self-assessment while the repository of questions                                concepts to detailed concepts - given that the concept
supplements the areas as a test bank. The target of the self-                           groups in the test definition are also selected and
assessment is to continuously improve the personal knowledge                            ordered to lead from general to more detailed groups.
within the assessed educational areas, by providing feedback on                         Knowledge evaluation dependency: If a person,
the performance after each phase of testing. To do so, the Studio                       taking the test, fails on general concepts he or she will
system includes Learning Material connected to the test bank and                        potentially also fail on more detailed concepts. Further,
the knowledge areas, analog to the test questions. The learning                         if a high number of detailed concepts are failed, the
material is organized into sections as a structured text with                           parent knowledge isn’t sufficiently covered and will be
mixed media, as pictures and videos, and is based on a wiki-                            derived as failed, too.
engine to maintain the content, including external links.
          Figure 3: The main ontology maintenance and administration interface, showing a part of the domain ontology.
The filtering is done based on the selection of a tutor, acting as    same level. If the learner’s answer is correct, the system will
an expert for the target educational area. The tutor chooses          activate the child elements of the current node and draw a
related areas, which are then created as a Test Definition,           random question from the first left child.
containing Concept Groups, as described in section 2.3. The           Based on the tree shaped knowledge structure, the assessment
system then uses the test definition as a filtering list to extract   now follows these steps to run the self-assessment, supported by
knowledge areas. After the extraction, the structure is cached as     the extracted knowledge structure:
a directed graph, while the top element of the initial concept
group is set as a start element. Beginning with the start-element,        1.   Starting from the start-element, the test algorithm will
the test will move then through the graph, while administering                 activate the child knowledge-areas of the start element.
the questions connected to knowledge areas and knowledge                  2.   The algorithm now selects the first child-knowledge
elements.                                                                      area and draws a random question out of the pool of
The loading of knowledge-elements follows three steps:                         available questions for this specific knowledge-element
                                                                               from the test bank.
     1.   Each type of relation between two knowledge-elements
          implements a direction for the connection. Assuming             3.   If the learner fails the question, the algorithm will
          the system loads all relations, starting with the start-             mark the element as failed and select the next
          element and ending on a knowledge-element, this                      knowledge area from the same level. If the learner’s
          creates a two level structure where the start-node is a              answer is correct, the system will activate the child
          parent-element and all related, loaded elements are                  elements of the current node and trigger the process for
          child-elements, as seen below in Figure 4.                           each child-element.
     2.   The loading algorithm then selects one child-element
          and assumes it as a start-element and repeat the
          loading process of knowledge-elements.
     3.   When no knowledge-elements for a parent-element
          could be loaded, the sub-process stops. When all sub-
          processes have stopped, the knowledge structure is
          fully covered.
The test algorithm will now activate the child knowledge areas of
the start element and select the first knowledge area to the left
and draw a random question from the selected knowledge area. If        Figure 4: Excerpt from the sub-ontology visualization, with
the learner fails the question, the algorithm will mark the             the visible parent-child relationship, as used in the data-
element as failed and selects the next knowledge area from the                  loading for preparing the self-assessment.
An example question is shown below in Figure 5. Further               JavaScript framework [18]. The visualization itself is a custom
following the testing algorithm, the system dives down within the     build, similar to the Ext JS graph function “Radar” and based on
domain ontology and triggers questions depending on the               the idea of Ka-Ping, Fisher, Dhamija and Hearst [19]. All views
learner’s answers and the extracted model of the relevant             are able to zoom in and out of the graph, move the current
education. In this regards the Studio system adapts the test on the   excerpt and offer a color code legend, explaining the meaning of
fly to the performance of the learner. Correlating to the idea of     the colored nodes. In comparison with state of the art, the
adaptation, the learner will later gain access to learning material   interface offers no special grouping or additional visualization
for each mastered knowledge area. As the learner continues to         features like coding information into the size of nodes. Each
use the self assessment to evaluate the personal knowledge, he or     interface offers an additional textual tree view to explore the
she will thus explore different areas of the target education,        knowledge-elements or concept groups in a hierarchical listing.
following their individual pace of learning.                          This simple, straightforward approach for visualization correlates
                                                                      with the goal of a direct and easy to grasp feedback through
                                                                      interfaces which have a flat learning curve and enable to catch
                                                                      the functionality in a small amount of time.
                                                                      While this simple visualization is sufficient for the reasonable
                                                                      amount of knowledge-elments within the result view, this alone
                                                                      is not suitable for the domain ontology administration interface,
Figure 5: Test interface with a random drawn test question.           as seen in Figure 3. Here Studio realizes methodologies to filter
                                                                      and transform the data to visualize. To do so it makes use of two
2.5 Test Feedback and Result Visualization                            supporting mechanisms:
An important aspect of the system is the test feedback and                      The maximum-level-selector defines the maximum
evaluation interface. The educational feedback is one of the main               level the system extracts from the domain ontology for
enabler for the student to grasp the current state and extend of                full screen visualization.
the personal education. The domain ontology models the
structure and the dependencies of the educational domain, and                   In combination with the maximum level, the ontologist
the grouped test definition extracts the relevant knowledge for                 could select single elements within the domain
the target area or education. As such, the visualization of the                 ontology. This triggers an on-demand re-extraction of
ontology structure extracted for the test, together with the                    the visualized data, setting the selected knowledge-
indication of correct and incorrect answers, represents a map of                element as the centre element. The system then loads
the knowledge of the learner.                                                   the connected nodes, based on their relations into the
                                                                                orientation circles till the maximum defined level is
Throughout each view onto the ontology, the system uses the                     reached. More details about the transformation are in
same basic visualization, making use of the Sencha Ext JS                       [19].




                                Figure 6: Result visualization as educational feedback for the learner.
Together, this selection and transformation mechanism enables        Each event stores information about the system in 7
the fluent navigation within the complete domain ontology            dimensions, as described in Table 1 below:
structure, while re-using the same visualization interface.          Table 1: Event blueprint to store events concerning system
Figure 6 shows the main view of the result interface. The left                              interaction.
area shows the sub-ontology extracted for the test, while the
                                                                              Attribute                      Description
colored nodes represent the answers to the administered
questions. A red node visualizes wrong answers, while orange         Event description code       Which type of event and what
nodes are rejected nodes with correct answers but with an                                         factors are relevant.
insufficient number of correctly answered child nodes,               Location code                On which part of the assessment-
indicating a lack of the underlying knowledge. Green nodes                                        process or interface the event has
represent accepted nodes with correct answers and a sufficient                                    occurred.
amount of correctly answered questions for child nodes. Grey
nodes are not administered nodes, which were not yet reached         Session identifier           Each access of the system is one
by the learner, as higher order nodes had no adequate                                             session for one user.
acceptance.                                                          Numerical value storage      Multi-purpose     field,   filled
Even though the target of the system is not a strict evaluation in                                depending on the event type.
number, the evaluation of the percentage of solved and               String value storage         Multi-purpose     field,   filled
accepted knowledge elements helps the learner to track the                                        depending on the event type.
personal progress and could additionally be saved as a report
for further consultation. Besides providing an overview of the       Event-time                   The time of the start of the
                                                                                                  event.
self-assessment result, the result interface gives access to the
integrated learning material. For every passed node, the learner     Item reference               A unique reference code,
can now open the correlated material and intensify the                                            identifying the correlated item
knowledge for successful tested areas.                                                            within the ontology. E.g. a
Retaking the test in cycles of testing and learning, while                                        question or a knowledge-element
adapting the educational interaction, is the central concept of                                   ID.
the Studio approach for self-assessment. As a consequence the        All events are stored in order of their occurrence, so if no
system will not disclose the right answers to questions or           explicit end event is defined, the next event for the same
learning material for not yet administered knowledge areas, to       session and user is acting as the implicit end date. Extending
promote an individual reflection on the educational content          the existing storage of information within Studio, the new
outside of a flat memorization of content.                           logging system stores the additional events, as shown in Table
                                                                     2 below:
3. SYSTEM EVALUATION
                                                                             Table 2: Assessment events and descriptions.
The system has been used, extended and evaluated in a number
of European and nationally funded research projects, including              Event type                      Description
applications in business process management and innovation-          START_TEST                 Marks the start of a test.
transfer [20], medical education [21] and job market
competency matching [22].                                            END_TEST                   Marks the end of a test.
Currently the system is being evaluated based on a running           OPEN_WELCOME_LM            The user opened the welcome
study with 200 university students in the field of business                                     page.
informatics. The study will conclude on two current research         OPEN_LM_BLOCK              The student opened a learning
streams which are improving the systems testing and analysis                                    material block on the test
capability. The first direction looks into potentials for the                                   interface.
integration of learning styles into adaptive learning systems to
offer valuable advice and instructions to teachers and students      OPEN_LM                    The student opened the learning
[23]. Within the second direction the question is challenged on                                 material tab on the test interface.
how to adapt the presented self-assessment further towards the       RATE_LM                    The student rated the learning
performance of the students, based on extracting assessment                                     material.
paths from the knowledge structure [24].
                                                                     CHECK_RESULT               The student opened a result page.
For each running test, Studio collects basic quantitative data
about the number of assigned questions, how often tests are          CONTINUE_TEST              The student submitted an answer.
taken and how many students open which test and when. This           FINISH_TEST                The test has been finished.
is completed by qualitative measures, collecting which
                                                                     SUSPEND_TEST               The user suspended the test.
questions and knowledge elements the students passed or
failed. To conclude further on the mechanisms and impacts of         RESUME_TEST                The user has restarted a previously
Studio within the current study, a new logging system was                                       suspended test.
developed, collecting the interaction with the system and            SELECT_TEST_ALGO-          The algorithm used to actually test
detailed information about the feedback as detailed events.          RITHM                      the student is selected.
TEST_ALGORITHM_-             The behavior of the current test              text,” Expert Systems with Applications, vol. 34, no. 2, pp.
EVENT                        algorithm changes, e.g. entering              1474–1480, Feb. 2008.
                             another stage of testing.                [5] S.-H. Wu and W.-L. Hsu, “SOAT: a semi-automatic
                                                                           domain ontology acquisition tool from Chinese corpus,” in
ASK_TESTQUESTION             Sends out a test question to the
                             user to answer.                               Proceedings of the 19th international conference on
                                                                           Computational linguistics-Volume 2, 2002, pp. 1–5.
STUDIO_LOGOUT                The user logs out of the Studio          [6] M. Missikoff, R. Navigli, and P. Velardi, “The Usable
                             system.                                       Ontology: An Environment for Building and Assessing a
To store the events, the system implements an additional                   Domain Ontology,” in The Semantic Web — ISWC 2002, I.
logging database, splitting the concepts of the logging to a star-         Horrocks and J. Hendler, Eds. Springer Berlin Heidelberg,
schema for efficient extraction, transformation and loading. The           2002, pp. 39–53.
logging system is modular and easy to extend with new                 [7] T. A. Gavrilova and I. A. Leshcheva, “Ontology design
concepts and easy to attach to potential event positions within            and individual cognitive peculiarities: A pilot study,”
the Studio runtime. Together with the existing logging of the              Expert Systems with Applications, vol. 42, no. 8, pp.
assessment evaluation feedback, this new extension tracks the              3883–3892, May 2015.
exploration of the sub-ontology within the assessment and             [8] S. Sosnovsky and T. Gavrilova, “Development of
enriches the feedback data with context information of the                 Educational Ontology for C-programming,” 2006.
students behavior on the system.                                      [9] V. Psyché, J. Bourdeau, R. Nkambou, and R. Mizoguchi,
                                                                           “Making learning design standards work with an ontology
4. NEXT STEPS                                                              of educational theories,” in 12th Artificial Intelligence in
The domain ontology offers a functional and semantically rich              Education (AIED2005), 2005, pp. 539–546.
core for supporting learning and education. Yet not all the           [10] T. Nodenot, C. Marquesuzaá, P. Laforcade, and C.
semantic potentials are fully leveraged to support and test the            Sallaberry, “Model Based Engineering of Learning
learner’s progress. The “requires_knowledge_of” relation-                  Situations for Adaptive Web Based Educational Systems,”
requirement is a potential start-concept to model sub-areas as             in Proceedings of the 13th International World Wide Web
groups which together compose the dependency. This could act               Conference on Alternate Track Papers &Amp; Posters,
as an additional input for the assessment, where the system                New York, NY, USA, 2004, pp. 94–103.
derives more complex decision how to further explore the              [11] A. Bouzeghoub, C. Carpentier, B. Defude, and F.
related parts of the structure [25]. This could also be visualized,        Duitama, “A model of reusable educational components
enabling the learner to grasp the personal knowledge as a                  for the generation of adaptive courses,” in Proc. First
visible group of concepts.                                                 International Workshop on Semantic Web for Web-Based
Besides giving colors to the different types of relations, the             Learning in conjunction with CAISE, 2003, vol. 3.
visualizing of edges between knowledge areas is yet unfiltered,       [12] W. Chen and R. Mizoguchi, “Learner model ontology and
offering no further support for navigation. A next stage of                learner model agent,” Cognitive Support for Learning-
implementation could be the introduction of a visual ordering              Imagining the Unknown, pp. 189–200, 2004.
and grouping of knowledge areas and relations. Underlying             [13] G. Neusch and A. Gábor, “PROKEX – INTEGRATED
relations of sub-nodes could be interpreted visually through the           PLATFORM FOR PROCESS-BASED KNOWLEDGE
thickness of relations between nodes, easing the perception of             EXTRACTION,” ICERI2014 Proceedings, pp. 3972–
complex parts of the domain ontology, especially within                    3977, 2014.
administration and maintenance tasks.                                 [14] H. L. Roediger III, “Relativity of Remembering: Why the
                                                                           Laws of Memory Vanished,” Annual Review of
The feedback of the current evaluation study of Studio will
                                                                           Psychology, vol. 59, no. 1, pp. 225–254, 2008.
provide additional insights into the usage of the system by the
                                                                      [15] J. D. Fletcher, “Evidence for learning from technology-
students. Based on this new data it is possible to mine profiles
                                                                           assisted instruction,” Technology applications in
over time on the knowledge structure. One major application is
                                                                           education: A learning view, pp. 79–99, 2003.
here the creation of behavior profiles, as proposed in [23].
                                                                      [16] J.-C. Falmagne, M. Koppen, M. Villano, J.-P. Doignon,
                                                                           and L. Johannesen, “Introduction to knowledge spaces:
5. REFERENCES                                                              How to build, test, and search them,” Psychological
[1] D. Dunning, C. Heath, and J. M. Suls, “Flawed self-                    Review, vol. 97(2), pp. 201–224, 1990.
    assessment implications for health, education, and the            [17] F. Neuhaus, E. Florescu, A. Galton, M. Gruninger, N.
    workplace,” Psychological science in the public interest,              Guarino, L. Obrst, A. Sanchez, A. Vizedom, P. Yim, and
    vol. 5, no. 3, pp. 69–106, 2004.                                       B. Smith, “Creating the ontologists of the future,” Applied
[2] D. Jonassen, “Reconciling a Human Cognitive                            Ontology, vol. 6, no. 1, pp. 91–98, 2011.
    Architecture,” in Constructivist Instruction: Success Or          [18] “Ext        JS       API.”       [Online].        Available:
    Failure?, S. Tobias and T. M. Duffy, Eds. Routledge,                   http://www.objis.com/formationextjs/lib/extjs-
    2009, pp. 13–33.                                                       4.0.0/docs/index.html. [Accessed: 04-May-2015].
[3] R. Vas, “Educational Ontology and Knowledge Testing,”             [19] K.-P. Yee, D. Fisher, R. Dhamija, and M. Hearst,
    Electronic Journal of Knowledge Management, vol. 5, no.                “Animated exploration of dynamic graphs with radial
    1, pp. 123 – 130, 2007.                                                layout,” in IEEE Symposium on Information
[4] M. Y. Dahab, H. A. Hassan, and A. Rafea, “TextOntoEx:                  Visualization, 2001. INFOVIS 2001, 2001, pp. 43–50.
    Automatic ontology construction from natural English
[20] M. Arru, “Application of Process Ontology to improve the     [23] H. M. Truong, “Integrating learning styles and adaptive e-
     funding allocation process at the European Institute of           learning system: Current developments, problems and
     Innovation and Technology.,” in 3rd International                 opportunities,” Computers in Human Behavior, 2015.
     Conference on Electronic Government and the                  [24] C. Weber, “Enabling a Context Aware Knowledge-Intense
     Information Systems Perspective (EGOVIS)., Munich,                Computerized Adaptive Test through Complex Event
     Germany, 2014.                                                    Processing,” Journal of the Sientific and Educational on
[21] M., M., Ansari, F., Dornhöfer Khobreh and M. Fathi, “An           Forum on Business Information Systems, vol. 9, no. 9, pp.
     ontology-based Recommender System to Support Nursing              66–74, 2014.
     Education and Training,” in LWA 2013, 2013.                  [25] C. Weber and R. Vas, “Extending Computerized Adaptive
[22] V. Castello, L. Mahajan, E. Flores, M. Gabor, G. Neusch,          Testing to Multiple Objectives: Envisioned on a Case
     I. B. Szabó, J. G. Caballero, L. Vettraino, J. M. Luna, C.        from the Health Care,” in Electronic Government and the
     Blackburn, and F. J. Ramos, “THE SKILL MATCH                      Information Systems Perspective, vol. 8650, A. Kő and E.
     CHALLENGE. EVIDENCES FROM THE SMART                               Francesconi, Eds. Springer International Publishing, 2014,
     PROJECT,” ICERI2014 Proceedings, pp. 1182–1189,                   pp. 148–162.
     2014.