=Paper= {{Paper |id=Vol-2193/paper2 |storemode=property |title=Learning Analytics Case Study in Blended Learning Scenarios |pdfUrl=https://ceur-ws.org/Vol-2193/paper2.pdf |volume=Vol-2193 |authors=Vlatko Lukarov, Ulrik Schroeder |dblpUrl=https://dblp.org/rec/conf/ectel/LukarovS18 }} ==Learning Analytics Case Study in Blended Learning Scenarios== https://ceur-ws.org/Vol-2193/paper2.pdf
    Learning Analytics Case Study in Blended Learning
                        Scenarios

                           Vlatko Lukarov1 and Ulrik Schroeder1
            1 RWTH Aachen University, Learning Technologies Research Group,

                     Ahornstrasse 55, 52704 Aachen, Germany
            {lukarov, schroeder}@informatik.rwth-aachen.de



       Abstract. In this paper we present a case study that provides a learning analytics
       tool to 53 course rooms on the learning platform in a higher education institution
       in Germany. The case study objective was to observe in which ways the teaching
       staff would use the learning analytics module while performing other teaching
       activities within the course room on the learning platform. We collected raw data
       from three different sources to examine the hypothesis of the case study in a
       quasi-experimental setting. The analyzed results showed that over the course of
       the study, in 40 courses the teaching staff used the learning analytics module at
       least on one occasion; in 25 courses the teaching stuff used the learning analytics
       module more than five times; and in 36 courses, the teaching staff has used the
       learning analytics module on multiple occasions within the same session while
       conducting teaching activities on the learning platform. As part of the data col-
       lection strategy for the case study, we conducted a two-part anonymous survey
       to collect qualitative feedback about the features and the user interface of the
       analytics prototype. The results of the survey revealed that the teaching staff used
       the analytics prototype mainly to observe the learning resources in their respec-
       tive courses and be aware about the student behavior in their courses on the learn-
       ing platform. Overall, we concluded that Learning Analytics should be an integral
       part of the provisioned e-learning services in higher educational institutions; the
       place for delivering Learning Analytics is the virtual course rooms on the learn-
       ing platform; and that the teaching staff (if provided) would use it on regular basis
       in short while doing their day-to-day teaching activities on the learning platform.

       Keywords: Learning Analytics, Case Study, Evaluation


1      Introduction

In blended learning scenarios, the teacher still holds the central role that influences
student learning and motivation. Teaching is an acquired mastery, and educators should
use all means at their disposal to provide the best learning setting and resources for their
students [2, 17]. In other words, teachers should design an appropriate pedagogical ap-
proach and choose fitting learning design; create suitable and diversified learning re-
sources; incorporate and carry out assessment within their pedagogical approach; pro-
vide timely and appropriate feedback back to the students; and be aware of student
2


engagement in the learning process. If the teachers want to enact continuous enhance-
ment and adaptation of their teaching, they need to receive feedback and information
about their teaching, analyze and reflect upon their work [5, 17]. Furthermore, teachers
need to be aware of how students perceive and behave in the learning setting on the
learning platform. This way they can identify which learning resources work well; dis-
cover with which resources or assignments students struggle; identify and categorize
learning patterns and strategies; understand which platform features are more effective
within their pedagogical approach and setting; be able to identify and adapt materials
to address the students’ needs; and guide the students to become successful and better
learners [10]. These teachers’ activities are well within the scope of teaching and are
crucial for providing high quality education. Nevertheless, most learning platforms still
do not support teachers in this respect and do not provide such analytics features that
helps them to improve their teaching practices, even though analyzing this data can
shed light to unseen behavior, provide visibility to pieces of information and insight
that could not be observed before, and would go unnoticed and be unactionable [6, 15].
In the context of this research, we understand the development and exploration of meth-
ods and tools for visual analysis and pattern recognition in educational data to permit
institutions, teachers, and students to iteratively reflect on learning processes and, thus,
call for the optimization of learning designs on one hand and aid the improvement of
learning on the other as learning analytics [4].
    Despite numerous and extensive advances in the research field of Learning Analyt-
ics, wide adoption and successful implementations of learning analytics as a service is
still not present [11]. The added value of Learning Analytics (LA) for learners and ed-
ucators is clearly recognized and identified, but there has been little research done to
provide conclusive evidence that the LA tools have desirable effects on the learning
processes [18]. As part of developing the strengths for scaling up and deploying LA
within the learning processes, this research aims to investigate the effects of providing
learning analytics prototype on the learning platform where the teaching staff is apply-
ing blended learning scenarios. The scope of this paper concentrates on the context of
where analytics should be provided; to support the different learning scenarios imple-
mented in a selection of courses; and to develop an understanding of teachers’ use and
incorporation of analytics and statistics tools/interfaces in their day to day teaching ac-
tivities within the learning platform. The underlying idea of this research is the teaching
staff can freely explore the analytics prototype, while conducting their online teaching
activities in blended learning scenarios. The result will provide understanding about
how teachers accomplish teaching tasks with the learning platform and explain how
they incorporated analytics within their daily activities. The hypothesis that we inves-
tigated was that teachers while doing teaching activities within the course would also
use the analytics module on regular basis, in the same session on the learning platform.
This assumption is derived from the fact that the analytics prototype would be seam-
lessly integrated within the course on the learning platform, and teachers would be
compelled to use it to get a glimpse of what is going on in their course.
                                                                                           3


2      Method

   We used a case study research method to examine the hypothesis because it enabled
us to closely examine the usage of the learning analytics prototype within the context
of blended learning scenarios in higher education institution [14, 19]. The case study
consisted of deploying an analytics prototype to a small number of courses (53 courses)
on the learning platform. It fits well in a case study scenario because on this learning
platform there are around 3000 courses per semester, and these 53 courses are ~1,5%
of the number of running courses per semester (small sample). The context of the study
was the technology aspect of the implemented blended learning scenarios in real
courses on the learning platform to grasp more realistic understanding of how analytics
would be used within the learning platform. For the data-triangulation aspect, three
types of data collection mechanisms were built to collect case study data to provide
corroborating evidence to explain the observations and results of the case study [14].
For this purpose, we collected anonymous log data on the usage of the analytics proto-
type, collected log data on the users’ activities within these courses on the learning
platform, and conducted a two-part survey. The survey consisted of questions that col-
lected qualitative feedback about the analytics prototype, and a seven-point Likert scale
usability questionnaire based on the ISO 9241/10 international standard [9].


3      Study Setting and Design

   We randomly selected and contacted a wide audience of professors and teaching
assistants of different faculties at our university via email to offer them to participate in
the case study. Professors and teaching assistants of 53 courses agreed to participate.
33 courses were a lecture connected with an exercise, 15 courses were practice oriented
laboratory courses, and five courses were seminar courses. Regarding the course distri-
bution among different faculties, 17 courses were from the Faculty of Mathematics,
Computer Science and Natural Sciences; six courses were from the Faculty of Mechan-
ical Engineering; three courses were from the Faculty of Electrical Engineering and
Information Technology; 11 courses were from the Faculty of Arts and Humanities; 15
courses were from the School of Business and Economics; and one course from the
Faculty of Medicine. The number of students participating in each course varied from
20 students to 2200 students. In retrospection, one can conclude that although the num-
ber of courses is small in regard to the total amount of courses per semester at our
university, the courses were distributed among six faculties (out of nine), the course
types were the three most common course types at our university, and according to the
number of students per course, the sample size contains courses with small number of
students, and very large courses with more than 2000 students.


3.1    Insights – Learning Analytics Prototype
   The Insights analytics prototype builds upon a knowledge gained through previous
research on different Learning Analytics prototypes [7, 8, 16] developed and provided
4


as pilot projects at RWTH Aachen University. The visualizations (indicators) in the
prototype visualize how the students use the different aspects and modules of the course
rooms on daily basis. Such examples include, how many different students show up in
the course room; what kinds of devices they use; what are the most popular learning
resources; or which collaboration techniques they prefer. The visualizations themselves
are interactive and enable the user to filter out specific parts and select and zoom-in on
other parts of the visualizations. The main idea of the prototype is by providing descrip-
tive statistics and analytics to each course to inspire teachers to reflect upon and possi-
bly improve their teaching in the learning process [7, 8, 16].
   The analytics prototype was available for the study participants by the end of April
2017 as an integral module inside their courses on the learning platform. After activat-
ing the Insights module, all participants received instructions and explanations about
the module via email, descriptions about the visualizations, what kind of data is visual-
ized, and guidelines about possible (valid) interpretations of the represented data within
the visualizations. We did not provide special instructions about when and how they
were supposed to use the module, but rather try to incorporate in their daily activities.
They also received information that the module activities would be observed by auto-
matic logging tools, and towards the end of the pilot phase they would be given an
online survey about their experiences with the Insights module. The survey itself was
non-binding, meaning that the participants were not obliged to fill it in. The Insights
module was never deactivated from the courses, so the teaching staff could still use it
in their (expired) courses. However, only the period between the availability of the In-
sights module and the end of the semester has been considered for the analysis of this
case study.


3.2     Analytics prototype replication (with other learning platforms)
    The prototype we used in this case study was built based on available data from the
learning platform at our university. The analytics prototype was built on anonymized
collected data from the learning platform following the concept of data minimalism [3].
Therefore no personal data was collected that was not necessarily needed to provide
analytics as a service, which in Germany can be very challenging due to stringent data
privacy laws. For this reason, we developed privacy conformant data collection strate-
gies that anonymized the requests from the individual users, and the only openly iden-
tifiable element of the request was the course in which the activities were made. The
collected raw data arrived in the form of seven different parameters identifying a single
HTTP request made to the learning platform. These seven parameters, presented in Ta-
ble 1, come from the HTTP protocol definition by the World-Wide Web consortium
[12].

               Table 1. Structure of the anonymous logs of the learning platform
                Client IP      Client     Processing                               Result
    Log Time                                            Operation       URI
                Address        Agent        Time                                   Code
                                                                                              5


   The first parameter is the exact date and a timestamp when a specific HTTP request
was generated from the user. The second and third parameter identify the client in the
HTTP request. The client IP is the anonymized IP address of the user’s device from
which this HTTP request originated, while the client agent identifies from which device
the HTTP request was made to the learning platform. The fourth parameter is the pro-
cessing time for each request to the learning platform, or how much time it took to
process the action from the learning platform. The fifth parameter is the HTTP opera-
tion method, or whether the activity was a simple read/view activity (GET), or it was a
create/edit activity (POST) in any of the modules of a course on the learning platform.
The sixth parameter was the URI, or the unified resource identifier of every item/re-
source/page on the platform. The URI identifies the resource upon which the request is
to be applied. In our case, the URI is built in such a way to identify the semester, the
course, the module, and the item which was requested or created by the user activity.
The last parameter is the HTTP status code, which conveys information how each re-
quest was completed [12].
   However, the learning platform in use at our university is a closed-source custom
solution created for support and implementation of the different blended learning sce-
narios. However, many German universities do not have the resources nor the expertise
for developing a bespoke learning platform for supporting their students and teaching
staff in their teaching and learning processes. For this reason, the universities use an
open source learning platform, such as Moodle, or ILIAS. Both learning platforms pro-
vide activity logging and learning data collection which can be used as basis for provid-
ing learning analytics services in their respective learning scenarios. However, the data
that is collected with their built-in data collection techniques cannot be used as-is, be-
cause it is highly personalized. Furthermore, the personal raw data is stored for an in-
definite amount of time and can be used to pinpoint individual users and observe their
various daily activities within their courses on the platform. These two aspects are not
conformant with the current data privacy law rules and regulations.

               Table 2. Structure of the logs of the Moodle learning platform
 Time   User      Affected     Event      Compo-      Event    Descrip-     Origin   IP Ad-
        Full      User         Context    nent        Name     tion                  dress
        Name
   Table 2 shows the structure in which the data collection methods are logging the
users’ activities on the Moodle learning platform. However, Moodle is a modular and
open source platform and the data collection mechanisms can be changed and updated
to be conformant with the data privacy laws to provide data which can be used as a
basis for providing data privacy conformant learning analytics. The developers and pro-
viders of learning analytics services should develop plugins (methods) that pseudomize
or completely anonymize the entries in fields “User Full Name”, “Affected User”, and
“IP address” of the collected log data. Furthermore, they need to develop raw data
deletion strategies that delete the collected personalized logs after the privacy transfor-
mation and delete the pseudomized (anonymized) logs after the conducted analysis on
the data. The pseudomization of these fields will not remove the semantics of the logs,
nor reduce their value for providing learning analytics, and the provided analytics will
6


be on the same level as the analytics provided by the Insights module. Additionally, the
pre-processed raw data should be analyzed with different application (and preferably
on a different physical layer), so that the user experience and performance is unaffected
by the computational-heavy data analysis. Figure 1 shows an outline of a privacy con-


    Preprocessed                                               Course Analytics
      log data                                                     Plugin




    Data Analysis
                                Analytics         Web API         Platform
                                 Results                       Analytics Plugin


                    Fig. 1. Proposed Learning Analytics Framework
formant learning analytics framework for the Moodle learning platform. The log data
is pre-processed to be privacy conformant, and then different analytics methods analyze
the data, produce and save the analytics results. Which analysis methods should be in-
corporated into the data analysis module are context dependent, and they are influenced
by the implemented learning scenarios, the questions that the teaching staff has, and the
requirements for the learning analytics prototype. After the analysis these results are
available for representation and visualization via a predefined RESTful API. The In-
sights learning analytics prototype has a very similar architecture with one notable dif-
ference for delivering the analytics results. The Insights prototype is a standalone web
application which can be embedded in different courses on the developed learning plat-
form, while in Moodle the analytics results would be delivered via a Moodle plugin. As
a last step towards providing analytics as a service, an automation process should be
developed that automatically triggers every step from the process: data collection and
pre-processing; the data analysis and saving the results; providing them in the appro-
priate courses on the learning platform; and removing them completely from the system
with accordance to the pre-defined course lifecycle. By implementing this framework,
fellow researchers can develop learning analytics components, experiment with them
in different blended learning scenarios like the case study in this research work, and
potentially scale them up and provide them as a service in their institutions.


4      Results and Discussions

   The case study results provided sufficient amount of raw-data and feedback to con-
clude that the teaching staff used the Insights module. Overall, during the time of the
case study, 40 courses from the 53 that had agreed to participate in the case study have
                                                                                                    7


used the Insights module at least on one occasion. The usage frequency of the Insights
module during the study was not evenly distributed, presented in Table 3.

                                    Table 3. Insights usage distribution among courses

 Number of Courses                                            Number of uses
 15 courses                                                   More than ten times
 10 courses                                                   Between six and ten times
 9 courses                                                    Less than five times
 13 courses                                                   No usage detected

We conducted descriptive statistics by calculating the mean with standard deviation,
and the median. The average usage frequency is 15.5 times per course with standard
deviation of 22, and the median is seven. In this case, the average usage frequency and
the standard deviation do not depict the real outcomes, because the standard deviation
is larger than the mean. For this reason, we calculated the coefficient of variation (CV
= 141), which means that the usage frequency data was spread across widely around
the mean. The median is more descriptive and suitable for the analysis because the
median separates the higher half from the lower half of the module’s usage frequency
data. In other words, the median shows that in half of the courses, the teaching staff has
used the Insights module on multiple occasions (median=7).
In Figure 2 can be observed how the usage frequency of the Insights module developed
over the course of the case study. On the x-axis is the time-span of the study, while on
the y-axis is the number of different courses from which the Insights module was ac-

                     40
                     35

                     30
 Number of Courses




                     25

                     20
                     15

                     10

                      5

                      0
                          05.05.2017
                          09.05.2017
                          11.05.2017
                          13.05.2017
                          16.05.2017
                          18.05.2017
                          20.05.2017
                          23.05.2017
                          26.05.2017
                          29.05.2017
                          31.05.2017
                          02.06.2017
                          09.06.2017
                          13.06.2017
                          16.06.2017
                          21.06.2017
                          23.06.2017
                          26.06.2017
                          28.06.2017
                          30.06.2017
                          04.07.2017
                          09.07.2017
                          12.07.2017
                          14.07.2017
                          17.07.2017
                          19.07.2017
                          23.07.2017
                          25.07.2017
                          27.07.2017
                          31.07.2017
                          02.08.2017
                          04.08.2017
                          07.08.2017
                          10.08.2017
                          14.08.2017
                          16.08.2017
                          18.08.2017
                          28.08.2017
                          01.09.2017
                          05.09.2017
                          07.09.2017
                          13.09.2017
                          15.09.2017
                          19.09.2017




                          Fig. 2. Number of courses per day in which the Insights module was used
cessed over the time-period. After the initial peak of usage when the Insights module
was available to the teaching staff, regular weekly peaks of the module’s usage can be
identified; troughs on the weekends; and the activities in the Insights module decrease
towards the end of the semester. What we found interesting in the weekly distribution
of the Insights module usage was that although the lectures had ended, the number of
courses in which the Insights module was used, had increased in the last three weeks of
8


July. In August and September, the number of courses in which the Insights module
was used, steadily decreased.


4.1     Analytics prototype usage results
    We also inspected whether the teaching staff logged in on the learning platform to
explicitly use the Insights module, or they used it as part of their daily teaching activi-
ties. For this purpose, the collected log data from the learning platform used for provid-
ing analytics and insights about the students’ activities within the course was re-pur-
posed and re-analyzed to identify and aggregate the teaching activities within the course
rooms on the platform. We wanted to discover whether within the same session1 on the
learning platform, the teaching staff have performed teaching activities while they have
used the Insights module. The teaching activities covered with the analysis were chosen
based on their relevance and influence on the learning processes within the course.
Hence, bureaucratic and technical activities within the course room that have no influ-
ence over the students and their learning, were disregarded and were not analyzed. The
teaching activities that were taken into consideration for the analysis were divided into
four major groups: (1) information distribution activities, (2) course organization ac-
tivities, (3) distribution of learning resources, and (4) formative assessment activities.
In total, in 36 courses the teaching staff has performed a teaching activity whilst using
the Insights module. Here follows the teaching activities distribution in more details:
    (1) The teaching staff can distribute this information either by posting an online
         announcement or send out an email to the students. The correlation of the log-
         data analysis showed that in 17 courses when the teaching staff used the Insights
         module within the same session they have posted an announcement, and in 17
         courses have used the Insights module within the same session when they had
         sent an email to the students. Combining the two course lists to remove redun-
         dancy, overall the teaching staff of 26 courses had distributed various infor-
         mation, within the same session when they had used the Insights module.
    (2) The correlation of the log-data analysis showed that in two courses when the
         teaching staff used the Insights module have also created or edited course events
         in the calendar, and that in one course the staff has created/edited a survey in
         the same session when using the Insights module.
    (3) The correlation of the log-data analysis showed that in 22 courses when the
         teaching staff had provided, or uploaded learning materials has also used the
         Insights module. In four courses they have uploaded or embedded lecture vid-
         eos, and in three they have provided online resources as hyperlinks. Combining
         all course lists, overall the teaching staff of 24 courses had provided learning
         resources within the same session when they had used the Insights module.




1 The term “session” is used in the context of an interaction session, when the user has logged in

    onto the system via a web browser and has performed different activities and interactions
    within the system.
                                                                                         9


  (4) The correlation of the log-data analysis showed that in six courses when the
      teaching staff provided or edited an assignment has also used the Insights mod-
      ule. In three courses, the teaching staff used the Insights module when correcting
      student submissions. Combining the two course lists, resulted in total of seven
      courses where the teaching staff had performed activities within the formative
      assessment modules within the same session when they had used the Insights
      module.


4.2    Anonymous survey results

    The two-part anonymous survey provided qualitative feedback about the Insights
module, and a usability questionnaire based on the ISO 9241/10 standard was per-
formed to assess the usability of the prototype [13]. In total, eight participants have
filled in the survey. The first part of the survey collected feedback about what were the
(1) positive aspects of the Insights module; (2) what were the negative aspects or expe-
riences with the Insights module; (3) which features and visualizations of the Insights
module were useful the most to the participants; and (4) what would the participants
wish to see in the Insights module to better fulfill their needs and expectations.
    (1) According to the answers, the possibility to have an overview about which
         learning materials are mostly used over time within the course room; the possi-
         bility to see whether the students have used the provided media and the lecture
         recordings provided by the teacher; and the possibility to observe how the stu-
         dents’ behavior developed over time in the course room are among the positive
         aspects provided by the Insights module. On one occasion, the teacher could
         infer with certainty when and how the students worked on the assignments and
         their submissions.
    (2) The negative experiences with the tool were mainly concerned with the data
         representation and visualization. The answers included statements about
         glitches in the zooming functionality and unfitting representations of the data
         on the charts’ axes; and the lack of help and description of the visualizations.
         An interesting claim marked as negative experience was that the participant’s
         fears that the students always studied and looked at learning resources just be-
         fore the exam, were confirmed.
    (3) According to the answers, the highlighted features of the Insights module were
         the ones that showed analytics and information about activities within the learn-
         ing resources modules. The participants could see which the most popular learn-
         ing materials and resources within the course room were. One participant men-
         tioned, that his expectations about the students’ behavior was confirmed and
         that he can use the tool to adapt his learning offerings and teaching behavior.
    (4) As possible improvements, the participants suggested a provision of help and
         guidelines about how to interpret the visualizations; smoother and clearer visu-
         alizations with better zooming functionality. There was also requested the pos-
         sibility to be able to combine and export the visualized data for offline analysis
         and usage, and to provide the tool available outside the university’s network.
10


   The goal of the second part of the survey was to collect feedback and information
about the usability of the Insights module. Usability can be broken down in these goals:
effective to use (Effectiveness), efficient to use (Efficiency), have good utility (Utility),
easy to learn (Learnability), easy to remember how to use (Memorability). In the survey
there were a set of questions that covered each of these goals. The questions themselves
were created based on the ISO 9241/10 standard [13]. The seven-point Likert scale
ranged from “Strongly Disagree” to “Strongly Agree” (1-7) and was used as a ranked
order across all 16 questions with the aim to receive more consistent results. The raw
results of the survey have the nature of ordinal data because the Likert scale uses order
(or rank) and one cannot consistently and correctly define the distance between the
categories. For ordinal data analysis, it is recommended to use methods that preserve
the ordering of the data so that there is no loss of power, such as computing the median
and the mode [1]. In Table 4 the analyzed results of the usability survey are presented.
The table columns represent the five usability goals, while each row represents the me-
dian and the mode of the scale number for each question’s answer from the survey.

                        Table 4. Results summary of the usability survey
        Effectiveness       Efficiency          Utility        Learnability    Memorability
        Median   Mode     Median    Mode    Median   Mode     Median   Mode    Median   Mode
 Q1      4.5       4         5        4      5.5          5    5.5         6     5       4
 Q2      5.5       6        5.5       6      5.75         6     6          7     5       5
 Q3       6        6         6        6       6           6     6          7     6       6
 Q4       -        -         6        7        -          -     -          -     6       7

   The results in general show that the participants in the survey have positively rated
the tool on the five usability goals, because the lowest value on a question was four, the
highest seven. Out of the five categories, the ones that fared the best was Learnability
with the highest median and mode followed by Utility, Efficiency, and Memorability.
The goal that fared the poorest was Effectiveness with the lowest median and mode.


5      Findings

   The goal of the case study was to find out whether the teaching staff in blended
learning scenarios would use learning analytics tools on regular basis and incorporate
them in their daily teaching activities within the learning platform. The underlying hy-
pothesis of the case study was that teachers while doing teaching activities on the learn-
ing platform, would also use the analytics prototype within the same session. The case
study itself can be labeled as quasi-experiment because the participants were not ran-
domly assigned to the conditions of the tool; the usage of the Insights module was not
randomly sampled among the teaching staff because as a study in the field the environ-
ment conditions and factors could not be controlled. However, the presented results in
the previous section show that the experimental design provided sufficient control and
provided substantial evidence that the goal of the study was fulfilled.
                                                                                        11


5.1    Findings regarding the usage analysis
   According to the presented results, almost in half of the courses the teaching staff
explicitly used the Insights module on multiple occasions. This statement is corrobo-
rated with the median of the number of usage per course (median = 7), meaning that at
least in half of the courses the teaching staff has used the Insights module on multiple
occasions. In the usage data of the Insights module there are regular weekly peaks and
troughs on the weekends. The analysis also showed that towards the end of the semester
the number of courses in which the Insights module was used, steadily decreased. What
was unexpected was the fact that the in the weeks right after the lecture ended, the
number of courses in which the Insights module was used started increasing. One pos-
sible explanation for this could be that the teaching staff wanted to observe and evaluate
the students’ behavior over the span of the entire semester within the course room.
   The findings presented in the previous paragraph were summative because they dealt
with the overall number of courses over the given amount of time. This means that there
was a possibility that within the weekly usage peaks there could have been many
courses with incidental usage (although such requests and usage were filtered out from
the raw data). The usage data analysis showed that the Insights module was used inten-
tionally especially whenever there was peaks in the number of different courses. This
finding associates well with the first part of the goal of this case study, namely that the
teaching staff would use analytics tool on regular basis within their course on the learn-
ing platform.


5.2    Using analytics results together with teaching activities
   The results of the analysis also showed that the teaching staff in 36 courses have
performed various teaching activities whilst using the Insights module within the same
session. In 24 courses the teaching staff had performed activities that provide various
learning resources (materials, slides, media, etc.) to their students. In 26 courses the
teaching staff had performed activities that distributed various course information to
their students, and in seven courses they corrected assignments, or provided new ones
within the same session. This is a strong indicator that the teaching staff used the In-
sights module as part of their teaching activities and confirm the second part of the goal
of the case study and its hypothesis. Considering the results that confirmed the goal and
the hypothesis of the study, it is safe to conclude that the correct place for providing
learning analytics solutions and visualizations in blended learning scenarios is the
course room on the learning platform. The teaching staff would use learning analytics
tools and results in their teaching activities while conducting their learning scenarios.
Nonetheless, this corroborated outcome does not provide evidence of whether the
teaching staff understood, observed, or even acted upon of the visualizations and ana-
lytics results while using the Insights module. These findings show only that the teach-
ing staff used the Insights module on regular basis.
12


5.3    Anonymous survey findings
    The two-part anonymous survey collected information and its analysis provided
qualitative feedback about the features and the user interface of the Insights module. In
total, only eight participants from the case study decided to participate and provide their
feedback and answers to the survey. In comparison with the usage frequency and num-
ber of participating courses, the response rate was comparatively low. The results from
the qualitative feedback showed that the teaching staff used the visualizations and ana-
lytics to get an overview about how the learning resources were used and be more aware
of the student behavior in the different modules of the course room on the learning
platform. Despite the inconsistencies with the data representation and the lack of help
and documentation to guide them through the interface, they were aware of what was
happening in their course, and their assumptions about intermittent learning were con-
firmed by the Insights module. The suggested improvements about the tool were di-
rected towards improving the data visualization and interface, rather than providing
new data and different analyses. To better understand their perspective about their qual-
itative feedback in the survey, analysis on the session duration, showed that the time
the teaching staff spent on the Insights module ranged from 60 seconds to seven
minutes. Additionally, in relation to the time of day (in the morning, or in the afternoon)
when they had used the Insights module is (almost) normally distributed, with a slight
advantage to the afternoon. The teaching staff had relatively short sessions while using
the Insights module, and during these short sessions they tried to get an overview, or
detect trends within the visualized data. This finding implicates the design of the visu-
alizations and the data representation. Their feedback about concentrating on improving
the provided visualizations and analytics representations further supports this finding.
The usability survey showed that the insights module was easy to learn to use and had
good utility. However, the effectiveness of the Insights module was rated the poorest.
This indicates that in the Insights module there is a need to consider the feedback for
improvements and become better at providing analytics in their courses.


5.4 Study Limitations

   This case study provided a comprehensive description of the interaction dynamic of
a learning analytics module in blended learning scenarios at a higher education institu-
tion. However, the study was in its essence a quasi-experiment because of the lack of
control. As such, the study can allow the existence of other hypotheses, and different
explanations and interpretations of the observed results. Moreover, this case study was
not immune to the three major concerns with case studies: the research/methodological
rigor, the subjectivity of the researcher, and the external validity and generalizability of
the results [14]. Considerable effort was invested in designing the case study so that
can be easily reproduced, and the methodological rigor to base the acceptance the hy-
pothesis of the study based on quantitative data. As mentioned before, the data collec-
tion for analysis was automatically collected, cleaned, and analyzed, to remove the in-
fluence of human error. Nonetheless, the methods how to analyze and interpret the re-
sults from the study in relation to the goal of the study and its underlying hypothesis
                                                                                          13


were still subjected to the personal interpretation of the researchers. The case study
setting and implementation can be reproduced, and carried out again with different
courses. However, since the environment and conditions are not controlled, it is not
possible to predict the behavior of the participants, without observing their behavior.
Hence the results would be limited to describing the phenomenon at hand, rather than
predicting future behavior of the participants. In other words, there is no quantifiable
certainty that by repeating the case study the results would be the same as they are in
this case study. One way to remedy this situation is to involve repeated observations of
the same variables (or participants) over longer period of times, so that enough longi-
tudinal data could be generated and analyzed.


6      Conclusion

   Learning Analytics should be an integral part of the e-learning services in higher
education institutions. We conducted a case study in which we provided a learning an-
alytics prototype in 53 courses with the objective to see whether the teaching staff
would use the analytics module while doing teaching activities within the course. The
case study results provided ample amount of raw-data and feedback to confirm the ob-
jective and the hypothesis of the case study, and that in at least half of the courses the
learning analytics module was used on regular basis and on multiple occasions. Fur-
thermore, the analysis of the case study data showed that in 36 courses the teaching
staff used the learning analytics module in the same session, while performing teaching
activities within the different modules on the learning platform. We also discovered
that their usage sessions with the analytics prototype ranged from one to seven minutes
and during these sessions they tried to get an overview and understand the behavior of
the students in their course rooms over the course of the semester. As part of the case
study’s data collection, we also conducted a two-part survey to collect qualitative feed-
back data about the different features and visualizations of the learning analytics mod-
ule, and its usability. Although the response rate was comparatively low, the qualitative
feedback showed that the teaching staff was mostly interested in insights about the stu-
dents’ behavior within the learning resources modules. The second part of the survey
provided feedback and knowledge that the analytics module was easy to learn to use,
but that it had poor effectiveness. Overall, the results of the case study confirm that the
place to provide learning analytics is within the courses on the learning platform, and
that the teaching staff (if provided) uses it on regular basis while doing their daily teach-
ing activities on the learning platform. The results of this case study provide enough
evidence to go continue forward on the path of providing analytics as a service and
evaluating its value and impact on a larger scale. The next concrete steps for us are to
provide the Insights module to a larger audience (~500 courses), and to evaluate the
value and effectiveness of the analytics results on the teaching staff in blended learning
scenarios in higher education institutions.
14


References
 1. Agresti, A.: Categorical Data Analysis: Second edition. (2002).
 2. Bonk, C.J. et al.: The Handbook of Blended Learning: Global Perspectives, Local Designs.
    High. Educ. 624 (2012).
 3. Bundesdatenschutzgesetz:            Bundesdatenschutzgesetz,           https://www.gesetze-im-
    internet.de/bdsg_1990/, (1990).
 4. Chatti, M.A. et al.: A reference model for learning analytics. Int. J. Technol. Enhanc. Learn.
    4, 5/6, 318 (2012).
 5. Dorothy E. Leidner, Sirkka L. Jarvenpaa: The Use of Information Technology to Enhance
    Management School Education: A Theoretical View. Manag. Inf. Syst. Q. 19, 3, 265–291
    (1995).
 6. Dyckhoff, a. L. et al.: Supporting action research with learning analytics. Proc. Third Int.
    Conf. Learn. Anal. Knowl. - LAK ’13. 220 (2013).
 7. Dyckhoff, A.L.: Action Research and Learning Analytics in Higher Education. (2014).
 8. Dyckhoff, A.L. et al.: Design and Implementation of a Learning Analytics Toolkit for
    Teachers Design and Implementation of a Learning Analytics Toolkit for Teachers. Source
    J. Educ. Technol. Soc. Educ. Technol. Soc. 15, 153, 58–76 (2012).
 9. Europäisches Komitee für Normung: EN ISO 9241-10. Iso 9241-10. 14 (1995).
10. Ferguson, R.: Learning analytics: drivers, developments and challenges. Int. J. Technol.
    Enhanc. Learn. 4, 5/6, 304 (2012).
11. Ferguson, R. et al.: Research Evidence on the Use of Learning Analytics. (2016).
12. Fielding, R. et al.: RFC2616 - Hypertext transfer protocol–HTTP/1.1. Internet Eng. Task
    Force. 1–114 (1999).
13. Figl, K.: ISONORM 9241/10 und Isometrics: Usability-Fragebögen im Vergleich.
    Tagungsband Mensch Comput. 143–152 (2009).
14. Lazar, J. et al.: Research Methods in Human-Computer Interaction. Evaluation. 426 (2010).
15. Lismont, J. et al.: Defining analytics maturity indicators: A survey approach. Int. J. Inf.
    Manage. 37, 3, 114–124 (2017).
16. Lukarov, V., Schroeder, U.: AiX-Analytics: Analytics Tool at {RWTH} Aachen University.
    In: Igel, C. et al. (eds.) Bildungsräume 2017: DeLFI 2017, Die 15. e-Learning Fachtagung
    Informatik, der Gesellschaft für Informatik e.V. (GI), 5. bis 8. September 2017, Chemnitz.
    pp. 393–394 Gesellschaft für Informatik, Bonn (2017).
17. Piccoli, G. et al.: Web-Based Virtual Learning Environments: A Research Framework and
    a Preliminary Assessment of Effectiveness in Basic IT Skills Training. MIS Q. 25, 4, 401
    (2001).
18. Scheffel, M. et al.: Quality Indicators for Learning Analytics. J. Educ. Technol. Soc. 17, 4,
    117–132 (2014).
19. Yin, R.K.: Case Study Research: Design and Methods. (2009).