=Paper= {{Paper |id=Vol-2294/DCECTEL2018_paper_16 |storemode=property |title=Implementation and Evaluation of a Trusted Learning Analytics Dashboard |pdfUrl=https://ceur-ws.org/Vol-2294/DCECTEL2018_paper_16.pdf |volume=Vol-2294 |authors=Daniel Biedermann,Jan Schneider,Hendrik Drachsler |dblpUrl=https://dblp.org/rec/conf/ectel/Biedermann0D18a }} ==Implementation and Evaluation of a Trusted Learning Analytics Dashboard== https://ceur-ws.org/Vol-2294/DCECTEL2018_paper_16.pdf
    Implementation and Evaluation of a Trusted
          Learning Analytics Dashboard

       Daniel Biedermann1 , Jan Schneider1 , and Hendrik Drachsler1,2,3
            1
             German Institute for International Educational Research
             {biedermann, schneider.jan, drachsler}@dipf.de
     2
       Open Universiteit, Valkenburgerweg 177, 6419 AT Heerlen, Netherlands
          3
            Goethe University Frankfurt, Frankfurt am Main, Germany



      Abstract. The research described in this article covers the user-facing
      components of a learning analytics environment which is developed with
      the premise of trust as an essential factor for the adoption of learning ana-
      lytics. It investigates the influence of privacy settings and personalization
      on the acceptance and adoption of learning analytics. By ensuring com-
      pliance with data protection legislation, and by providing transparency
      in the means and results of data collection, we aim to reduce doubts
      and fears in the learning analytics process. By respecting the needs of
      individuals, we hope to create an environment where learning analytics
      is perceived as something positive.


1   Introduction
Trust, or rather the lack of trust in analytics processes is a common topic in data
analytics [1]. For the field of Learning Analytics (LA), trust is a critical factor
for the acceptance by the stakeholders [2]. Building of trust has many layers,
from the integrity and quality of the data sources, over the secure storage and
processing, to the e↵ectiveness of the analytics results, the stakeholders have
to trust that what happens is in their best interest. In addition to this moral
and ethical viewpoint, the requirement to be transparent to users is further
substantiated by the European General Data Protection Regulation (GDPR),
which grants users far-reaching rights with regard to data analytics processes
    At our institution, a large German university with primarily live lectures,
there is currently no LA environment in place, and the task of our group is to
establish it. Our assumption is that LA can only e↵ectively assist in the learning
process when the learner actually accepts and utilizes the o↵erings. We therefore
decided to follow an approach to LA that we call ”Trusted Learning Analytics”
(TLA). With TLA, we focus strongly on the users and give them the information
and the tools to control what, how and why their data is collected and analyzed.
    As the interface between the learner and the LA approach is often a Learning
Analytics Dashboard (LAD), this is also the location where the trust-building
measures will be implemented and evaluated. The research question that I will
attempt to answer in my PhD project is: ”How can we design and evaluate a
trusted learning analytics dashboards?”
2     Fields of Research

From the gathering of the data, through the storage and processing, to the
presentation of the results, there are multiple aspects where trust of the learners
has to be won. In my PhD, the focus is on the user-facing side, while the other
parts (e.g. storage and infrastructure) are handled by other members within the
same research group. My research will depend on and interact with those other
parts.


2.1   Establishing a Learning Analytics Environment

Our working group was established at the same time that I started my PhD, in
November 2017. Thus, there was no LA environment in place, and we have to
create it from the start.
    We see this as a big opportunity, as we can choose which approaches to
implement in our environment. While trying to get an overview of the field, we
noticed that there are very di↵erent approaches to LA and it was not clear which
ones would fit best for our case. There is literature that gives an overview of the
LA landscape in general (e.g. [3]), and especially on the topic of LADs (e.g. [4]
and [5]), yet the literature only provided pointers to what approaches exist, but
only in very general categories and without information on where to find the
results.
    We conducted a literature review on the approaches that are researched in
the field of LA. We identified a heterogeneous combinations of various compo-
nents that form those LA approaches and found that the nature of the results
warrants more than simply listing them. In order to make the results of the
review continually explorable, we created a web-based tool, which we call the
Learning Analytics Indicator Repository (LAIR). In the LAIR, the approaches
and the respective papers are listed, and their components (subjects, platforms,
activities, metrics, indicators) are visualized in directed graphs. With the LAIR,
we hope to achieve two goals with regard to TLA: 1) Being able to quickly find
approaches and the related literature for specific combinations of components.
2) A visualization of LA approaches which plan to use in the LADs to explain
the learners how their data is gathered and for what purpose.


2.2   Privacy, Transparency and GDPR Compliance

Ethics and Privacy as factors for the acceptance and adoption of LA have been
the subject of numerous publications. There are several policy policy papers (e.g.
[6] ,[7]), and user studies which show that stakeholders have high expectations
for privacy and are not willing to consent to indiscriminate sharing of their data
[8].
     This need for privacy is underpinned by the GDPR regulation [9]. For GDPR
compliance, the state of research is theoretical in nature (e.g. [10]), and there
is little precedence in how the GDPR can be applied in LA practice (there
exist some commercial variants, e.g. the privacy dashboard by Microsoft 1 ).
The combination of the self-imposed ethical and an external legal demand for
privacy brings requirements for the user interface components. Especially the
learner-facing LADs should provide an interface that gives the users easy and
understandable control over their data. In this area, I expect my research to
align more with areas such as User Interface and Privacy research than with
TEL.
    We have currently mapped the GDPR user rights to functionalities and re-
spective user interface components. I am currently implementing those in a LAD.
Using a design-based research methodology [11], we will roll out a prototypical
LAD first in a small seminar where we plan to receive feedback using usabil-
ity studies (Surveys, Eyetracking, Think-Aloud). From this initial feedback, we
will improve on the LAD and use it in larger live lectures (100-200 students),
where we will further research how the users interact with the privacy settings
by gathering usage statistics.

2.3    Personalization
Interviews of learners have also shown that they expect adaptive and person-
alizable LADs [8] [12]. In combination with the privacy expectations and the
varying degrees of willingness to share data between individuals, we derive that
LA should o↵er a high degree of transparency and personlization to gain more
widespread acceptance.
    Dashboards in the TLA environment therefore require widgets that can be
configured by the learners to their individual needs. One particular issue that
we see is the di↵erence in perception with regard to peer referenced LA [8] [12].
Widgets in the TLA environment should therefore ideally support both those
who are motivated by the aspect of peer reference, and those who see this as a
rather demotivating factor. We aim to create widgets which allows individuals to
opt-out of peer reference. After opting-out, they would only see their own data
and not those of others and others would no longer see the opted-out individuals,
except in aggregated form.
    In a first iteration, we plan to create an essay-writing widget to be apply
this concept in smaller seminars. There, the learners can a) plan their intended
progress on the various parts of an essay b) track their total progress. In addition
to their own values, they will also see the planning and progress of the other
learners in the course that have opted-in to being peer-referenced. We want to
investigate how and why learners choose to opt-out of this and how they progress
in comparison to those who leave the peer-referenced aspects turned on.


3     Current Progress
I have completed the first half year of the PhD program which is currently
set to three years. At the time of this writing, I have completed the literature
1
    https://account.microsoft.com/account/privacy
review on the LA approaches and implemented the web-based repository. For the
TLA dashboard, several tasks are completed and a privacy interface has been
implemented on the frontend, which needs to be connected with the rest of the
infrastructure. The first usability experiments are set for the fall this year.
References
 References
  [1]   Analytics, K. I. D. \.: Building Trust in Analytics. Company Whitepaper.
        KPMG International Data & Analytics.
  [2]   Greller, W. and Drachsler, H.: “Translating Learning into Numbers: A
        Generic Framework for Learning Analytics”. In: Educational Technology
        & Society 15 (2012), pp. 42–57.
  [3]   Moissa, B., Gasparini, I., and Kemczinski, A.: “A Systematic Mapping
        on the Learning Analytics Field and Its Analysis in the Massive Open
        Online Courses Context”. In: Int. J. Distance Educ. Technol. 13.3 (July
        2015), pp. 1–24.
  [4]   Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto,
        L. P., Boroujeni, M. S., Holzer, A., Gillet, D., and Dillenbourg, P.: “Per-
        ceiving Learning at a Glance: A Systematic Literature Review of Learn-
        ing Dashboard Research”. In: IEEE Transactions on Learning Technolo-
        gies 10.1 (2017), pp. 30–41.
  [5]   Bodily, R. and Verbert, K.: “Review of Research on Student-Facing
        Learning Analytics Dashboards and Educational Recommender Sys-
        tems”. In: IEEE Transactions on Learning Technologies 10.4 (2017),
        pp. 405–418.
  [6]   Drachsler, H. and Greller, W.: Privacy and analytics: it’s a DELICATE
        issue a checklist for trusted learning analytics. In: ACM Press, 2016,
        pp. 89–98.
  [7]   Sclater, N.: “Developing a code of practice for learning analytics”. In:
        Journal of Learning Analytics (Apr. 2016), pp. 16–42.
  [8]   Ifenthaler, D. and Schumacher, C.: “Student perceptions of privacy prin-
        ciples for learning analytics”. In: Educational Technology Research and
        Development 64.5 (Oct. 2016), pp. 923–938.
  [9]   European Parliament, C. o. t. E. U.: “Regulation (EU) 2016/679 of the
        European Parliament and of the Council of 27 April 2016 on the protec-
        tion of natural persons with regard to the processing of personal data and
        on the free movement of such data, and repealing Directive 95/46/EC
        (General Data Protection Regulation)”. In: Official Journal of the Eu-
        ropean Union L119 (May 2016), pp. 1–88.
 [10]   Hoel, T., Griffiths, D., and Chen, W.: The influence of data protection
        and privacy frameworks on the design of learning analytics systems. In:
        ACM Press, 2017, pp. 243–252.
 [11]   Wang, F. and Hannafin, M. J.: “Design-based research and technology-
        enhanced learning environments”. In: Educational Technology Research
        and Development 53.4 (Dec. 2005), pp. 5–23.
 [12]   Tan, J. P.-L., Koh, E., Jonathan, C., and Yang, S.: “Learner dashboards
        a double-edged sword? Students’ sense-making of a collaborative critical
        reading and learning analytics environment for fostering 21st century
        literacies”. In: Journal of Learning Analytics 4.1 (2017), pp. 117–140.