=Paper= {{Paper |id=Vol-1601/CrossLAK16Paper15 |storemode=property |title=Visualizing Workplace Learning Data with the SSS Dashboard |pdfUrl=https://ceur-ws.org/Vol-1601/CrossLAK16Paper15.pdf |volume=Vol-1601 |authors=Adolfo Ruiz-Calleja,Sebastian Dennerlein,Tobias Ley,Elisabeth Lex |dblpUrl=https://dblp.org/rec/conf/lak/Ruiz-CallejaDLL16 }} ==Visualizing Workplace Learning Data with the SSS Dashboard== https://ceur-ws.org/Vol-1601/CrossLAK16Paper15.pdf
    Visualizing workplace learning data with the SSS Dashboard
                             Adolfo Ruiz-Calleja, Tallinn University, adolfo@tlu.ee
               Sebastian Dennerlein, Graz University of Technology, sdennerlein@know-center.at
                                  Tobias Ley, Tallinn University, tley@tlu.ee
                      Elisabeth Lex, Graz University of Technology, elex@know-center.at

         Abstract: This paper reports the design and development of a visual Dashboard, called the
         SSS Dashboard, which visualizes data from informal workplace learning processes from
         different viewpoints. The SSS Dashboard retrieves its data from the Social Semantic Server
         (SSS), an infrastructure that integrates data from several workplace learning applications into
         a semantically-enriched Artifact-Actor Network. A first evaluation with end users in a course
         for professional teachers gave promising results. Both a trainer and a learner could understand
         the learning process from different perspectives using the SSS Dashboard. The results
         obtained will pave the way for the development of future Learning Analytics applications that
         exploit the data collected by the SSS.

         Keywords: workplace learning analytics, data visualization, social semantic data

Introduction
Workplace learning (Eraut, 2004) often represents an informal way of gaining knowledge tightly connected to
the work processes and is typically driven by personal interest or by problems at work. Self-reflection, expertise
seeking or co-creation of guidelines are examples of common learning practices that happen at the workplace
(Ley et al., 2014). The lack of structure typical for formal learning settings makes many Learning Analytic (LA)
solutions inapplicable in workplace learning contexts.

For the analysis of respective learning processes, we firstly designed and developed an infrastructure, called
Social Semantic Server (SSS) (Dennerlein et al., 2015), whose aim is to enhance the development and
integration of applications that support workplace learning, as well as to log emerging usage traces. The SSS
collects data from the applications integrated into it and coherently combines it into a semantically-enriched
Artifact-Actor Network (AAN) (Reinhardt, Moi & Varlemann, 2009). Our plan is to exploit this AAN using
Learning Analytics applications that will provide learners and workplace trainers different ways to analyze their
own learning data and, hence, useful information to improve the learning process.

As a first attempt we explored the possibility of developing a Dashboard, called SSS Dashboard, that represents
the information contained in the AAN to make workplace learners and trainers aware of the characteristics of an
informal learning process. The present paper reports the design and development of the SSS Dashboard. It
focuses specially on its user interface since it is specially challenging how to graphically represent data collected
from informal learning processes using abstractions that non-technical users are able to manipulate. A first
assessment alongside a professional teacher training course enables us to discuss the suitability of the
abstractions managed by SSS Dashboard and paves the way for further developments.

The rest of the paper is structured as follows: the following section briefly describes the state of the art regarding
dashboards for workplace learning. Then the design and development of the SSS Dashboard is reported, which
is evaluated in the following section. Finally, the most important conclusions of the paper are summarized.

Dashboards to visualize learning processes at the workplace
In the last few years we witnessed the rise of LA and many research works were proposed that focus on formal
education. Although some examples can be found related to informal learning (Klamma, 2013), they are still a
clear minority (Nistor, Derntl & Klamma, 2013). The number of research works about LA is even more
restricted when talking about workplace learning. In this regard, we can find several proposals whose intention
is to raise awareness of informal workplace learning processes. Interestingly, many of these proposals exploit
the informal social relationships that arise when learning at the workplace, as they understand workplace
learning as a social process where learning is seen as a culture of shared experiences. An example can be seen at


                                                          79

                     Copyright © 2016 for this paper by its authors. Copying permitted for private and academic purposes.
(de Laat & Schreurs, 2013), where the authors detect the informal networks in organizations with the aim of
connecting the isolated ones and enhance their value creation.

On the other hand, there are several proposals in the Computer Supported Cooperative Work (CSCW) domain
that deal with workplace awareness and propose dashboards to make workmates conscious of what their
colleagues have done. One example is FASTDash (Biehl et al., 2007), a visual dashboard for awareness of
software teams. Other CSCW dashboards intend to persuade workers to change some behavior or to promote
those workers who behave properly. See (Yun et al., 2014) for an example where a dashboard is used to
promote environmental sustainability behavior at the workplace. Nonetheless, these CSCW proposals do not
seek for learning purposes, so they do not represent data using learning abstractions that will support the
understanding of a learning process.

A few examples of dashboards for workplace LA can still be found. One of them is Learn-B (Siadaty et al.,
2012), a tool for self-regulated workplace learning built on top of an infrastructure that collects data from
workplace learning processes. It represents a good example of how data from different workplace learning
applications can be integrated and visualized in a combined way. Nonetheless, the information offered to the
users is restricted to some statistics about the resource usage and sharing. Another interesting dashboard is
MyExperiences (Kump et al., 2012), which was carried out under the Aposdle Project. In this case the Aposdle
system creates a learner model out of the topics of interest of each learner and MyExperiences graphically
represents such model for learners to reflect about their learning process. Both MyExperiences and Learn-B are
good examples of how visual dashboards can enhance workplace learning; however, both of them focus in
specific aspects of the learning process, which hinder its understanding as a complex process that can be
analyzed from different perspectives.

The SSS Dashboard
Exploiting a Social Semantic Infrastructure for workplace LA
The Social Semantic Server (SSS) (Dennerlein et al., 2015) is a service-based infrastructure specially targeted at
supporting informal workplace learning scenarios that has already been employed in several pilot studies (Ley et
al., 2014). The SSS offers services to create, structure and enrich content as well as services that enable
discussions and which provide guidance and recommendations. The basis for these services is a semantically-
enriched Artifact-Actor Network (AAN) (Reinhardt, Moi & Varlemann, 2009), which combines the social-
network (e.g. Facebook) and the artifact-network (e.g. Wikipedia) approaches to describe the relationships
among actors and artifacts in different contexts. The interactions between actors and artifacts, as well as the
contexts in which these interactions take place, can be logged via dedicated services and stored in the SSS. As
such, the SSS can serve as a technical basis for end-user workplace learning applications, such as the Attacher
and Bookmarker (Ruiz-Calleja et al.) in the example of Figure 1. Such applications benefit from the SSS and its
service-based architecture, which facilitates the application development and the technical integration across
tools.

The SSS can log all the events of the applications that make use of its services. The SSS can then create an AAN
that consists of different elements: entities, which can either be actors (typically learners) or artifacts (either
learning documents or conceptual artifacts); a structure of semantic relationships between the entities;
information about the activities that build up the relationships among entities; and some additional information
about these activities, such as when they happen or some contextual information. The entities, relationships and
contexts managed by the AAN may have different levels of formality, ranging from simple keywords to formal
ontologies.

The AAN managed by the SSS is a flexible data model that allows to capture the informal learning interactions
that happen at the workplace, the relationships between learners and the learning artifacts, and the context in
which these interactions happen. However, this AAN is a very complex data structure that would be very
difficult to manipulate directly by an end user to extract meaningful conclusions about the learning process. For
that reason, the SSS services can expose the data contained in its AAN using different abstractions, which
feature specific aspects of the network. For example, a SSS service could provide a social network of all
workers that have interacted with each other during a particular time concerning a particular topic.


                                                        80
                               Figure 1. Architecture of the Social Semantic Server.

Design and implementation of the SSS Dashboard
As Figure 1 represents, the SSS Dashboard is an application that gathers data from the SSS. Its aim is to exploit
the AAN created by the SSS in order to inform trainers and apprentices about the learning process. Hence, it
should offer some visualizations that allow trainers and apprentices to browse the data contained in the AAN in
a meaningful way. However, due to the informal nature of workplace learning it is very difficult to foresee
which are the relevant aspects to understand the learning process. For this reason, we based the design of the
SSS Dashboard on the three learning metaphors considered in (Paavola & Hakkarainen, 2015). According to
(Paavola & Hakkarainen, 2015), the three metaphors are complementary ways of understanding a learning
process and all of them should be taken into account.

The first one is the knowledge acquisition metaphor, which understands learning as a process of acquiring
knowledge by a learner; hence, this metaphor mainly focuses on individual learning or learner-oriented
information. The second is the participation metaphor, which understands learning as a process of participating
in cultural practices; consequently, this metaphor focuses on the social relationships of the learning community.
Finally, the knowledge creation metaphor focuses on the innovation processes and understands that the
relationships among learners are mediated by shared artifacts; hence, this metaphor deals with the collaborative
development of learning materials and conceptual artifacts.

Taking these metaphors into account we proposed three visualizations for the SSS Dashboard

Filter Events Visualization (see Figure 2). This visualization represents the list of events collected by the
AAN. Each event is described by the date it happened, the actor, the action, and the artifacts involved (either
documents or concepts). It is also explicitly stated if in this event the actor reuses an artifact introduced by other
actor. The events can be filtered by their actors, by the artifacts involved or by the actions done.

Social Network Visualization (see Figure 3). This visualization represents a social network of the actors
registered in the AAN. This social network is based on the shared artifacts among the actors. Thus, two actors
are connected if one of them introduced an artifact that was reused by the other actor. The actors are represented
by a circle. The bigger this circle is, the more artifacts the actor introduced were reused by others. When
clicking on a circle the Dashboard shows the artifacts (both the documents and the tags) managed by the
corresponding actor, and when clicking on a link between two circles, the Dashboard shows the artifacts shared
between them. Finally, it is possible to filter the relationships of the social network represented by the type of
event (e.g. ``only link the learners that reused artifacts introduced by other when tagging a learning resource'')
and the amount of events involved (e.g. ``only represent the relationships where at least three events were
involved'').


                                                         81
 Figure 2. Filter Events Visualization in the SSS Dashboard.




Figure 3. Social Network Visualization in the SSS Dashboard.




  Figure 4. Tag Cloud Visualization in the SSS Dashboard.



                             82
Tag Cloud Visualization (see Figure 4). This visualization represents a tag cloud that includes all the
conceptual artifacts registered in the AAN. When clicking on any of the tags, the documents and the actors
related to it are shown. Furthermore, the relationships between the documents and the actors are also explicit:
when clicking on a document the actors related to it turn into red, and when clicking on an actor all the
documents related to it turn into red.

Note that there is not a direct correspondence between the intended use of the three visualizations and the three
learning metaphors. For example, the Social Network Visualization represents a configurable social network,
but it also shows the artifacts managed by each user and the ones shared among users. Hence, this visualization
offers data abstractions that are intended to be used for all three metaphors.

The SSS Dashboard is developed using Javascript. It exploits D3.js 12, which is a well-known and open library
used to create data visualizations in web browsers.

Experience using the SSS Dashboard

Study design
For this pilot study we collected data from a teacher training course in Estonia where they had to reflect about
how to introduce new technologies and pedagogical techniques into their classes, which were conducted
alongside their participation in the course. 10 Estonian teachers (``learners'' from now on) attended the course
and were guided by a trainer. During the course they had to collect and share bookmarks that they stumbled
upon during their daily informal learning and write 10 individual blog posts where they reflect on what they
learned and where they should cite the bookmarks collected. A Chrome plugin, called Bookmarker, and a
Wordpress extension, called Attacher, were developed to integrate these two applications into the SSS.
Bookmarker allows to create bookmarks, tag them and submit them to the SSS. Attacher allows the learners to
browse the bookmarks contained in the SSS from the blog edition interface, to access their corresponding URLs
and to cite them in the blog posts. Further, Attacher also registers in the SSS the blog posts written by the
learners (see (Ruiz-Calleja et al., 2015) for more information about Attacher and Bookmarker).

 Dashboard support               Trainer task                               Learner task
 Individual       learning     (T1) Find out which topics have been (L1) Find out the topics you learned
 achievements                  understood by each learner                     and the artifacts you used
 Learning interests of         (T2) Find out which are the interesting (L2) Find out the most important
 the community                 topics for the learner group                   topics for your colleagues
 Social relationships and      (T3) Detect which learners shared (L3) Find out the learners you
 participation                 information and which ones are isolated        collaborated with
 Cultural relevance of         (T4) Detect which interests two learners (L4) Find out topics interesting for
 specific artifacts            share in common                                you to collaborate with others
 Dynamics of student           (T5) Identify artifacts that were introduced (L5) Find out learners that introduced
 learning situations           by a learner and reused by other               documents that you reused
 Emerging topics in the        (T6) Identify learning topics that are (L6) Find some learning topics you
 student community             surprising or unexpected to you                were not aware of
                             Table 1. Evaluation tasks proposed to the trainer and learners.

Note that this learning scenario occurs in a professional training course and has some kind of guidance from a
trainer. However, it can still be considered an example of workplace learning where the learners reflect on the
problems that appear in their work context. It is also true that some topics were suggested by the trainer but the
learners had freedom to choose the main topics of their reflection.

Once the training course was over, one of the learners who attended the course and the trainer visualized the
data collected by the SSS using the SSS Dashboard. The functionality of the SSS Dashboard was explained to
them and then a set of six tasks were proposed for them to accomplish using the Dashboard (see Table 1). The
six tasks proposed emerged from the learning procedures defined by the three learning metaphors considered in
(Paavola & Hakkarainen, 2015). As the intention was to assess whether the users understood the learning
          12
               http://d3js.org
                                                          83
process, these tasks were not as specific as the ones typically employed in software usability studies. We
observe how they accomplished the tasks proposed and, once they finished all the tasks, a semi-structured
interview was realized in order to understand their opinion.

Results
Both participants could accomplish the six tasks commended using the SSS Dashboard. Both of them agreed
that they got relevant information to understand the learning process, and both understood the collaboration
among learners as a process mediated by artifacts (trainer: ``as a trainer I would be worried because these two
learners did not reuse artifacts from others and they do not share information to others''), which is an
assumption of the knowledge creation metaphor. However, after completing the tasks they found the Dashboard
useful for different purposes: the trainer finds it interesting to understand the learning process (trainer: ``an
average trainer would use it to understand what is going on in the course''); the learner reduces its potential use
to the identification of relevant learning artifacts or to find potential collaborators (learner: ``I would use the
dashboard to find out if there are learners that use the same resources as me and to get an overview of the
resources used by others''). This is not a surprising result since in informal learning settings the utility of
learning dashboards is typically higher for trainers than for learners (Verbert et al., 2013).

Interestingly, both users were already familiar with the data abstractions presented by the SSS Dashboard since
they had already used applications that presented social networks or tag clouds. Their previous experience
helped them to reduce their learning curve when using the SSS Dashboard. It was not difficult for them to
manipulate the SSS Dashboard, nor to understand the data displayed. Their main problem was to understand
how to configure the social network since they had never manipulated a semantically-enriched social network
before. It took for them some time to understand the meaning of the possible relationships among the actors and
to know how to extract meaning from the different social networks represented. As a consequence, it took some
time for the learner to accomplish the task L5. It is known that the semantic data requires a significant learning
effort from end-users (Baeza-Yates & Riber-Neto, 1999, chapter 10).

Although both participants could accomplish the six tasks, the trainer had some problems when doing T5 and
the learner had some problem with L4. The reason is that the SSS Dashboard lacks of an abstraction that
aggregates the learning resources and somehow represents their relevance (trainer: ``probably a network of
documents may help''). The Tag Cloud Visualization was understood as an aggregation of conceptual artifacts
and was extensively used, especially by the trainer (tasks T1, T2, T4 and T6). Nonetheless, the trainer realized
that some of the tags contained in the tag cloud were not learning topics, as it was the case with “video”, which
was understood as the media of the learning resource. This tag was unexpected for the trainer but she did not
understand it as something positive (trainer: ``if I detected it before I would ask the learners to use as tags
learning topics, not types of media as “video”''). A technical solution for this problem would be to restrict the
tags to a controlled vocabulary or even an ontology. The SSS supports this option, but in that case it would
restrict the possibility of the learners to introduce unexpected learning artifacts. The learner also used this
visualization and he found learning topics he was not aware of (learner: ``it is surprising how many tags are
related to “motivation”. I am not sure why it is so interesting for other colleagues'').

An interesting difference between the behavior of the trainer and the learner when finding information for
similar tasks (T2-L2 and T6-L6), is that the trainer seeks for a global overview of the learning process while the
learner does not. Instead, in tasks L2 and L6 he just looked up the activities done and the artifacts managed by
some other learners he considered close to himself. For this purpose the learner mainly used the Filter Events
Visualization. He used the filters to visualize the events done by some users or related to some tags.

Although it was not the main purpose of the evaluation, we also collected some feedback about the usability of
the SSS Dashboard. For example, it was not natural for the users to find information by clicking on the links
between the nodes of the social network. This functionality is typically not included in data visualization
applications and triggers a rather new affordance that requires some learning effort. The learner also suggested
providing sorting mechanisms in the Filter Events Visualization, so that it would be easier for him to extract
meaning out of the learning event list. Another suggestion was to explicitly state the quantity of events, and the
kind of events, done by each user related a tag in the Tag Cloud Visualization.



                                                        84
Conclusions and future work
In the last few years many LA research projects have been proposed, but very few of them deal with the
informal nature of workplace learning processes. We contribute to this research by proposing the SSS
Dashboard, which allows learners and trainers to browse the data collected by the SSS from different
perspectives on learning, thus providing a holistic picture on learning process. A first prototype of the SSS
Dashboard was developed and assessed with end users. The evaluation showed that both a trainer and a learner
could manipulate the data contained in the SSS and use it to satisfactorily understand a learning process.

These are encouraging results since they show the technical viability of collecting data from a workplace
learning scenario, structure it and offering it back in such a way that the learning process can be understood by
different stakeholders. It is remarkable that the evaluands that participated in the study could quickly learn how
to use the SSS Dashboard and how to extract meaningful information out of it because they were already used to
most of its data representations. However, there is still a price to pay for this approach: the SSS needs to count
with a flexible data structure and an extensible business logic, which increases the technological complexity of
the infrastructure.

After this exploratory research, we plan to use the feedback obtained to develop another version of the SSS
Dashboard. A new evaluation of the SSS Dashboard will entail its use during the learning process, thus
assessing its impact on the learning process. Additionally, we plan to exploit the data collected by the SSS
developing other LA applications, such as recommender systems of learning artifacts.

References
Baeza-Yates, R. & Ribeir-Neto, B. (1999). Modern Information Retrieval. Addison-Wesley, Harlow, UK.
Biehl, J.T., Czerwinski, M., Smith, G. & Robertson, G.G. (2007). Fastdash: a visual dashboard for fostering
       awareness in software teams. In: Proceedings of the SIGCHI conference on Human factors in computing
       systems (pp. 1313-1322), San Jose, California, USA, ACM.
Dennerlein, S., Kowald, D., Lex, E., Theiler, D., Lacic, E., Ley, T., Tomberg, V. & Ruiz-Calleja, A. (2015). The
       Social Semantic Server: A Flexible Framework to Support Informal Learning at the Workplace. In:
       Proceedings of the 15th International Conference on Knowledge Technologies and Data-driven
       Business. Graz, Austria. ACM.
Eraut, M (2004). Informal learning in the workplace. Studies in continuing education, 26(2), 247-273.
Klamma, R. (2013). Community learning analytics: challenges and opportunities. In: Proceedings of Advances
       in Web-Based Learning (pp. 284-293). Kenting, Taiwan, Springer.
Kump, B., Seifert, C., Beham, G., Lindstaedt, S. & Ley, T. (2012). Seeing What the System Thinks You Know -
       Visualizing Evidence in an Open Learner Model. In: Proceedings of the 2nd International Conference on
       Learning Analytics and Knowledge (pp. 153-157), Vancouver, Canada, ACM.
de Laat, M. & Schreurs, B. (2013). Visualizing informal professional development networks building a case for
       learning analytics in the workplace. American Behavioral Scientist, 57(10), 1421-1438.
Ley, T., Cook, J., Dennerlein, S., Kravcik, M., Kunzmann, C., Pata, K., Purma, J., Sandars, J., Santos, P.,
       Schmidt, A., Al-Smadi, M. & Trattner, C. (2004) Scaling informal learning at the workplace: A model
       and four designs from a large-scale design-based research effort. British Journal of Educational
       Technology, 45(6), 1036-1048.
Nistor, N., Derntl, M. & Klamma, R. (2015). Learning Analytics: Trends and Issues of the Empirical Research
       of the Years 2011-2014. In: Proceedings of the 10th European Conference on Technology Enhanced
       Learning (pp. 453-459), Toledo, Spain, Springer.
Paavola, S. & Hakkarainen, K. (2005). The knowledge creation metaphor: an emergent epistemological
       approach to learning. Science & Education, 14:(6), 535-557.
Reinhardt, W., Moi, M. & Varlemann, T. (2009). Artefact-Actor-Networks as tie between social networks and
       artefact networks. In: Proceedings of the 5th International Conference on Collaborative Computing:
       Networking, Applications and Worksharing (pp. 1-10). Washington, USA, IEEE.
Ruiz-Calleja, A., Dennerlein, S., Tomberg, V., Ley, T., Theiler, D. & Lex, E. (2015). Integrating data across
       workplace learning applications with a social semantic infrastructure. In: Proceedings of the 14th
       International Conference on Web-based Learning (pp. 208-217), Guangzhou, China, Springer.



                                                       85
Siadaty, M., Gasevic, D., Jovanovic, J., Milikic, N., Jeremic, Z., Ali, L., Giljanovic & A., Hatala, M. (2012).
       Learn-B: A social analytics-enabled tool for self-regulated work-place learning. In: Proceedings of the
       2nd International Conference on Learning Analytics and Knowledge (pp. 115-119), Vancouver, Canada,
       ACM.
Verbert, K., Duval, E., Klerkx, J., Govaerts, S. & Santos, J. (2013). Learning analytics dashboard applications.
       American Behavioral Scientist, 57(10), 1500-1509.
Yun, R., Aziz, A., Lasternas, B., Zhang, C., Loftness, V., Scupelli, P., Mo, Y., Zhao, J. & Wilberforce, N.
       (2014): The design and evaluation of intelligent energy dashboard for sustainability in the workplace. In:
       Design, User Experience, and Usability. User Experience Design for Everyday Life Applications and
       Services (pp. 605-615). Volume 8519 of Lecture Notes in Computer Science. Springer International
       Publishing.

Acknowledgments
This research has been funded by the FP7 ICT Workprogramme of the European Community: ``Learning Layers
- Scaling up Technologies for Informal Learning in SME Clusters'' (grant no: 318209). Authors also want to
thank the trainer and the learner that participated in the evaluation.




                                                       86