=Paper= {{Paper |id=Vol-1518/paper6 |storemode=property |title=Exploring Inquiry-Based Learning Analytics through Interactive Surfaces |pdfUrl=https://ceur-ws.org/Vol-1518/paper6.pdf |volume=Vol-1518 |dblpUrl=https://dblp.org/rec/conf/lak/CharleerKD15 }} ==Exploring Inquiry-Based Learning Analytics through Interactive Surfaces== https://ceur-ws.org/Vol-1518/paper6.pdf
         Exploring Inquiry-Based Learning Analytics through
                         Interactive Surfaces

                                   Sven Charleer, Joris Klerkx, and Erik Duval
                                                 Dept. of Computer Science
                                                         KU Leuven
                                                   Celestijnenlaan 200A
                                                   3001 Leuven, Belgium
           sven.charleer@kuleuven.be, joris.klerkx@kuleuven.be, erik.duval@kuleuven.be

ABSTRACT
Learning Analytics is about collecting traces that learners
leave behind and using those traces to improve learning.
Dashboard applications can visualize these traces to present
learners and teachers with useful information. The work in
this paper is based on traces from an inquiry-based learning
(IBL) environment, where learners create hypotheses, dis-
cuss findings and collect data in the field using mobile de-
vices. We present a work-in-progress that enables teachers
and learners to gather around an interactive tabletop to ex-      Figure 1: Students gathering around an interac-
plore the abundance of learning traces an IBL environment         tive tabletop, exploring learner traces of a Human-
generates, and help collaboratively make sense of them, so        Computer Interaction course.
as to facilitate insights.

Categories and Subject Descriptors
H.5.2 [Information interfaces and presentation]: User             learners whose activities can be tracked in detail. Reflect-
Interfaces; H.5.n [Information interfaces and presenta-           ing on those traces can help learners to understand what is
tion]: Miscellaneous                                              the optimal setting and context in which they learn best.
                                                                  Teachers can, among other things, use the same traces to
                                                                  find out where learners struggle with what content or activ-
General Terms                                                     ity. Dashboards help present this abundance of data in a
Design, Human Factors, Experimentation                            way that supports both teachers and learners [14].

Keywords                                                          Teachers show interest in using dashboards collaboratively
interactive surfaces, learning analytics, learning dashboards,    with learners to discuss their activities, progress and re-
collaboration, reflection, awareness, information visualiza-      sults [3]. Interactive tabletops can facilitate and capture
tion, sense-making, inquiry-based learning                        collaboration activities in the classroom [8]. In previous
                                                                  work [2] we explored this platform to visualize learning an-
1.     INTRODUCTION                                               alytics data (see Figure 1), using the affordances (e.g. large
Similar to the Quantified Self 1 movement, which focuses on       display size, multi-user interaction) of interactive tabletops
collecting user traces and using the data for self-improvement,   to create a collaborative sense-making environment [6].
Learning Analytics can help to understand and optimize
(human) learning and the environments in which it occurs [12].    This paper describes our work-in-progress on an interactive
However, capturing learner traces can generate an abun-           tabletop visualization for learner traces that are generated
dance of data, especially in the context of Massive Open          by students in an inquiry-based learning (IBL) environment.
Online Courses (MOOCs) that involve tens to thousands of          Section 2 briefly present the learning environment and the
                                                                  data it generates. Section 3 discusses development details,
1                                                                 section 4 explains the design of the tabletop visualization.
    http://quantifiedself.com
                                                                  We discuss our findings and future work in section 5


                                                                  2.   IBL LEARNING TRACES
                                                                  Contrary to a traditional passive role in a classroom, in
                                                                  Inquiry-Based Learning (IBL), learners assume an active
                                                                  role as explorer and scientist with a focus on learning “how
                                                                  to learn”. Teachers try to stimulate learners to pose ques-
                                                                  tions and create hypotheses regarding a specific topic, per-
                                                                  form independent investigations, gather data to confirm and
Figure 2: weSPOT Inquiry Environment, presenting
6 phases and 2 active widgets in phase 5 (Interpre-
tation).


                                                                  Figure 4: A. The overview of all activities. B. The
                                                                  list of students participating in the inquiry (with
                                                                  student filter options). C. The content behind se-
                                                                  lected activities. D. Phase filter options.


                                                                  Following a user-centered rapid prototyping approach, we
                                                                  started from paper prototypes to gather initial feedback on
                                                                  early ideas, gradually developed more functional digital pro-
Figure 3: A web-based dashboard for teachers and                  totypes which have been deployed and evaluated with learn-
students providing access to learning analytics data              ers regarding usability.
per inquiry.
                                                                  Web technologies (HTML, CSS3 and JavaScript) facilitate
                                                                  development of quick prototypes and allows us to deploy
                                                                  on most school infrastructures. Interaction is supported
discuss their findings and generate conclusions. 6 phases         through both native browser mouse/touch events and the
of learning activities are often discerned in an IBL process      npTUIClient plug-in 3 , allowing the application to run on in-
model: problem identification, operationalization, data col-      teractive tabletops, interactive white-boards, tablets, phones
lection, data analysis, interpretation and communication [9].     and desktop computers. Our interactive tabletop setup cur-
As each learner can follow his own route through the IBL          rently facilitates up to 5 users.
process, it is obvious that the sequence and length of these
phases differ among students. Individual and collaborative        A centralized filter system using Crossfilter 4 and a modular
reflection is furthermore vital in every phase. Indeed,“even      and event-based architecture facilitates easy creation of new
at the very beginning when students need to develop a ques-       widgets. D3.js 5 and Processing.js 6 help visualize the data.
tion or a hypothesis, they need to reflect upon the question,     A Node.js 7 back-end generates the web pages while fetching
and evaluate it before they decide to proceed. They also need     the learning traces from the weSPOT environment.
to reflect while deciding what kind of data they need to col-
lect, how to proceed to data analysis, and how to communi-
cate their results” [10].                                         4.   DESIGN
                                                                  Flexible visual analysis tools must provide appropriate con-
In the weSPOT Inquiry Environment 2 , a teacher can set           trols for specifying the data and views of interest [5]. Filter-
up an inquiry regarding a specific research topic. For each       ing out unrelated information to focus on relevant items is
phase, learners can use a set of widgets (see Figure 2) to        the key control in our learning dashboards due to the abun-
e.g. create hypotheses, ask questions, rate and comment           dance of traces learners leave behind. Previous work [3] has
on activities, generate mind-maps, etc. By taking pictures,       shown that there is also a need for context and content to
recording videos, entering text and data from measurements        complement the visualized data. We therefore follow the
through a mobile application, students collect data in the        visual information-seeking mantra of “Overview first, zoom
field to support their hypotheses. All activities in the learn-   and filter, then details-on-demand” [11]: our tabletop visu-
ing environment are logged and stored in a data store and         alization presents users with a coordinated set of widgets
exposed as learning traces through REST services. Teach-          which contain: (i) a complete overview of all activities (Fig-
ers and students can access the learning analytics data of a      ure 4.A), (ii) data filters (Figure 4.B/D) and (iii) the content
specific inquiry through a web-based dashboard integrated         view (Figure 4.C).
in weSPOT Inquiry Environment 4, and the tabletop appli-
                                                                  3
cation.                                                             https://github.com/fajran/npTuioClient
                                                                  4
                                                                    http://square.github.io/crossfilter/
                                                                  5
                                                                    http://d3js.org
3.     ITERATIVE DEVELOPMENT                                      6
                                                                    http://processingjs.org
2                                                                 7
    http://inquiry.wespot.net/                                      http://nodejs.org
Figure 5: Time-lines per activity thread. The high-
lighted thread consist of a hypothesis creation fol-
lowed by 2 edits, a user rating and 2 comments.


                                                                  Figure 7: A prototype with 5 filter “drop zones”.
                                                                  Dropping a filter value into the blue (top-left) drop
                                                                  zone highlights data points matching the filter result
                                                                  by coloring the top-left part of the glyph.



                                                                  4.2     Filtering the Data
                                                                  Using the filter widgets, users can focus on activities by
                                                                  drilling down on one or more phases (see Figure 4.D), or one
                                                                  or more learners (see Figure 4.B). When multiple learners
                                                                  are selected (e.g. a group that works together), the path of
Figure 6: A. The blue path indicates the steps taken              each learner can be individually highlighted (see Figure 6.B),
by a student. In this case, the student learned some-             in order to provide an overview of work distribution. This
thing which he then rated. This then lead to the cre-             can help teachers to find struggling learners in a group. It
ation of a new hypothesis. B. Visualization limited               can also help learners to become aware of uneven work dis-
to a group of 2 students. Individual paths are high-              tribution and help to redivide the work. The path can also
lighted. The student indicated by the yellow line has             shed light on the methodology a learner uses to reach a cer-
been more active with both commenting and rating                  tain result (e.g. Figure 4.A).
activities. The student has also been more active in
phase 6 (purple).                                                 The interface of Figure 4 is limited to one person driving
                                                                  the navigation and only supports global filters. To fully
                                                                  use the affordances of the tabletop and create a collabora-
                                                                  tive sense-making environment, the application must sup-
4.1    Visualizing IBL Traces                                     port both individual as well as group work [4]. Figure 7
The visualization displays a time-line per activity thread (see   shows an early prototype that presents 5 participants with
Figure 5). For instance, the creation of a hypothesis by a        individual filtering tools. Global filters result in more tightly
learner is followed by every comment on, rating on, and edit      coupled collaboration [13], but can disturb individual work.
of the hypothesis. Squares represent create and edit events,      One participant’s filter activity could remove data from the
while circles represent comment events. Stars represent a         visualization another participant is working with. To allow
rating activity, triangles are data collection events. Activ-     participants to simultaneously filter the data presented on
ities within a single thread are connected by a horizontal        the tabletop, we use the multivariate attributes of a glyph-
line. This enables teachers and learners to see the evolution     based visualization [1]. The filter result of each participant is
of an activity thread, the comments that may have impacted        highlighted in the color corresponding to the user interface.
edits of e.g. the original hypothesis, and the rating trend.

Activities in other activity threads can enrich the context
                                                                  5.     CONCLUSION AND FUTURE WORK
                                                                  Our interactive visualization will be deployed in multiple
of a specific thread. A discussion in one thread might in-
                                                                  secondary school pilots 8 across Europe, both on interac-
fluence the creation of a new hypothesis, or an edit of an
                                                                  tive tabletop devices and interactive white-boards. Ques-
existing one. Therefore, every activity is positioned relative
                                                                  tionnaires regarding usefulness for both teachers and stu-
in time to all other activities displayed, allowing the users
                                                                  dents will help evaluate our design choices, while interaction
to backtrack through time across multiple threads at once
                                                                  logging and video recordings of collaboration sessions can
(see Figure 6.A).
                                                                  provide insights in whether the application is useful as a
                                                                  sense-making environment.
IBL phases (see Section 2) in which an activity occurs are in-
dicated by different background colors, matching the colors
                                                                  Our application lets users retrace individual steps taken by
used of the web dashboard (see Figure 4). The visualiza-
                                                                  (groups of) learner(s), i.e. they can collaboratively (i) re-
tion can be panned and zoomed using standard multi-touch
                                                                  8
interactions.                                                         http://portal.ou.nl/web/wespot/pilots
flect on the rationale of a learner’s decisions and actions,          lenses for geospatial exploration. In Proceedings of the
(ii) (re-)examine past explanations and conclusions, and (iii)        Ninth ACM International Conference on Interactive
(re-)evaluate past evidence data. Students can learn from             Tabletops and Surfaces, ITS ’14, pages 409–414, New
peers’ activities through exploration, discovery and discus-          York, NY, USA, 2014. ACM.
sion. The application can be used for evaluation purposes,        [8] R. Martinez-Maldonado, K. Yacef, Y. Dimitriadis,
allowing (groups of) learner(s) and teacher(s) to iterate over        M. Edbauer, and J. Kay. MTClassroom and
every step performed from hypothesis to conclusion together.          MTDashboard: supporting analysis of teacher
Pilot data can also help IBL researchers with the discussion          attention in an orchestrated multi-tabletop classroom.
and refinement of the IBL model.                                      In International Conference on Computer-Supported
                                                                      Collaborative Learning, CSCL 2013, pages 119–128,
Enabling multiple learners and teachers to interact with the          2013.
visualization simultaneously remains the biggest challenge.       [9] A. Mikroyannidis, A. Okada, P. Scott, E. Rusman,
We shall further explore the possibilities of glyph-based vi-         M. Specht, K. Stefanov, P. Boytchev, A. Protopsaltis,
sualizations to provide unobtrusive global filters, use user          P. Held, S. Hetzner, K. Kikis-Papadakis, and
position tracking through technology such as Kinect to sup-           F. Chaimala. weSPOT: A Personal and Social
port the dynamic nature of collaborators around a tabletop            Approach to Inquiry-Based Learning. Journal of
and explore data lenses (e.g. GeoLens [15, 7]) to facilitate          Universal Computer Science, 19(14):2093–2111, 2013.
individual exploration of the data on a shared visualization.    [10] A. Protopsaltis, P. Seitlinger, F. Chaimala,
                                                                      O. Firssova, S. Hetzner, K. Kikis-Papadakis, and
6.   ACKNOWLEDGMENTS                                                  P. Boytchev. Working environment with social and
The research leading to these results has received funding            personal open tools for inquiry based learning:
from the European Community’s Seventh Framework Pro-                  Pedagogic and diagnostic frameworks. The
gramme (FP7/2007-2013) under grant agreement No 318499                International Journal of Science, Mathematics and
- weSPOT project.                                                     Technology Learning, 20(4):51–63, 2014.
                                                                 [11] B. Shneiderman. The eyes have it: a task by data type
                                                                      taxonomy for information visualizations. In IEEE
7.   REFERENCES                                                       Symposium on Visual Languages, pages 336–343.
 [1] R. Borgo, J. Kehrer, D. H. S. Chung, E. Maguire,
                                                                      IEEE, 1996.
     R. S. Laramee, H. Hauser, M. Ward, and M. Chen.
     Glyph-based Visualization: Foundations, Design              [12] G. Siemens and P. Long. Penetrating the fog:
     Guidelines, Techniques and Applications. In M. Sbert             Analytics in learning and education. volume 46, pages
     and L. Szirmay-Kalos, editors, Eurographics 2013 -               30–32, Boulder, CO, USA, 2011. EDUCAUSE.
     State of the Art Reports. The Eurographics                  [13] A. Tang, M. Tory, B. Po, P. Neumann, and
     Association, 2012.                                               S. Carpendale. Collaborative coupling over tabletop
 [2] S. Charleer, J. Klerkx, J. L. Santos, and E. Duval.              displays. In Proceedings of the SIGCHI Conference on
     Improving awareness and reflection through                       Human Factors in Computing Systems, CHI ’06, pages
     collaborative, interactive visualizations of badges. In          1181–1190, New York, NY, USA, 2006. ACM.
     M. Kravcik, B. R. Krogstie, A. Moore, V. Pammer,            [14] K. Verbert, S. Govaerts, E. Duval, J. Santos,
     L. Pannese, M. Prilla, W. Reinhardt, and T. D.                   F. Van Assche, G. Parra, and J. Klerkx. Learning
     Ullmann, editors, ARTEL@EC-TEL, volume 1103 of                   dashboards: an overview and future research
     CEUR Workshop Proceedings, pages 69–81.                          opportunities. Personal and Ubiquitous Computing,
     CEUR-WS.org, 2013.                                               18(6):1499–1514, 2014.
 [3] S. Charleer, J. Santos, J. Klerkx, and E. Duval.            [15] U. von Zadow, F. Daiber, J. Schöning, and A. Krüger.
     Improving teacher awareness through activity, badge              GeoLens: Multi-User Interaction with Rich
     and content visualizations. In Y. Cao, T. Valjataga,             Geographic Information. Proc. DEXIS 2011, pages
     J. K. Tang, H. Leung, and M. Laanpere, editors, New              16–19, 2012.
     Horizons in Web Based Learning, Lecture Notes in
     Computer Science, pages 143–152. Springer
     International Publishing, 2014.
 [4] C. Gutwin and S. Greenberg. Design for individuals,
     design for groups: Tradeoffs between power and
     workspace awareness. In Proceedings of the 1998 ACM
     Conference on Computer Supported Cooperative Work,
     CSCW ’98, pages 207–216, New York, NY, USA,
     1998. ACM.
 [5] J. Heer and B. Shneiderman. Interactive dynamics for
     visual analysis. Queue, 10(2):30:30–30:55, Feb. 2012.
 [6] P. Isenberg, N. Elmqvist, J. Scholtz, D. Cernea, K.-L.
     Ma, and H. Hagen. Collaborative visualization:
     Definition, challenges, and research agenda.
     Information Visualization, 10(4):310–326, 2011.
 [7] F. Marinho Rodrigues, T. Seyed, F. Maurer, and
     S. Carpendale. Bancada: Using mobile zoomable