Learning Analytics for Learner Awareness in Remote Laboratories Dedicated to Computer Education Rémi Venant Philippe Vidal Julien Broisin Université Toulouse III Paul Sabatier Université Toulouse III Paul Sabatier Université Toulouse III Paul Sabatier +33 561 558 296 +33 561 557 402 +33 561 557 402 remi.venant@irit.fr philippe.vidal@irit.fr julien.broisin@irit.fr ABSTRACT should then provide both self- and social awareness. Learning analytics (LA) processes to track, analyze and report learners’ This paper addresses learning analytics for learner awareness in data are able to support such awareness [33] and to improve remote laboratories. The objectives we identified to provide self- learning [34]. Efforts have already been made to apply learning and social awareness while learners are practicing in their virtual analytics tools to remote laboratories [27, 30]. However, none of learning environment are threefold: (1) the definition of a them embraces both issues of awareness for learners and learning performance metric that requires no assessment tests, (2) the analytics for remote laboratories. tracking of data to infer that metric in real time, (3) the visualization of the performance metric to provide learners with We address in this article the question of providing learners with a awareness, without impacting learners’ cognitive load. To support remote laboratory featuring self- and social awareness through these needs, we propose a metric related to our context of learning analytics. The presented work was achieved in the computer education, a generic tracking framework, and context of Lab4CE, a remote laboratory for computer education visualization tools. All of these suggestions have been [4]. In the next section, we expose recent works on learning implemented in Lab4CE, a remote laboratory management system analytics for learner awareness. The third part deals with the for computer education currently in use within our university.. measure of learners’ performance for practical sessions, a critical information that must be provided to users [1] and that cannot be CCS Concepts retrieved from traditional assessment technics such as quizzes or evaluations of learning paths; indeed, practical activity in a remote • Human-centered computing~Visual analytics • Applied laboratory happens before any achievement test. We propose a computing~Interactive learning environments • Applied metric of performance that fits our pedagogical and technical computing~Collaborative learning • Applied contexts, and a client side XAPI framework to track learners’ computing~Distance learning • Information systems~Data activity and infer metrics. Based on this metric, we then introduce analytics • Social and professional topics~Student assessment two learning analytics tools to report on actions carried out by learners, as well as their level of performance while they are Keywords practicing in the remote laboratory. Remote laboratory, awareness, learning analytics, computer education 2. LEARNING ANALYTICS FOR 1. INTRODUCTION LEARNER AWARENESS Remote laboratories rely on inquiry-based learning that leads, Learning analytics were defined during the 1st International among other outcomes, to knowledge building, deep learning and conference on learning analytics in 2011 as “the measurement, reflection [9]. In Technology Enhanced Learning (TEL), self- collection, analysis and reporting of data about learners and their awareness appears to be an important factor to support these contexts, for purposes of understanding and optimizing learning outcomes [11], and thus became the focus of several studies in and the environments in which it occurs”. They target two main computer-mediated research [23, 24, 38]. Moreover, inquiry goals: recommendation & visualization [10]. The former relies on learning relies on social constructivism, a theory that defines the system to ensure the analysis and take decisions, whereas learning as shared and constructed in the context of relationships visualization, among others goals [34], can provide users with with others [35]. Social awareness is also an important factor awareness about the learning environment they are involved in, regarding learner’s development [13] that should be provided to and thus let them take their own decisions. Hence, visualization learners [2, 5]. can be used as cognitive tools for learners or teachers [33]. SocialLearn is a social media space designed to support social To support learners’ cognition, remote laboratory environments learning [12]. It exploits data from different sources (i.e., the SocialLearn website, the Open University and social media sites used by learners such as LinkedIn or Twitter) to provide visualization and recommendation through different analytics. SocialLearn is built on a three tiers architecture composed of (1) an identity server that supplies data from the sources to a (2) recommendation engine, which processes data for further analysis, and (3) a delivery channel. The delivery channel includes the SocialLearn website, a browser toolbar and various Copyright © 2016 for the individual papers by the papers' authors. Copying permitted only for private and academic purposes. This volume is published and copyrighted by its editors. LAL 2016 workshop at LAK '16, April 26, 2016, Edinburgh, Scotland. applications embedded into external websites. These delivering However, adding widgets to an existing interface can lead to a systems are used as both providers (i.e., they send data to the busy screen design that may become irrelevant and cause learning identity server) and consumers (i.e., they expose to users problems [6]. Kirsh [17] thus suggests avoiding split attention of recommendations and/or visualizations of information supplied by learners that happens when different components on a screen the recommendation engine). Many individual and group analytics require multitasking to integrate disparate sources of information. are proposed through a LA dashboard. For instance, the view Also, the choice of the visualization itself is important since it “Learning dialogue analytics” summarizes the learner usage of depends on the goal targeted [10]. We must try to not raise different types of dialogues based on sociocultural discourse learners’ cognitive load with visualizations that require time and analysis [21], while the view “My learning network” exposes the cognition to be understood and analyzed. different interactions the learner had with others. These approaches cannot be reused directly in our context. Unlike In a similar approach, Govaerts et al. [14] proposed the Student them, we must propose fitted metrics and lightweight Activity Meter (SAM) to support awareness for teachers and visualizations to use for a practical session. However, we will learners through visualizations based on various metrics. SAM follow the web architecture based on full-duplex communication analyses data expressed using the Contextualized Attention like Web Socket that addresses real-time requirement. Metadata (CAM) format, and presents them in a detailed dashboard that allows dimension drilling and filtering: it acts as a 3. LEARNER PERFORMANCE IN THE multi-dimensional analysis tool such as ClickView©. LAB4CE ENVIRONMENT FORGE is a European initiative for online learning and We propose here a metric relying on actions performed by experimentation via interactive learning resources. It proposes learners during a practical session. We also suggest a model of learning analytics to support awareness and reflection for teachers trace to record this metric (among other data) in respect with the and learners [23]. Widgets inside ebooks and online courses track XAPI specification. Finally, we describe a tracking framework learners interactions with the course materials and with each other that addresses the technical requirements we identified previously using the Tin Can API (XAPI). Learning analytics aim at to collect these traces and compute the level of performance. evaluating learners (beside surveys and questionnaires) and provide them (as well as teachers) with awareness about the 3.1 Pedagogical Context results of the evaluation process. Like in SocialLearn or SAM, Our education field is the remote practical learning of Computer awareness is supported by visualizations within dashboards Science Education (CSE). Practical activities, referred to as “any offered by the Learning Locker LRS (Learning Record Store). learning and teaching activity that engages learners in manipulating and analyzing real and physical objects”, involve Rich dashboards offering different visualization and exploration learners and teachers in a laboratory (lab), a spatial and temporal features seem to be a common exploitation of learning analytics space hosting devices used for experiments [3]. CSE, within that for awareness. Visualization of learners’ performance is also context, presents a large variety of experiments depending on the frequent, as all of these tools offer analytics on performance field of teaching. For instance, programming language learning metrics. Though, the context of a practical session in a remote requires at least a text editor and a compiler, while database laboratory raises several issues. Awareness about learners’ learning needs at least a database server and a client to access it. performance during a session cannot be achieved through However, the resources for practical experiments are all traditional performance measurements based on assessments, such computers configured for a certain experiment (i.e., providing the as in FORGE or other studies that rely on achievement tests [18, hardware and software required to achieve the pedagogical 29]. Since a practical session mostly happens before any objectives), and interactions between learners and these resources assessment, the performance metric needs to be live-computed, are either based on CLI (Command Line Interface) or GUI based on actions performed by learners during the session. For (Graphical User Interface). The evaluation of these interactions instance, the metric based on the number of programming errors a can be used to study how learners cope with the system(s) they student had done proposed in [14] could be used in a must manipulate to perform the experiment. Thus we propose here programming remote laboratory. to focus on these interactions to infer a performance metric. Awareness in the context of a practical learning situation also requires synchronization between learners’ actions and 3.2 Technological Support: Lab4CE information returned back to them: it should reflect what just 3.2.1 Functional Presentation happened, which implies near real-time processing and automatic Lab4CE is a remote laboratory environment for computer updates of visualization. In SAM, data seem to be loaded education standing on existing virtualization tools to benefit from asynchronously (i.e., on demand); SocialLearn does not clearly their advanced computational features, and integrating original define if their architecture can address such real-time requirement. scaffolding tools and services to improve the user experience and Learning Locker LRS, used in FORGE, addresses this issue to increase students’ engagement in practical activities. through an architecture based on Web Socket; other works also Lab4CE includes different features through several web adopt this learning analytics infrastructure to return immediate applications. The design of an experiment (i.e., the different feedback to learners [15]. machines, their configuration and their topology) is achieved Finally, the use of dashboards like SocialLearn, SAM or FORGE thanks to a WISIWIG interface that offers the opportunity to draw LRS forces learners to switch between the visualization GUI and the experiment and configure each of its components (e.g., adding the interface allowing operating the laboratory. Simpler a software to or choosing the operating system of a given visualizations, embedded as a lightweight component like in the machine). For all learners, a set of virtual machines and networks ROLE context [31], would prevent such additional cognitive load. configured properly according to the experiment’s design will be Widgets can be easily integrated into existing applications and automatically created in a cloud manager as soon as they start provide portability and reusability of visualizations [36]. their practical session. Figure 2. Architecture and data streams of Lab4CE exchanged between users and resources (e.g., communications between web-terminals of the RLI and virtual machines). 3.3 Rightness of Instruction as a Performance Metric Within the last red channel, data contain, among other things, instructions sent by learners to the resources, as well as the responses returned back by the remote system. Instructions Figure 1. The Lab4CE Rich Learning Interface (RLI) provide an interesting granularity. An instruction is a textual Learners can then control (i.e., start, shut down or put in sleep message composed of a command name that may be followed by mode) their machine(s) and send instructions through a web arguments and/or options (e.g., “ls -l”, where “ls” is the command terminal included into a Rich Learning Interface (RLI) shown in name and “-l” is an option). Being the basis of the experiment’s Figure 1. In addition to these control capabilities, the RLI path, the sequence of instructions reflects learners’ progression integrates communication functions through an instant messaging within the experiment. Thus, instructions can be analyzed to system; all participants of an experiment are able to talk to each provide relevant information at different scales: a single other using the public chat room, or a dedicated private room, instruction gives information about the performance on the restricted to the participants of the same practical session. command itself, whereas a whole sequence of instructions reflects how the learner achieved the objective of the practical session. Indeed, a collaborative system lets learners invite each other to their practical session so they can work together on the same Instructions also present another advantage. When executed, they virtual resources. Learners are able to work together as if they can be automatically evaluated as right or wrong: the response were side-by-side in a hands-on laboratory thanks to a set of returned by the resource gives information that can be used to awareness tools already implemented. For instance, learners infer that state of execution. In the rest of the paper, we will refer working on the same machine can see each other’s terminal to that state as the technical rightness. An instruction that is thanks to a terminal streaming system. technically right means it has been properly executed on the resource. However, this rightness has to be contextualized to the 3.2.2 Architecture and Data Channels activity: a command execution might be successful, but it might Lab4CE relies on three scalable distributed components. The be wrong according to the goal the learner is trying to achieve, or laboratory layer, built on a cloud manager, hosts the virtual irrelevant for that goal. That « pedagogical » point of view of the resources as well as some pedagogical objects (e.g., experiment rightness can be used to evaluate the learner’s progression through description, user account, etc.), and provides low-level its learning path. It can be defined as the result of the interaction management (e.g., hosting, planning, authorization, etc.). The between the instruction, the resource it has been executed on, and middleware layer includes two distinct and isolated components. the current position of the learner in the learning path. However, The pedagogical middleware offers a pedagogical REST interface in this article, the definition is restricted to the technical rightness. and acts as a broker between the cloud manager and the rich Indeed, to compute the pedagogical rightness, production of learning interface to manage experiments, practical sessions, solutions to practical activities has to be ensured either resources and users. It also embeds Web Socket endpoints for automatically, or manually by tutors and teachers. This process real-time streaming requirements (e.g., terminal streaming). The goes beyond the scope of this article and will be in the focus of web-based terminal middleware acts as a SSH (Secure Shell: the future research. However, this information is taken into account protocol used to interact with a remote machine) proxy between within our trace model presented in the next section. virtual machines and the web terminal interface (see Figure 1). Finally, the learning layer exposes different web-based end-user 3.4 Trace Model While this article focuses on learners’ performance, much more interfaces such as a web-terminal based on ShellInABox [25]. information can be recorded from the Lab4CE environment for Figure 2 represents the interactions that happen between these further analytics. It includes instant messages, invitations between three layers and users, and focuses on the three channels learners, logged in/logged out & start/stop lab actions, operations transporting data. The blue channel represents the endogenous on machine beside instructions (e.g., start, stop, suspend & Lab4CE data, such as collaborative invitation events or instant resume), and navigation within the platform (e.g., to switch from a messages. The green channel represents pedagogical and resource lab to another, to open help popups, etc.). We chose to adopt the management data: information on experiment, practical session, XAPI specifications to design our trace model for the following resource and users, exchanged between the cloud manager and the reasons: (i) to reuse existing interoperable tools, (ii) to share our RLI through the middleware. For instance, when a user starts a collected data, and (iii) to reuse the analytics tools we designed in machine, the RLI sends an order to change the state of the other contexts [36]. In addition, XAPI is becoming widely used matching resource to the cloud manager, which in return, sends [20, 26, 28] and proposes a main flexible structure of data able to the new state back. Finally, the red channel contains the data represent any action called statement, composed of a verb, an actor, and an object. A statement might also include the time Figure 3. Example of instruction and result XAPI elements when the action was performed (i.e., a timestamp), its context, its result (success, completion, score, etc.), the authority asserting the action that occurred, or any attachment (e.g., a file attached to the statement). To represent an instruction executed on a remote machine, we created a custom XAPI Activity object. For instance, the instruction “rm –v myfile”, whose the response returned by the resource is “rm: myfile: No such file or directory” is Figure 4. The Learning Analytics framework represented in Figure 3 as a couple of a computer instruction object and its related result. The learner information is set in an statement and sends it back to the trace forger (4). Eventually, the actor element, while the resource, the practical session and the forger sends the enriched statement to the LRS. experiment define the context of the statement. The timestamp is Finally, our LRS proposes both a REST interface and full-duplex also recorded within the statement. The tracking framework communication endpoints (i.e., Web Socket, Ajax/XHR explained below is in charge to generate these complete Streaming). The REST interface is used by the trace forger to send statements. statements, while the components that exploit statements (such as visualization tools) can subscribe to one or several streams of 3.5 Learning Analytics Framework statements through the full-duplex endpoint. We propose a tracking framework (see Figure 4) inspired from existing infrastructures such as the Migen project [32], or the We implemented the three layers in Lab4CE with AngularJS, a flexible and extensible approach proposed by Hecking et al. [15]. Model-View-Controller (MVC) framework. This kind of Our framework aims at generating XAPI statements from the framework facilitates the creation of sensors to monitor data, since different data sources of the Lab4CE environment, and also it provides automatic bindings between the DOM structure of the supports statements’ enrichment with inferred indicators. Unlike web page and the trace model. However, environments that do not other approaches, our framework essentially resides on the client use such a framework can support sensors by parsing the DOM side, so as to benefit from distributed computation. Also, since it structure of the page itself. Our stores were implemented with a enables indicators inference on that side, this framework can stack composed of a noSQL database (i.e., MongoDb) to enable avoid sending sensible data and still compute related metrics. integration of new statements or modifications in statements’ structure, and a Spring Java EE layer to expose both REST The framework includes three loose-coupled layered components interfaces and full-duplex endpoints. An enriching engine was on the client side, and two remote stores. The sensors monitor implemented to infer technical rightness of a command, setting a data on a specific component of the Lab4CE environment (i.e., (1) score of 1 when an instruction is evaluated as right, 0 otherwise. in Figure 4), transform these data in a XAPI element, and send them to the trace forger as an event (2). Figure 4 represents two We detail in the following section different learning analytics examples of sensors used in Lab4CE: the sensor 1 monitors the tools dedicated to learner awareness that exploit our LRS. web-terminal and sends an event each time an instruction is carried out by a learner. This event includes three elements: the 4. AWARENESS TOOLS verb, the object and the timestamp. The sensor 2 monitors artifacts We present in this section two learning scenarios leading to the of the RLI that gives information about the identity of the user, design and creation of visualization tools based on instructions the resource (s)he is working on and the lab it belongs to. carried out by learners. These tools aim at making learners able to analyze why and what they are doing, in order to support The trace forger merges these data in order to build the matching metacognitive processes [16]. statement, and adds a timestamp if this field is missing. The statement is then routed to either the enriching engine (3), or 4.1 Social Comparison Tool directly to the LRS (5). Objectives. Recent research show that learners that become engaged in a social analysis process might enhance their reflection The enriching engine enhances statements with inferred [37]. Comparative tools aim at identifying each learner’s indicators. This inference engine receives its rules from a store performance and allowing learners to compare each other. These (stream labeled “A” in Figure 4). When the engine has received tools consist in providing social comparison feedback [22] and rules, it subscribes to the forger to receive statements it can infer giving the students the feeling of being connected with and indicators on (i.e., (B) in figure 4). For instance, the enriching supported by their peers [19]. engine subscribes to the forger to enrich instruction statements with a rightness indicator. The forger then sets its routing table Learning Scenario. While learners are working in Lab4CE, they according to that subscription. Afterwards, each time the forger are aware of both their own and peers’ performance. They should builds such statement it sends that trace to the enriching engine. be able to keep their attention to their activity and get feedback The enriching engine infers then the rightness, adds it to the Figure 6. The reflection-on-action tool opposite, they can also identify peers who could get support from them. Furthermore, this tool is available to tutors who become aware of the group level of performance, and who are thus able to adjust the objectives and/or learning paths of the practical activity. Finally, the individual learners’ progress bars help tutors to identify learners in difficulty and needing support. This social comparison tool provides awareness to learners about their own and their peers’ performance, while requiring insignificant cognitive efforts from learners and thus offering them the opportunity to keep working and focusing on the Figure 5. The social comparison tool learning activities. However, this simplicity prevents them to deeply analyze their own actions, as well as those of their peers; about their own and others’ performance with the less cognitive the following tool aims at achieving this objective. overhead possible, as required in Section 2. Design and Implementation. The visualization tool is composed 4.2 Reflection-on-Action Tool of a set of progress bars that reflects learners’ level of Objectives. Davis et al. [8] defined reflection-on-action as the performance (according to the rightness of instruction defined in analysis of processes after the actions are completed. Collins et al. Section 3.3) based on a simple color code (green if the value is 1, [7] recommended various strategies to engage learners in red if it is 0). A progress bar is a lightweight component that reflection-on-action like imitation by learners of performance subscribes to our LRS for instruction statements and then applies especially modeled for them, or replay of students’ activities and filters on these data in order to display information about a performance by teacher. Providing reflection-on-action would particular learner or a group, and about the current session or the then leverage the limits of the social comparison tool. whole practical activity. For each statement, the tool draws a Learning Scenario. Users review the instructions they carried out colored gradation according to its score, scaled on time. for a session of work, or since the beginning of a given practical The visualization tool integrated into the Lab4CE environment is activity, on a particular resource or on all of them. They analysis illustrated on top of Figure 5. It provides three different progress their own work, but also discover how peers have proceeded to bars. The first one (i.e., My current session on Figure 5) relates achieve the pedagogical objectives. the individual performance of the logged-on learner during the Design and Implementation. To review a work session, we current session, the second one (i.e., My experiment on Figure 5) propose a dashboard illustrated in Figure 6 to let learners drill reflects the performance since the logged-on learner started down into deeper analysis of their own or peers’ work. The form working on the current practical activity, and the third bar (i.e., All on the top of the interface allows filtering information to visualize participants on Figure 5) exposes the level of performance of the (i.e., instructions) according to a given user, session or resource. group of learners enrolled in the practical activity. The progress For each selected resource, a timeline of instructions is exposed bars are automatically updated each time a command is executed into the main panel. Each node of the timeline represents an by a learner. In addition, we enriched the existing social presence instruction, colored according to its rightness (using the color tool included into the RLI to display the current level of code of the social comparison tool). Details of an instruction performance of each connected learner using a smaller progress appear at mouse over, in a terminal-like area containing the bar; the resulting visualization is illustrated on bottom of Figure 5. command, its argument(s), and the output returned by the machine With the different progress bars, learners get aware of the (see Figure 6). Finally, a last feature enables merging several progression of their level of performance, and are able to compare timelines to visualize all instructions carried out by a learner on their own and their partners’ levels. They also have the any machine of a given practical activity. opportunity to identify peers that seem to perform better, and thus who could help them when they encounter difficulty; at the This tool adopts the more traditional client-server architecture, Education. (2015), 1–27. and relies on the REST interface of the LRS to retrieve data. As a [5] Buder, J. 2011. Group awareness tools for learning: dashboard, it cannot be used competitively with the RLI (i.e., Current and future directions. Computers in Human learners cannot work on a resource and visualize this tool at the Behavior. 27, 3 (2011), 1114–1117. same time). However, this tool could be useful for a practical [6] Chandler, P. and Sweller, J. 1996. Cognitive load session to engage learners in the analysis of peer’s actions in order while learning to use a computer program. Applied to better understand, for instance, the solutions proposed by cognitive psychology. (1996). others. To promote that usage, a connection exists between this [7] Collins, A. and Brown, J.S. 1988. The Computer as a tool and the social comparison tool: when a user clicks on an Tool for Learning Through Reflection. Learning Issues individual progress bar, the reflection-on-action tool appears and for Intelligent Tutoring Systems. Springer US. 1–18. allows to visualize the instructions carried out by the matching [8] Davis, D., Trevisan, M., Leiffer, P., McCormack, J., learner during the current session. Beyerlein, S., Khan, M.J. and Brackin, P. 2005. Reflection and Metacognition in Engineering Practice. The aim of this tool is to engage learners in a reflective process Using Reflection and Metacognition to Improve and to make them analyze their work in details. The timeline also Student Learning. Stylus Publishing, LLC. 78–103. visually highlights the difficulties they experienced. Moreover, [9] de Jong, T., Linn, M.C. and Zacharia, Z.C. 2013. tutors can review learners’ actions and evaluate how they Physical and Virtual Laboratories in Science and performed. Thanks to the connection with the social comparison Engineering Education. Science. 340, 6130 (Apr. tool, learners can easily seek help from peers (or offer help) by 2013), 305–308. analyzing the instructions executed in the current session. [10] Duval, E. 2011. Attention please!: learning analytics We have designed and implemented both tools into the existing for visualization and recommendation. ACM. Lab4CE environment. [11] Ellis, R.A., Marcus, G. and Taylor, R. 2005. Learning through inquiry: student difficulties with online course‐ 5. CONCLUSION based Material. Journal of Computer Assisted We proposed in this paper two awareness tools aiming at Learning. 21, 4 (Aug. 2005), 239–252. engaging learners in the deep learning process while they are [12] Ferguson, R. and Shum, S.B. 2012. Social learning practicing in a remote laboratory. A social awareness tool reveals analytics: five approaches. (Apr. 2012), 23–33. to learners their current and general levels of performance, and let [13] Goldman, S.V. 1992. Computer resources for them compare each other's levels. This tool stands on a simple supporting student conversations about science visualization technic in order to be usable during a practical concepts. ACM SIGCUE Outlook. 21, 3 (Feb. 1992), session, while users perform their learning activity, without 4–7. requiring specific attention. The reflection-on-action tool, [14] Govaerts, S., Verbert, K., Duval, E. and Pardo, A. implemented as timelines, allows learners to deeply analyze both 2012. The student activity meter for awareness and their own work and peers’ activity. self-reflection. Conference for Human-Computer Both tools rely on a generic and modular learning analytics Interaction. (May 2012), 869–884. framework standing on XAPI specifications and integrating an [15] Hecking, T., Manske, S., Bollen, L., Govaerts, S., enriching engine able to infer different indicators from collected Vozniuk, A. and Hoppe, H.U. 2014. A Flexible and data. These tools and the framework have been successfully Extendable Learning Analytics Infrastructure. (Aug. integrated into the Lab4CE system, our remote laboratory 2014), 123–132. dedicated to computer education. [16] Jonassen, D. 1999. Designing constructivist learning environments. Instructional design Theories and The performance metric we defined is currently restricted to the Models A New Paradigm of instructional theory. 215– technical aspect of a computer command. However, it should also 239. relate the pedagogical relevance of an instruction to allow [17] Kirsh, D. 2000. A Few Thoughts on Cognitive comparison of learners’ progression within the course. Also, the Overload. (2000). exploitation of the whole set of traces our system currently [18] Lau, W.W.F. and Yuen, A.H.K. 2011. Modelling generates (e.g., instructions, but also instant messages, programming performance: Beyond the influence of collaboration invitation, etc.) leads us to learners profiling and learner characteristics. Computers \& Education. 57, 1 pattern mining that represent other areas of investigation in line (2011), 1202–1213. with our objective. For instance, such analytics could be used to [19] Lowe, D.B., Murray, S., Lindsay, E. and Liu, D. 2009. promote help seeking or offering processes between peers. Evolving Remote Laboratory Architectures to 6. REFERENCES Leverage Emerging Internet Technologies. TLT. 2, 4 [1] Arnold, K.E. and Pistilli, M.D. 2012. Course signals at (2009), 289–294. Purdue: using learning analytics to increase student [20] Megliola, M., De Vito, G., Sanguini, R., Wild, F. and success. LAK. (2012), 267–270. Lefrere, P. 2014. Creating awareness of kinaesthetic [2] Bodemer, D. and Dehler, J. 2011. Group awareness in learning using the Experience API: current practices, CSCL environments. Computers in Human Behavior. emerging challenges, possible solutions. (2014), 11– 27, 3 (2011), 1043–1045. 22. [3] Bouabid, M.E.A. 2012. De la Conception à [21] Mercer, N. 2004. Sociocultural discourse analysis: l'Exploitation des Travaux Pratiques en ligne. analysing classroom talk as a social mode of thinking. [4] Broisin, J., Venant, R. and Vidal, P. 2015. Lab4CE: a Journal of Applied Linguistics and Professional Remote Laboratory for Computer Education. Practice. 1, 2 (2004), 137–168. International Journal of Artificial Intelligence in [22] Michinov, N. and Primois, C. 2005. Improving productivity and creativity in online groups through social comparison process: New evidence for Process using Remote Laboratories and Learning asynchronous electronic brainstorming. Computers in Analytics to support teachers in continuous Human Behavior. 21, 1 (2005), 11–28. assessment. (2014), 227–230. [23] Mikroyannidis, A. and Gomez-Goiri, A. 2015. [31] Santos, J.L., Verbert, K., Govaerts, S. and Duval, E. Deploying learning analytics for awareness and 2011. Visualizing PLE usage. (2011). reflection in online scientific experimentation. 5e [32] Santos, S.G., Mavrikis, M. and Magoulas, G.D. 2010. Workshop in Awareness And Reflection in Technology Layered Development and Evaluation for Intelligent Enhanced Learning. (2015), 105–111. Support in Exploratory Environments: The Case of [24] Morais, A.M., Marenzi, I. and Kantz, D. 2015. The Microworlds. Intelligent tutoring systems. (2010), 105– LearnWeb formative assessment extension: Supporting 114. awareness and reflection in blended courses. [33] Schneider, D., Class, B., Benetos, K. and Lange, M. ARTEL@EC-TEL. (2015), 97–103. 2012. Requirements for learning scenario and learning [25] Morell, L. and Jiang, C. 2015. Using ShellInABox to process analytics. (Chesapeake, VA, 2012), 1632– improve web interaction in computing courses. Journal 1641. of Computing Sciences in Colleges. 30, 5 (May 2015), [34] Siemens, G., Gasevic, D., Haythornthwaite, C., 61–66. Dawson, S., Buckingham Shum, S., Ferguson, R., [26] Murray, K., Berking, P. and Haag, J. 2012. Mobile Duval, E., Verbert, K. and Baker, R.S. 2011. Open Learning and ADL's Experience API. Connections: Learning Analytics: an integrated & modularized The Quarterly Journal. 12, (2012), 45. platform. [27] Orduna, P., Almeida, A., Lopez-de-Ipina, D. and [35] Vigotsky, L.S. 1978. Mind in Society: the development Garcia-Zubia, J. 2014. Learning Analytics on federated of higher Psychology. remote laboratories: Tips and techniques. (2014), 299– [36] Vozniuk, A., Govaerts, S. and Gillet, D. 2013. 305. Towards Portable Learning Analytics Dashboards. [28] Rabelo, T., Lama, M., Amorim, R.R. and Vidal, J.C. (2013), 412–416. 2015. SmartLAK: A big data architecture for [37] Wilson, J. and Wing, J.L. 2008. Smart thinking: supporting learning analytics services. (2015), 1–5. Developing reflection and metacognition. Curriculum [29] Ritzhaupt, A.D., Pastore, R. and Davis, R. 2015. Press. Effects of captions and time-compressed video on [38] Zhong, K. and Li, N. 2014. A Correlative Research on learner performance and satisfaction. Computers in the Relationship Between Awareness in Collaborative Human Behavior. 45, (2015), 222–227. Learning and its Impact on Academic Performance. [30] Romero, S., Guenaga, M., Garcia-Zubia, J. and ICSLE. Chapter 2 (2014), 9–22. Orduna, P. 2014. New challenges in the Bologna