=Paper= {{Paper |id=None |storemode=property |title=Empowering Students to Reflect on their Activity with StepUp!: Two Case Studies with Engineering Students |pdfUrl=https://ceur-ws.org/Vol-931/paper6.pdf |volume=Vol-931 |dblpUrl=https://dblp.org/rec/conf/ectel/SantosVD12 }} ==Empowering Students to Reflect on their Activity with StepUp!: Two Case Studies with Engineering Students== https://ceur-ws.org/Vol-931/paper6.pdf
Empowering students to reflect on their activity
with StepUp!: Two case studies with engineering
                  students.

              Jose Luis Santos, Katrien Verbert, and Erik Duval

           Dept. of Computer Science, KU Leuven, Celestijnenlaan 200A,
                            B-3001 Leuven, Belgium
        {JoseLuis.Santos,Katrien.Verbert,Erik.Duval}@cs.kuleuven.be




      Abstract. This paper reports on our ongoing research around the use of
      learning analytics technology for awareness and self-reflection by teachers
      and learners. We compare two case studies. Both rely on an open learning
      methodology where learners engage in authentic problems, in dialogue
      with the outside world. In this context, learners are encouraged to share
      results of their work, opinions and experiences and to enrich the learn-
      ing experiences of their peers through comments that promote reflection
      and awareness on their activity. In order to support this open learning
      process, we provided the students with StepUp!, a student activity visu-
      alization tool. In this paper, we focus on the evaluation by students of
      this tool, and the comparison of results of two case studies. Results in-
      dicate that StepUp! is a useful tool that enriches student experiences by
      providing transparency to the social interactions. The case studies show
      also how time spent on predefined high level activities influence strongly
      the perceived usefulness of our tool.

      Keywords: human computer interaction, technology enhanced learn-
      ing, reflection, awareness



1   Introduction

This paper reports on a comparison of two recent experiments with learning an-
alytics. In our view, learning analytics focuses on collecting traces that learners
leave behind and using those traces to improve learning [1]. Educational Data
Mining can process the traces algorithmically and point out patterns or com-
pute indicators [2, 3]. Our interest is more in visualizing traces in order to make
learners and teachers to reflect on the activity and consequently, to draw con-
clusions. We focus on building dashboards that visualize the traces in ways that
help learners or teachers to steer the learning process [4].
    Our courses follow an open learning approach where engineering students
work individually or in groups of three or four on realistic project assignments in
an open way. Students use twitter (with course hash tags), wikis, blogs and other



                                         73
Empowering students to reflect on their activity with StepUp!
  web 2.0 tools such as Toggl1 and TiNYARM2 ., to report and communicate about
  their work with each other and the outside world in a community of practice
  kind of way [5, 6].
      Students share their reports, problems and solutions, enabling peer students
  to learn from them and to contribute as well. However, teachers, assistants
  and students themselves can get overwhelmed and feel lost in the abundance
  of tweets, blog posts, blog comments, wiki changes, etc. Moreover, most stu-
  dents are not used to such a community based approach and have difficulties in
  understanding this process. Therefore the reflection on the activity of the com-
  munity can help users to understand what is going on and what is expected of
  them.
      In this paper, we present two follow-up studies to our earlier work [7], where
  we documented the user-centered design of an earlier version of StepUp!: the
  new version we present here is geared towards an open learning approach.
      In our courses, we encourage students to be responsible of their own learn-
  ing activities, much in the same way as we expect them to be responsible of
  their professional activities later on. In order to support them in this process,
  our studies focus on how learning dashboards can promote reflection and self
  awareness by students. To this end, we consider different ways to capture traces
  and to identify which traces are relevant to visualize for the users. Finally, we
  analyze how visualizing these traces affects the perception and actions of the
  learner.
      These experiments rely on the design, implementation, deployment and eval-
  uation of dashboards with real users in ongoing courses. We evaluated our proto-
  types in two elaborate case studies: in the first case study, we introduced StepUp!
  to the students at the beginning of the course, visualizing blog and twitter ac-
  tivity and time reported on the different activities of the course using Toggl.
  They could access the tool but it was not mandatory. After a period of time, we
  evaluated the tool with students by using a questionnaire and Google Analytics3
  to track the actual use of the tool.
      In the second case study, StepUp! visualized student activities from blogs,
  twitter and TiNYARM, a tool to track read, skimmed and suggested papers in
  a social context [8]. Students used the tool at the end of the course, after which
  they completed an evaluation questionnaire. The idea behind of evaluating the
  tool at the end of the course was to analyze how the normal use of the tool
  affected to the perceived usefulness.
      As time tracking is so prominent in what we visualize, we also discuss the
  importance of tracking time on high-level definition of activities and the potential
  differences between automatic and manual tracking of the data.
      The remainder of this text is structured as follows: the next section presents
  our first case study, in a human-computer interaction course. Section 3 describes

   1
     http://toggl.com
   2
     http://atinyarm.appspot.com/
   3
     http://analytics.google.com




                                           74
Empowering students to reflect on their activity with StepUp!
  the second case study, in a master thesis student group. Results are discussed in
  Section 4. Section 5 presents conclusions and plans on future work.


  2     First case study
  2.1    Data tracked
  One of the main challenges with learning analytics is to collect data that reflect
  relevant learner and teacher activities [4].
      Some activities are tracked automatically: this is obviously a more secure and
  scalable way to collect traces of learning activities. Much of our work in this area
  is inspired by “quantified self” applications [9], where users often carry sensors,
  either as apps on mobile devices, or as specific devices, such as for instance
  Fitbit4 or Nike Fuel5 .
      We rely on software trackers that collect relevant traces from the Web in the
  form of digital student deliverables: the learners post reports on group blogs,
  comment on the blogs of other groups and tweet about activities with a course
  hash tag. Those activities are all tracked automatically: we basically process RSS
  feeds of the blogs and the blog comments every hour and collect the relevant
  information (the identity of the person who posted the blog post or comment
  and the timestamp) into a database with activity traces. Similarly, we use the
  twitter Application Programming Interface (API) to retrieve the identity and
  timestamp of every tweet with the hash tag of the course.
      Moreover, we track learner activities that may or may not produce a digital
  outcome with a tool called Toggl: this is basically a time tracking application
  that can be configured with a specific set of activities. In our HCI course, we
  make a distinction between the activities reported on in this way, based on the
  different tasks that the students carry out in the course:
   1. evaluation of google plus;
   2. brainstorming;
   3. scenario development;
   4. design and implementation of paper prototype;
   5. evaluation of paper prototype;
   6. design and implementation of digital prototype;
   7. evaluation of digital prototype;
   8. mini-lectures;
   9. reading and commenting on blogs by other groups;
  10. blogging on own group blog.
  The first six items above correspond to course topics: the students started with
  the evaluation of an existing tool (Google Plus6 ) and then went through one
  cycle of user-centered design of their own application, from brainstorming over
   4
     http://www.fitbit.com/
   5
     http://www.nike.com/fuelband/
   6
     http://plus.google.com/




                                           75
Empowering students to reflect on their activity with StepUp!
  scenario development to the design, implementation and evaluation of first a
  paper and then a series of) digital prototype(s) [10]. The last three items above
  correspond with more generic activities that happen throughout the course: mini-
  lectures during working sessions, and blogging activities, both on their own blog
  and on that of their peers. For all these activities, we track the start time, the
  end time and the time span between, as well as learner identity.
      When students use Toggl, they can do so in semi-automatic mode or man-
  ually. Semi-automatic mode means that, when they start an activity, they can
  select it and click on a start button. When they finish the activity, they click
  on a stop button. Manually means that the students have to specify activity,
  time, and duration to Toggl. In this way, students can add activities that they
  forgot to report or edit them manually. Of course, on the one hand, this kind
  of tracking is tedious and error prone - hence the manual option. On the other
  hand, requiring students to log time may make them more aware of their time
  investment and may trigger more conscious decisions about what to focus on or
  how much time to spend on a specific activity.
      The main course objective is to change the perspective of how they look at
  software applications, from a code-centric view to a more user-centric view. That
  is an additional reason why self-reflection is important in this context.

  2.2   Description of the interface
  Figure 1 illustrates how the data are made available in their complete detail in
  our StepUp! tool: this is a “Big Table” overview where each row corresponds
  with a student. The students are clustered in the groups that they belong to.
  For instance: rows 1-3 contain the details of the students ‘anneeverars’, ‘ganji ’
  and ‘greetrobijns’ (see marker 1 at Figure 1). These three students work together
  in a group called ‘chigirlpower’, the second column in the table (marker 2). The
  green cells in that second column indicate that these students made 8, 9 and 13
  posts in their group blog respectively (marker 3). Rows 4-6 contain the details
  of the second group, called ‘chikulua12‘: they made 1, 4 and 18 comments on
  the blog of the first group (column 2) and 9, 6 and 9 posts in their own blog
  (column 3) respectively (marker 4). The rightmost columns (marker 5) in the
  table indicate the total number of posts, the total number of hours spent on the
  course (Toggl) and the total number of tweets.
      The two rightmost columns are sparklines[9] that provide a quick glance of
  the overall evolution of the activity for a particular student (marker 6). They
  can be activated to reveal more details of student activity (marker 7 and 8).
      As is obvious from Figure 1, this is a somewhat complex tool. Originally, the
  idea was that this would mainly be useful for the teacher - who can indeed provide
  very personal feedback to the students, based on the in-depth data provided by
  the table. However, somewhat to our surprise, and as illustrated by Figure 2
  and Figure 3, this overview is used by almost all students once per week, for an
  average of about 10 minutes.
      Nevertheless, in order to provide a more personalized and easy to understand
  view that students can consult more frequently, which is important for awareness



                                           76
Empowering students to reflect on their activity with StepUp!




                         Fig. 1. First case study - Big table View




                        Fig. 2. Analytics of Big Table use (daily)




                          Fig. 3. Analytics of Big Table (week)




                                           77
Empowering students to reflect on their activity with StepUp!
  support, we have developed a mobile application for these data (see Figure 4)
  that we released recently, as discussed in future work section below.




                        Fig. 4. Profile view in Mobile Application




  2.3   Evaluation

  We carried out a rather detailed evaluation six weeks into the course, based on
  online surveys. In the evaluation, we used five instruments, in order to obtain a
  broad view of all the positive and negative issues that these could bring up:

   1. open questions about student opinions of the course;
   2. questions related to their awareness of their own activities, those of their
      group and those of other groups;
   3. opinions about the importance of the social media used in the course;
   4. questions about how StepUp! supports awareness of their own activity, that
      of their group and of other groups;
   5. a System Usability Scale (SUS) evaluation focused on the tool [11].

     Another goal of our evaluations is to gather new requirements to improve the
  course and the deployed tools. This task becomes complex because sometimes
  students are not aware about the goals of the course.
     Below, we summarize the main outcomes of this evaluation.


  Demographics In total, 27 students participated in the evaluation; they are
  between 20 and 23 years old and include 23 males and 4 females. All the partic-
  ipants are students of the Human Computer Interaction course.



                                           78
Empowering students to reflect on their activity with StepUp!
  Open Questions For the open questions, the students were asked about pos-
  itive and negative aspects of the course, and they were asked how they would
  improve the course.
      Overall, the use of the learning analytics seems to be well received, as il-
  lustrated by the following quotes: “I like the interactive courses. As professor
  Duval said himself, it allows him to adjust us faster. We (the students) keep
  on the right track. Otherwise, we might do a lot of worthless work and thus lose
  valuable time we could invest better in other ways in this course.” or “The course
  is different from any courses I taken before as there is class participation, imme-
  diate feedback etc.”. Neither the negative aspects mentioned, nor the suggestions
  to improve the course related to the use of learning analytics.




                   Fig. 5. Evaluation first case study - Awareness part


  Awareness We asked students questions on whether they think they are aware
  of how they, their group and the other students in class spend efforts and time
  in the course, and whether they consider this kind of information important.
      Overall, the students think that they are very aware of their own efforts, just
  a little bit less aware of the efforts of the other members in their group, and
  less aware of the efforts by members of other groups - Figure 5 (left box plot)
  provides more details.

  StepUp! support As illustrated by Figure 5 (right box plot), students evaluate
  the support by StepUp! for increased awareness rather positively: the students
  agree that the tool reinforces transparency, that it helps to understand how peers
  and other students invest efforts in the course. This is important because these
  data suggest that the tool does achieve its main goal.



                                           79
Empowering students to reflect on their activity with StepUp!
  SUS questionnaire Overall, the SUS usability questionnaire rating of StepUp!
  is 77 points on a scale of 100. This score rates the dashboard as good [11].
  From our previous design, we have increased 5 points in this scale [7], which is
  encouraging.


  3     Second case study
  3.1   Tracked data
  The second case study ran with 13 master students working on their master
  thesis. All of them work on HCI topics such as music visualization and augmented
  reality. In this case study, most students work individually on their thesis topics,
  except for two students who work together on one topic.
      As in the previous case study, they report their progress on blogs, share
  opinions and communicate with their supervisors and each other on twitter. In
  addition, they use TiNYARM. The use of this tool is intended to increase the
  awareness of supervisors and students. They can suggest papers to each other,
  see what others have read and read papers that are suggested to them.
      In our previous experiment [9], we tracked the time spent using RescueTime,
  a completely automatic time tracking tool. In section 2, students reported the
  time spent on activities using Toggl. In this case study, students do not report
  time spent. The goal behind this setup is to figure out how important the time
  spent traces are for our students.

  3.2   Description of the interface




                       Fig. 6. Second case study - Big table View


      Figure 6 illustrates how the data are made available in their complete detail
  in our StepUp! Tool.
      The students are ordered alphabetically and in the groups that they belong
  to, as it is the case for ‘annivdb and ‘mendouksai (marker 1 at Figure 6). For
  instance: rows 1-2 contain the details of the students already mentioned before.



                                           80
Empowering students to reflect on their activity with StepUp!
  These two students work together on a thesis topic (augmented reality). The
  green cells in that second column indicate that these students made 17 and 15
  posts in their blog respectively (marker 2). Row 3 contains the details of another
  student who is working individually on his thesis: he made 2 comments on the
  blog of the group working on augmented reality (column 2) and 43 posts in his
  own blog (column 3) (marker 3). The rightmost columns in the table indicate
  the total number of tweets and read, skimmed, suggested and to read papers
  (marker 4).
     The rightmost column is a sparkline that provides a quick glance of the overall
  evolution of the twitter, blog and TiNYARM activity for a particular student.
  They can be activated to reveal more details of student activity (marker 5).


  3.3   Evaluation

  We carried out the same detailed evaluation as in the previous case study. How-
  ever, in this case study, students had not accessed the tool before. The idea
  behind of this evaluation setup was to analyze how the use or not use of the tool
  before influenced the perceived usefulness of the tool.


  Demographics In total, 12 students participated in the evaluation; they are
  between 21 and 25 years old and include 10 males and 2 females.


  Open Questions For the open questions, the students were asked about pos-
  itive and negative aspects of the course, and they were asked how they would
  improve the course.
      Overall, the use of social networks seems to be well received, as illustrated
  by the following quotes: “The blogs are a good way to get an overview of what
  everyone is doing. ” or “Having a blog is also a good thing for myself, because
  now I have most of the information I processed in one place.”


  Awareness We asked students questions on whether they think they are aware
  of how they, and the other students in class spend efforts in the course, and
  whether they consider this kind of information important.
      Overall, the students think that they are very aware of their own efforts and
  less aware of the efforts by other members of the course - Figure 7 (left box plot)
  provides the details. These results are similar to the previous case study.


  StepUp! support As illustrated by Figure 7 (right box plot), students evaluate
  the support by StepUp! different from the previous case study. They consider
  that StepUp! provides better transparency, but indicate that this tool is less
  useful to understand how others spend their efforts. As we discuss in the next
  section, time seems to be a really useful indicator to understand how others are
  behaving, being this the main difference with the previous use case.



                                           81
Empowering students to reflect on their activity with StepUp!




                  Fig. 7. Evaluation secondcase study - Awareness part


     One of the students remarked that he would have liked to realize earlier his
  low activity on commenting blogs, an all the rest agreed that they should have
  been more active in the use of social networks.


  SUS questionnaire Overall, the SUS usability questionnaire rating of StepUp!
  is 84 points on a scale of 100. This score rates the dashboard as almost excellent
  [11]. From the previous experiment, we have increased 5 points in this scale.
  The main difference from the previous use case is that we replaced Toggl data
  by data that is tracked by TiNYARM. We could say that the complexity of
  the visualization decreases by erasing Toggl data. In the previous use case, we
  visualized two units, time (Toggl) and number of actions (Twitter and Blog).
  In the second case study we focus on number of actions (Twitter, Blog and
  TiNYARM). In the second case study, the number of users decreases, hence the
  size of table is also smaller - which may also affect the usability results.
      Although the usability results can be encouraging, results of this case study
  indicate that StepUp! is less useful to understand the efforts of peer students.
  As Toggl data was not included in the visualizations of this case study, this
  may have affected this perceived usefulness. These results indicate that further
  evaluation studies are required to assess the impact of visualized data to support
  awareness.


  4    Discussion and open issues

  The field of learning analytics has known explosive growth and interest recently.
  Siemens et al. [12] presents an overview of ongoing research in this area. Some



                                           82
Empowering students to reflect on their activity with StepUp!
  of that recent focuses more on Educational Data Mining, where the user traces
  power recommendation algorithms [2, 3]. When learning analytics research ap-
  plies visualizations, it is typically less focused on dashboards and less systematic
  evaluations of the usability and usefulness of the tools are conducted.
      In this paper, we have presented two case studies. The first study focuses
  on visualizing social network activity and complementarily time reporting on
  predefined activities in a course that follows an open learning approach. The
  second case study focuses exclusively on the social network activity.
      Time is a commonly used indicator for planning. Based on the European
  Qualification Framework of higher education, degrees and courses have been
  assigned a number of credits called European Credit Transfer System (ECTS).
  Each of these credits have an estimation of time, one credit is approximately 30
  hours. Therefore, time spent seems to be a good indicator to take into account
  for reflection and to check whether the time spent by the student in the course
  is properly distributed. Time is also used in empirical studies[13]. In addition,
  our results supports this idea. Students seems to understand better how others
  spend their efforts when time spent is visualized.
      However, time tracking is not an easy task. Manual tracker systems and
  applications such as Trac[14], Toggl described in this paper and twitter [15] are
  used in learning experiments for this purpose. These systems rely on the user
  to report time. They require such explicit action as well as the implicit process
  of reflection. But these systems enable users to game the system overestimating
  the time spent on the course. On the other hand, the deployment of automatic
  trackers such as Rescuetime [7] and logging systems of learning management
  systems [15] release the user of such manual reporting tasks. These trackers
  are able to categorize the used tools by the activity that they are intended
  for. Usually, they are less abstract activities. Moreover, they are not able to
  track time on tasks done offline such as reading a book or having a meeting.
  Nevertheless, time tracking has influenced the results of the evaluations. In the
  second case study, student reported worse understanding on how others spend
  their efforts.
      From the evaluations and discussion above is clear that many open research
  issues remain. We briefly discuss some of them below.

   1. What are relevant learner actions? We track tweets and blog posts and ask
      students to track their efforts on specific course topics and activities. How-
      ever, we track quantitative data that tells us little or nothing about the
      quality of what students do. Obviously, these data provide in some sense
      information about necessary conditions: if the students spend no time on
      particular topics, then they will probably not learn a lot about them either.
      However, they may spend a lot of time on topics and not learn a lot. Or they
      may be quite efficient and learn a lot with little investment of time. It is
      clear, that we need to be quite careful with the interpretation of these data.
   2. How can we capture learner actions? We rely on software trackers for laptop
      or desktop interactions, and social media for learner interactions (through
      twitter hash tags and blog posts and comments). We could further augment



                                           83
Empowering students to reflect on their activity with StepUp!
      the scope of the data through physical sensors for mobile devices. However,
      capturing all relevant actions in an open environment in a scalable way is
      challenging.
   3. How can we evaluate the usability, usefulness and learning impact of dash-
      boards? Whereas usability is relatively easy to evaluate (and we have done
      many such evaluations of our tools), usefulness, for instance in the form of
      learning impact, is much harder to evaluate, as this requires longer-term and
      larger-scale evaluations.
   4. How can we enable goal setting and connect it with the visualizations, so as to
      close the feedback loop and enable learners and teachers to react to what they
      observe and then track the effect of their reactions? We are experimenting
      with playful gamification approaches, that present their own challenges [16],
      for instance around trivialization and control.
   5. There are obvious issues around privacy and control - yet, as public attitudes
      and technical affordances evolve [17], it is unclear how we can strike a good
      balance in this area.


  5    Conclusions and future work

  Our main goal with StepUp! is to provide students with a useful tool and to
  empower them to become better students. From our point of view, they should
  work in an open way sharing their knowledge with the world and having some
  impact in others opinion.
  StepUp! supports our open learning approach providing more transparency in
  the social interaction. It provides students an opportunity to reflect on their ac-
  tivity to take a look to this quantitative data and see how others are performing
  within the community.
  Time tracking seems to be a useful indicator for students to understand how
  students spend their efforts and to increase awareness on the course activity.
  Furthermore, usefulness of a tool is not only based on conclusions driven by vi-
  sualizations. How we collect the traces also influences such a factor. To this end,
  manual and automatic tracking require more research. Design is also a factor that
  influences the use of our application. To this end, we are currently experimenting
  with other approaches. For instance, we have currently deployed a mobile web
  application (see Figure 5) that provides a quick overview and indicators on their
  activity. We expect to reduce the cognitive efforts making them more attractive
  to use these tools.
  In conclusion, we believe that a sustained research effort on learning analytics
  dashboards, with a systematic evaluation of both usability and usefulness, can
  help to make sure that the current research hype around learning analytics can
  lead to real progress. As we already mention in section 2, we propose to deploy
  new versions of StepUp! on different devices to research how devices can influ-
  ence the reflection process from a Human Computer Interaction perspective, for
  instance evaluating the profile view (Figure 4) for mobile devices. Furthermore,
  as explained in section 4, we are interested mainly to figure out the relevant



                                           84
Empowering students to reflect on their activity with StepUp!
  traces for the students, to involve sensors to track external data and to enable
  goal setting.


  6    Acknowledgements

  This work is supported by the STELLAR Network of Excellence (grant agree-
  ment no. 231913). Katrien Verbert is a Postdoctoral Fellow of the Research
  Foundation -Flanders (FWO). The work of Jose Luis Santos has received fund-
  ing from the EC Seventh Framework Programme (FP7/2007-2013) under grant
  agreement no 231396 (ROLE).


  References
   1. Duval, E.: Attention please! learning analytics for visualization and recommenda-
      tion. In: Proceedings of LAK11: 1st International Conference on Learning Analytics
      and Knowledge,, ACM (2011) 9–17
   2. Pechenizkiy, M., Calders, T., Conati, C., Ventura, S., Romero, C., Stamper, J.,
      eds.: Proceedings of EDM11: 4th International Conference on Educational Data
      Mining. (2011)
   3. Verbert, K., Manouselis, N., Drachsler, H., Duval, E.: Dataset-driven research to
      support learning and knowledge analytics. Educational Technology and Society
      (2012) 1–21
   4. Duval, E., Klerkx, J., Verbert, K., Nagel, T., Govaerts, S., Parra Chico, G.A., San-
      tos, J.L., Vandeputte, B.: Learning dashboards and learnscapes. In: Educational
      Interfaces, Software, and Technology,. (May 2012) 1–5
   5. Fischer, G.: Understanding, fostering, and supporting cultures of participation.
      interactions 18(3) (May 2011) 42–53
   6. Wenger, E.: Communities of Practice: Learning, Meaning, and Identity (Learning
      in Doing: Social, Cognitive and Computational Perspectives). 1 edn. Cambridge
      University Press (September 1999)
   7. Santos, J.L., Govaerts, S., Verbert, K., Duval, E.: Goal-oriented visualizations of
      activity tracking: a case study with engineering students. In: LAK12: International
      Conference on Learning Analytics and Knowledge, Vancouver, Canada, 29 April -
      2 May 2012, ACM (May 2012) Accepted.
   8. Parra, G., Klerkx, J., Duval, E.: Tinyarm: Awareness of relevant research papers
      through your community of practice. In: Proceedings of the ACM 2013 conference
      on Computer Supported Cooperative Work. (2013) under review.
   9. Tufte, E.R.: Beautiful Evidence. Graphics Press (2006)
  10. Rogers, Y., Sharp, H., Preece, J.: Interaction Design: Beyond Human-Computer
      Interaction. John Wiley and Sons Ltd (2002)
  11. Bangor, A., Kortum, P.T., Miller, J.T.: An empirical evaluation of the system
      usability scale. Int. J. Hum. Comput. Interaction (2008) 574–594
  12. Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S., Ferguson,
      R., Duval, E., Verbert, K., Baker, R.: Open learning analytics: an integrated and
      modularized platform: Proposal to design, implement and evaluate an open plat-
      form to integrate heterogeneous learning analytics techniques. Society for Learning
      Analytics Research (2011)




                                           85
Empowering students to reflect on their activity with StepUp!
  13. Keith, T.Z.: Time spent on homework and high school grades: A large-sample path
      analysis. Journal of Educational Psychology 74(2) (1982) 248–253
  14. Upton, K., Kay, J.: Narcissus: Group and individual models to support small
      group work. In: Proceedings of the 17th International Conference on User Mod-
      eling, Adaptation, and Personalization: formerly UM and AH. UMAP ’09, Berlin,
      Heidelberg, Springer-Verlag (2009) 54–65
  15. Govaerts, S., Verbert, K., Duval, E., Pardo, A.: The student activity meter for
      awareness and self-reflection. In: CHI EA ’12: Proceedings of the 2012 ACM An-
      nual Conference Extended Abstracts on Human Factors in Computing Systems
      Extended Abstracts,, ACM (May 2012) 869–884
  16. Deterding, S., Sicart, M., Nacke, L., O’Hara, K., Dixon, D.: Gamification. using
      game-design elements in non-gaming contexts. In: Proceedings of the 2011 annual
      conference extended abstracts on Human factors in computing systems. CHI EA
      ’11, New York, NY, USA, ACM (2011) 2425–2428
  17. Jarvis, J.: Public Parts: How Sharing in the Digital Age Improves the Way We
      Work and Live. Simon Schuster (2011)




                                           86