=Paper= {{Paper |id=Vol-2359/paper1 |storemode=property |title=Exploring the connection between gamification and student engagement in computer-supported collaboration |pdfUrl=https://ceur-ws.org/Vol-2359/paper1.pdf |volume=Vol-2359 |authors=Antti Knutas,Timo Hynninen,Annika Wolff,Jayden Khakurel |dblpUrl=https://dblp.org/rec/conf/gamifin/KnutasHWK19 }} ==Exploring the connection between gamification and student engagement in computer-supported collaboration== https://ceur-ws.org/Vol-2359/paper1.pdf
                            Exploring the connection between gamification and
                            student engagement in computer-supported collabo-
                                                  ration

                       Antti Knutas1[0000−0002−6953−0021] , Timo Hynninen2[0000−0002−1354−001X] , Annika
                             Wolff1[0000−0002−6638−6677] , and Jayden Khakurel1[0000−0002−1397−5478]
                                      1
                                       LUT University, Yliopistonkatu 34, Lappeenranta 53850, Finland
                                      (antti.knutas,annika.wolff,jayden.khakurel)@lut.fi
                           2
                             South-Eastern Finland University of Applied Sciences, Patteristonkatu 3, 50100 Mikkeli,
                                                                   Finland
                                                        timo.hynninen@xamk.fi



                                Abstract. Collaboration platforms and online forums often employ gamification
                                in order to produce a better end-user experience. However, designing for gamified
                                elements can be challenging, especially in the education context of learning plat-
                                forms: In order to support online learning activities, one must carefully deliberate
                                which type of activities can be enhanced with gamified design. In this paper, we
                                investigate the effect of gamification in relation to activities on a collaborative on-
                                line platform. Specifically, we use the partial least squares path modelling method
                                to determine the effect of gamification in relation to three different collaborative
                                learning activities: 1) seeking help, 2) helping other students, and 3) receiving help.
                                According to our analysis, gamification has a positive impact on students engaging
                                in beneficial collaborative behaviour, such as asking questions and helping others
                                by answering. Furthermore, gamification has an indirect impact on performance,
                                which was measured by the students receiving help on the platform.

                                Keywords: gamification · computer-supported collaborative learning · student
                                engagement · path modelling


                       1     Introduction
                      Collaborative learning is a learning method where students have a symmetry of action,
                      knowledge and status, and have a low division of labor [11]. Computer-supported
                      collaborative learning (CSCL) facilitates the interaction with software tools and increases
                      the potential for creative activities and social interaction [47]. Furthermore, CSCL is an
                      approach used in online environments such as Massive Open Online Courses (MOOCs)
                      to provide mutual learning opportunities, often implemented through a discussion forum
                      as a part of a ‘social learning’ strategy [16].
                          However, in order for the students to benefit from the collaboration, they first have to
                      be engaged as users of the platform. In recent studies, it has been shown that students
                      can be guided towards educational goals like collaboration by using gamification [35],
                      which is the intentional use of game elements for a gameful experience of non-game
                      tasks and contexts [9,27,45]. Furthermore, gamification approaches have had success




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                  1
                      in engaging users of online programs [30] and engaging students in online discussions
                      [12].
                          Points, badges, and leaderboards are one of the most often used gamification elements
                      [45] and are being used in successful online collaboration platforms, such as Stack
                      Overflow and Quora. The elements have faced critique for being rather simple and over-
                      used [34], with authors calling for more holistic design of gamified systems [7,8,34].
                      However, according to earlier research [32] those gamification elements have had an
                      impact on performance, if not on motivation. Our research goal is to further explore the
                      connection between users engaging in collaborative behaviour and the most commonly
                      used gamification features of online collaboration forums.
                          To realize our goals, we explored a previously collected survey dataset [29] from
                      an introduction to programming course using the Partial Least Squares Path Modelling
                      (PLS-PM) research method [20], which has also been used in previous gamification
                      research [3,22]. We used the method to create a path model to evaluate hypotheses
                      between students engaging with gamification features, other engagement behaviours,
                      and collaborative outcomes.
                          In the next section, we review theory and present our hypotheses. In Section 3, we
                      present the research context and data analysis. In Section 4, we present our findings. The
                      paper concludes with Section 5.


                       2     Theory and hypotheses
                       2.1   Gamification in collaborative learning
                       Collaborative learning fosters the development of critical thinking through interactions
                       such as discussion, clarification of ideas, and evaluation of others’ ideas in both traditional
                       classroom and online programs [14,19]. To stimulate engagement and influence the
                       learners’ activities within the collaborative learning, gamification has been considered
                       as one possible approach through the inclusion of game-like features such as points
                       and badges, in non-game contexts [30]. However, Looyestrn et al. [30, p. 1] point out
                       that ”engagement in online programs is difficult to maintain” and performs a systematic
                       review to understand “Are gamification strategies effective in increasing engagement
                       in online programs?”. In the review, [30] examined a variety of forms of gamification:
                       leader boards, badges, points and rewards, and commonly in combination. The authors
                       claim that leader boards as a gamification feature are particularly effective compared
                       to extrinsic rewards such as badges and points which tend to wear off after a short
                       period of novelty. The study concludes that gamification can increase engagement
                       and downstream behaviors, and enhance related outcomes (e.g. health behaviour and
                       academic performance) in online programs. Other authors, for example, Ding et al. [12],
                       applied a gamification approach and conducted two trial studies in the online discussion
                       tool, gEchoLu. The study concludes, gamification is still in its infancy stage and students
                       may have limited experiences with it. For gamification to be successful, i) every game
                       element of gamification should be meaningful to students; ii) instructors should serve as
                       facilitator.
                           Fundamentally, gamification aims to improve both user engagement and user expe-
                       rience [9]. The different empirical studies concerning the effect of gamification were




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                 2
                      examined in a literature review by Hamari et al. [21], where the authors create a frame-
                      work for examining the users’ motivational affordances, along with psychological and
                      behavioral outcomes of gamification. The study by Hamari et al. [21] stresses the role of
                      context in which gamification is being applied, and also users’ qualities. The authors state
                      that many previous studies do not take into account the complexity of the phenomena
                      they observe: Gamification can be effective, but the effects are ”greatly dependent on the
                      context in which gamification is being implemented.”


                       2.2   Engagement and online discussion

                       Student engagement is often defined by three components: behavioural, as evidenced
                       through active engagement in learning tasks; cognitive, as evidenced through learning;
                       and emotional, as evidenced through the level of interest [1]. Of these, the most com-
                       monly used within online learning environments (and the easiest to capture at scale) is the
                       behavioural component, as indicated by a broad spectrum of online actions [37]. More
                       focused research has looked at how students engage in discussion forums. Once seen as
                       an extra to learning activities, forums are increasingly being recognised as an important
                       part of a ‘social learning’ approach, and are used by MOOCs such as Futurelearn to
                       support learning steps [16]. Participation in forums can be shown to lead to improved
                       learning outcomes [40], with both student-student and instructor-student communication
                       being important [13].


                       2.3   Research model and hypotheses

                      In this section, we present our research model and then present our list of hypotheses.
                          Our research model is based on four constructs, which are all based on behavioural
                      engagement [1] with different platform features. The constructs are based on engagement
                      with (1) gamification features, (2) asking questions, and (3) answering questions. We
                      define performance as the number of times the forum user has received help. The survey
                      is not based on a pre-published instrument but does have similarities to engagement
                      measures used by Shin [46]. The research model and the hypotheses are presented
                      visually in Fig. 1 and the hypotheses with their rationale are listed as follows.
                          Gamification has been shown to increase engagement with online platforms [45,30]
                      and in MOOCs [18,48]. From this, we derive hypotheses H1 and H2 that forum users
                      engaging with gamification features has an impact on increasing behavioural engagement
                      with collaborative features, operationalized as initiating and responding to discussions.


                       H1. Engaging with gamification features positively influences asking questions on the
                       platform.


                       H2. Engaging with gamification features positively influences answering questions on
                       the platform.




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                             3
                        Participation in course forums has been shown to lead to improved learning outcomes
                       [40] and increased collaboration [29,30]. In this study, we evaluate if the increased
                       collaborative activity on the forum increases collaborative outcomes, defined as instances
                       of received help on the forum (H3a, H3b). Furthermore, we evaluate if the impact of
                       gamification on outcomes (H4) is mediated by collaborative activity in the form of asking
                       or answering questions.

                       H3a. Asking questions and seeking for answers on the platform has a positive influence
                       on receiving help on the platform.

                       H3b. Answering questions on the platform has a positive influence on receiving help on
                       the platform.

                       H4. Engaging with gamification features indirectly increases the chances of positive
                       collaborative outcomes, or receiving help on the forum.




                                                        Fig. 1. Research model.




                       3   Research context and methods

                      The study was conducted for the duration of a fourteen week long introduction to
                      programming course that had 249 participating students, four assistants helping with the
                      exercises and one lecturer ultimately responsible for the course. Most of the participants
                      were first year university students. The course had weekly two-hour lectures, exercises
                      of the same duration and weekly assignments that had to be returned through an online
                      grading system. The grading depended on the return of the weekly exercises, one larger
                      exercise project, and a final exam. Additionally, lecture or online activity was rewarded
                      with activity bonus points.
                          The course has online support materials like a programming guide, lecture notes, and
                      video lectures. This year an online asynchronous collaboration system with gamification




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                            4
                       elements was added to the course 3 . The use of the system was taught on the introductory
                       lecture of the course and the assistants also encouraged its use over questions by email,
                       in order to have other participants to the course benefit from the discussion.
                           The discussion system’s primary feature is the possibility for participants to post
                       questions or issues regarding the course and for others to participate in discussions.
                       The somewhat novel element that separates this discussion system from basic online
                       forums is the aspect of gamification. The system recognizes and rewards people who
                       constructively contribute with answers, comments or just ask good questions. Other
                       users of the systems can upvote or downvote content of the system, further rewarding
                       and encouraging other contributors. The users can also publicly show the points and
                       achievement badges collected in the system. In essence, the system is much like Stack
                       Overflow.
                           An earlier study analyzed the same course’s student behaviour from a social network
                       analysis perspective [29] using recorded system interactions. The study also collected a
                       survey about user’s engagement behaviour, which could benefit from further in-depth
                       analysis, which we present in this paper. The students were surveyed about their use of
                       the system, including how often they had engaged with gamification features, asking
                       and answering questions, and from how many other students they had received help. In
                       the survey, 73 users replied, 61 of whom had used the Q2A discussion system and 12
                       students who hadn’t.
                           Survey items are presented in Table 1. Survey items in categories GAM, ANS, ASK
                       asked ”how often” and answers were on a five-step Likert scale of ”never” to ”more than
                       once per day.” Survey item REC1 asked ”how many” on a scale of one to five, with one
                       denoting one and six denoting more than five. Engaging with the gamification system’s
                       features is operationalized by asking participants how involved they were with checking
                       up their points status and voting, which is essentially like awarding points to other users.
                       The archival nature of the data imposed some limitations on how the constructs were
                       operationalized, such as measuring some constructs with only one variable and not
                       having more diverse variables for the ”engaging with gamification” construct.


                       3.1     Data analysis

                      We analyzed the research model with a path modelling approach. Path modelling allows
                      specifying a visual research model with multiple theory-based constructs and defining
                      paths between them that represent hypotheses of variable or construct relationships [20].
                      We selected the PLS-PM statistical modelling method with non-parametric bootstrapping
                      to cope with slight non-normality of the observed variables and low sample size (for a
                      path modelling approach) [20,23,24]. Furthermore, PLS-PM is an appropriate method to
                      use because of the limited sample size, the research is based on secondary or archival
                      data, and the path model includes formatively measured constructs [44]. We applied the
                      SmartPLS 3 software package for data analysis [39]. Our sample size was 61, which is
                      enough for PLS-PM both according to the rule of thumb of having 10 times per largest
                      number of paths from independent variable going into a dependent variable [20] and
                        3
                            http://www.question2answer.org




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                              5
                                                          Table 1. Survey items

                        Engaging with gamification                     Asking questions
                        features
                        GAM1. During the weeks I worked on the         ASK1.      During the weeks I worked on the
                                 course, I checked my point and                   course, I searched for and read use-
                                 badge status.                                    ful message threads.
                        GAM2. During the weeks I worked on             ASK2.      During the weeks I worked on the
                                 the course, I voted on other users’              course, I asked questions on the
                                 questions and comments.                          platform.

                        Answering questions                            Received help
                        ANS1.    During the weeks I worked on the      REC1. On the QETA platform, I received
                                 course, I answered questions (on               help from this many users.
                                 the QETA system).



                       using a power analysis guidelines adapted by Hair et al. [20] from [4] at 5% significance
                       level.



                      Specifying and validating the measurement model
                      In our model, we used both reflective and formative constructs [20]. This causes some
                      considerations, as not all validity measures apply to formative constructs [20]. Assessing
                      convergent and discriminant validity using criteria similar to those associated with
                      reflective measurement models is not meaningful when formative indicators and their
                      weights are involved [2,20]. For formative and reflective constructs we validated the (1)
                      measurement reliability and (2) model structure. For reflective constructs, we additionally
                      validated the (3) discriminant validity. Furthermore, we assessed nomological validity,
                      according to which the relationships between the constructs in the path model, sufficiently
                      well-known through prior research, should be strong and significant [26]. The survey was
                      not based on existing measurement scales or instruments, which is its main weakness,
                      but after assessment, we decided that it is of sufficient nomological quality to proceed
                      with analysis.
                          The measurement reliability was assessed with composite reliability (CR) and aver-
                      age variance extracted (AVE) indicating convergent validity. The measurement reliabili-
                      ties are reported in Table 2. All of the convergent validity metrics were greater than the
                      thresholds recommended in literature (CR > 0.7; AVE > 0.5) [17,24]. Some constructs
                      are composed of only one variable because of the archival nature of the data, which
                      prevents the evaluation of their convergent validity. The fields where evaluation was
                      not possible are marked with N/A. However, other quality metrics were high enough
                      that we proceeded with analysis. Furthermore, single variable constructs are possible
                      in PLS-PM, especially in cases where the variable is a concrete, observable item, even
                      though their use decreases construct reliability [20]. It should be noted that because of
                      this issue the data should be considered only an initial, exploratory investigation instead
                      of full-fledged theory testing.




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                 6
                                                           Table 2. Measurement reliabilities

                                                            Loading t-value p-value Mean SD                    AVE CR
                               Gamification                                                                    0.917 0.847
                               GAM1.                        0.907       27.819 ***            1.590 0.875
                               GAM2.                        0.934       51.364 ***            1.721 0.889
                               Asking questions                                                                0.728 0.591
                               ASK1.                        0.946       28.115 ***            2.705 0.554
                               ASK2.                        0.534       2.216 *               1.492 0.617
                               Answering questions                                                             N/A N/A
                               ANS1.               1                    N/A        N/A        1.377 0.705
                               Received help                                                                   N/A N/A
                               REC1.                        1           N/A        N/A        3.492 1.743
                               *) Statistically significant at p<0.05, **) Statistically significant at p<0.01, ***) Statistically sig-
                               nificant at p<0.001




                          We analyzed the structure of the measurement model, where applicable, by signifi-
                      cance and weight of factor loadings, and for cross-loadings between the latent factors.
                      All loadings in the outer model (measurement model) were significant and varying from
                      acceptable .534 to good .946, indicating valid model structure.
                          We assessed discriminant validity of the measurement model, firstly, by the square
                      root of AVE (i.e. Fornel-Larcker criterion [17]), where all of the AVE square roots should
                      be greater than the squared latent variable correlations. Secondly, we verified that all
                      item loadings exceed cross-loadings [24]. The results indicate good discriminant validity
                      of the measurement model. We did not use the heterotrait-monotrait ratio of correlations
                      (HTMT) method, as the use of some of the validity methods is still unclear for formative
                      or single variable constructs [20].


                      Validating the structural model
                      We used a bootstrapping method to evaluate the coefficients for their significance [5],
                      which is the recommended method for PLS-PM [20]. The bootstrap sample size was
                      n = 61, and resampling was performed 5000 times, which are parameters following
                      best practises in literature and are sufficient to evaluate the model [20,24]. We tested and
                      validated the quality of the structural model representing our hypotheses by evaluating
                      (1) collinearity issues and overall fit, (2) explanatory power, and (3) path significances.
                          We assessed the collinearity and the model fit in order to validate the structural model
                      and identify misspecification problems. VIF (variance inflation factor) of the latent
                      constructs did not indicate collinearity issues with values clearly between recommended
                      values (0.2 < VIF < 5) [20]. Because of the hypothesis testing objective of the paper,
                      we further assessed the overall fit of the structural model to the data in order to analyze
                      whether the model was specified correctly. For that purpose, we used the standardized
                      root mean square residual quality metric (SRMR < .08) [24] to evaluate estimation
                      error and misspecification of the model [20]. In our case, the model fit of SRMR = .106
                      indicates that serious misspecification of the structural model does not occur.




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                                  7
                          The R2 for the latent variables in the path model varies from ”answering” = 0.53 to
                      ”received help” = 0.35, which range from substantial to moderate according to guidelines
                      interpreted by Henseler et al. [25] from [2], indicating acceptable explanatory power
                      of the inner path model structure. If a good proportion of the variance is explained, the
                      explanatory power of the model can be high regardless of the relatively low sample size
                      [38].
                          We present the path significances and hypotheses testing results in the next section.


                       4     Findings

                      We analyzed direct and indirect effects in the model, which are defined by hypothesis one
                      to four and summarized Table 3. The path model could account for 36% of the variance in
                      receiving help, 34% of the variance in asking questions, and 54% in answering questions.
                      The model shows that engaging with gamification features has a strong influence on
                      engaging both in asking and answering behaviour (H1 and H2). Furthermore, engaging
                      in asking behaviour affects receiving help (H3a). Engaging in answering behaviour does
                      not increase the chances of receiving help on the platform (H3b).
                          In order to test H4, we evaluated the mediation effects following a process by Zhao
                      et al. [49], which is also proposed by Hair et al. [20] for use with PLS-PM. First, we
                      established that a minimum of one of the mediating effects is significant, with a path from
                      the gamification (GAM) construct to the received help (REC) through asking questions
                      (ASK), as displayed in Table 3. The second mediating effect through answering questions
                      (ANS) is not significant. Furthermore, the direct effect of GAM on REC is not significant
                      (β = 0.237, p = 0.054). After testing for and excluding direct and alternative mediating
                      effects, we conclude that GAM has an indirect effect on REC through only one construct,
                      ASK, and is fully mediated by it (H4). The path coefficient β and significance p presented
                      for H4 is the specific indirect effect through the ASK construct. To summarize, engaging
                      with gamification has an impact on receiving help only when it affects asking questions
                      behaviour.


                                                             Table 3. Effects and hypothesis support

                       Hypothesis Path                                             Effect (β) t-value p-value Supported
                       H1.              Gamification → Asking     0.585                            5.047 ***               Yes
                       H2.              Gamification → Answering 0.731                             11.499 ***              Yes
                       H3a.             Asking → Received help    0.626                            5.248 ***               Yes
                       H3b.             Answering → Received help -0.236                           1.514 n                 No
                                        Gamification → Asking
                       H4.                                        0.367                            4.008      ***          Yes
                                        → Received help
                       n) not significant, *) Statistically significant at p<0.05, **) Statistically significant at p<0.01, ***) Statistically significant
                       at p<0.001




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                                                     8
                       5   Discussion and conclusion
                      In this paper, we explored the connection between students engaging with gamification
                      features and engaging in computer-supported cooperative learning. The results indicate
                      that engaging with gamification is positively associated both with desired behaviour
                      and positive outcomes on the collaborative discussion platform. Previous research has
                      established that gamification can increase engagement with software systems [36,45,30]
                      and activity in online discussion [12]. In this study, we additionally demonstrate that
                      gamification increases engagement and this is connected to participants receiving help
                      in the forum. To summarize, students asking for help receive it and students who do not
                      need help are engaged in providing answers, with the activities supported by gamification
                      features.
                          The connection between gamification and increased collaboration in forum-based
                      CSCL systems suggests that asynchronous discussion systems structured in the manner
                      of Stack Overflow provide a valuable addition to courses on technical subjects. Here our
                      findings match the findings in other contexts on social learning, including the studies
                      by Ferguson et al. [16] on social learning in MOOCs and Romero et al. [40] on online
                      discussion forums. Furthermore, our findings imply that gamification is not only a useful
                      feature in supporting engagement in online learning environments [30], but can also
                      support beneficial outcomes, such as increased collaboration. Our research provides
                      some insight into mechanisms that can foster greater collaboration and social learning
                      opportunities. This is of value because gamification is a desirable feature in MOOC
                      platforms [28], but one that can be misunderstood or misapplied [10,15]. By providing
                      new evidence on how gamification affects phenomena present in collaborative and social
                      learning, the findings of this study can assist in the design of future social learning
                      features in MOOCs and other online educational environments.
                          Our main contribution to state of the art is confirming a connection between engaging
                      with a gamification system and collaborative outcomes. While this study did not explore
                      the underlying causes of effective gamification, the forum system follows design reflec-
                      tions found essential in other studies: 1) intrinsic integration, where the gamification
                      system (e.g. content voting and points) should be an intrinsic part of main activities [8];
                      2) facilitation [12]; and 3) generally supporting Deci and Ryan’s [6] self-determination
                      theory’s competence, relatedness, and autonomy factors from an education perspective
                      [41]. The connection between self-determination theory and the design of the case study
                      are discussed further in [29].
                          Another contribution of our study is confirming the findings by Mekler et al. [32]
                      using a different research method and a new context, providing evidence on the effective-
                      ness of points in the context of CSCL. While not a replication study, it is still valuable,
                      as more authors are calling for important findings in education research and beyond to be
                      replicated [31,33], and more in the wild studies to be implemented on gamification [42].
                      Mekler et al. did take their study to a more advanced level by also exploring the effects
                      of gamification elements at an individual level and including motivation, as defined in
                      Deci and Ryan’s self-determination theory [43], in addition to performance.
                          While we provide new additions to the scientific body of knowledge on gamification,
                      engagement and CSCL, we acknowledge some limitations. The main limitation is that
                      the model has minor imperfections because of the use of archival data and this has also




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                            9
                       limited the number and granularity of constructs in our research model. However, the
                       constructs and our hypotheses were solidly based on earlier literature, and the results
                       supported the hypotheses. As future work, we agree with earlier publications [32,42]
                       that point out the need for more in the wild studies with detailed, theory-driven models
                       that include motivation-based constructs that provide more understanding to connections
                       between diverse types of gamification, motivation, and outcomes.

                       Acknowledgements
                      We thank our colleague Mika Immonen for sharing his knowledge on the PLS-PM
                      research method.

                       References
                        1. Axelson, R.D., Flick, A.: Defining student engagement. Change: The magazine of higher
                           learning 43(1), 38–43 (2010)
                        2. Chin, W.W.: The partial least squares approach to structural equation modeling. Modern
                           methods for business research 295(2), 295–336 (1998)
                        3. Codish, D., Ravid, G.: Academic course gamification: The art of perceived playfulness.
                           Interdisciplinary Journal of E-Learning and Learning Objects 10(1), 131–151 (2014)
                        4. Cohen, J.: A power primer. Psychological bulletin 112(1), 155 (1992)
                        5. Davison, A.C., Hinkley, D.V., et al.: Bootstrap methods and their application, vol. 1. Cam-
                           bridge university press (1997)
                        6. Deci, E.L., Ryan, R.M.: Motivation, personality, and development within embedded social con-
                           texts: An overview of self-determination theory. The Oxford handbook of human motivation
                           pp. 85–107 (2012)
                        7. Deterding, S.: Eudaimonic Design, or: Six Invitations to Rethink Gamification. SSRN Schol-
                           arly Paper ID 2466374, Social Science Research Network, Rochester, NY (Jul 2014)
                        8. Deterding, S.: The Lens of Intrinsic Skill Atoms: A Method for Gameful Design. Hu-
                           man–Computer Interaction 30(3-4), 294–335 (May 2015)
                        9. Deterding, S., Dixon, D., Khaled, R., Nacke, L.: From Game Design Elements to Gamefulness:
                           Defining ”Gamification”. In: Proceedings of the 15th International Academic MindTrek
                           Conference: Envisioning Future Media Environments. pp. 9–15. MindTrek ’11, ACM, New
                           York, NY, USA (2011)
                       10. Dichev, C., Dicheva, D.: Gamifying education: what is known, what is believed and what
                           remains uncertain: a critical review. International Journal of Educational Technology in Higher
                           Education 14(1), 9 (2017)
                       11. Dillenbourg, P.: What do you mean by collaborative learning? Collaborative-learning: Cogni-
                           tive and computational approaches 1, 1–15 (1999)
                       12. Ding, L., Kim, C., Orey, M.: Studies of student engagement in gamified online discussions.
                           Computers & Education 115, 126–142 (2017)
                       13. Dixson, M.D.: Creating effective student engagement in online courses: What do students
                           find engaging? Journal of the Scholarship of Teaching and Learning pp. 1–13 (2010)
                       14. Faja, S.: Collaborative learning in online courses: Exploring students perceptions. Information
                           Systems Education Journal 11(3), 42 (2013)
                       15. Falkner, N.J., Falkner, K.E.: Whither, badges? or wither, badges!: a metastudy of badges in
                           computer science education to clarify effects, significance and influence. In: Proceedings
                           of the 14th Koli Calling International Conference on Computing Education Research. pp.
                           127–135. ACM (2014)




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                     10
                       16. Ferguson, R., Sharples, M.: Innovative pedagogy at massive scale: teaching and learning in
                           moocs. In: European Conference on Technology Enhanced Learning. pp. 98–111. Springer
                           (2014)
                       17. Fornell, C., Larcker, D.F.: Evaluating structural equation models with unobservable variables
                           and measurement error. Journal of marketing research pp. 39–50 (1981)
                       18. Gené, O.B., Núñez, M.M., Blanco, Á.F.: Gamification in mooc: challenges, opportunities and
                           proposals for advancing mooc model. In: Proceedings of the Second International Conference
                           on Technological Ecosystems for Enhancing Multiculturality. pp. 215–220. ACM (2014)
                       19. Gokhale, A.A.: Collaborative learning enhances critical thinking. Volume 7 Issue 1 (fall 1995)
                           (1995)
                       20. Hair Jr, J.F., Hult, G.T.M., Ringle, C., Sarstedt, M.: A primer on partial least squares structural
                           equation modeling (PLS-SEM). Sage Publications (2016)
                       21. Hamari, J., Koivisto, J., Sarsa, H.: Does Gamification Work? – A Literature Review of
                           Empirical Studies on Gamification. In: 2014 47th Hawaii International Conference on System
                           Sciences (HICSS). pp. 3025–3034 (Jan 2014)
                       22. Hamari, J., Shernoff, D.J., Rowe, E., Coller, B., Asbell-Clarke, J., Edwards, T.: Challenging
                           games help students learn: An empirical study on engagement, flow and immersion in game-
                           based learning. Computers in Human Behavior 54, 170–179 (2016)
                       23. Henseler, J., Dijkstra, T.K., Sarstedt, M., Ringle, C.M., Diamantopoulos, A., Straub, D.W.,
                           Ketchen Jr, D.J., Hair, J.F., Hult, G.T.M., Calantone, R.J.: Common beliefs and reality about
                           pls: Comments on rönkkö and evermann (2013). Organizational Research Methods 17(2),
                           182–209 (2014)
                       24. Henseler, J., Hubona, G., Ray, P.A.: Using pls path modeling in new technology research:
                           updated guidelines. Industrial management & data systems 116(1), 2–20 (2016)
                       25. Henseler, J., Ringle, C.M., Sinkovics, R.R.: The use of partial least squares path modeling in
                           international marketing. In: New challenges to international marketing, pp. 277–319. Emerald
                           Group Publishing Limited (2009)
                       26. Henseler, J., Ringle, C.M., Sinkovics, R.R.: The use of partial least squares path modeling in
                           international marketing. In: New challenges to international marketing, pp. 277–319. Emerald
                           Group Publishing Limited (2009)
                       27. Huotari, K., Hamari, J.: Defining Gamification: A Service Marketing Perspective. In: Pro-
                           ceeding of the 16th International Academic MindTrek Conference. pp. 17–22. MindTrek ’12,
                           ACM, New York, NY, USA (2012)
                       28. Kasurinen, J., Knutas, A.: Publication trends in gamification: A systematic mapping study.
                           Computer Science Review 27, 33–44 (2018)
                       29. Knutas, A., Ikonen, J., Nikula, U., Porras, J.: Increasing Collaborative Communications
                           in a Programming Course with Gamification: A Case Study. In: Proceedings of the 15th
                           International Conference on Computer Systems and Technologies. pp. 370–377. ACM (2014)
                       30. Looyestyn, J., Kernot, J., Boshoff, K., Ryan, J., Edney, S., Maher, C.: Does gamification
                           increase engagement with online programs? a systematic review. PloS one 12(3), e0173403
                           (2017)
                       31. Makel, M.C., Plucker, J.A.: Facts are more important than novelty: Replication in the education
                           sciences. Educational Researcher 43(6), 304–316 (2014)
                       32. Mekler, E.D., Brühlmann, F., Tuch, A.N., Opwis, K.: Towards understanding the effects of
                           individual gamification elements on intrinsic motivation and performance. Computers in
                           Human Behavior 71, 525–534 (Jun 2017)
                       33. Moonesinghe, R., Khoury, M.J., Janssens, A.C.J.: Most published research findings are
                           false—but a little replication goes a long way. PLoS medicine 4(2), e28 (2007)
                       34. Nacke, L.E., Deterding, S.: The maturing of gamification research. Computers in Human
                           Behavior 71, 450–454 (Jun 2017)




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                         11
                       35. Nah, F.F.H., Zeng, Q., Telaprolu, V.R., Ayyappa, A.P., Eschenbrenner, B.: Gamification
                           of education: a review of literature. In: International Conference on HCI in Business. pp.
                           401–409. Springer (2014)
                       36. Nah, F.F.H., Zeng, Q., Telaprolu, V.R., Ayyappa, A.P., Eschenbrenner, B.: Gamification of
                           education: a review of literature. In: International conference on hci in business. pp. 401–409.
                           Springer (2014)
                       37. Onah, D.F., Sinclair, J., Boyatt, R.: Dropout rates of massive open online courses: behavioural
                           patterns. EDULEARN14 proceedings pp. 5825–5834 (2014)
                       38. Prentice, D.A., Miller, D.T.: When small effects are impressive. Psychological bulletin 112(1),
                           160 (1992)
                       39. Ringle, C.M., Wende, S., Becker, J.M.: Smartpls 3. boenningstedt: Smartpls gmbh (2015)
                       40. Romero, C., López, M.I., Luna, J.M., Ventura, S.: Predicting students’ final performance from
                           participation in on-line discussion forums. Computers & Education 68, 458–472 (2013)
                       41. van Roy, R., Zaman, B.: Why gamification fails in education and how to make it successful:
                           introducing nine gamification heuristics based on self-determination theory. In: Serious Games
                           and edutainment applications, pp. 485–509. Springer (2017)
                       42. van Roy, R., Zaman, B.: Unravelling the ambivalent motivational power of gamification: A
                           basic psychological needs perspective. International Journal of Human-Computer Studies
                           (2018)
                       43. Ryan, R.M., Deci, E.L.: Self-determination theory and the facilitation of intrinsic motivation,
                           social development, and well-being. American psychologist 55(1), 68 (2000)
                       44. Sarstedt, M., Ringle, C.M., Hair, J.F.: Partial least squares structural equation modeling. In:
                           Handbook of market research, pp. 1–40. Springer (2017)
                       45. Seaborn, K., Fels, D.I.: Gamification in theory and action: A survey. International Journal of
                           Human-Computer Studies 74, 14–31 (Feb 2015)
                       46. Shin, N.: Online learner’s ‘flow’experience: an empirical study. British Journal of Educational
                           Technology 37(5), 705–720 (2006)
                       47. Stahl, G., Koschmann, T., Suthers, D.: Computer-supported collaborative learning: An histori-
                           cal perspective. Cambridge handbook of the learning sciences 2006, 409–426 (2006)
                       48. Vaibhav, A., Gupta, P.: Gamification of moocs for increasing user engagement. In: MOOC,
                           Innovation and Technology in Education (MITE), 2014 IEEE International Conference on.
                           pp. 290–295. IEEE (2014)
                       49. Zhao, X., Lynch Jr, J.G., Chen, Q.: Reconsidering baron and kenny: Myths and truths about
                           mediation analysis. Journal of consumer research 37(2), 197–206 (2010)




GamiFIN Conference 2019, Levi, Finland, April 8-10, 2019                                                                      12