=Paper= {{Paper |id=Vol-2637/paper7 |storemode=property |title=Gamification of an open access quiz with badges and progress bars: An experimental study with scientists |pdfUrl=https://ceur-ws.org/Vol-2637/paper7.pdf |volume=Vol-2637 |authors=Athanasios Mazarakis,Paula Bräuer |dblpUrl=https://dblp.org/rec/conf/gamifin/MazarakisB20 }} ==Gamification of an open access quiz with badges and progress bars: An experimental study with scientists== https://ceur-ws.org/Vol-2637/paper7.pdf
                                       Copyright © 2020 for this paper by its authors.
                  Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).




                         Gamification of an open access quiz with badges and
                         progress bars: An experimental study with scientists

                                             Athanasios Mazarakis1 and Paula Bräuer1
                                                1 Kiel University, Kiel 24118, Germany

                                                       a.mazarakis@zbw.eu
                                                        p.braeuer@zbw.eu



                            Abstract. Gamification is the use of game-design elements in non-game con-
                            texts. Gamification has been used in many different contexts to motivate differ-
                            ent audiences, but the effect on scientists has not yet been sufficiently investi-
                            gated. In a field study with 28 scientists the effects of the game design elements
                            badge and progress bar are examined for their potential to raise motivation for
                            scientists to answer questions in the context of an open access quiz. Previous
                            findings show that engaging with the topic open access can be perceived by sci-
                            entists as “boring”. By gamifying a quiz about open access with badges or a
                            progress bar we aim to create incentives for scientists to engage with the topic.
                            In our study both game design elements provide a statistically significant in-
                            crease in the number of questions answered compared to a control group.

                            Keywords: Gamification, Open Access, Motivation, Badge, Progress Bar,
                            Scientists


                     1      Introduction

                     Gamification refers to the use of game design elements in a context other than that of
                     a game [6]. The concept of gamification has meanwhile established itself in the field
                     of human-computer interaction [31]. In various application cases from fitness and
                     health [19, 28] to working environments [18, 32, 36] to education and training [4, 17]
                     it could be shown that gamification has a positive influence on increasing motivation
                     and performance. But how gamification can be used to motivate scientists, who most-
                     ly work under special conditions and have special criteria for the evaluation of their
                     performance, has not yet been sufficiently researched [10].
                         Usually work can have the potential to motivate on its own, especially if a flow
                     condition is achieved [5]. Gamification is a method to create or promote a flow condi-
                     tion [15]. But in contrast to flow theory, which requires full concentration to get into a
                     flow condition [5], gamification works with both conscious and unconscious per-
                     ceived game design elements.
                         Gamification studies with scientists are rare [10] and usually not in an experi-
                     mental setting [11]. This is also the case for gamification studies with the context of
                     open access. In addition scientists want to focus on other things and not on open ac-




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                        62
                     cess, because they consider this topic as complex and boring [20]. However this topic
                     is of highest importance for all scientists who publish their publicly funded results [8].
                        The aim of this article is to examine the impact of the game design elements badge
                     and progress bar on scientists in the context of an open access quiz. In addition this is
                     a first contribution to research the effect of gamification on scientists with an experi-
                     mental approach.
                        The rest of our article is structured as follows: An overview of the current state of
                     research is given in the following section. Section three describes the study in detail,
                     giving information about the procedure of our experiment, measurements and the
                     subjects which participated. The results follow in the fourth section. In the last sec-
                     tion, the results are discussed and pointed to limitations and possible future research.


                     2      Related work

                     Gamification is used in science to promote participation in citizen science projects
                     such as Galaxy Zoo or Foldit [2, 9, 30]. Through the use of game design elements,
                     incentives are created so that users generate new solutions or classify images and not
                     only once but preferably on a long-term basis.
                        Kidwell and colleagues provide empirical evidence how badges can motivate sci-
                     entists to publish scientific results according to open science principles. The authors
                     showed that the award of open science badges motivates scientists to make their re-
                     search data freely available [21].
                        Scientific platforms such as ImpactStory and ResearchGate also use game design
                     elements such as badges or progress indicators to motivate scientists. Hammarfelt and
                     colleagues [16] look at the impact of gamification on these platforms. The authors
                     suspect that the use of game design elements creates a motivating feeling when publi-
                     cations and online interactions with other users are transformed into points. They also
                     assume that their own position within the academic community can be more easily
                     defined through gamification. Scientific progress and comparison with other users is
                     thus facilitated by converting publications into points [16].
                        Feger and colleagues address the question of how gamification can be used in sci-
                     ence workplaces [10]. They present problems and challenges, such as the aspect that
                     progress in scientific work is difficult to quantify. In a follow-up study, based on an
                     extensive survey of high energy physicists, they developed various gamified ap-
                     proaches to motivate scientists to provide reproducible data [11].
                        Mekler [27] has already been able to show in a study on points, levels and leader-
                     boards that individually applied game design elements can have an effect on behav-
                     iour. For the present study, the two game design elements badge and progress bar
                     were selected in order to examine their effects individually and thus assess their po-
                     tential to encourage scientists to answer questions in a quiz in the context of open
                     access. Our study is a preliminary study of a larger study with the aim of improving
                     the impact of gamification as a tool for disseminating and improving knowledge
                     about open access. Both game design elements are used to build nonfiction gamified
                     experiences [37] and are easier to reuse in other contexts than narratives, for example.




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                        63
                         Badges in the context of games and gamification are digital artefacts which are
                     given to the user for the fulfilment of certain tasks and are therefore a visual represen-
                     tation [1]. They can be used to create a comparison with others or to challenge oneself
                     [13]. The game design element badge has already proven itself in other studies as an
                     effective instrument for increasing user activity [14, 26]. Morris et al. [29] could al-
                     ready show with an experimental study that the use of badges and goals has no influ-
                     ence on the acquired knowledge. Building on this, the present study will not examine
                     any learning effect, but only the motivation of the subjects to engage with a specific
                     topic for longer period of time. In our context this topic is open access.
                         In contrast to badges, the progress bar element is a less frequently studied game
                     design element in gamification [22, 34]. It is a simple visual way of informing the
                     users about their progress. Initial studies have shown that even this simple form of
                     feedback can motivate users [26]. Besides games, this easily implemented element is
                     also used in many other areas. These include, for example, software to show a pro-
                     gress in an installation process or in surveys to give the subjects an overview of how
                     many questions he or she still has to answer. Despite the general assumption that pro-
                     gress bars should increase the completion rate of surveys, studies have not been able
                     to prove this effect [7] or even produced opposite results [23]. However, we have
                     shown in a previous study that progress indicators can have a positive motivational
                     effect when answering a general quiz by a general audience [26]. Therefore we as-
                     sume that this positive effect is expectable as well with scientists and that they can be
                     motivated by the game design element progress bar to answer more questions in an
                     open access quiz.


                     3      Method

                     The aim of our study is to investigate whether scientists can be motivated by a gami-
                     fied quiz to answer more questions on the topic of open access. To this end, a field
                     study in the form of an online quiz on the topic of open access was developed. A be-
                     tween-subjects design was chosen to determine the effect of the individual game de-
                     sign elements. The following three experimental conditions were used for the experi-
                     ment: a control group (CG) without gamification, an experimental group with pro-
                     gress bar (PB) and an experimental group with badges (BA).

                     3.1    Hypotheses
                     In line with previous findings [14, 25, 26] the following two hypotheses were formu-
                     lated:
                        H1: The subjects in the group with progress bar (PB) answer on average more open
                     access questions than the control group (CG).
                        H2: The subjects in the group with badges (BA) answer on average more open ac-
                     cess questions than the control group (CG).




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                        64
                     3.2    Procedure
                     To answer our hypotheses we developed an experimental setting with a quiz. The quiz
                     is based on the idea that questions are only answered as long as the subjects enjoy it.
                     This has been explicitly stated to the subjects in the invitation e-mail and at the first
                     page of the experiment. There is no minimum number of questions to answer. Instead
                     the quiz can be finished at any time. Subjects are randomly and permanently assigned
                     to one of the three test conditions (CG, BA or PB). All subjects reached the online
                     quiz through the same internet URL. Due to the random division into groups, the
                     subjects were not aware that there is more than one test condition. At the end of the
                     experiment, the mean number of questions answered is compared between the three
                     groups.
                        The quiz developed for the field study consists in total 29 multiple-choice ques-
                     tions on the topic of open access. Each question is provided with four possible an-
                     swers of which always only one is correct. In all three test conditions, the subjects are
                     immediately informed of the correct or incorrect answer to a question. The correct
                     answer is highlighted in green colour and wrong answers are displayed in red colour
                     (see Fig. 1). All questions are always asked in the same order, so that we can compare
                     the results between the different groups.




                       Fig. 1. Correctly answered question in the progress bar condition (left) and incorrectly an-
                                             swered question in the badge condition (right).

                        All questions address different aspects of open access. In a pre-test before conduct-
                     ing the experimental field study, five subjects were asked to give feedback about our
                     online quiz. All questions were described as "interesting" and "important". At this
                     point, the questions were tested exclusively with the non-gamified version of the quiz.
                     This form of presentation of the questions at the pre-test was found to be "monoto-
                     nous" by the subjects. This supports the findings of Kelty [20] and provides the pre-
                     requisites for using gamification to increase motivation to engage users for a signifi-
                     cantly longer period of time with the topic.




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                            65
                         The badges are displayed to the user above the questions during the quiz (see Fig. 1
                     right). Before a badge is unlocked, only a grey circle is visible in the overview. So the
                     subjects can estimate that further badges can be achieved. This can appeal to the col-
                     lective instinct [37] and thus motivate to continue with an activity. The badges were
                     designed in such a way that they can be unlocked as consistently as possible in the
                     course of the experiment. The subjects are not informed in advance about the criteria
                     for the award of the badges. This was done on the purpose not to encourage the sub-
                     jects to just unlock the badges instead of engaging extensively with the questions. In
                     addition, unexpected unlocking can have a motivating effect on people who react
                     positively to surprise effects [37]. Just after a badge is unlocked the condition for its
                     assignment is disclosed and the associated visual badge is displayed.
                         Eight different badges were designed for the quiz. One of the badges is bound to a
                     time condition and is activated for answering a question within five seconds. Three of
                     the badges are awarded for a certain number of questions answered correctly one after
                     the other. Three more badges are given for correctly answering certain individual
                     questions and the last badge is assigned for answering all questions in the quiz.
                         A simple horizontal progress bar was used for the other game design element (see
                     Fig. 1 left). However, the progress bar is not filled equally, but in uneven proportions.
                     This procedure is based on the assumption that the maximum number of questions
                     cannot be predicted easily and that this vague approach might motivate the subjects
                     [24]. If the maximum number of questions is known, the subjects might be additional-
                     ly motivated to answer more questions, which would lead to a confusion of the re-
                     sults, as this would set an implicit goal. This would make it impossible to attribute our
                     results only to the game design elements because the ceteris paribus assumption
                     would not be applicable due to the intervening factor of the additional variable, more
                     specifically the implicit goal. It is also known from research on the influence of pro-
                     gress indicators on the completion rate of surveys that they can also be demotivating
                     if they give the participant the feeling he or she is not progressing fast enough [4].
                     The variation in progress should add a random factor that is more motivating than a
                     monotonous increase in the display and still reflect progress.
                         At the beginning of the quiz the subjects were also asked to indicate their age and
                     gender and after completing the quiz they were given the opportunity to leave a com-
                     ment in a text field. The subjects did not receive any compensation for their participa-
                     tion in this experiment.


                     4      Results

                     Over a period of one week, 28 subjects were acquired at three German institutes who
                     are also project partners of the forthcoming larger study. Only employees of the insti-
                     tutes were contacted and no students. They received an e-mail with the request to
                     participate in a study and could access the quiz anonymously by clicking on a general
                     link to the page where the quiz was provided. Both in the e-mail and on the website
                     the subjects were informed that they only had to answer questions for as long as they
                     enjoyed it. A total of three volunteers did not report their gender. Of the other sub-




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                       66
                     jects, 12 were male and 13 female. The mean age was 38.57 years (standard deviation
                     13.90). The experiment presented here is a preliminary study of a larger study. This
                     explains the short runtime and the relatively small number of subjects.
                        On average, 23.32 of 29 maximum possible questions were answered (standard de-
                     viation 9.24). 18 subjects (64 %) answered all 29 questions, of which three were in
                     the control group (corresponding to 27 % in this group), six in the group with badges
                     (100 %) and nine in the group with progress bar (81 %). The evaluation thus shows a
                     ceiling effect that is particularly pronounced in the two experimental conditions. The
                     discussion can already be anticipated and additional questions should be developed
                     for further use of the open access quiz. This can also be seen from the text field com-
                     ments of the subjects, which at the end of the experiment could be submitted on a
                     voluntary basis. Several respondents from all three groups noted that they would have
                     liked to have answered further questions.
                        A list of the number of subjects per test condition, as well as the number of those
                     who answered all questions, the mean value of answered questions and the associated
                     standard deviation are given in Table 1.

                       Table 1. Total number of respondents, number of respondents with all answered questions,
                                          mean value and standard deviation per condition.

                                                                               Condition
                                                             CG                   PB         BA
                                   N                         11                   11          6
                                   All answered               3                    9          6
                                   Mean                     16.82               26.73*     29.00**
                                   SD                       10.00                7.21        0.00
                                                          *= p < .05; ** = p < .01


                        An analysis of variance is performed for statistical evaluation. The test to check the
                     homogeneity of the variances (Levene test) for the number of answered questions
                     yields a statistically significant result with p = .001, the Levene statistics is 8.78. All
                     following results are therefore based on an unequal variance and the corrected results
                     are reported accordingly.
                        The analysis of variance yields a statistically significant difference between the in-
                     dividual test conditions, F(2, 25) = 6.47, p = .005. Since the homogeneity of the vari-
                     ances is not given, the Welch test must correct accordingly. The comparison of the
                     mean values of the control group with the group with progress bar provides a statisti-
                     cally significant result, t(18.19) = 2.67, p = .008. Consequently, the hypothesis H1 can
                     be supported and it can be assumed that the progress bar motivated the subjects to
                     answer more questions.
                        The comparison of the mean values between the control group and the experi-
                     mental group with badges also shows a statistically significant result, t(10.00) = 4.04,
                     p = .001. It can thus also be supported hypothesis H2 that the game design element
                     badge also motivated the subjects to answer further questions.
                        The number of subjects in the individual test conditions differs significantly and, in
                     addition, the number of subjects in the test conditions is very low. This can result in




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                         67
                     conservative effects in their statistical significance due to a low test power [12].
                     Therefore, it is all the more remarkable that statistically significant results have been
                     obtained. This is also shown by the effect size of Δ = .99 for the comparison between
                     the control group and the progress bar. The effect size for the comparison between the
                     control group and the group with badges is Δ = 1.22. Thus a strong effect can be
                     demonstrated for both comparisons. For such a small sample size and the problems
                     normally associated with this size [12], the results are particularly remarkable.


                     5      Discussion and future work

                     The present study was able to show that the game design elements progress bar and
                     badge had a motivating effect in the context of an open access quiz. Based on the
                     results of Kidwell et al. [21], the assumption can thus be supported that the game
                     design elements can also be used in a different context in order to generate or increase
                     motivation for a gamified quiz in the context of open access.
                        Following on the results of Mekler et al. [27], we were also able to observe an ef-
                     fect of individual game design elements on the behaviour of the subjects in our study.
                     We could show for the two game design elements badges and progress bar that they
                     can individually influence behaviour in comparison to a control group. These results
                     are also consistent with the results of Hamari [14] who could show also positive ef-
                     fects for badges in the context of a sharing community.
                        Our result that a positive effect on the number of questions answered was achieved
                     in contrast to the studies addressing with the use of progress bars in surveys [7, 23]
                     could be attributed to various factors. For example this could be due to the relatively
                     small number of questions. Despite the manipulation of the progress bar, the subjects
                     could anticipate that they were progressing relatively quickly. Another point could be
                     the quiz setting, which differs from a classical survey by stating the correct or wrong
                     answers on a question. It is possible that this additional element could positively in-
                     fluence the effect of the progress bar, as it is assumed that game elements influence
                     each other [32].
                        The assumption of Feger et al. [10, 11] that gamification is also suitable to address
                     the very special target group of scientists, can also be supported on the basis of the
                     results of our preliminary study. Nevertheless, the practical feasibility of this idea
                     should be investigated in further experiments.
                        Despite the successful implementation of the field study, there are some limita-
                     tions. First of all the small number of subjects is basically a problem for every field
                     study. In the present context this can be explained because the study was scheduled to
                     take one week to complete and that it is a preliminary study of a larger study.
                        The restriction that the data were collected exclusively in Germany could also have
                     had an influence on the results. Maybe the level of knowledge and implementation of
                     open access makes a distinction in some cases between Germany and other countries.
                     Also cultural factors may have an influence on the impact of gamification [35].
                        In addition, for further investigations of different game design elements with the
                     same research design, more questions need to be developed to avoid the ceiling effect




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                       68
                     present in this study, which occurred especially in the experimental conditions. This is
                     also in line with the comments from our subjects.
                        Another point that should be further examined is whether a cooperative design of
                     the game design elements would be preferable in the scientific context. According to
                     Feger et al. [11], the design of gamified scientific work environments should rather be
                     based on a cooperative approach, as the scientists surveyed seemed to prefer such a
                     design. Although the present study did not include a cooperative element we achieved
                     very positive results. In order to be able to demonstrate possible advantages of coop-
                     erative design methods, an experimental investigation of cooperative gamification
                     approaches would have to be carried out in the scientific context.
                     Also it should be examined whether an online quiz with additional questions is too
                     artificial to approach the question of how open access can be made more motivating.
                     Instead, another way could be to let subjects make fictitious selection decisions in an
                     experiment (conjoint analysis), similar to a procedure described by Schöbel and col-
                     leagues [33]. In such an experiment, for example, scientific articles could be sorted in
                     an order in which they would be prioritized for download. Different aspects of open
                     access could be visualised and highlighted by gamification, which is not possible with
                     a quiz or a survey. A further possibility for experimental research could be to offer
                     different publication options for one's own publication and to evaluate these according
                     to the probability of submission by the subjects. The variants could differ with regard
                     to the number, design and quality of game design elements and bibliometric data.


                     Acknowledgments
                     This research was funded by the Federal Ministry of Education and Research (BMBF) in Ger-
                     many as part of the research project OA-FWM (16OA044C).


                     References
                      1. Antin, J., Churchill, E.F.: Badges in Social Media: A Social Psychological Perspective. In:
                         CHI 2011 Gamification Workshop Proceedings. pp. 1–4 , Vancouver, British Columbia,
                         Canada (2011).
                      2. Bowser, A. et al.: Using Gamification to Inspire New Citizen Science Volunteers. In: Pro-
                         ceedings of the First International Conference on Gameful Design, Research, and Applica-
                         tions - Gamification ’13. pp. 18–25 ACM Press, Toronto, Ontario, Canada (2013).
                      3. Buckley, P., Doyle, E.: Individualising Gamification: An Investigation of the Impact of
                         Learning Styles and Personality Traits on the Efficacy of Gamification Using a Prediction
                         Market.         Computers         &       Education.       106,       43–55        (2017).
                         https://doi.org/10.1016/j.compedu.2016.11.009.
                      4. Conrad, F.G. et al.: The Impact of Progress Indicators on Task Completion. Interacting
                         with Computers. 22, 5, 417–427 (2010). https://doi.org/10.1016/j.intcom.2010.03.001.
                      5. Csikszentmihalyi, M.: Flow: The Psychology of Optimal Experience. Harper Collins, New
                         York, NY, USA (2009).
                      6. Deterding, S. et al.: From Game Design Elements to Gamefulness: Defining “Gamifica-
                         tion.” In: Proceedings of the 15th international academic MindTrek conference: Envision-




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                             69
                         ing      future     media      environments.      pp.    9–15     ACM      Press   (2011).
                         https://doi.org/10.1145/2181037.2181040.
                      7. Dirk Heerwegh, Geert Loosveldt: An Experimental Study on the Effects of Personaliza-
                         tion,Survey Length Statements, Progress Indicators, and SurveySponsor Logos in Web
                         Surveys. Journal of Official Statistics. 22, 2, 191–210 (2006).
                      8. European        Commission:        Open      access     -      H2020    Online     Manual,
                         https://ec.europa.eu/research/participants/docs/h2020-funding-guide/cross-cutting-
                         issues/open-access-data-management/open-access_en.htm, last accessed 2019/10/09.
                      9. Eveleigh, A. et al.: “I Want to Be a Captain! I Want to Be a Captain!”: Gamification in the
                         old Weather Citizen Science Project. In: Proceedings of the First International Conference
                         on Gameful Design, Research, and Applications - Gamification ’13. pp. 79–82 ACM
                         Press, Toronto, Ontario, Canada (2013). https://doi.org/10.1145/2583008.2583019.
                     10. Feger, S. et al.: Just Not The Usual Workplace: Meaningful Gamification in Science. Ge-
                         sellschaft für Informatik e.V. 113–118 (2018). https://doi.org/10.18420/muc2018-ws03-
                         0366.
                     11. Feger, S.S. et al.: Gamification in Science: A Study of Requirements in the Context of Re-
                         producible Research. In: Proceedings of the 2019 CHI Conference on Human Factors in
                         Computing Systems - CHI ’19. pp. 1–14 ACM Press, Glasgow, Scotland UK (2019).
                         https://doi.org/10.1145/3290605.3300690.
                     12. Field, A.P.: Discovering Statistics Using Ibm Spss Statistics: And Sex and Drugs and Rock
                         “N” Roll. SAGE, London, UK (2013).
                     13. Gibson, D. et al.: Digital Badges in Education. Educ Inf Technol. 20, 2, 403–410 (2015).
                         https://doi.org/10.1007/s10639-013-9291-7.
                     14. Hamari, J.: Do Badges Increase User Activity? A Field Experiment on the Effects of Gam-
                         ification.     Computers        in    Human        Behavior.     71,   469–478     (2015).
                         https://doi.org/10.1016/j.chb.2015.03.036.
                     15. Hamari, J., Koivisto, J.: Measuring Flow in Gamification: Dispositional Flow Scale-2.
                         Computers           in        Human        Behavior.         40,     133–143       (2014).
                         https://doi.org/10.1016/j.chb.2014.07.048.
                     16. Hammarfelt, B.M.S. et al.: Quantified Academic Selves: The Gamification of Science
                         Through Social Networking Services. Information Research. 21, 2, (2016).
                     17. Hanus, M.D., Fox, J.: Assessing the Effects of Gamification in the Classroom: A Longitu-
                         dinal Study on Intrinsic Motivation, Social Comparison, Satisfaction, Effort, and Academ-
                         ic      Performance.       Computers      &       Education.     80,   152–161     (2015).
                         https://doi.org/10.1016/j.compedu.2014.08.019.
                     18. Huschens, M. et al.: On the Role of Social Comparison Processes in Gamified Work Situa-
                         tions. In: Proceedings of the 52nd Hawaii International Conference on System Sciences -
                         HICSS 52. pp. 1446–1455 (2019).
                     19. Johnson, D. et al.: Gamification for Health and Wellbeing: A Systematic Review of the
                         Literature.          Internet        Interventions.         6,       89–106        (2016).
                         https://doi.org/10.1016/j.invent.2016.10.002.
                     20. Kelty, C.: BEYOND COPYRIGHT AND TECHNOLOGY: What Open Access Can Tell
                         Us about Precarity, Authority, Innovation, and Automation in the University Today. Cul-
                         tural Anthropology. 29, 2, 203–215 (2014). https://doi.org/10.14506/ca29.2.02.
                     21. Kidwell, M.C. et al.: Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effec-
                         tive Method for Increasing Transparency. PLoS Biol. 14, 5, e1002456 (2016).
                         https://doi.org/10.1371/journal.pbio.1002456.




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                             70
                     22. Koivisto, J., Hamari, J.: The Rise of Motivational Information Systems: A Review of
                         Gamification Research. International Journal of Information Management. 45, 191–210
                         (2019). https://doi.org/10.1016/j.ijinfomgt.2018.10.013.
                     23. Liu, M., Wronski, L.: Examining Completion Rates in Web Surveys Via Over 25,000 Re-
                         al-World Surveys. Social Science Computer Review. 36, 1, 116–124 (2018).
                         https://doi.org/10.1177/0894439317695581.
                     24. Marczewski, A.: Gamification: A Simple Introduction. Lulu.Com (2013).
                     25. Mazarakis, A., Bräuer, P.: Badges or a Leaderboard? How to Gamify an Augmented Reali-
                         ty Warehouse Setting. In: Proceedings of the 3rd International GamiFIN Conference -
                         GamiFIN 2019. pp. 229–240 (2019).
                     26. Mazarakis, A., Bräuer, P.: Gamification is Working, but Which One Exactly?: Results
                         from an Experiment with Four Game Design Elements. In: Proceedings of the Technology,
                         Mind, and Society. p. 22:1 ACM, New York, NY, USA (2018).
                         https://doi.org/10.1145/3183654.3183667.
                     27. Mekler, E.D. et al.: Towards Understanding the Effects of Individual Gamification Ele-
                         ments on Intrinsic Motivation and Performance. Computers in Human Behavior. 71, 525–
                         534 (2015). https://doi.org/10.1016/j.chb.2015.08.048.
                     28. Miller, A.S. et al.: A Game Plan: Gamification Design Principles in Mhealth Applications
                         for Chronic Disease Management. Health Informatics Journal. 22, 2, 184–193 (2016).
                         https://doi.org/10.1177/1460458214537511.
                     29. Morris, B.J. et al.: Comparing Badges and Learning Goals in Low- and High-Stakes
                         Learning Contexts. J Comput High Educ. 31, 3, 573–603 (2019).
                         https://doi.org/10.1007/s12528-019-09228-9.
                     30. Ponti, M. et al.: Science and Gamification: The Odd Couple? In: Proceedings of the 2015
                         Annual Symposium on Computer-Human Interaction in Play - CHI PLAY ’15. pp. 679–
                         684         ACM           Press,       London,        United      Kingdom         (2015).
                         https://doi.org/10.1145/2793107.2810293.
                     31. Rapp, A. et al.: Strengthening Gamification Studies: Current Trends and Future Opportuni-
                         ties of Gamification Research. International Journal of Human-Computer Studies. 127, 1–6
                         (2018). https://doi.org/10.1016/j.ijhcs.2018.11.007.
                     32. Sailer, M. et al.: Fostering Development of Work Competencies and Motivation via Gami-
                         fication. In: Mulder, M. (ed.) Competence-based Vocational and Professional Education.
                         pp. 795–818 Springer International Publishing, Cham (2017). https://doi.org/10.1007/978-
                         3-319-41713-4_37.
                     33. Schöbel, S. et al.: More Than the Sum of Its Parts – Towards Identifying Preferred Game
                         Design Element Combinations. In: Learning Management Systems. pp. 1–12 , Seoul,
                         South Korea (2017).
                     34. Sümer, M., Aydın, C.H.: Gamification in Open and Distance Learning: A Systematic Re-
                         view. In: Spector, M.J. et al. (eds.) Learning, Design, and Technology. pp. 1–16 Springer
                         International Publishing, Cham (2018). https://doi.org/10.1007/978-3-319-17727-4_115-1.
                     35. Thom, J. et al.: Removing Gamification from an Enterprise SNS. In: Proceedings of the
                         ACM 2012 conference on Computer Supported Cooperative Work - CSCW ’12. pp. 1067–
                         1070         ACM           Press,       Seattle,     Washington,       USA        (2012).
                         https://doi.org/10.1145/2145204.2145362.
                     36. Warmelink, H. et al.: Gamification of the Work Floor: A Literature Review of Gamifying
                         Production and Logistics Operations. In: Proceedings of the 51st Hawaii International
                         Conference on System Sciences. pp. 1108–1117 (2018).
                         Zichermann, G., Cunningham, C.: Gamification by Design: Implementing Game Mechan-
                         ics in Web and Mobile Apps. O’Reilly Media, Sebastopol, CA, USA (2011).




GamiFIN Conference 2020, Levi, Finland, April 1-3, 2020 (organized online)                                           71