=Paper= {{Paper |id=Vol-1642/paper2 |storemode=property |title=Learning Gamification Design - An Usability First Approach for the Enterprise Infoboard Experiment |pdfUrl=https://ceur-ws.org/Vol-1642/paper2.pdf |volume=Vol-1642 |authors=Michael Meder,Till Plumbaum,Sahin Albayrak |dblpUrl=https://dblp.org/rec/conf/sigir/MederPA16 }} ==Learning Gamification Design - An Usability First Approach for the Enterprise Infoboard Experiment== https://ceur-ws.org/Vol-1642/paper2.pdf
      Learning Gamification Design - An Usability First
      Approach for the Enterprise Infoboard Experiment

             Michael Meder                         Till Plumbaum                       Sahin Albayrak
         DAI-Labor, TU Berlin                   DAI-Labor, TU Berlin               DAI-Labor, TU Berlin
          Ernst-Reuter-Platz 7                   Ernst-Reuter-Platz 7               Ernst-Reuter-Platz 7
         10587 Berlin, Germany                  10587 Berlin, Germany              10587 Berlin, Germany
           meder@dai-lab.de                         till@dai-lab.de                   sahin@dai-lab.de



                                                                 users with more space and LinkedIn3 is motivating
                                                                 users to complete their profiles by presenting progress
                        Abstract                                 bars.
                                                                     Coming along with the adaption of gamification in
    Gamification or gameful design attempts to                   di↵erent domains, new insights about advantages and
    raise participation through the application of               problems about the right usage are gained. Finding
    game design patterns and principles in non-                  the right means to increase motivation is a non-trivial
    game environments. It has successfully been                  task since motivation is mainly driven by human-
    applied but in many cases gamification fails                 centric factors [Yee07]. Taking a deeper look into
    due to di↵erent kind of design phase pitfalls.               e↵ects coming with the increased usage of gamifica-
    Several game and gamification design tax-                    tion becomes unavoidable in the enterprise domain.
    onomies and guides exists. But it is hard to                 As enterprises starting to adapt gamification to en-
    select the right one for a specific application              hance employee engagement and participation, the
    of gamification. One of the causes is probably               question arises what motivates people being on top of
    the fact that engineers try to implement what                the leaderboard while others seem to completely ignor-
    experienced game designer should do. We pro-                 ing it? The goal of this research is to examine factors
    pose to apply data mining on user interaction                motivating people to participate. We argue that gam-
    data of gamified applications to extract in-                 ification design must be user specific to successfully
    sights to support and adapt the application                  apply gamification. We also argue that especially in
    of gamification. Therefore we started the In-                enterprises, it is even more important using the right
    foboard experiment – a two phase gamification                mechanisms. While there are works about gamifica-
    study of a cutting-edge enterprise information               tion design and which elements to apply when, we
    sharing system.                                              believe that by using data mining methods to deter-
                                                                 mine the types of users existing in a company and to
1    Introduction                                                learn what elements best suits them would be a major
Gamification, the use of game-mechanics in non-                  leap in successfully introducing gamification. We will
gaming contexts [DDKN11, HH12], has been widely                  not discuss questions regarding implications of di↵er-
adapted by di↵erent services during the last years. We           ent gamification methods, such as leaderboards, and
see online services like Stackoverflow1 using a reputa-          what a user’s position on the leaderboard says about
tion leaderboard where users get points for helpful an-          the work performance, for example.
swers. Dropbox2 rewards users helping acquire new                    In this paper, we describe an experiment to col-
                                                                 lect data needed to learn user types and correspond-
    Copyright c by the paper’s authors. Copying permitted for    ing mechanisms. We present an enterprise informa-
private and academic purposes.                                   tion system we build, the Infoboard application, and
     In: F. Hopfgartner, G. Kazai, U. Kruschwitz, and M. Meder   the experiment design to collect the data needed for
(eds.): Proceedings of the GamifIR 2016 Workshop, Pisa, Italy,   the machine learning approach. The experiment is de-
21-July-2016, published at http://ceur-ws.org
   1 http://stackoverflow.com/
   2 http://dropbox.com/                                           3 http://linkedin.com/
signed to be conducted in di↵erent phases, and will         ther, Farzan et al. [FD08] also studied if there is any
be explained in detail in section 4. We also present        noticeable e↵ect on the usage when the points sys-
the needed basics and fundamentals to conduct enter-        tem is explicitly explained to the users. Therefore,
prise gamification research. Thus, we summarize the         they provided further details via email and repeated
contributions of this work as follows:                      the experiment. They conclude that points systems
                                                            can successfully be employed to motivate users to con-
    1. We give an overview of the current state of the      tribute more in an enterprise social network system,
       art of enterprise gamification (section 2).          especially if combined with email notifications. Fur-
    2. We describe user-centric gamification design tak-    ther, they conclude that the type of contribution can
       ing into account user types and cultures (section    directly be controlled by the type of gamification ap-
       3).                                                  plied, i.e., increasing the points for certain types of
                                                            contributions will indeed result in an increase of contri-
    3. We describe an experimental set-up to gather user    butions of this type. In a follow-up experiment, Farzan
       data and outline machine learning approaches to      et al. [FDB09] increase the social interaction and di-
       learn and match users and gamification elements      versity of content even further by introducing a badge
       (section 4 and 4.2.3).                               based approach on promoting content. Although they
                                                            observe an increased activity due to the introduction
   The goal of the experiment is to collect a dataset
                                                            of gamification methods, the authors argue that they
that can be used for the previously mentioned machine
                                                            cannot make any statement about the quality of the
learning approach to gamification. As to the best of
                                                            contributions. Further studies are needed to examine
our knowledge no dataset for this topic exists, we be-
                                                            this in detail. Overall they show that points and sta-
lieve that this work is an essential step to collect the
                                                            tus levels motivates more activity by IBM employees
data and enable further research. We therefore ex-
                                                            within Beehive and this also inspires further activities
plicitly invite others to give feedback regarding the
                                                            by other users. It is also important that the incen-
experimental setup to be able to collect the best data
                                                            tive mechanisms incent continually to bring a return,
possible.
                                                            which was the weakness of their static points [FD08]
                                                            system.
2     Current State of
                                                                Evaluating the e↵ect of gamification methods from
      Enterprise Gamification                               a di↵erent perspective, Thom et al. [TMD12] study
Various studies indicate that gamification has a posi-      whether the removal of gamification features from an
tive e↵ect on the use of enterprise systems. In [Cas07],    enterprise social media system has any measurable ef-
Dugan et al. describe the transformation of an enter-       fect on user activity. They report a significant decline
prise bookmarking system into a guessing game called        of user activities after removing gamification features.
Dogear. In this game, bookmarks and their tags are          Interestingly, the authors also noticed some relation
displayed on screen and the players have to guess,          between user activity and their geographical location.
who created this bookmark. If they guess the correct        Further the authors conclude that the organizational
creator of the bookmark, the players can gain points        culture and the local culture should play a role in gam-
which is inspired by von Ahn’s ESP game [vAD04].            ification design.
They report that within the first month of the release          Hamari [Ham13] evaluates the use of badges in a
of the system, they had 87 active players from 10 dif-      peer-to-peer trading service. He observes that the in-
ferent countries. A detailed analysis is missing though.    troduction of gamification mechanisms does not auto-
   Farzan et al. [FDM08] examine the impact of game         matically result in an increased use of the system by all
mechanics, more precisely the introduction of a points      users, but that those users, who actively inspect their
system, on a social enterprise network system (Bee-         own badges become more active. This supports our
hive, IBM). They evaluate the impact of this points         assumption that individual behavior plays an impor-
system by performing A/B testing, i.e., one half of all     tant role in the successful application of gamification
users are made aware of the points system, while the        methods in an office scenario.
other half (i.e., the control group) cannot see this fea-       Stanculescu et al. [SBSH16] examined which game
ture. They observe that overall, the introduction of        design element is more e↵ective for a predefined goal.
the points system increased the activity level of users     They applied points, badges and leaderboard to an
within the system. However, they also report that 72%       enterprise learning and social interaction Web appli-
of the users in the experimental group never visited        cation. In total they compared four treatment groups
the page which describes how to earn points. More-          with either enabled a leaderboard or badges or both
over, they argue that a large portion did not even no-      or none of them. Whereas points were enabled for all
tice the existence of points. Addressing this issue fur-    groups. The results of the study indicates that there
is no di↵erence if only the leaderboard or only badges       nition the goal is rather geared towards the (improved)
are visible to the user. Whereas the combination of          user experience itself, in Huotari and Hamari’s defini-
both, leaderboard and badges visible, will result in an      tion it is the outcome driven by the user experience.
even greater e↵ect.                                          We agree more with Deterding’s definition aiming on
   Summarizing, previous research reports an increase        the “improvement of the user experience” achieved by
of users’ activity in an enterprise due to diverse game      gamification.
design elements. But only for some users and for a
short period of time [FDM08]. Remarkable is that             3.1   Player Types
we could not find statements about the usability or
                                                             Game designers take advantage of player types [HT14]
user experience of the systems before or without the
                                                             or play-personas [CD09] to set some boundaries for
application of gamification. We think to understand
                                                             the game design element selection process towards a
gamification we should aim to measure the pure e↵ect
                                                             user centered game design. Designing gamification is
of gamification by minimizing disruptive factors such
                                                             also always a user-oriented process. This is due to the
as bad usability of the system itself. We also notice
                                                             fact that users are all individuals driven by di↵erent
that the existing work is hardly comparable as stud-
                                                             input factors like age, gender, education, social skills
ies are usually conducted in closed systems, and no
                                                             and cross-cultural influences [HK13, Kha11, YMT+ 11,
data is publicly available. However, these studies also
                                                             Yee07, YDN12]. In the game world this is considered
indicate that individual behavior has a significant in-
                                                             by several player typologies developed on user obser-
fluence on the success of gamification. Up to now most
                                                             vations and in-game behavior. The evolution seems to
studies recommend an examination beyond question-
                                                             went from Bartles 4 and later 8 player types to Yee’s
naires to understand users’ or employees’ actual behav-
                                                             3 motivation components or 10 motivation subcompo-
ior with gamified application. Therefore, we attempt
                                                             nents [Yee07]. Hamari et al. [HT14] list existing game
to better understand employees’ behavior in more de-
                                                             player typologies and state that player types have their
tail by gathering users’ interaction data and applying
                                                             legitimation because of the di↵erent behavior and mo-
machine learning techniques. The interaction dataset
                                                             tivation of players. It is a wide-spread assumption that
should meet demands regarding reproducibility of re-
                                                             also for the gamification scenario such types of players
sults and the collected data should also be ’clean’, e.g.
                                                             or users can be applied. Although many player typolo-
the influence of a bad user experience should be mini-
                                                             gies exist we argue that it is hard to map them to one
mized. Thus, the experiment we conduct has the goal
                                                             or more specific game design elements. Beyond that,
to produce ’clean’ data and the data should also be
                                                             such types could change over time which seems to be
available for research.
                                                             a central criticism on player typologies [HT14].

3    Gamification Design                                     3.2   Game Design Elements
Before explaining the experiment itself, we will intro-      An important aspect of successful gamification is the
duce the game design elements and approaches that            selection of game design elements. Game design el-
mainly influenced our Infoboard approach.                    ements determine what type of gameful experiences
   In 2011, two definitions of gamification were pub-        are generated for the users. In [DDKN11], Deterd-
lished. Deterding et al. [DDKN11] define gamification        ing et al. provide five levels of game design elements.
as “the use of game design elements in non-game con-         They distinguish between game interface design pat-
texts”. Huotari and Hamari [HH12] define it as “a pro-       terns, game design patterns and mechanics, game de-
cess of enhancing a service with a↵ordances for game-        sign principles and heuristics, game models and game
ful experiences in order to support user’s overall value     design methods. Robinson et al. [Rob13] propose a
creation”. Hamari et al. summarized in [HKS14] both          taxonomy built on levels of expected engagement and
definitions “as a process of enhancing services with         the required commitment of the user. This taxonomy
(motivational) a↵ordances in order to invoke gameful         has been conceived as a decision support for game el-
experiences and further behavioral outcomes.” We in-         ement selection.
terpret both definitions as implying a goal as the utility      Motivational a↵ordances, interface design patterns
of gamification. Both describe elements of the game          with a stimuli to action or “properties of an object
design world which could change a user’s experience          that determine whether and how it can support one’s
in a di↵erent context (non-game [DDKN11], service            motivational needs” [Zha08], were found by Hamari et
[HH12]). Interestingly, for Deterding [DDKN11] “[...]        al. [HKS14] in 10 di↵erent forms in 24 examined stud-
the term ‘gameful design’ – design for gameful experi-       ies on gamification. Jia et al. [JXKV16] examined the
ences – was also introduced as a potential alternative       relation between the Big Five collections of personality
to ’gamification’.” Summarizing, in Deterding’s defi-        traits and motivational a↵ordances through a survey
which, among other things, asked for opinions to ex-       allows us to test and compare results. With this in
ample interactions shown with videos. The results of       mind, and the knowledge from the previous section,
their survey (N=248, mostly AMT4 ) indicate that con-      about the various game design aspects, we have build
sidering personality traits helps to make gamification     an application and designed an experiment to start
design choices. They plan to analyze the interaction       solving those problems. The path taken, starting with
with motivational a↵ordances on a real application.        this experiment, will allow us to apply di↵erent gam-
   Previous studies have shown that an improvement         ification elements to di↵erent users within the same
towards user activity and user experience is possible      application, resulting in an overall increase of motiva-
[HKS14]. Those studies also showed that the constel-       tion and participation.
lation of users (player motivations [Yee07] and player
types [Bar96, Bar03, HT14]) and motivational a↵or-         4.1   Infoboard Application
dances (interface design patterns) and game design el-
                                                           The Infoboard application is a modern enterprise in-
ements seems to be important for a successful appli-
                                                           formation system. The main goal of the system is
cation of gamification.
                                                           to provide users with relevant information and to en-
                                                           able knowledge exchange across enterprise department
3.3    Gamification Design Problem                         borders. It is build upon a distributed search engine
We argue that it is critical to measure challenges and     which provides information in the form of di↵erent
risks that occur due to di↵erent types of users before     kind of items from indexed data of enterprise sources
introducing gamification methods. Applying the right       (enterprise wiki, internal file server) and public sources
gamification element to the right user will increase the   (news articles, websites, scientific publications, confer-
motivation and participation while applying the wrong      ence calls and funding calls).
element can on the other hand have negative e↵ects.            Users can define own topics of interest and the In-
More importantly, due to the usually diverse set of        foboard will continuously search for new information
employees, and the accompanying set of diverse char-       and present results ordered by date allowing users to
acters coming from di↵erent cultures, finding the one      quickly find the latest information. All items found are
gamification element satisfying them all is almost im-     arranged as tiles with topic specific background colors.
possible.                                                  Whenever a new item has been found the tiles on the
   Finding an optimal user and game design elements        board will be updated and re-arranged. Fig. 1 shows
relation implies a goal or outcome we want to achieve      an exemplary tile. The tile itself contains information
with that relation. Thus, regarding the predefined goal    about the topic it belongs to (see upper right corner),
of a gamification implementation extends the user and      the source of information and date found (bottom right
game design element relation to a goal, user and game      corner above and under the line). The knowledge shar-
design elements relation. In this relation the right se-   ing is supported by the elements one can find in the
lection of game design elements is crucial to reach the    upper left corner and the bottom right corner. Users
goal.                                                      can up or down vote an information item. This vote
   Under the assumptions that (i) gamification targets     information a↵ects the ordering of the Infoboard of
various types of users that experience game design el-     the user who voted, but may also influence boards of
ements di↵erently; and (ii) gamification is deployed       other users with the same item on their board. In the
in order to achieve some goal in the broadest sense:       end, users sharing the same interests, will see the most
We consider the gamification design problem                interesting items. The tile also shows who voted the
as the problem of assigning each user (at least)           tile last, so that users can see and connect with others
one game design element that maximizes their               sharing the same interest.
expected contribution to achieve some goal.                    Fig. 2a shows an Infoboard of a user with di↵erent
                                                           topics, marked by the di↵erent colors. As explained,
4     Infoboard Experiment                                 users can vote items and also read the information.
                                                           Therefore, we included a reader view for usability rea-
Summarizing the previous sections, we have seen that       sons, see 2b, allowing users to read and then to vote
gamification in enterprises is a hot topic but comes       the article directly, without forcing users to switch be-
with several aspects that needs more research. We          tween di↵erent browser tabs.
have also seen that current research only analyzes cer-
tain gamification aspects and not a set of di↵erent        4.2   Experiment setup
methods for di↵erent users. And, especially from a
machine learning perspective, there is no dataset that     Our goal is not to compare a non-gamified with a
                                                           gamified version of an enterprise application simulta-
    4 Amazon Mechanical Turk                               neously. This has been done before and results have
(a) The Infoboard showing information for di↵erent topics.          (b) The reader view of the Infoboard allowing users to easily
Each topic gets a unique color assigned allowing users to easily    read and interact with information items.
distinguish between the topics.

Figure 2: Experiment phase I: Gamification is disabled. Only the Infoboard, the Reader, Settings and a FAQ
page is available. No game design elements are visibile to the users.
shown that gamification has positive e↵ects at least on    announcement email to all 120 employees at our lab,
some users for a short time [FDM08, FD08, FDB09,           we received a number of questions and feedback. Most
TMD12]. As explained in Section 3.3 our main goal          of them concerning the underlying data sources, the
is to understand which game design elements are pre-       sorting rules or how to change the topics. Surprisingly
ferred by which kind of user.                              a number of users also asked how to change the colors
    To be able to measure pure e↵ect of gamification,      assigned to each topic. Privacy was also a big issue,
of di↵erent gamification elements, we designed the ex-     because we show an Infoboard with general topics of
periment to minimize disruptive factors such as low        the enterprise on a screen inside the enterprises’ co↵ee
usability of the system itself, explained in Phase I. To   kitchen. And people wanted to know if we consider ac-
measure gamification e↵ects, users need to get di↵erent    cess rights restriction (of course we do). Based on this
gamification elements at di↵erent times. A/B tests,        feedback, we fixed some issues and decided to add a
usually a good choice to test such e↵ects are rather       page containing the frequently asked questions (FAQ)
difficult to realize in the enterprise because a random    with detailed answers one week after the application
assignment of treatment groups also bears a lot of         start.
risks due to discussions among participants [SBSH16]          Another week later we launched a first survey to
or “During the experiment, a few users in the control      measure the usability of the application. As a usability
group asked us why they couldn’t see their points and      metric we used the widely accepted and rather simple
submitted bugs [...]” [FDM08]. For those reasons we        System Usability Scale (SUS) [Bro96]. The score is cal-
designed a two phase experiment with an usability first    culated from 5 point Likert scale answers given on 10
approach, to make the system has a good usability and      standardized questions. A SUS score below 50 is not
users are less a↵ected by usability problems.              acceptable, between 50 and 70 is marginal and above
    The Infoboard application itself logs all user inter-  70 indicates that the usability is good or even excel-
actions with the system. This allows us to later process   lent [BKM09]. The Infoboard application achieved a
a dataset where we can learn interaction patterns of       score of 62.96 (N=16) which is rather lower marginal
users and the game design elements they correspond         usability. Unfortunately, this confirms our presump-
to.                                                        tion, that the usability of the application needs to be
                                                           improved.
4.2.1    Phase I: Warmup
                                                                   4.2.2   Phase II: Gamification enabled
We started with a basic version of the Infoboard with-
out any gamification elements. In this warmup phase                In the second phase of our experiment we will enable
we want to make sure that the core functionality is                gamification which results in visible game design el-
usable, understandable and free from major bugs. Be-               ements for the users. The current Infoboard system
cause we do not want to apply gamification to dis-                 already contains a set of di↵erent game design ele-
tract from deficiencies of a useless or faulty applica-            ments. They were carefully selected to ensure coverage
tion. This allows us to better analyze the a↵ects on               of Yee’s three main components of player motivations
di↵erent users achieved with gamification.                         [Yee07], Achievement, Social and Immersion, in order
   After we started the Infoboard experiment with an               to address the broad range of user types which allows
                                                              would exactly look like this to the users (same user be-
                                                              havior assumed) except for the booster bar (red) which
                                                              is zero because gamification is disabled users can not
                                                              redeem points for boosters until now.
                                                                 As we are currently in the first phase and trying
                                                              to maximize the usability score, the second phase has
                                                              not yet started. We are currently discussing to fur-
                                                              ther split the second phase into more sub-phases. In
                                                              each sub-phase, only game design elements for a cer-
                                                              tain player type will be enabled. We currently review
                                                              this approach to see if this is necessary or if it intro-
                                                              duces negative side e↵ects.

                                                              4.2.3   Interaction Data Mining
                                                              Very similar to the future work stated by Jia et al.
                                                              [JXKV16], we plan to analyze the collected user inter-
                                                              action data with motivational a↵ordances represented
                                                              by interface design elements on the Infoboard appli-
                                                              cation. We aim to find similar behaving user groups
                                                              based on their interactions. In [MJ14] we provide fur-
                                                              ther details on how to select (learn) a model to predict
                                                              appropriate game design elements. One approach is
                                                              based on the application of a support vector machine.
Figure 1: An Infoboard tile showing information about         As we assume sparse user behavior data we treat this
the item, the topic and knowledge sharing supporting          as a problem of regression learning for which a plethora
elements.                                                     of powerful mathematical methods are available. Fur-
                                                              ther we regard the gamification design problem as a
us to later detect and learn typical interaction patterns     special case of a recommendation problem for which
of these user types. Currently the following game de-         matrix factorization constitutes a state-of-the-art so-
sign elements are implemented and are ready to be             lution.
activated:
Achievement: points; badges; leaderboard; progress
                                                              5   Conclusion
   bar; levels                                                Research on enterprise gamification is still in the early
                                                              stages. Especially given the influence and e↵ects of di-
Social: feedback messages; user activities stream;            verse personnel, regarding character and culture. We
    group achievements (total points)                         think that machine learning approaches can help us
Immersion: customization (color themes);  re-                 determine the best game design element for a user,
   deemable points to buy a booster (a points                 based on the users’ interaction patterns. Unfortu-
   multiplier, positive feedback loop)                        nately, there is currently no dataset available that al-
                                                              lows us to apply machine learning methods. In this pa-
Fig. 3 shows a few examples of these game design ele-         per, we introduced the Infoboard and an experiment,
ments which are currently not visible for the users. On       which is an essential step to collect the data and en-
the left side of Fig. 3a one can see the leaderboards         able further research. The experiment is subdivided
(overall and monthly), in the middle the achieved             into di↵erent phases, where users interact with di↵er-
points of all users and on the right the latest user activ-   ent functional characteristics. In the first phase, which
ities. Fig. 3a shows the points detail view of a specific     is currently running, users only interact with a non-
user with information about the current level status.         gamified version to detect and fix influences of errors
We already count points and badges in the background          and usability flaws. In the following phases, we will
on the currently deployed application were gamifica-          enable gamification to see which user respond to what
tion is disabled yet. This give us similar insights to        element. The game design elements we used in the In-
A/B testing because we can compare e.g. the achieved          foboard were selected based on research about user
points and badges of both phases. The illustrations           and player types and previous gamification studies.
in Fig. 3 reflect the actual status and numbers after         We present these works and discuss the current state
two weeks. If gamification would have been enabled it         of the art to provide a comprehensive overview about
(a) Gamification statistics over all users of the system (leader-      (b) Points and level information view of a specific user.
boards, points and recent activities).

  Figure 3: Experiment phase II: Example game design elements of the application with gamification enabled.
the domain of enterprise gamification and game design  [CD09]      Alessandro Canossa and Anders Drachen.
research. As explained, the work presented in this pa-             Patterns of Play : Play-Personas in User-
per is work in progress. Nevertheless, it is a needed              Centred Game Development. In DiGRA
step to advance gamification research. An experiment,              ’09. Brunel University, 2009.
particularly in the domain of enterprise gamification,
                                                       [DDKN11] Sebastian       Deterding,      Dan    Dixon,
needs to be carefully conducted to minimize negative
                                                                   R Khaled, and L Nacke. From game
influences and unforeseen e↵ects.
                                                                   design elements to gamefulness: defining
                                                                   gamification. Proceeding of the 15th Inter-
6 Acknowledgment                                                   national Academic MindTrek Conference,
The authors would like to thank Tom Nick5 for pro-                 pages 9–15, 2011.
gramming and interface design support. The re-
                                                       [FD08]      Rosta Farzan and JM DiMicco. When the
search leading to these results was performed in the
                                                                   experiment is over: Deploying an incen-
CrowdRec project, which has received funding from
                                                                   tive system to all the users. In Persuasive
the European Union Seventh Framework Programme
                                                                   Technology. ACM, 2008.
FP7/2007-2013 under grant agreement N 610594.
                                                       [FDB09] Rosta Farzan, JM DiMicco, and Beth
References                                                         Brownholtz. Spreading the honey: a sys-
                                                                   tem for maintaining an online commu-
[Bar96]     Richard Bartle. Hearts, Clubs, Diamonds,
                                                                   nity. Proceedings of the ACM 2009 inter-
            Spades: Players Who Suit MUDs, 1996.
                                                                   national conference on Supporting group
[Bar03]     Richard A Bartle. A Self of Sense, 2003.               work, pages 31–40, 2009.

[BKM09]       Aaron Bangor, Philip Kortum, and James                [FDM08]    Rosta Farzan,       JM DiMicco,      and
              Miller. Determining what individual SUS                          DR Millen. Results from deploying a
              scores mean: Adding an adjective rat-                            participation incentive mechanism within
              ing scale. Journal of usability studies,                         the enterprise.    In Proceedings of the
              4(3):114–123, 2009.                                              SIGCHI conference on Human factors in
                                                                               computing systems, pages 563–572, 2008.
[Bro96]       John Brooke. SUS - A quick and dirty
              usability scale. Usability evaluation in in-          [Ham13]    Juho Hamari. Transforming Homo Eco-
              dustry, 189(194):4–7, 1996.                                      nomicus into Homo Ludens: A Field Ex-
                                                                               periment on Gamification in A Utilitar-
[Cas07]       Marty Moore Casey Dugan, Michael                                 ian Peer-to-Peer Trading Service. Elec-
              Muller, David R. Millen, Werner Geyer,                           tronic Commerce Research and Applica-
              Beth Brownholtz.     The dogear game:                            tions, 12(4):236–245, 2013.
              a social bookmark recommender system.
                                                                    [HH12]     Kai Huotari and J Hamari. Defining gam-
              In Proceedings of the 2007 international
                                                                               ification: a service marketing perspective.
              ACM conference on Supporting group
                                                                               Proceeding of the 16th International Aca-
              work, pages 387–390, 2007.
                                                                               demic MindTrek Conference, pages 17–22,
   5 github: https://github.com/TN1ck                                          2012.
[HK13]    Juho Hamari and Jonna Koivisto. Social       [vAD04]   Luis von Ahn and Laura Dabbish. Label-
          motivations to use gamification: an empir-             ing images with a computer game. CHI
          ical study of gamifying exercise. Proceed-             ’04, 6(1):319–326, 2004.
          ings of the 21st European Conference on
          Information Systems SOCIAL, (JUNE):1–        [YDN12]   Nick Yee, Nicolas Ducheneaut, and Les
          12, 2013.                                              Nelson. Online gaming motivations scale:
                                                                 development and validation. CHI ’12,
[HKS14]   Juho Hamari, J Koivisto, and Harri Sarsa.              pages 2803–2806, 2012.
          Does Gamification Work? A Literature
                                                       [Yee07]   Nick Yee. Motivations for play in online
          Review of Empirical Studies on gamifica-
                                                                 games. Journal of CyberPsychology and
          tion. In In proceedings of the 47th Hawaii
                                                                 Behavior, 9:772–775, 2007.
          International Conference on System Sci-
          ences, 2014.                                 [YMT+ 11] Jiang Yang, Meredith Ringel Morris,
                                                                 Jaime Teevan, Lada A Adamic, Mark S
[HT14]    Juho Hamari and J Tuunanen. Player                     Ackerman, and One Microsoft Way. Cul-
          Types: A Metasynthesis. In Transactions                ture Matters : A Survey Study of Social
          of the Digital Games Research Associa-                 Q & A Behavior. In International AAAI
          tion, 2014.                                            Conference on Weblogs and Social Media,
                                                                 pages 409–416, 2011.
[JXKV16] Yuan Jia, Bin Xu, Yamini Karanam, and
         Stephen Voida. Personality-targeted Gam-      [Zha08]   Ping Zhang. Motivational A↵ordances:
         ification: A Survey Study on Personality                Reasons for ICT Design and Use. Com-
         Traits and Motivational A↵ordances. In                  munications of the ACM, 51(11):145–147,
         Proceedings of the 34th Annual ACM Con-                 2008.
         ference on Human Factors in Computing
         Systems - CHI ’16, 2016.

[Kha11]   Rilla Khaled. It’s Not Just Whether You
          Win or Lose : Thoughts on Gamifica-
          tion and Culture. In Workshop on Gam-
          ification: Using Game Design Elements
          in Non-Gaming Contexts, co-located with
          CHI2011, pages 1–4, 2011.

[MJ14]    Michael Meder and Brijnesh-Johannes
          Jain. The Gamification Design Problem.
          arXiv:1407.0843v1 [cs.HC], page 5, 2014.

[Rob13]   David Robinson. A Preliminary Taxon-
          omy of Gamification Elements for Varying
          Anticipated Commitment. In Proc. ACM
          CHI 2013 Workshop on Designing Gamifi-
          cation: Creating Gameful and Playful Ex-
          periences, 2013.

[SBSH16] Laurentiu Catalin Stanculescu, Alessandro
         Bozzon, Robert-Jan Sips, and Geert-jan
         Houben. Work and Play: An Experiment
         in Enterprise Gamification. CSCW ’16,
         pages 345–357, 2016.

[TMD12]   Jennifer Thom, D Millen, and Joan DiM-
          icco. Removing gamification from an en-
          terprise SNS. Proceedings of the ACM
          2012 conference on Computer Supported
          Cooperative Work, pages 1067–1070, 2012.