<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>CoFeel: Using Emotions for Social Interaction in Group Recommender Systems</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Yu Chen</string-name>
          <email>yu.chen@epfl.ch</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Categories and Subject Descriptors</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Pearl Pu</string-name>
          <email>pearl.pu@epfl.ch</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>H.5.2 [Information Interfaces and Presentation]: User, Interfaces -Graphical user interfaces (GUI), User-centered, design. H.5.3 [Information Interfaces and Presentation]: Group, and Organization Interfaces - Organizational design</institution>
          ,
          <addr-line>Web-based, interaction</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Human Computer Interaction Group, Swiss Federal Institute of Technology</institution>
          ,
          <addr-line>CH-1015, Lausanne</addr-line>
          ,
          <country country="CH">Switzerland</country>
        </aff>
      </contrib-group>
      <fpage>48</fpage>
      <lpage>55</lpage>
      <abstract>
        <p>Group and social recommender systems aim to suggest items of interest to a group or a community of people. One important issue in such environment is to understand each individual's preference and attitude within the group. Social and behavioral scientist have evidenced the role of emotions in group work and social communication. This paper aims to examine the role of emotion for social interaction in group recommenders. We implemented CoFeel, an interface that aims to provide emotional input in group recommenders. We further apply CoFeel in a GroupFun, a mobile group music recommender system. Results of an in-depth field study show that by exchanging feelings with other users, CoFeel motivates users to provide feedback on recommended items in a natural and enjoyable way. Results also show that emotions do serve as an effective and promising element to elicitate users' attitudes, and that they do have the potential to increase user engagement in a group. Based on suggestions collected from users, we propose other potential recommendation domains of CoFeel. Paper presented at the Workshop on Interfaces for Recommender Systems 2012, in conjunction with the 6th ACM conference on Recommender Systems. Copyright @ 2012 for the individual papers is held by the papers' authors. This volume is published and copyrighted by its editors.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. INTRODUCTION</title>
      <p>Nowadays, sharing, coordination, cooperation and communication
among group members are becoming indispensible in online
environment. Such groups can be constituted by families selecting
a recipe together, colleagues working on same projects, and social
club members planning a culture event. These are examples of
small groups, normally less than hundreds of people. In group
environment, group decision-making becomes a problem due to
information overload. Group recommender systems (GRSs) aim
to alleviate information overload by suggesting items to a group
of people.</p>
      <p>
        Group recommendation problem is not only “the sum of
members”
        <xref ref-type="bibr" rid="ref11">(Jameson, 2004)</xref>
        . As the audiences move from
individuals to groups of people, challenges arise such as
aggregating preferences and arriving at equilibrium point of
expectations. Picture yourself sitting together with your friends
and selecting a music playlist for a birthday party. The selection
process does not only depend on the verbal indication on
preferences and choices, but also on various non-verbal channels
such as individuals’ emotion within the group. Social and
behavioral scientists have long been studying the social role of
emotion in group environment. Our goal is to set a basic
understanding of using emotion for social interaction in group
recommenders with a particular focus on the following two
questions.
      </p>
      <p>1)</p>
      <p>What are the roles of emotional information in group
recommender systems?
2)</p>
      <p>How to design such interface that is useful, easy to use
and playful?
To answer these questions, we introduce CoFeel, an affective
interface that allows users to provide emotional input in
recommender systems. We further implemented CoFeel in
GroupFun, a mobile group music recommender system. The rest
of the paper is organized as followed. Section 2 discusses existing
work and how they related with our work, and particularly, why
emotions play an important role in group and social environment.
This is followed by design and usage of CoFeel interface in
Section 3 and how to apply CoFeel in GroupFun in Section 4.
After reporting the results of a small-scale qualitative user study
in Section 5, this paper discusses further application scenarios in
Section 6 and concludes with limitations and future work.</p>
    </sec>
    <sec id="sec-2">
      <title>2. RELATED WORK</title>
    </sec>
    <sec id="sec-3">
      <title>2.1 Social Interaction in Group</title>
    </sec>
    <sec id="sec-4">
      <title>Recommender Systems</title>
      <p>
        Jameson studied some of the key user issues for group
recommender systems
        <xref ref-type="bibr" rid="ref11">(Jameson, 2004)</xref>
        and investigated several
measures for promoting collaborating and coordination. These
measures mainly aim at designing user interfaces to enhance
mutual awareness. Mutual awareness in group recommender
systems includes membership awareness, preference awareness
and decision awareness.
      </p>
      <p>
        Membership awareness allows users to check which users are in
the group. Being aware of members in a group facilitates users to
decide how to behave and thus enhances trust in a group
recommender
        <xref ref-type="bibr" rid="ref26">(Yu, Zhou, Hao, &amp; Gu, 2006)</xref>
        .
      </p>
      <p>
        Preference awareness enables users to be aware of the preferences
of other members. A user study on PolyLens reveals that users
would like to see each other’s preference information, even at the
expense of some degree of privacy loss. Preference awareness in
group recommender systems are categorized into three levels:
zero awareness, partial awareness and full awareness. Zero
preference awareness means that users only know their own
preferences, as shown in MusicFX
        <xref ref-type="bibr" rid="ref17 ref18">(J. F. Mccarthy, 1998a)</xref>
        . Zero
preference awareness systems are simple but do not inspire user
trust. Partial awareness in group recommenders allows users to
apply preference information from other group members
        <xref ref-type="bibr" rid="ref15">(Kudenko, Bauer, &amp; Dengler, 2003)</xref>
        . However, it is prone to
social loafing, a phenomenon when people contribute less in a
social environment than when they work individually. In full
preference awareness, users are aware of other members’
preferences. One typical technique for is Collaborative Preference
Specification (CPS)
        <xref ref-type="bibr" rid="ref11">(Jameson, 2004)</xref>
        , as presented in CATS,
PocketRestaurantFinder
        <xref ref-type="bibr" rid="ref17 ref18">(J. F. Mccarthy, 1998b)</xref>
        and Travel
Decision Forum. CPS in group recommender systems enables
persuasion, supports preference explanation and justification and
reduces conflict. Decision awareness is important in helping users
arrive at a final decision. Decision awareness is a status in which
users are aware of the decision making process of other members.
Existing group recommender systems include the following
decision making styles: (1) zero awareness - simply translating the
most highly rated solution into action without the consent of any
user (e.g. in MusicFX), (2) partial awareness - one or a selected
set of representatives of the group are responsible for making the
final decisions (e.g. INTRIGUE and PolyLens), and (3) full
awareness - arriving at final decision through face-to-face
discussions (e.g., CATS) or mediated discussions (e.g., MIAU
        <xref ref-type="bibr" rid="ref15">(Kudenko et al., 2003)</xref>
        and Travel Decision Forum). However,
none of the work addresses the role of emotion in
decisionmaking or group interaction.
      </p>
    </sec>
    <sec id="sec-5">
      <title>2.2 Interface in Group Recommender</title>
    </sec>
    <sec id="sec-6">
      <title>Systems</title>
      <p>
        “Group interfaces differ from single-user interfaces in that they
depict group activity and are controlled by multiple users rather
than single user”
        <xref ref-type="bibr" rid="ref6">(Ellis, J. Gibbs, &amp; Rein, 1991)</xref>
        . Therefore,
interface adequacy has more requirements in group recommenders
compared with individual recommenders. Flytrap
        <xref ref-type="bibr" rid="ref5">(Crossen,
Budzik, &amp; Hammond, 2002)</xref>
        visualizes recommended items by
using colors and locations. Songs personalized for different users
are displayed with different colors, and the closer the songs are to
the center, the more likely they will be played. PolyLens
        <xref ref-type="bibr" rid="ref4">(Connor,
Cosley, Konstan, &amp; Riedl, 2001)</xref>
        supports three models of
visualizing recommendation UI. Group-only interface only
displays movies from group recommendation. Composite
interface displays a list of recommended movies with both group
and individual member predictions. Individual-focused interface
shows the items for other individual users’ preferences. CATS
        <xref ref-type="bibr" rid="ref19">(K.
Mccarthy et al., 2006)</xref>
        offers users personal space and group
space. In group space, each user has a snowflake with a different
color and the size of snowflake indicates preferences of individual
users. This allows users to check the interest of other users for a
particular resort. Additionally, each icon presents a resort, and its
size grows or shrinks in accordance with the preference of the
whole group.
      </p>
      <p>
        Travel Decision Forum
        <xref ref-type="bibr" rid="ref24">(Taylor, Ardissono, Goy, &amp; Petrone,
2003)</xref>
        introduces an animated character for each group member
currently not available for communication. By responding with
speech, facial expressions, and gesture to proposed solutions; a
representative conveys to the current online users some key
aspects of its corresponding offline user’s responses to a proposed
solution. This is one of the few work that employs non-verbal
channels in group environment.
      </p>
    </sec>
    <sec id="sec-7">
      <title>2.3 Emotion in Recommender Systems</title>
      <p>
        Musicovery 1 and Stereomood 2 have developed an interactive
interface for users to select music category based on their mood.
Musicovery classifies mood by two dimensions: dark-positive and
energetic calm. It uses highly interactive interface for users to
experience different emotion categories and their corresponding
music. However, such recommender does not support interaction
in social group environment. The main goal of studying
recommender systems is to improve user satisfaction. However,
satisfaction is a highly subjective metric. Masthoff and Gatt
        <xref ref-type="bibr" rid="ref16">(Masthoff, 2005)</xref>
        have considered satisfaction as an affective state
or mood based on the following aspects in socio- and
psychotheories: 1) mood impacts judgement; 2) retrospective feelings
can differ from feelings experienced; 3) expectation can influence
emotion and 4) emotions wear off over time. However, they did
not propose any feasible methods to apply the above
psychological theories. They also proved that in group
recommender systems, members’ emotion can be influenced by
each other, and this phenomenon is called emotional contagion.
      </p>
    </sec>
    <sec id="sec-8">
      <title>2.4 Emotions and Decision Making</title>
      <p>
        Our everyday experiences leave little doubt that our emotions can
influence decisions we make. For instance, experiment results
        <xref ref-type="bibr" rid="ref21">(Raghunathan &amp; Pham, 1999)</xref>
        showed that in gambling decisions,
as well as job-selection decisions, sad individuals are biased in
favor of high-risk and high-reward options, whereas anxious
individuals are biased in favor of the opposite. On the other hand,
        <xref ref-type="bibr" rid="ref10">(Isen, 2001)</xref>
        reveals evidence that in most circumstances, positive
affect enhances problem solving and decision making, leading to
cognitive processing that is not only flexible, innovative, but also
thorough and efficient.
        <xref ref-type="bibr" rid="ref23">(Schwarz, 2000)</xref>
        has addressed the
influence of moods and emotions experienced at the time of
decision making, affective consequences of decisions and the role
of anticipated and remembered affect in decision making.
        <xref ref-type="bibr" rid="ref2">(Bechara, 2004)</xref>
        further proves the influence of emotions on
decision-making from neurology. (Velásquez, 1997) and
        <xref ref-type="bibr" rid="ref7">(Gratch
&amp; Rey, 2000)</xref>
        also modeled emotion-based decision making.
2.5 Social Role of Emotions
        <xref ref-type="bibr" rid="ref12">(Keltner, 1999)</xref>
        integrate claims and findings concerning the
social functions of emotions at the individual, dyadic, group, and
cultural levels of analysis. On dyadic level (a group of two),
emotional expressions help individuals know others’ emotions,
beliefs and intensions, and thus rapidly coordinating social
interactions. Emotional communication also evokes
complementary and reciprocal emotions in others that help
individuals respond to significant social events. Emotions serve as
incentives or deterrents for other individuals’ social behavior. On
group level, emotions have claimed to help individuals solve the
problem of identifying group members. Displaying emotions may
help individuals define and negotiate group-related roles and
status. Collective emotional behavior may also help group
members negotiate group-related problems. Study results from
1 Musicovery. http://musicovery.com/
2 Stereomood. http://www.stereomood.com/
        <xref ref-type="bibr" rid="ref13">(Ketelaar &amp; Tung Au, 2003)</xref>
        are discussed in terms of an
“affectas-information” model, which suggests that non-cooperating
individuals who experience the negative state associated with guilt
in a social bargaining game may be using this feeling state as
“information” about the future costs of pursuing an uncooperative
strategy.
        <xref ref-type="bibr" rid="ref3">(Bowles &amp; Gintis, 2002)</xref>
        suggest that prosocial emotions,
such as shame, guilt,
        <xref ref-type="bibr" rid="ref19">(K. Mccarthy et al., 2006)</xref>
        pride, regret, and
joy, play a central role in sustaining cooperative relations,
including successful transactions in the absence of complete
contracting.
        <xref ref-type="bibr" rid="ref9">(Hareli &amp; Rafaeli, 2008)</xref>
        propose that organizational
dyads and groups inhabit emotion cycles: emotions of an
individual influence the emotions, thoughts and behaviors of
others; others’ reactions can then influence their future
interactions with the individual expressing the original emotion,
as well as that individual’s future emotions and behaviors.
        <xref ref-type="bibr" rid="ref1">(Barsade, 2001)</xref>
        proved that the leaders transfer their moods to
group members and that leaders’ moods impact the effort and the
coordination of groups.
        <xref ref-type="bibr" rid="ref8">(Hancock et al., 2008)</xref>
        have investigated
emotion contagion and proved that emotions can be sensed in
text-based computer mediated communications.
      </p>
    </sec>
    <sec id="sec-9">
      <title>3. CoFeel: Providing Emotional Input</title>
    </sec>
    <sec id="sec-10">
      <title>3.1 Design Goals</title>
      <p>
        As the first step to investigate the social role of emotions, we
design an interface that helps users to provide emotional input.
Since this input is also users’ feedback, we cross-use “emotional
input” in this paper. We refer to the guidelines for designing
recommender systems, proposed by
        <xref ref-type="bibr" rid="ref20">(Pu, Chen, &amp; Hu, 2011)</xref>
        .
Designing CoFeel should meet the following design principles.
      </p>
      <p>Usefulness. Users are able to provide emotional
feedback using CoFeel.</p>
      <p>Ease to use. Users find CoFeel easy to use and easy to
learn.</p>
      <p>Playfulness. Users find it fun, playful and entertaining
to use CoFeel.</p>
    </sec>
    <sec id="sec-11">
      <title>3.2 What is it?</title>
      <p>CoFeel aims to enhance group experience by enhancing
selfpresence and mutual awareness within a group. By exchanging
feelings with other users, CoFeel aims to motivate users to
provide feedback on recommended items in a natural and easy
way. It is implemented as an infrastructure, which can be easily
extended to various group recommendation domains.</p>
      <p>
        We choose Geneva Emotion Wheel (GEW) introduced by Scherer
        <xref ref-type="bibr" rid="ref22">(Scherer, 2005)</xref>
        for users to label emotions, i.e., attitude to
recommended items, see Figure 1. Using GEW to label emotion
has two advantages: natural tagging of discrete categorical words
and the possible mapping of these labels to a two-dimensional
space (valance-arousal). In each emotion, users can choose
different sized circle. As such, users can assign different intensity
values to the emotion they choose.
We adopt Scherer’s color wheel style and choose 8 emotions for
CoFeel Emotion Plate: excited, joyful, surprised, calm, sad, fear,
distressed, aroused, as is shown in Figure 2. Each emotion class
provides a scale from 1 to 5 indicating the intensity of the emotion.
In order to enhance user engagement in interacting with the
CoFeel, we design each emotional position as a hole and a ball is
rolling on the surface of emotion plate. Users interact with the
plate by placing the ball in the hole that corresponding to the
emotional state. The aim of using the plate-hole-ball metaphor is
to enhance user affordance to interact with the interface.
3.3 How to use it?
We implement CoFeel emotion plate on mobile phones. Since we
have chosen the metaphor of a plate, it is natural that a ball can
roll around the surface. Users can select the emotion, i.e., place
the ball, by tilting the plate surface. Once users confirm an
emotion, they can simply click a ‘track’ button, which is around
the emotional plate, see Figure 3. The phone detects user
movement and direction of surface plate using sensors on mobile
phones, i.e., accelerometer and gyroscope. We designed this way
in order to make the proces more fun and engaging. We have also
filtered out constant accelormeter data when users are walking,
travelling and etc. In this way, users can input their emotion in a
stable way.
      </p>
    </sec>
    <sec id="sec-12">
      <title>4. PROTOTYPE</title>
    </sec>
    <sec id="sec-13">
      <title>4.1 GroupFun: a music recommender system</title>
      <p>In order to test the applicability of CoFeel, we implemented
GroupFun, a mobile group music recommender system. Its
function is to come up with common playlists for user created
groups. Users can create groups and share their music taste with
their group members by rating songs in GroupFun. When
GroupFun generates a common playlist for a group, the criterion
is to take into account the music taste of all of its contributing
members. Figure 4 shows the group function of GroupFun. Users
can use CoFeel for two purposes: 1) providing emotional
feedback to a song and 2) leaving mood traces on the timeline of a
song.</p>
    </sec>
    <sec id="sec-14">
      <title>4.2 Providing Emotional Feedback to a Song</title>
      <p>Emotional feedback can be used as an explanation interface for
rating. Users can choose the emotion category and its intensity
using CoFeel, see Figure 5. As we introduced in Section 3.3, users
hold the phone and roll around the indicator ball around the
surface of emotion plate, as is shown in Figure 6.
After selecting, emotional feedback is recorded with the song, as
is shown in Figure 7. The color dots right to the title of a song
indicates the type and intensity that users have chosen, which
correspond to the colors in CoFeel. The intensity of emotions is
visulized with transparency of circles. For example, the song ‘We
will rock you’, is rated as an ‘exciting’ song, with the level of 3
out 5.</p>
    </sec>
    <sec id="sec-15">
      <title>4.3 Emotional Traces in Timeline of a Song</title>
      <p>Users can also leave emotional traces throughout the timeline of a
song. Figure 8 is an example way to visualize the traces as music
score. User emotions are distinguished by different colors,
corresponding with colors in CoFeel. Intensities of emotions
correspond to the line. The position of dots in the lines represents
the relative position of the moment when user leaves emotional
comments. For example, a user is listening to “Paradise” from
Coldplay. The last two red dots represent users’ emotion towards
the end of the song: aroused with different levels of intensities.</p>
    </sec>
    <sec id="sec-16">
      <title>5. Experiment</title>
    </sec>
    <sec id="sec-17">
      <title>5.1 Goals</title>
      <p>To the best of our knowledge, our work is the first to propose
providing emotional feedback in group recommender systems.
Therefore, the main purpose of evaluation is not to prove its
superiority to other means of feedback or replace them. Rather,
we aim to understand users’ opinions towards emotional feedback
and the design of CoFeel interface, including their degree of
acceptance and suggestions. To be more specific, we aim to
investigate two research questions:</p>
    </sec>
    <sec id="sec-18">
      <title>5.2 Design and Procedure</title>
      <p>In order to answer the above questions, we carried out a
smallscale qualitative user experiment, with emphasis on learning from
users through active listening, inspection and observation. In
addition to normal users, we also showed GroupFun to domain
experts. Based on the above two types of interviewed users, we
divide the experiment to two steps.</p>
      <sec id="sec-18-1">
        <title>Step 1: Evaluate with normal users</title>
        <p>The goal of evaluate with normal users is to observe how they
interact with CoFeel, particularly whether they have encountered
any usability problems. However, we evaluate CoFeel interface
using GroupFun, without explicitly telling users what we were
evaluating and observing.</p>
        <p>Four users participated in the experiment. Each user is distributed
with an Android phone installed with GroupFun. Before
experiment, we assigned each participant with a specific group
with 11 members. The 11 members come from his/her Facebook
friends. Each group is recommended with a music playlist. Since
the accuracy of recommendation is out of scope of this paper, we
use choose most popular songs, i.e., top 40 songs in the
experiment week.</p>
        <p>Before exposing users with application and systems, we ask the
following questions to warm them up.</p>
        <p>1)
2)
3)
4)
5)</p>
        <sec id="sec-18-1-1">
          <title>How often do you listen to music?</title>
        </sec>
        <sec id="sec-18-1-2">
          <title>In which context do you listen to music? Which kind of device do you use to listen to music? What do you think about the relation between music and emotion?</title>
        </sec>
        <sec id="sec-18-1-3">
          <title>Do you share music among friends?</title>
          <p>During the experiment, the participants explore and experience
GroupFun freely, with particular focus on CoFeel interface. We
observe how they interact with GroupFun and CoFeel, the whole
process of which is recorded. In the meantime, they can ask any
questions and raise their concerns. After the experiment, we ask
for users’ comments.</p>
        </sec>
      </sec>
      <sec id="sec-18-2">
        <title>Step 2: Interview domain expert</title>
        <p>Different from experiment with normal users, the goal of
interviewing domain experts is to understand the role of emotions
in social and group environment and whether CoFeel contributes
to this purpose. Additionally, the focus shifts from observation to
listening for their feedback and suggestions. We invited a doctor
in the field of social psychiatry and interviewed them for feedback
in emotional design. They first briefly play around with GroupFun
and CoFeel then commented on the design from the theoretical
function point of view.</p>
      </sec>
    </sec>
    <sec id="sec-19">
      <title>5.3 Results</title>
      <sec id="sec-19-1">
        <title>Step 1: Evaluate with normal users</title>
        <p>We summarize the demographic information as below in Table 1.
ID
Occupation
Gender
Age
Music exp.
(App.)
Devices for
listening
Music
context
Sharing
music
friends</p>
        <p>User 1
Student
Male
22
&gt;12 h/day
Mobile
phones and
laptop
Studying,
designing,
walking
with</p>
        <p>Spotify,
Facebook</p>
        <p>User 2
Student
Female
26
&gt;8h/day
Computer,
car, MP3
player
Working,
cooking,
driving,
before
sleep
Facebook,
Twitter,
Google +</p>
        <p>User 3
Student
Male
25
2 h/day
Computer
Relaxing
Google+,
Facebook,
Email</p>
        <p>User 4
Consultant
Female
32
2-4 h/day
Mobile
phones
Travelling,
meditation,
music
lessons
CDs,
DVDs
1. They hardly notice that music is more frequent in their life than
their perception. When asked how often they listen to music, 3 out
of the 4 interviewed users answered: not very often. However,
when we ask them to recall the last song they listened to recently,
they finally discover much more scenarios and time that they
listen to music. This implies that users tend to use listening to
music as background tasks.
2. The methods they listening to music tend to be mobile and
pervasive. From user evaluation, we found that 3 out of 4 users
listen to music on the go. Such mobile devices can be smart
phones, mp3 players, laptops, in-car entertaining system and etc.
3. They choose music based on different context. When asked
what types of music they listen to. Their answers usually start
with “er”, “well, depends…”. Then they elaborate how they
choose music in different contexts, e.g., studying, driving,
cooking etc.
4. They are intrinsically willing to share music among friends.
Surprisingly, all interviewed users share and discuss about songs
among their friends. As one user mentioned, “I share a song with
friends, either because I like it, or I think my friend may like it, or
it include our shared memory, or it suits the current context.”
We further observe users when they are playing around CoFeel
emotion plate in GroupFun. Not surprisingly, we observed some
common phenomenon during their interaction with system.
1. During the whole process they interact with GroupFun, they
spend the majority of time exploring CoFeel, out of curiosity and
fun.
2. The first time when they saw the interface, their mental model
of choosing emotion is by clicking. After few seconds, they
realized how the ball is moving.
3. They learnt to use CoFeel to keep track of their mood in very
short time.</p>
        <p>This implies that given the fact that CoFeel is a novel interface,
users enjoy playing with it and can learn how to use it in short
time.</p>
        <p>After using interacting with the system, we further interviewed
them for feedback on the design of CoFeel and its usage in
GroupFun. We received both many encouraging and promising
comments as well as suggestions.</p>
        <p>Overall, users were excited to talk about CoFeel emotion plate. As
users commented: “The plate reminded me of a game I played
when I was young, very intuitive and entertaining.” “It is simply
artistic and charming.” “I like the visual effect. It is beautiful”.
From the received comments, users are generally impressed by the
visual effect of CoFeel.</p>
        <p>When asked whether CoFeel, i.e., emotional feedback, is useful in
GroupFun, all of them agree it is useful. “It is interesting way to
comment on a song.” “In this way, my friend understand why I
like this song and I also know their styles and favorite songs
better.” “I used the emotional re-tweeting function in one
microblogging system, which is a fast and convenient way to express
multi-dimensional meanings.” “Sometimes I don’t know how to
express my feeling and comments for a song. They are abstract
and I’m a person of few words. Emotional feedback looks like I’m
choosing my comments from a set of words. It is a take-away
style. Everything is predefined and very quick.”
At the mean time, they suggest further application scenarios for
using CoFeel in social interaction. “It will be interesting to see a
music messaging system where people communicate emotions via
music.” “What about an interface for mixed emotions?”
“Retweeting a song attached with emotions would be cool!”
From the qualitative analysis above, we conclude that CoFeel has
fulfilled the goals we have set in Section 3.1: usefulness, ease of
use and playfulness.</p>
      </sec>
      <sec id="sec-19-2">
        <title>Step 2: Interview domain expert</title>
        <p>Furthermore, we interviewed a doctor in children and adolescent
psychiatry. From mental health perspective, he pointed out that
discussing with friends with/using music is also used to enhance
people’s mental health. This process is called music therapy.
Music and mood is by nature connected. Meanwhile, encouraging
discussion about mood among a social group also brings benefit to
enhance users’ mental state, under the condition that the process
should be fun. This method is also known as social therapy. He
also commented on GroupFun with CoFeel as followed. “Your
software, I find it very interesting, especially the idea of
selfregulation by the music and the group's involvement even if it is a
virtual interaction. In short, fun and social group, they are two
very important elements, not just for people with depression, but
also for everyone who is interested in this type of language. Every
day, we all have moments of frustration and we all seek for
selfsolutions and be content with a group that gives us support and
sense of belonging.”
From the interview results, we find that theoretically providing
emotional feedback has a positive effect on encouraging group
interaction and engagement. A further discovery is that social
interaction that takes place within a group also enhances user
mood and mental state.</p>
      </sec>
    </sec>
    <sec id="sec-20">
      <title>5.4 Implications</title>
      <p>We summarize the findings from the above user study about
providing emotional feedback in group recommender systems.
1. Providing emotional feedback enhances mutual awareness of
user preferences within a group. Users know the reasons their
friends like a song.
2. A well-designed interface for emotional feedback offers social
affordance and invites users engagement in the system. When
users know the items their friends like and the reasons of liking,
they are more likely to experience the recommended items, i.e.,
music. This encourages users to be more engaged in the system.
3. Social interaction in turn strengthens users’ sense of social
belonging and enhances their emotional state.</p>
    </sec>
    <sec id="sec-21">
      <title>6. LIMITATIONS AND DISCUSSIONS</title>
      <p>This work has some limitations that we would like to continue in
the future. First of all, CoFeel collects explicit emotions reported
by users. Sometimes, users are not aware of their emotional
attitude. Thus we also aim to consider users’ implicit emotional
feedback. Additionally, the study is limited within individuals
with manipulated friend groups instead of users within a group.
Furthermore, as an in-depth qualitative user study, we only invited
a few users and domain experts. In order to further validate our
hypotheses, we need more groups and users and conduct larger
scale user studies for quantitative analysis. It would also be
interesting to let users use GroupFun with their friends in real life
and observe their behavior and attitude.</p>
      <p>Despite of the limitations, using emotions for social interaction
implies a much broader usage context. CoFeel not only applies in
music recommender systems but also various other domains.
Based on feedback received from interviewed users, we propose
the following example domains where emotional feedback can be
useful: movies, tourists, product, hotels, food and etc. One thing
in common in the above domain is the capability for the items to
elicit emotions. This has been cross validated by social and
behavioral scientists.</p>
    </sec>
    <sec id="sec-22">
      <title>7. CONCLUSIONS</title>
      <p>We hypnotize that using emotion to enhance social interaction in
group recommenders. We have implemented CoFeel, with the
goal of designing an interface that is easy to use and enjoyable for
users to leave emotional attitude. We further applied CoFeel in
GroupFun, a group music recommender system in mobile phones.
CoFeel can be used in two modes in GroupFun: elicitation of
emotional attitude towards a whole song or emotional traces in the
timeline of a song. We then conducted an in-depth qualitative
experiment with users, observing their interaction with GroupFun
and CoFeel, followed by interviews with them. Besides normal
users, we also showed our prototype to domain experts and
received positive feedback from them, both theoretically and
practically. Results show that providing emotional feedback not
only enhances mutual awareness of user preferences, but also
encourages social interaction. In essence, providing such social
affordance using emotions in group environment in turn promotes
users’ enthusiasm in interacting with system. Based on discussion
with users, we are more convinced that emotional feedback, i.e.,
CoFeel, applies not only in music domain, but also in many others,
such as travel, movie and product recommendations.</p>
    </sec>
    <sec id="sec-23">
      <title>8. ACKNOWLEDGEMENT</title>
      <p>We thank Swiss National Science Foundation for supporting this
work and project. We also thank all participants for their interest
in our project, their valuable time and suggestions.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Barsade</surname>
            ,
            <given-names>S. G.</given-names>
          </string-name>
          (
          <year>2001</year>
          ). Working Paper Series OB Organizational Behavior “ The Ripple Effect : Emotional Contagion In Groups ”, (
          <year>August</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Bechara</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2004</year>
          ).
          <article-title>The role of emotion in decision-making: evidence from neurological patients with orbitofrontal damage</article-title>
          .
          <source>Brain and cognition</source>
          ,
          <volume>55</volume>
          (
          <issue>1</issue>
          ),
          <fpage>30</fpage>
          -
          <lpage>40</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.bandc.
          <year>2003</year>
          .
          <volume>04</volume>
          .001
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Bowles</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Gintis</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          (
          <year>2002</year>
          ).
          <article-title>Prosocial emotions</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Connor</surname>
            ,
            <given-names>M. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Cosley</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Konstan</surname>
            ,
            <given-names>J. A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Riedl</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2001</year>
          ).
          <article-title>PolyLens : A Recommender System for Groups of Users</article-title>
          .
          <source>ECSCW 2001</source>
          (pp.
          <fpage>199</fpage>
          -
          <lpage>218</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Crossen</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Budzik</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Hammond</surname>
            ,
            <given-names>K. J.</given-names>
          </string-name>
          (
          <year>2002</year>
          ).
          <article-title>Flytrap: intelligent group music recommendation. the 7th international conference on Intelligent user interfaces</article-title>
          (pp.
          <fpage>184</fpage>
          -
          <lpage>185</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Ellis</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>J. Gibbs</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Rein</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          .
          <article-title>(</article-title>
          <year>1991</year>
          ).
          <article-title>Groupware: some issues and experiences</article-title>
          .
          <source>Commun. ACM</source>
          ,
          <volume>34</volume>
          (l).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Gratch</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Rey</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>2000</year>
          ).
          <article-title>Modeling the Interplay Between Emotion and Decision-Making.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Hancock</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gee</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Ciaccio</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          (
          <year>2008</year>
          ).
          <article-title>I'm sad you're sad: emotional contagion in CMC</article-title>
          .
          <source>Proceedings of the 2008 ACM</source>
          ,
          <fpage>295</fpage>
          -
          <lpage>298</lpage>
          . Retrieved from http://dl.acm.org/citation.cfm?id=
          <fpage>1460611</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Hareli</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Rafaeli</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2008</year>
          ).
          <article-title>Emotion cycles: On the social influence of emotion in organizations</article-title>
          . Research in Organizational Behavior,
          <volume>28</volume>
          ,
          <fpage>35</fpage>
          -
          <lpage>59</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.riob.
          <year>2008</year>
          .
          <volume>04</volume>
          .007
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Isen</surname>
            ,
            <given-names>A. M.</given-names>
          </string-name>
          (
          <year>2001</year>
          ).
          <article-title>An Influence of Positive Affect on Decision Making in Complex Situations: Theoretical Issues With Practical Implications</article-title>
          .
          <source>Journal of Consumer Psychology</source>
          ,
          <volume>11</volume>
          (
          <issue>2</issue>
          ),
          <fpage>75</fpage>
          -
          <lpage>85</lpage>
          . doi:
          <volume>10</volume>
          .1207/S15327663JCP1102_
          <fpage>01</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Jameson</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          (
          <year>2004</year>
          ).
          <article-title>More Than the Sum of Its Members : Challenges for Group Recommender Systems</article-title>
          ,
          <volume>48</volume>
          -
          <fpage>54</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Keltner</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>1999</year>
          ).
          <article-title>Social functions of emotions at four levels of analysis</article-title>
          .
          <source>Cognition &amp;amp; Emotion</source>
          , (
          <year>July 2012</year>
          ),
          <fpage>37</fpage>
          -
          <lpage>41</lpage>
          . Retrieved from http://www.tandfonline.com/doi/abs/10.1080/026999399379 168
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Ketelaar</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Tung Au</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>The effects of feelings of guilt on the behaviour of uncooperative individuals in repeated social bargaining games: An affect-as-information interpretation of the role of emotion in social interaction</article-title>
          .
          <source>Cognition &amp; Emotion</source>
          ,
          <volume>17</volume>
          (
          <issue>3</issue>
          ),
          <fpage>429</fpage>
          -
          <lpage>453</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <source>doi:10.1080/02699930143000662</source>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Kudenko</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bauer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Dengler</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>Group Decision Making through Mediated Discussions</article-title>
          ,
          <fpage>238</fpage>
          -
          <lpage>247</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Masthoff</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2005</year>
          ).
          <article-title>The Pursuit of Satisfaction : Affective State in Group Recommender Systems</article-title>
          . User Modelling.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Mccarthy</surname>
            ,
            <given-names>J. F.</given-names>
          </string-name>
          (
          <year>1998a</year>
          ).
          <article-title>MUSICFX : An Arbiter of Group Preferences for Computer Supported Collaborative Workouts</article-title>
          . ACM conference on Computer supported cooperative work.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Mccarthy</surname>
            ,
            <given-names>J. F.</given-names>
          </string-name>
          (
          <year>1998b</year>
          ).
          <article-title>PocketRestaurantFinder: A Situated Recommender System for Groups (pp</article-title>
          .
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Mccarthy</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Salam</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Coyle</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mcginty</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Smyth</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Nixon</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          (
          <year>2006</year>
          ).
          <article-title>CATS : A Synchronous Approach to Collaborative Group Recommendation</article-title>
          .
          <source>American Association for Artificial Intelligence</source>
          (pp.
          <fpage>86</fpage>
          -
          <lpage>91</lpage>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Pu</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Hu</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          (
          <year>2011</year>
          ).
          <article-title>A user-centric evaluation framework for recommender systems</article-title>
          .
          <source>Proceedings of the fifth ACM conference on Recommender systems - RecSys '11</source>
          ,
          <fpage>157</fpage>
          . New York, New York, USA: ACM Press. doi:
          <volume>10</volume>
          .1145/2043932.2043962
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Raghunathan</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Pham</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          (
          <year>1999</year>
          ).
          <article-title>All Negative Moods Are Not Equal: Motivational Influences of Anxiety and Sadness on Decision Making. Organizational behavior and human decision processes</article-title>
          ,
          <volume>79</volume>
          (
          <issue>1</issue>
          ),
          <fpage>56</fpage>
          -
          <lpage>77</lpage>
          . doi:
          <volume>10</volume>
          .1006/obhd.
          <year>1999</year>
          .2838
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Scherer</surname>
            ,
            <given-names>K. R.</given-names>
          </string-name>
          (
          <year>2005</year>
          ).
          <article-title>What are emotions? And how can they be measured?</article-title>
          <source>Social Science Information</source>
          ,
          <volume>44</volume>
          (
          <issue>4</issue>
          ),
          <fpage>695</fpage>
          -
          <lpage>729</lpage>
          . Retrieved from http://ssi.sagepub.com/cgi/content/abstract/44/4/695
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Schwarz</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          (
          <year>2000</year>
          ).
          <article-title>Emotion, cognition, and decision making</article-title>
          .
          <source>Cognition &amp; Emotion</source>
          ,
          <volume>14</volume>
          (
          <issue>4</issue>
          ),
          <fpage>433</fpage>
          -
          <lpage>440</lpage>
          . doi:
          <volume>10</volume>
          .1080/026999300402745
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Taylor</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ardissono</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Goy</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Petrone</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          (
          <year>2003</year>
          ).
          <article-title>INTRIGUE: Personalized recommendation of tourist attractions for desktop and handset devices</article-title>
          .
          <source>Applied Artificial Intelligence</source>
          ,
          <fpage>37</fpage>
          -
          <lpage>41</lpage>
          . doi:
          <volume>10</volume>
          .1080/08839510390225050
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Velásquez</surname>
            ,
            <given-names>J. D.</given-names>
          </string-name>
          (
          <year>1997</year>
          ).
          <article-title>Modeling Emotion-Based DecisionMaking</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Yu</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhou</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hao</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Gu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2006</year>
          ).
          <article-title>TV Program Recommendation for Multiple Viewers Based on user Profile Merging. User Modeling</article-title>
          and
          <string-name>
            <surname>User-Adapted</surname>
            <given-names>Interaction</given-names>
          </string-name>
          ,
          <volume>16</volume>
          (
          <issue>1</issue>
          ),
          <fpage>63</fpage>
          -
          <lpage>82</lpage>
          . doi:
          <volume>10</volume>
          .1007/s11257-006-9005-6
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>