<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>“It is better than working with a person” - Affective cues and responses to robots at work</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Anna Lampi</string-name>
          <email>anna.k.lampi@student.jyu.fi</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Kaisa Venermo</string-name>
          <email>kaisa.e.venermo@student.jyu.fi</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Markus Salo</string-name>
          <email>markus.t.salo@jyu.fi</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Henri Pirkkalainen</string-name>
          <email>henri.pirkkalainen@tuni.fi</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Faculty of Information Technology, University of Jyväskylä</institution>
          ,
          <addr-line>P.O. Box 35, FI-40014 Jyväskylä</addr-line>
          ,
          <country country="FI">FINLAND</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Unit of Information and Knowledge Management, Faculty of Management and Business, Tampere University</institution>
          ,
          <addr-line>P.O. Box 527, 33101 Tampere</addr-line>
          ,
          <country country="FI">FINLAND</country>
        </aff>
      </contrib-group>
      <fpage>208</fpage>
      <lpage>221</lpage>
      <abstract>
        <p>Robotic technologies are gaining popularity in various industries, and, by implication, academic research anticipates changes in occupations and work tasks, as well as in the labor market in general. In the future, human employees and robots will increasingly work side by side as coworkers. Despite growing interest, little is known about human employees' experiences of working with robots, and particularly, their emotions toward robots. However, research has shown that emotions - that is, affective responses - play an important role in the use and adoption of technologies. To address this research gap, we conducted a qualitative questionnaire examining the emotions experienced by employees working with robots and the antecedents of those emotions. We found seven affective cues (i.e., triggers of emotions) related to robots that employees responded to with various affective responses: robot as a technological advancement, technical errors and restrictions, robot's autonomy, robot user's self-efficacy, emotional connection with a robot, decrease in human-human interaction, and fear of losing jobs. The findings of our study provide insights into employees' experiences and help organizations smoothen the collaboration between human employees and robots.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Robots</kwd>
        <kwd>affective responses</kwd>
        <kwd>affective cues</kwd>
        <kwd>emotions</kwd>
        <kwd>work</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Millions of robots are already in use in various industries throughout the world, and they are
becoming increasingly common [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Advancing vision and motion technologies, along with software
programming are leading to more autonomous and adaptable robots that can work side by side with
human employees [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Naturally, this development has implications for work. Although some
researchers predict that robots may reduce low-skilled work and so-called routine work [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ] or
decrease labor and wages in some industrial robot-intensive fields [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], the most significant changes are
likely to occur in the contents of work [
        <xref ref-type="bibr" rid="ref4 ref6 ref7">6, 4, 7</xref>
        ]. Instead of mass unemployment, robots, and other related
technologies are more likely to change occupations and individual work tasks in multiple fields [
        <xref ref-type="bibr" rid="ref6 ref8 ref9">6, 8,
9</xref>
        ]. These changes will affect employer-employee relations and set new requirements for workers [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
Most notably, robots in organizations are becoming active team members instead of just helping humans
[
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>
        The definition of the term robot varies in different fields. Following [10, p. 377], we used the term
to refer to “technology with both virtual- and physical-embodied actions.” Typically, robots are
movable machines with sensors, tools, and some degree of autonomy, and are capable of executing
programmed tasks. The most common are industrial robots used in the automotive and electronics
industries, but service robots used in the service industry, such as delivery robots, are becoming more
common [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. As such, robotic technologies are part of technological transformation of automated, partly
autonomous, connected, and intelligent production processes that are often referred to as the “Fourth
Industrial Revolution” or “Industry 4.0” [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>
        In this article, the experiences of employees working with robots are explored through the lens of
emotions. Emotions can affect employees’ job performance and job satisfaction [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and change
readiness [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. Furthermore, Stein et al. [14 p. 2, 15] argued that studying emotions may produce a
needed “more nuanced understanding” of IT use. Previous studies in information systems (IS) have
explored the role of emotions in, for example, new IT implementation processes [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], in initial use of
IT and how emotions affect usage intentions [
        <xref ref-type="bibr" rid="ref17 ref18">17, 18</xref>
        ], and in post-adoption use behavior [
        <xref ref-type="bibr" rid="ref14 ref15 ref19">14, 15, 19</xref>
        ].
Apart from these studies, our research draws on a socio-cultural understanding of emotions [see, e.g.,
20, 21, 22]: despite their physio-biological elements, the representation and meaning-making of
emotions is socially and culturally influenced. Therefore, rather than focusing on an individual’s
specific feelings or emotional states in a psychological sense, we are interested in the meanings
individuals give to emotions [see, e.g., 23]. In the research literature, the terms emotion and affective
response are often used as synonyms [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], and we followed this approach in this study. In addition, we
adopted the term affective cue to refer to the characteristics, aspects, or elements of an object or event
evoking affective responses [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ]. That is, affective cues are antecedents, triggers, or causes of affective
responses.
      </p>
      <p>
        Despite the growing interest in robots in both literature and practice, research on employees’
affective responses to robots is limited. As robots become integrated into multiple levels of work and
society, studies on emotions triggered by robots are needed to develop more seamless human-robot
interactions [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ], and to understand the needs of employees working with robots. Furthermore, studying
the triggering of positive emotions may help turn resisters of robots into supporters [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ]. To address
this gap, our study aims to answer the following research questions:
1. What kinds of affective cues related to robots do employees identify as antecedents of positive
and negative affective responses at work?
2. How do employees respond to affective cues related to robots?
      </p>
      <p>To address the research questions, we conducted a qualitative questionnaire study of 208 participants
working with robots. We examined elements of robots at work that trigger an array of positive and
negative affective responses among employees. Our study contributes to the existing literature on
affective responses to IT [e.g., 14, 15] by extending the research context to robots. It identifies specific
elements of robots at work that evoke various affective responses among employees. Furthermore, the
findings of our study have practical value for organizations that use or implement robots.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Background 2.1.</title>
    </sec>
    <sec id="sec-3">
      <title>Robots at work: Toward joint optimization between humans and robots</title>
      <p>
        Different types of robots traditionally classified by their intended use, such as industrial robots,
service robots, and social robots, are starting to converge as they become more autonomous and
interactive. For example, in recent years, industrial robots have been physically separated from human
employees, often for safety reasons. This is changing as robotic technology is advancing and so-called
collaborative robots are becoming more common [
        <xref ref-type="bibr" rid="ref27">27, 28</xref>
        ]. A collaborative robot is one designed to
undertake certain tasks at least partly autonomously, but in collaboration with or in the same work
environment as humans [
        <xref ref-type="bibr" rid="ref26 ref27">26, 27</xref>
        ]. Besides industrial settings, collaborative robots or collaborative
service robots are becoming more common in the service industry. Social robots, which are designed
for interaction with humans or other robots, are evolving from toy-like curiosities to potential human
replacements in the service industry. Therefore, categorizing robots is becoming more challenging and
non-exclusive. As even industrial robots are increasingly freed from assembly lines or restricted areas
in factory halls to work autonomously among human employees, organizations are facing new demands
concerning, for example, workspaces and safety standards [29].
      </p>
      <p>
        Apart from physical conditions, the implementation of robots changes psychosocial work conditions
and affects people as well as organizations. Workforces need to adapt to new technology. Changes vary
in different fields of industry, and in some fields, the transformation of work may be more striking. In
manufacturing, the increasing use of robots may represent a natural continuum of automation since the
Industrial Revolution [30], whereas in the service industry, work conditions may change more
markedly. For example, in welfare services, robotic technologies carry great promises for solving
various challenges in the field. However, prior research to date suggests that robots do not necessarily
reduce healthcare workers’ overall workload but transform individual work tasks [
        <xref ref-type="bibr" rid="ref7">31, 7</xref>
        ].
      </p>
      <p>
        Acceptance of robots and people’s willingness to work alongside them are essential for robot
implementation. Similarly, the change readiness of organizations is a defining factor, especially in the
fields where robots are yet to be used [32]. Acceptance of robots depends on whether they are perceived
as a threat or as an opportunity for employees [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. On an individual level, an employee’s characteristics
and personal experiences in particular are associated with higher acceptance rates of robots [32]. A
typical outcome of automation is the loss of employee autonomy, which might reduce the desirability
of a given occupation. Similarly, the implementation of robots in the workplace and increasing
automation may result in job insecurity, which may cause anxiety and resistance toward robots [33, 34].
On a positive note, robots may also increase both employee satisfaction and productivity [34] by, for
example, reducing their physical workload and taking over unpleasant routine tasks.
      </p>
      <p>An ideal workplace utilizing robotic technologies would comprise joint optimization of social and
technical systems, that is, of human employees and robots. In this context, joint optimization refers to
the idea of combining differing competences of people and robots: human employees’ flexibility,
intelligence, and socio-emotional capabilities with robots’ efficiency and physical strength [35]. This is
achieved through successful human-robot interaction including communication, physical contact, and
possibly social interactions composed of social, emotional, moral, or cognitive elements [36]. The
relationships between social and technical elements do not always need to be equal, but rather, involve
varying dynamics [37]. Still, an understanding of human employees and robots as socio-technical
relationships or joint optimization directs the attention of discussion to robots complementing and
assisting human employees rather than replacing them.
2.2.</p>
    </sec>
    <sec id="sec-4">
      <title>Affective responses, emotions, and IT at work</title>
      <p>
        To date, various studies in the IS field have examined the role of affective responses or emotions in
IT use [e.g., 16, 15]. The research draws on psychology, but an understanding of the term emotion
differs according to the applied academic literature. For example, so-called basic emotions theory
approaches emotions as psychobiological, evolutionary, and universal, whereas appraisal theory
considers emotions in a broader sense as processes [for a comparison, see 38]. The appraisal theory of
emotions recognizes the physiological aspect of emotions but emphasizes the role of an evaluation of
the subjective significance of a situation experienced. Consequently, emotions are seen as responses
resulting from an individual’s situational assessment processes, in which the values, goals, and
knowledge of an individual play an important part [39]. In the same vein, social and cultural
understanding of emotions emphasizes emotions not only as mental states but also as meanings that
subjects attach to certain emotions and their representations [e.g., 23, 20]. Furthermore, some
researchers [see, e.g., 20, 40, 21] have adopted the term affect (or an affective response) when referring
to a broader concept of a sensation; affects can be understood as involving emotions, moods, and
feelings but also partly subconscious bodily sensations, emerging in relationships between entities
affecting each other. The term affective cue has been used to describe elements, aspects or
characteristics of an event or an object that evoke the affective responses [
        <xref ref-type="bibr" rid="ref14 ref15 ref24">14, 15, 24</xref>
        ].
      </p>
      <p>
        Emotions play an important role in the use and adoption of technologies. Emotions experienced in
the adoption phase of new IT affect its later use, either directly or indirectly [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]. According to Beaudry
and Pinsonneault [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], positive emotions of excitement and happiness are related to more frequent use
of IT through task adaptation and seeking instrumental support, whereas negative emotions of anxiety
have more complex repercussions depending on whether users perform psychological distancing or
seek social support.
      </p>
      <p>
        Emotions are also important in the use of IT after its initial adoption. They affect intentions to use
IT: positive emotions of satisfaction or happiness support intentions to continue its use [
        <xref ref-type="bibr" rid="ref16 ref17">17, 16</xref>
        ], whereas
negative emotions such as anxiety affect the perceived ease of use and therefore decrease intentions to
use [
        <xref ref-type="bibr" rid="ref16 ref18">18, 16</xref>
        ]. Emotions have a role in how use patterns emerge: users form evaluations of IT based on
various material, social, and personal cues they respond to emotionally, with either uniform or mixed
affective responses. Evaluations then lead to specific use patterns and coping strategies [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ].
Emotions experienced when using IT may affect a user’s continuing use behavior, even unconsciously
[
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. Additionally, it has been found that emotions evoked by IT project control affect the controlees’
behavioral responses [41].
      </p>
      <p>
        In computing, affective responses to robots and other artificial agents have been studied for decades
[42] to develop human-mimicking software systems and artificial intelligence. However, few studies
have investigated how human employees respond to robots in their actual work settings and how robots
shape their work experiences. What we do know is that robots differ from other technologies with their
physical embodiment and varying degrees of anthropomorphistic, that is, human-like, features and
social interaction capabilities. As a result, affective responses to robots may resemble those to humans
[
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. For example, van der Woerdt and Haselager [43] found that if a robot’s failure in a task seemed
to be caused by its own lack of effort, people tended to ascribe the robot agency and responsibility and
experience affective responses of disappointment. Additionally, a robot’s anthropomorphic features
increase the ascriptions of agency and emotional capabilities, which evokes generally more positive
attitudes toward them [44]. In the work context, You and Robert [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] noted that employees performed
better and operated more efficiently when they were emotionally attached to robots they worked with.
      </p>
      <p>A literature review by Persson et al. [31] suggested that robots used in care settings may reduce
caregivers’ emotional demands, such as stress, by monitoring patients and reducing the need for
constant human presence. However, robots also evoked negative responses and increased emotional
demands due to technical difficulties and low stability. Consequently, trust between an employee and a
robot is important for robot acceptance and efficient teamwork [45].</p>
    </sec>
    <sec id="sec-5">
      <title>3. Methods 3.1.</title>
    </sec>
    <sec id="sec-6">
      <title>Data collection</title>
      <p>The usage and experience of robots at work is not yet common, and prior research on robots in the
work context has been primarily based on perceptions and expectations of robots rather than on
empirical data of actual use experiences [see, e.g., 26]. Therefore, we decided to target a wide online
audience in order to reach actual robot users. The population of our research consisted of a database of
participants in Amazon Mechanical Turk (MTurk), an online platform where registered users are able
to participate in various tasks, such as surveys, for reasonable rewards. Within MTurk, we targeted
participants who were located in the United States because the MTurk U.S. population has been found
to be comparable with other online and offline research settings [46]. This approach enabled us to reach
a nationwide audience of potential robot users efficiently.</p>
      <p>In MTurk, we conducted a questionnaire of open-ended questions complemented by close-ended
questions for background information, such as age, gender, nationality, and questions of education and
employment. The questionnaire targeted employees who had worked with physical robots, either using
or working next to them (more than tried/trialed). With the term physical robot, we refer to any robot
technology with a physical embodiment (excluding, e.g., software robots and chatbots). We conducted
a pre-test and a pilot study in December 2021, which helped us to evaluate the questions asked and
make minor wording changes. The pilot study was included in the final dataset. The rest of the final
dataset was collected between January and March 2022. Besides the location, several criteria typically
used in MTurk [46] were applied for assuring data quality: the number of respondents’ approved HITs
was set to 1000–5000, the HIT approval rate was set to 95–99%, and we used an attention check
question. We excluded answers that, for example, clearly did not answer the questions asked, had text
copied from the internet, or seemed like other kinds of spam. The attention check question helped in
identifying these answers, although it was never the only criterion used for exclusion. The respondents
were rewarded with 2.90 U.S. dollars, the expected time of answering the questionnaire being 15–20
minutes. The final dataset consisted of 208 respondents who had worked with physical robots.</p>
      <p>The respondents’ average age was 40 years. Of the respondents, 60% were male and 38% were
female, while 2% chose not to answer. Respondents represented various industries, the most commonly
mentioned of which were manufacturing, ICT, finance and banking, healthcare and medical sector, and
retail. Accordingly, the respondents had worked with various kinds of robots, including robotic arms in
industrial settings, delivery robots, cleaning robots, and service robots greeting customers in hotels or
malls.</p>
      <p>As pointed out by Hökkä et al. [47], research on emotions in organizational settings tends to
emphasize negative emotions and even ignore positive ones. Therefore, we decided to group the
questions into two distinct categories – positive and negative emotions – as this allowed us to examine
also positive emotional experiences in more detail. The open-ended questions were:
1. What kinds of positive emotions have you experienced when working with a physical robot?
2. What especially has caused positive emotions when working with a physical robot?
3. What kinds of negative emotions have you experienced when working with a physical
robot?
4. What especially has caused negative emotions when working with a physical robot?
Participants’ answers to questions ranged from 11 to 320 words. The questionnaire also contained
questions concerning other technologies, but they were not considered in this article.
3.2.</p>
    </sec>
    <sec id="sec-7">
      <title>Data analysis</title>
      <p>The analysis was conducted using qualitative content analysis, following the guidelines by Lune and
Berg [48]. After inductive coding and categorization, themes and relationships in the data were
identified and examined, and they were compared to previous research. In other words, the analysis
started with an exploratory bottom-up approach [49], but iterated toward the end to more of a top-down
process.</p>
      <p>
        The first author began the coding process by creating data-driven subcategories. To identify affective
cues, that is, antecedents of affective responses, the coding process was conducted based on the
identification of specific aspects of working with robots. In this stage, there were 23 subcategories of
affective cues (e.g., technical errors and sense of achievement). To identify affective responses, a
similar coding process was conducted based on the identification of specific emotions or affects. In this
stage, there were 40 subcategories of positive affective responses (e.g., happy and relaxed) and 31
subcategories of negative affective responses (e.g., frustration and worry). Several of the affective
responses had only one mention. Next, subcategories were discussed among the authors, then
subcategories of affective cues were combined by the first author to create seven broader main
categories. In the final stage, the original data-driven categories of affective cues were classified as
material, social and personal affective cues, following Stein et al. [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ] in adopting the terms in
reference to specific elements or characteristics of a situation evoking emotions. The term affective cue
allows consideration of the various triggers of emotions as fluid and interrelated; certain conditions or
technical features, for example, do not always trigger certain emotions; rather, these conditions may
involve varying affective cues and cognitive sensemaking.
      </p>
      <p>Nineteen of the respondents described positive emotions and their antecedents but indicated not
having any negative emotions toward the robots they worked with.</p>
    </sec>
    <sec id="sec-8">
      <title>4. Findings: Affective cues and responses to robots at work</title>
      <p>
        The findings of the study are presented in the following pages. We identified seven affective cues
in total and classified them into categories of material cues (robot as a technological advancement,
technical errors and restrictions, robot’s autonomy), personal cues (robot user’s self-efficacy), and
social cues (emotional connection with a robot, decrease in human–human interaction, fear of losing
jobs). Identified cues partly overlap, since some of them can be explored from multiple perspectives.
The cues evoked various affective responses, the most common of which are mentioned. However, it
is worth emphasizing that affective responses are the results of individual appraisals in a given situation,
and none of the affective cues can be generalized as sole triggers of specific affective responses [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
4.1.
      </p>
    </sec>
    <sec id="sec-9">
      <title>Material cues associated with robotic features</title>
      <p>Our analysis revealed that employees responded affectively to various cues related to the materiality
of a robot. We identified three material cues associated with robotic features: robot as a technological
advancement (180 mentions), technical errors and restrictions (161 mentions), and robot’s autonomy
(88 mentions). Technical features, functionality and the design of the robot evoked both positive and
negative affective responses. Moreover, it turned out that some material features could be a root cause
of both positive and negative affective responses, although it was not clear whether the respondents
themselves identified the connection.</p>
      <p>The most commonly associated cue evoking affective responses to robots was perceiving them as a
positive technological advancement at work. Especially getting work done more efficiently due to the
speed and help of a robot was identified as eliciting positive affective responses of happiness and
satisfaction:</p>
      <p>“When you work with these robots, there is a feeling of satisfaction on how
they complete this task faster, without getting tired, it’s just so satisfying.”</p>
      <p>For some employees, the robot’s efficiency meant that time was freed up for other tasks, and others
responded positively to increased capabilities and being more productive. Moreover, it was evident that
some employees were intrigued by the pure novelty of robotic technology. Working with robots can
create a sense of modernity and progress, which may be responded to with, for example, happiness,
excitement, or enthusiasm:</p>
      <p>“I am attracted to technology, so any opportunity I have had to work with a
new robot just makes me happy. I like seeing how much more advanced each new</p>
      <p>model is compared to the model or unit that came before it.”</p>
      <p>Working with robots may have very practical and concrete positive outcomes for employees. For
some, robots have decreased routine manual work and/or physically demanding tasks and thus also
decreased cognitive and/or physical workloads. Delegation of unpleasant work tasks to a robot evoked
affective responses of happiness and relief. One employee described feeling “--- much better and
--happier at work” elaborating that:
“I no longer have to do all of the heavy lifting, it has increased my mood while</p>
      <p>working.”</p>
      <p>More specifically, having a robot perform tasks that previously belonged to a human employee is
related to the autonomy of a robot, which is discussed in more detail later. Moreover, the accuracy and
predictability of robots were noted to decrease stress.</p>
      <p>Unsurprisingly, technical errors and restrictions were the most commonly mentioned triggers of
negative affective responses among employees. In particular, the malfunctioning, technical errors, and
glitches of robots evoked negative affective responses of frustration and anxiety. According to the data,
repairing a malfunctioning robot was often a challenging task and could take a long time. In addition,
external support was often needed as employees working with robots may not have the ability to repair
the robots themselves:
“I get frustrated because I can’t fix it and I have to wait on someone else to do</p>
      <p>this.”</p>
      <p>Consequently, employees may perceive robots as hindering and delaying their work. A feeling of
not having control over the malfunction leads to frustration and impatience. For some employees,
stronger responses of anger and hatred were also present, as stated by an employee who described
having felt hatred:
“When a robot just won’t work correctly even though everything is set</p>
      <p>correctly and nothing is physically wrong.”</p>
      <p>Apart from technical errors, robots have indisputable material restrictions. The low-level
adaptability of a robot signifies its inability to adapt to the changing needs of the robot user. Most robots
are programmed to execute a single task repeatedly. Accordingly, employees working with robots may
be compelled to engage in similar repetitiveness. This is in contrast to human work teams, among which
behavioral adaptations are practical everyday skills. Thus, a lack of flexibility may evoke affective
responses of frustration and boredom. Or, as one employee described himself,
“Feeling a little lonely and isolated” because “When I’m hearing certain
prompts and commands over and over on a bad day [it] can get kind of</p>
      <p>irritating.”</p>
      <p>A specific material cue evoking affective responses was robot’s autonomy. Autonomy itself was not
mentioned in the responses but was evident from the material cues mentioned above. Notably, the
robot’s autonomy as a technical feature elicited mixed affective responses. The same technological
advancements that lessen an employee’s cognitive or physical workload and evoke positive affective
responses may sooner or later threaten the employee’s job altogether. Whether this is true or probable
is not that relevant, since this is how employees feel. Therefore, apart from positive affective responses,
robot autonomy evoked negative affective responses of fear and worry:
“Though the robot is helpful, I sometimes fear that the robot will eventually
make my job unneeded. So sometimes I get a little worried about that.”
4.2.</p>
    </sec>
    <sec id="sec-10">
      <title>Personal cues associated with robot user’s characteristics</title>
      <p>Our analysis revealed that, apart from material cues and technical robotic features, employees
responded affectively to personal cues associated with their own characteristics. We identified one
personal cue from the data: robot user’s self-efficacy (100 mentions). As expected, a user’s self-efficacy
evoked both positive and negative affective responses.</p>
      <p>The majority of affective responses to the robot user’s self-efficacy were positive affective responses
of satisfaction and excitement. Working with a robot elicited a sense of accomplishment in employees,
especially after learning something new or being able to solve a problem:</p>
      <p>“Fixing it myself, seeing it at work again because of my knowledge.”</p>
      <p>Controlling and maintaining the robot boosted self-efficacy and positive responses, as with an
employee who enjoyed “being the go-to guy for how it works.” Employees could perceive a work task
as rewarding and be proud of it even though the task was executed by a robot if they could maintain a
sense of being responsible for the robot’s work and having control:
“I was proud working with the robots. I just sounded impressive when I told
people I “fixed robots”. They thought it was more complicated than it was. Plus,
the robots did tasks that would have been difficult for a human to do efficiently.”</p>
      <p>Apart from positive affective responses, the robot user’s self-efficacy was represented by negative
affective responses of frustration, worry, and anxiety. Having to learn new skills was sometimes
considered challenging:</p>
      <p>“The frustration in the beginning of learning something completely new at my
age. I seriously had doubts if I would ever master the skills needed to interact with
robots.”</p>
      <p>Then again, not being fully trained and having to work with a robot were triggers for confusion and
self-doubt. Furthermore, self-efficacy as a personal cue is related to the material cues discussed above.
For example, the inability to fix technical malfunctions and errors of a robot affected users’
understanding of self-performance.
4.3.</p>
    </sec>
    <sec id="sec-11">
      <title>Social cues associated with robots changing workplace dynamics</title>
      <p>Our analysis revealed that apart from material and personal cues, employees responded affectively
to social cues associated with robots changing the workplace dynamics. We identified three social cues
from the data: emotional connection with a robot (51 mentions), decrease in human–human interaction
(24 mentions), and fear of losing jobs (30 mentions). Employees responded to social cues with both
positive and negative affective responses. The cues of emotional connection with a robot and a decrease
in human–human interaction evoked both positive and negative affective responses depending on the
employees’ appraisal and preferences.</p>
      <p>The most commonly associated social cue evoking affective responses among employees was
emotional connection with a robot. Depending on the employee, either the perceived lack of emotional
connection with a robot caused negative affective responses, or the perceived connection with a robot
caused positive ones. For some, the emotional connection evoked mixed affective responses. Positive
and negative responses were more or less equally distributed in the data. Regardless of the nature of the
affective responses, employees seemed to consider the emotional connections and social dynamics of
the workplace important.</p>
      <p>The perceived lack of emotional connection with the robot evoked negative affective responses of
frustration and loneliness. Some employees missed human coworkers and everyday social connection,
represented by, for example, small talk or occasional smiles. As one employee described their
relationship with a robot:
“It is emotionless and performs only set tasks, which makes me feel like having</p>
      <p>a one-sided relationship.”</p>
      <p>The lack of emotional connection and involved insufficient interaction was related to some of the
material cues discussed above. Inadaptable, inflexible robots lack spontaneity that is natural to humans
and do not adapt to changing situations like human employees would. This itself underlines the
mechanical, unemotional nature of robots, but in some cases may even cause safety hazards. In such
cases, employees feel the robot “doesn’t care”:
“When something goes wrong. The robot does not care and will do the same</p>
      <p>action over and over again.”</p>
      <p>For some employees, the perceived emotional connection with a robot evoked positive affective
responses of happiness and relaxation. A few employees reported enjoying the human-like
characteristics of a robot, such as its daily greetings or dressing it up. Often, the robot was seen as a
coworker or part of a team:
“--- I think the robot is also a good colleague in that they are willing to help</p>
      <p>all the time.”</p>
      <p>For some, the robot appeared to be even like a friend, or someone with whom they could share an
emotional bond. One employee described an emotional attachment to a robot they were working with:
“--- It doesn’t matter that it’s not real, just that it appears to be real.”</p>
      <p>However, some employees reported having both positive affective responses of emotional
connections with robots and negative affective responses of not having “enough” of it.</p>
      <p>Robots may also change workplace dynamics in other ways. Some employees identified their fear
of losing jobs as a primary trigger for evoking negative affective responses at work. The fear of losing
one’s job could be classified as a personal cue, but the data reveal that the fear was often directed not
solely for respondents’ own jobs, but also for coworkers. Therefore, fear of losing jobs may be
perceived as a broader issue affecting workplace dynamics. Naturally, the fear of losing jobs evoked
purely negative affective responses, apart from fear worry and anxiety. A few of the employees had
already experienced the replacement of a coworker with a robot:
“The fact that up until two years ago the robot’s job had been done by a</p>
      <p>human who I knew and liked.”</p>
      <p>The fear of losing jobs is related to material cues of technological advancement and the robot’s
autonomy – the same employees enjoyed working with modern technology, but simultaneously feared
it would advance “too far” and replace them.</p>
      <p>The last social cue we identified from the data was decrease in human–human interaction. As close
as it is to the cue emotional connection with a robot, within this cue, the trigger of affective responses
was not the robot itself but the lack of social dynamics it symbolized – less of human–human interaction
compared to a workplace without robots. From this perspective, the decrease in human–human
interaction evoked positive affective responses of relief and relaxation. Some employees appeared to
either have had negative experiences with human coworkers in the past, or they were anticipating those
based on their presumptions:</p>
      <p>“I don’t have to worry about a coworkers stupid bullshit.”</p>
      <p>Therefore, they perceived working with a robot as preferable to working with humans. With a robot,
social norms are not important:
“It is better than working with a person. A robot doesn’t have feelings and so
you can direct it to do what you want without worrying about hurting it’s</p>
      <p>feelings.”</p>
      <p>Some employees considered the workplace dynamics with other humans stressful and competitive,
whereas a robot was seen as nonjudgmental or easier to work with:
“I think the lack of drama and competition will create a more positive work
environment. The robot also won't get mad or upset if I make a mistake.”</p>
      <p>In some cases, robots were considered more reliable and consistent than humans, which again
decreased general stress.</p>
    </sec>
    <sec id="sec-12">
      <title>5. Discussion</title>
      <p>
        A large part of prior research on robots becoming increasingly common in various industries has
either emphasized employees’ negative attitudes toward robots or speculated about robots’ potential to
cause job loss. Our study extends these approaches by demonstrating how employees working with
robots experience an array of both negative and positive affective responses to the technology. By so
doing, it provides a more nuanced understanding of IT (and specifically robot) use, as called for by
Stein et al. [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ]. How employees respond affectively to working with robots is related not only to
technology itself, but to multiple social and personal dimensions. Furthermore, employees’ affective
responses are not always uniform but sometimes mixed. Based on our analysis, employees experiencing
negative affective responses to robots, such as fear or frustration, can still identify various triggers of
positive affective responses. This is in line with a study by Hoorn et al. [42, p. 2], who noted that “a
robot may be appreciated for working effectively but simultaneously that effectiveness may induce fear
of job loss.”
5.1.
      </p>
    </sec>
    <sec id="sec-13">
      <title>Implications for research and practice</title>
      <p>
        Our study contributes to the existing knowledge about affective responses in IT use by extending
the study context to robots at work. Our analysis followed research by Stein et al. [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ] in classifying
identified data-driven affective cues into three categories: material, personal, and social cues. We
identified seven affective cues in total: material cues of robot as a technological advancement, technical
errors and restrictions, and robot’s autonomy, a personal cue of robot user’s self-efficacy, and social
cues of emotional connection with a robot, decrease in human–human interaction, and fear of losing
jobs. In line with Stein et al. [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ], our findings describe the importance of material cues such as
technical features and restrictions, and the possibilities of technology in eliciting affective responses.
However, robots and physical robots in particular differ from other IT systems and may have some
characteristic features that prove to be salient in evoking affective responses. Our analysis reveals that
large part of affective responses to robots relate to robots being associated with humans, such as the
emotional connection and interaction with a robot, or lack thereof. Robots are often perceived as
coworkers and accordingly, seen as friends or rivals.
      </p>
      <p>
        Our study has several practical implications. The findings of our study can be utilized by
organizations planning an implementation process of robots, but also have value for organizations
already using them. Paying attention to employees’ affective responses and their antecedents can help
to smoothen the implementation process and increase overall job satisfaction. Experiencing positive
emotions at work improves performance at the individual as well as organizational levels, while
negative emotions may degrade organizational culture and, for example, lead to employee stress and
burnout [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. An understanding of emotions or affective responses may be used to motivate employees
and improve their change readiness.
      </p>
      <p>
        Emotions are building blocks of attitudes [50], and an understanding of employees’ affective
responses may help in supporting the formation of positive attitudes toward robots. Apart from
technology itself, attitudes toward robots are affected by attitudes toward organizational change, which
again is affected by emotions such as insecurity and loss of control [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. The findings of our study
suggest that employees value a sense of control, and respond more positively to, for example, delegation
of work tasks to robots if their sense of control is maintained. Accordingly, a sense of control could
help prevent the fear of job loss. Employers could put effort into dispelling employees’ worries by, for
example, sharing organization’s plans for robotic technologies and employees’ retraining. On the other
hand, negative emotions in general can also act as important indicators of injustice or wrongful conduct
in organizations [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Therefore, emotions that are considered negative as such, may have positive
outcomes for individuals [51].
5.2.
      </p>
    </sec>
    <sec id="sec-14">
      <title>Limitations and future research</title>
      <p>
        Our research has certain limitations that need to be acknowledged. First, affective responses and
their antecedents may differ in different contexts. They may be evoked by organizational context [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ],
which is a factor that our study did not focus on. It is also possible that affective responses and affective
cues differ depending on robot type and features such as its appearance. In addition, prior research [32]
showed that acceptance and motives for acceptance of robots varied across different countries, perhaps
due to disparities in the labor market and job security, and this might also be the case with affective
responses. Second, studying affective responses with a questionnaire presents its own challenges.
Affective responses were studied retrospectively, which may create recall bias. Using a written format
of a questionnaire also measures respondents’ abilities to express themselves in a written format, and
so differs from interview studies. The distinction of emotions to categories of positive and negative may
restrict the representation of more controversial emotions and does not allow evaluations of employees’
general moods or attitudes toward robots. However, it did provide room for both kinds of emotions and
resulted in data on the antecedents and characteristics of positive emotions, as called for by Hökkä et
al. [47]. Finally, the analysis of the study was principally conducted by the first author, and the results
may have been affected by their interpretation.
      </p>
      <p>
        Our findings provide possible topics for future research. In future studies, it could be important to
investigate how affective responses affect robot use and emerging behavioral patterns. Furthermore, it
would be of interest to explore what kinds of coping strategies employees apply when experiencing
negative affective responses related to robot use. Our findings suggest that physical robots tend to evoke
mixed or ambivalent affective responses. Previous research [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] has shown that in such situations, IT
users mix different coping strategies as well. Whether similar user strategies are applied when working
with robots remains yet to be investigated.
      </p>
    </sec>
    <sec id="sec-15">
      <title>6. Conclusion</title>
      <p>As robots become increasingly common in various industries, we need a more comprehensive
understanding of employees’ experiences working with them. Our study suggests that employees have
various affective responses to working with robots triggered by multiple affective cues. Paying attention
to cues triggering positive affective responses can help reinforce employees’ job satisfaction and
performance, or even turn robot resisters into supporters. On the other hand, cues triggering negative
affective responses may indicate a need for adjustments in an organization.</p>
    </sec>
    <sec id="sec-16">
      <title>7. Acknowledgements</title>
      <p>This research has been funded by the Emil Aaltonen Foundation and the Academy of Finland
(341359).</p>
      <p>This Word template was created by Aleksandr Ometov, TAU, Finland. The template is made
available under a Creative Commons License Attribution-ShareAlike 4.0 International (CC BY-SA
4.0).</p>
    </sec>
    <sec id="sec-17">
      <title>8. References</title>
      <p>[28] A. Hentout, M. Aouache, A. Maoudj, I. Akli, Human-robot interaction in industrial collaborative
robotics: A literature review of the decade 2008-2017, Advanced Robotics 33 (2019) 764–799.
doi:10.1080/01691864.2019.1636714
[29] A. Rega, F. Vitolo, C. Di Marino, S. Patalano, A knowledge-based approach to the layout
optimization of human–robot collaborative workplace, International Journal on Interaction Desing
and Manufacturing 15 (2021) 133–135. doi:10.1007/s12008-020-00742-0
[30] E. Fernández-Macías, D. Klenert, J. Antón, Not so disruptive yet? Characteristics, distribution and
determinants of robots in Europe, Structural Change and Economic Dynamics 58 (2021) 76–89.
doi:10.1016/j.strueco.2021.03.010
[31] M. Persson, D. Redmalm, C. Iversen, Caregivers’ use of robots and their effect on work
environment – a scoping review, Journal of Technology in Human Services (2021) 1–27. doi:
10.1080/15228835.2021.2000554
[32] T. Turja, A. Oksanen, Robot acceptance at work: A multilevel analysis based on 27 EU
Countries, International Journal of Social Robotics 11 (2019) 679–689.
doi:10.1007/s12369-01900526-x
[33] S. Erebak, T. Turgut, Anxiety about the speed of technological development: Effects on job
insecurity, time estimation, and automation level preference, Journal of High Technology
Management Research 32 (2021) 100419. doi:10.1016/j.hitech.2021.100419
[34] V. N. Lu, J. Wirtz, W. H. Kunz, S. Paluch, T. Gruber, A. Martins, P. G. Patterson, Service robots,
customers and service employees: What can we learn from the academic literature and where are
the gaps?, Journal of Service Theory and Practice 30 (2020) 361–391.
doi:10.1108/JSTP-04-20190088
[35] T. Turja, Accepting robots as assistants: A social, personal, and principled matter, Ph.D. thesis,</p>
      <p>Tampere University, Tampere, Finland, 2019.
[36] S. K. Ötting, L. Masjutin, J. J. Steil, G. W. Maler, Let's Work Together: A Meta-Analysis on Robot
Design Features That Enable Successful Human-Robot Interaction at Work, Human
Factors (2020) 1–24. doi:10.1177/0018720820966433
[37] S. Sarker, S. Chatterjee, X. Xiao, A. Elbanna, The Sociotechnical Axis of Cohesion for the IS
Discipline: Its Historical Legacy and its Continued Relevance, MIS quarterly 43 (2019), 695–719.
doi:10.25300/MISQ/2019/13747
[38] J. P. P. Jokinen, J. Silvennoinen, The appraisal theory of emotion in human-computer interaction,
in: R. Rousi, J. Leikas, P. Saariluoma (eds.), Emotions in Technology Design: Emotions to Ethics,
Springer International Publishing AG, 2020.
[39] R. S. Lazarus, S. Folkman, Stress, Appraisal, and Coping, Springer Publishing Company, Inc.,</p>
      <p>New York, 1984.
[40] T. Juvonen, M. Kolehmainen, Affective inequalities in intimate relationships, Routledge,</p>
      <p>Abingdon, New York, 2018.
[41] D. Murungi, M. Wiener, M. Marabelli, Control and emotions: Understanding the dynamics of
controllee behaviours in a health care information systems project, Information Systems Journal
29 (2019) 1058–1082. doi:10.1111/isj.12235
[42] J. F. Hoorn, T. Baier, J. A. N. Van Maanen, J. Wester, Silicon Coppelia and the Formalization of
the Affective Process, IEEE Transactions on Affective Computing, 2021. doi:
10.1109/TAFFC.2020.3048587
[43] S. van der Woerdt, S., P. Haselager, When robots appear to have a mind: The human perception
of machine agency and responsibility, New Ideas in Psychology 54 (2019) 93–100. doi:
10.1016/j.newideapsych.2017.11.001
[44] K. C. Yam, Y. E. Bigman, P. M. Tang, R. Ilies, D. De Cremer, H. Soh, K. Gray, Robots at work:
People prefer—and forgive—service robots with perceived feelings, Journal of Applied
Psychology 106 (2021), 1557–1572. doi:10.1037/apl0000834
[45] T. Law, M. Chita-Tegmark, M. Scheutz, The Interplay Between Emotional Intelligence, Trust, and
Gender in Human–Robot Interaction: A Vignette-Based Study, International Journal of Social
Robotics 13 (2020) 297–309. doi:10.1007/s12369-020-00624-1
[46] W. Mason, S. Suri, Conducting behavioral research on Amazon’s Mechanical Turk, Behavior
Research Methods 44 (2012) 1–23. doi:10.3758/s13428-011-0124-6
[47] P. Hökkä, K. Vähäsantanen, S. Paloniemi, Emotions in Learning at Work: A Literature</p>
      <p>Review, Vocations and Learning 13 (2019) 1–25. doi:10.1007/s12186-019-09226-z
[48] H. Lune, B. L. Berg, Qualitative Research Methods for the Social Sciences, Pearson, Edinburgh,</p>
      <p>Harlow, Essex, 2017.
[49] M. D. Myers, Qualitative Research in Business and Management, SAGE, London, Thousand Oaks,</p>
      <p>Calif., New Delhi, Singapore, 2020.
[50] P. Saariluoma, J. Cañas, J. Leikas, Designing for Life: A Human Perspective on Technology</p>
      <p>Development, Palgrave Macmillan, London, 2016.
[51] F. Meijers, Career Learning in A Changing World: The Role of Emotions, International Journal
for the Advancement of Counselling 24 (2002) 149–167. doi:10.1023/A:1022970404517</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1] IFR. International Federation of Robotics, Presentation of World Robotics Press Conference,
          <year>2021</year>
          . URL: https://ifr.org/downloads/press2018/
          <year>2021</year>
          _10_28_
          <article-title>WR_PK_Presentation_long_version</article-title>
          .pdf.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2] IFR.
          <source>International Federation of Robotics, Robots and the Workplace of the Future</source>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>R.</given-names>
            <surname>Kurt</surname>
          </string-name>
          ,
          <article-title>Industry 4.0 in Terms of Industrial Relations and Its Impacts on Labour Life</article-title>
          ,
          <source>Procedia Computer Science</source>
          <volume>158</volume>
          (
          <year>2019</year>
          )
          <fpage>590</fpage>
          -
          <lpage>601</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.procs.
          <year>2019</year>
          .
          <volume>09</volume>
          .093
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>G.</given-names>
            <surname>Graetz</surname>
          </string-name>
          , G. Michaels, Robots at Work,
          <source>The Review of Economics and Statistics</source>
          <volume>100</volume>
          (
          <year>2018</year>
          )
          <fpage>753</fpage>
          -
          <lpage>768</lpage>
          . doi:
          <volume>10</volume>
          .1162/rest_a_
          <fpage>00754</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>D.</given-names>
            <surname>Acemoglu</surname>
          </string-name>
          , P. Restrepo,
          <article-title>Robots and Jobs: Evidence from US Labor Markets</article-title>
          ,
          <source>The Journal of Political Economy</source>
          <volume>128</volume>
          (
          <year>2020</year>
          )
          <fpage>2188</fpage>
          -
          <lpage>2244</lpage>
          . doi:
          <volume>10</volume>
          .1086/705716
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>P.</given-names>
            <surname>Fleming</surname>
          </string-name>
          , Robots and Organization Studies: Why Robots Might Not Want to Steal Your Job,
          <source>Organization Studies</source>
          <volume>40</volume>
          (
          <year>2019</year>
          )
          <fpage>23</fpage>
          -
          <lpage>38</lpage>
          . doi:
          <volume>10</volume>
          .1177/0170840618765568
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>C.</given-names>
            <surname>Lloyd</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Payne</surname>
          </string-name>
          ,
          <article-title>Fewer jobs, better jobs? An international comparative study of robots and 'routine' work in the public sector</article-title>
          ,
          <source>Industrial Relations Journal</source>
          <volume>52</volume>
          (
          <year>2021</year>
          )
          <fpage>109</fpage>
          -
          <lpage>124</lpage>
          . doi:
          <volume>10</volume>
          .1111/irj.12323
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>E.</given-names>
            <surname>Brynjolfsson</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. McAfee,</surname>
          </string-name>
          <article-title>The second machine age: Work, progress, and prosperity in a time of brilliant technologies</article-title>
          , W. W. Norton, New York, London,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>M.</given-names>
            <surname>Ford</surname>
          </string-name>
          ,
          <article-title>The rise of the robots: Technology and the threat of mass unemployment</article-title>
          ,” Oneworld, London,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>S.</given-names>
            <surname>You</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. P.</given-names>
            <surname>Jr. Robert</surname>
          </string-name>
          ,
          <article-title>Emotional attachment, performance, and viability in teams collaborating with embodied physical action (EPA) robots</article-title>
          ,
          <source>Journal of the Association for Information Systems</source>
          <volume>19</volume>
          (
          <year>2018</year>
          )
          <fpage>377</fpage>
          -
          <lpage>407</lpage>
          . doi:
          <volume>10</volume>
          .17705/1jais.00496
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M.</given-names>
            <surname>Piccarozzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Aquilani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ,
            <surname>C.</surname>
          </string-name>
          <article-title>Gatti, Industry 4.0 in Management Studies: A Systematic Literature Review</article-title>
          ,
          <source>Sustainability</source>
          <volume>10</volume>
          (
          <year>2018</year>
          )
          <fpage>1</fpage>
          -
          <lpage>24</lpage>
          . doi:
          <volume>10</volume>
          .3390/su10103821
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>S. G.</given-names>
            <surname>Barsade</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. E.</given-names>
            <surname>Gibson</surname>
          </string-name>
          , Why Does Affect Matter in Organizations?,
          <source>Academy of Management Perspectives</source>
          <volume>21</volume>
          (
          <year>2007</year>
          )
          <fpage>36</fpage>
          -
          <lpage>59</lpage>
          . doi:
          <volume>10</volume>
          .5465/AMP.
          <year>2007</year>
          .24286163
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>A.</given-names>
            <surname>Meissner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Trübswetter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Conti-Kufner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Schmidtler</surname>
          </string-name>
          ,
          <article-title>Friend or Foe? Understanding Assembly Workers' Acceptance of Human-robot Collaboration</article-title>
          ,
          <source>ACM Transactions on HumanRobotic Interaction</source>
          <volume>10</volume>
          (
          <year>2020</year>
          )
          <fpage>1</fpage>
          -
          <lpage>30</lpage>
          . doi:
          <volume>10</volume>
          .1145/3399433
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>M-K. Stein</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Newell</surname>
            ,
            <given-names>E. L.</given-names>
          </string-name>
          <string-name>
            <surname>Wagner</surname>
            ,
            <given-names>R. D.</given-names>
          </string-name>
          <string-name>
            <surname>Galliers</surname>
          </string-name>
          ,
          <article-title>Continued Use of IT: An Emotional Choice</article-title>
          ,
          <source>Thirty Third International Conference on Information Systems</source>
          , Orlando,
          <year>2012</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>19</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>M-K. Stein</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Newell</surname>
            ,
            <given-names>E. L.</given-names>
          </string-name>
          <string-name>
            <surname>Wagner</surname>
            ,
            <given-names>R. D.</given-names>
          </string-name>
          <string-name>
            <surname>Galliers</surname>
          </string-name>
          ,
          <article-title>Coping with information technology: Mixed emotions, vacillation, and nonconforming use patterns</article-title>
          ,
          <source>MIS Quarterly 39</source>
          (
          <year>2015</year>
          )
          <fpage>367</fpage>
          -
          <lpage>392</lpage>
          . doi:
          <volume>10</volume>
          .25300/MISQ/
          <year>2015</year>
          /39.2.
          <fpage>05</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>B. A.</given-names>
            <surname>Beaudry</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pinsonneault</surname>
          </string-name>
          ,
          <source>The Other Side of Acceptance: Studying the Direct and Indirect Effects of Emotions on Information Technology Use, MIS Quarterly 34</source>
          (
          <year>2010</year>
          )
          <fpage>689</fpage>
          -
          <lpage>710</lpage>
          . doi:
          <volume>10</volume>
          .2307/25750701
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>A.</given-names>
            <surname>Bhattacherjee</surname>
          </string-name>
          ,
          <source>Understanding Information Systems Continuance: An ExpectationConfirmation Model, MIS Quarterly 25</source>
          (
          <year>2001</year>
          )
          <fpage>351</fpage>
          -
          <lpage>370</lpage>
          . doi:
          <volume>10</volume>
          .2307/3250921
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>V.</given-names>
            <surname>Venkatesh</surname>
          </string-name>
          ,
          <article-title>Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model</article-title>
          ,
          <source>Information Systems Research</source>
          <volume>11</volume>
          (
          <year>2000</year>
          )
          <fpage>342</fpage>
          -
          <lpage>365</lpage>
          . doi:
          <volume>10</volume>
          .1287/isre.11.4.342.11872
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>A. O. De Guinea</surname>
            ,
            <given-names>M. L.</given-names>
          </string-name>
          <string-name>
            <surname>Markus</surname>
          </string-name>
          ,
          <article-title>Why Break the Habit of a Lifetime? Rethinking the Roles of Intention, Habit, and Emotion in Continuing Information Technology Use</article-title>
          ,
          <source>MIS Quarterly 33</source>
          (
          <year>2009</year>
          )
          <fpage>433</fpage>
          -
          <lpage>444</lpage>
          . doi:
          <volume>10</volume>
          .2307/20650303
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>S.</given-names>
            <surname>Ahmed</surname>
          </string-name>
          ,
          <source>The Cultural Politics of Emotion</source>
          , 2nd. ed., Edinburgh University Press, Edinburgh,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>M.</given-names>
            <surname>Wetherell</surname>
          </string-name>
          ,
          <article-title>Affect and emotion: A new social science understanding</article-title>
          ,
          <source>SAGE</source>
          , Los Angeles, London,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>C.</given-names>
            <surname>Ratner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A</given-names>
            <surname>Cultural-Psychological Analysis</surname>
          </string-name>
          of Emotions,
          <source>Culture &amp; Psychology</source>
          <volume>6</volume>
          (
          <year>2000</year>
          )
          <fpage>5</fpage>
          -
          <lpage>39</lpage>
          . doi:
          <volume>10</volume>
          .1177/1354067X0061001
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ducey</surname>
          </string-name>
          ,
          <article-title>More Than a Job: Meaning, Affect, and Training Health Care Workers</article-title>
          , in: P. T. Clough (eds.),
          <source>The Affective Turn: Theorizing the Social</source>
          , Duke University Press, Durham [N. C.],
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>P. Zhang,</surname>
          </string-name>
          <article-title>The Affective Response Model: A Theoretical Framework of Affective Concepts and Their Relationships in the ICT Context, MIS Quarterly 37 (</article-title>
          <year>2013</year>
          )
          <fpage>247</fpage>
          -
          <lpage>274</lpage>
          . doi:
          <volume>10</volume>
          .25300/MISQ/
          <year>2013</year>
          /37.1.
          <fpage>11</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>R.</given-names>
            <surname>Kirby</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Forlizzi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Simmons</surname>
          </string-name>
          , Affective social robots,
          <source>Robotics and Autonomous Systems</source>
          <volume>58</volume>
          (
          <year>2010</year>
          )
          <fpage>322</fpage>
          -
          <lpage>332</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.robot.
          <year>2009</year>
          .
          <volume>09</volume>
          .015
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>S.</given-names>
            <surname>Paluch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Tuzovic</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. F.</given-names>
            <surname>Holz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Kies</surname>
          </string-name>
          , M. Jörling, '
          <article-title>My colleague is a robot': Exploring frontline employees' willingness to work with collaborative service robots</article-title>
          ,
          <source>Journal of Service Management</source>
          <volume>33</volume>
          (
          <year>2022</year>
          )
          <fpage>363</fpage>
          -
          <lpage>388</lpage>
          . doi:
          <volume>10</volume>
          .1108/JOSM-11-2020-0406
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>C.S.</given-names>
            <surname>Franklin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. G.</given-names>
            <surname>Dominguez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. D.</given-names>
            <surname>Fryman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. L.</given-names>
            <surname>Lewandowski</surname>
          </string-name>
          ,
          <article-title>Collaborative robotics: New era of human-robot cooperation in the workplace</article-title>
          ,
          <source>Journal of Safety Research</source>
          <volume>74</volume>
          (
          <year>2020</year>
          ),
          <fpage>153</fpage>
          -
          <lpage>160</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.jsr.
          <year>2020</year>
          .
          <volume>06</volume>
          .013
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>