“It is better than working with a person” – Affective cues and responses to robots at work Anna Lampi 1 , Kaisa Venermo 1 , Markus Salo1 and Henri Pirkkalainen2 1 Faculty of Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014 Jyväskylä, FINLAND 2 Unit of Information and Knowledge Management, Faculty of Management and Business, Tampere University, P.O. Box 527, 33101 Tampere, FINLAND Abstract Robotic technologies are gaining popularity in various industries, and, by implication, academic research anticipates changes in occupations and work tasks, as well as in the labor market in general. In the future, human employees and robots will increasingly work side by side as coworkers. Despite growing interest, little is known about human employees’ experiences of working with robots, and particularly, their emotions toward robots. However, research has shown that emotions – that is, affective responses – play an important role in the use and adoption of technologies. To address this research gap, we conducted a qualitative questionnaire examining the emotions experienced by employees working with robots and the antecedents of those emotions. We found seven affective cues (i.e., triggers of emotions) related to robots that employees responded to with various affective responses: robot as a technological advancement, technical errors and restrictions, robot’s autonomy, robot user’s self-efficacy, emotional connection with a robot, decrease in human–human interaction, and fear of losing jobs. The findings of our study provide insights into employees’ experiences and help organizations smoothen the collaboration between human employees and robots. Keywords 1 Robots, affective responses, affective cues, emotions, work 1. Introduction Millions of robots are already in use in various industries throughout the world, and they are becoming increasingly common [1]. Advancing vision and motion technologies, along with software programming are leading to more autonomous and adaptable robots that can work side by side with human employees [2]. Naturally, this development has implications for work. Although some researchers predict that robots may reduce low-skilled work and so-called routine work [3, 4] or decrease labor and wages in some industrial robot-intensive fields [5], the most significant changes are likely to occur in the contents of work [6, 4, 7]. Instead of mass unemployment, robots, and other related technologies are more likely to change occupations and individual work tasks in multiple fields [6, 8, 9]. These changes will affect employer-employee relations and set new requirements for workers [3]. Most notably, robots in organizations are becoming active team members instead of just helping humans [3]. The definition of the term robot varies in different fields. Following [10, p. 377], we used the term to refer to “technology with both virtual- and physical-embodied actions.” Typically, robots are movable machines with sensors, tools, and some degree of autonomy, and are capable of executing The 8th International Conference on Socio-Technical Perspectives in IS Development (STPIS ‘22), August 19–21, 2022, Reykjavík, Iceland EMAIL: anna.k.lampi@student.jyu.fi; kaisa.e.venermo@student.jyu.fi; markus.t.salo@jyu.fi; henri.pirkkalainen@tuni.fi ORCID: 0000-0001-9249-9619 (A. Lampi); 0000-0002-5409-7042 (K. Venermo); 0000-0001-5229-0300 (M. Salo); 0000-0002-5389-7363 (H. Pirkkalainen) © 2022 Copyright for this paper by its authors. CEUR Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR Workshop Proceedings (CEUR-WS.org) ht tp: // ceur -ws .or g Works hop I SSN1613- 0073 Pr oceedi ngs 208 programmed tasks. The most common are industrial robots used in the automotive and electronics industries, but service robots used in the service industry, such as delivery robots, are becoming more common [1]. As such, robotic technologies are part of technological transformation of automated, partly autonomous, connected, and intelligent production processes that are often referred to as the “Fourth Industrial Revolution” or “Industry 4.0” [11]. In this article, the experiences of employees working with robots are explored through the lens of emotions. Emotions can affect employees’ job performance and job satisfaction [12] and change readiness [13]. Furthermore, Stein et al. [14 p. 2, 15] argued that studying emotions may produce a needed “more nuanced understanding” of IT use. Previous studies in information systems (IS) have explored the role of emotions in, for example, new IT implementation processes [16], in initial use of IT and how emotions affect usage intentions [17, 18], and in post-adoption use behavior [14, 15, 19]. Apart from these studies, our research draws on a socio-cultural understanding of emotions [see, e.g., 20, 21, 22]: despite their physio-biological elements, the representation and meaning-making of emotions is socially and culturally influenced. Therefore, rather than focusing on an individual’s specific feelings or emotional states in a psychological sense, we are interested in the meanings individuals give to emotions [see, e.g., 23]. In the research literature, the terms emotion and affective response are often used as synonyms [15], and we followed this approach in this study. In addition, we adopted the term affective cue to refer to the characteristics, aspects, or elements of an object or event evoking affective responses [24]. That is, affective cues are antecedents, triggers, or causes of affective responses. Despite the growing interest in robots in both literature and practice, research on employees’ affective responses to robots is limited. As robots become integrated into multiple levels of work and society, studies on emotions triggered by robots are needed to develop more seamless human-robot interactions [25], and to understand the needs of employees working with robots. Furthermore, studying the triggering of positive emotions may help turn resisters of robots into supporters [26]. To address this gap, our study aims to answer the following research questions: 1. What kinds of affective cues related to robots do employees identify as antecedents of positive and negative affective responses at work? 2. How do employees respond to affective cues related to robots? To address the research questions, we conducted a qualitative questionnaire study of 208 participants working with robots. We examined elements of robots at work that trigger an array of positive and negative affective responses among employees. Our study contributes to the existing literature on affective responses to IT [e.g., 14, 15] by extending the research context to robots. It identifies specific elements of robots at work that evoke various affective responses among employees. Furthermore, the findings of our study have practical value for organizations that use or implement robots. 2. Background 2.1. Robots at work: Toward joint optimization between humans and robots Different types of robots traditionally classified by their intended use, such as industrial robots, service robots, and social robots, are starting to converge as they become more autonomous and interactive. For example, in recent years, industrial robots have been physically separated from human employees, often for safety reasons. This is changing as robotic technology is advancing and so-called collaborative robots are becoming more common [27, 28]. A collaborative robot is one designed to undertake certain tasks at least partly autonomously, but in collaboration with or in the same work environment as humans [26, 27]. Besides industrial settings, collaborative robots or collaborative service robots are becoming more common in the service industry. Social robots, which are designed for interaction with humans or other robots, are evolving from toy-like curiosities to potential human replacements in the service industry. Therefore, categorizing robots is becoming more challenging and 209 non-exclusive. As even industrial robots are increasingly freed from assembly lines or restricted areas in factory halls to work autonomously among human employees, organizations are facing new demands concerning, for example, workspaces and safety standards [29]. Apart from physical conditions, the implementation of robots changes psychosocial work conditions and affects people as well as organizations. Workforces need to adapt to new technology. Changes vary in different fields of industry, and in some fields, the transformation of work may be more striking. In manufacturing, the increasing use of robots may represent a natural continuum of automation since the Industrial Revolution [30], whereas in the service industry, work conditions may change more markedly. For example, in welfare services, robotic technologies carry great promises for solving various challenges in the field. However, prior research to date suggests that robots do not necessarily reduce healthcare workers’ overall workload but transform individual work tasks [31, 7]. Acceptance of robots and people’s willingness to work alongside them are essential for robot implementation. Similarly, the change readiness of organizations is a defining factor, especially in the fields where robots are yet to be used [32]. Acceptance of robots depends on whether they are perceived as a threat or as an opportunity for employees [13]. On an individual level, an employee’s characteristics and personal experiences in particular are associated with higher acceptance rates of robots [32]. A typical outcome of automation is the loss of employee autonomy, which might reduce the desirability of a given occupation. Similarly, the implementation of robots in the workplace and increasing automation may result in job insecurity, which may cause anxiety and resistance toward robots [33, 34]. On a positive note, robots may also increase both employee satisfaction and productivity [34] by, for example, reducing their physical workload and taking over unpleasant routine tasks. An ideal workplace utilizing robotic technologies would comprise joint optimization of social and technical systems, that is, of human employees and robots. In this context, joint optimization refers to the idea of combining differing competences of people and robots: human employees’ flexibility, intelligence, and socio-emotional capabilities with robots’ efficiency and physical strength [35]. This is achieved through successful human-robot interaction including communication, physical contact, and possibly social interactions composed of social, emotional, moral, or cognitive elements [36]. The relationships between social and technical elements do not always need to be equal, but rather, involve varying dynamics [37]. Still, an understanding of human employees and robots as socio-technical relationships or joint optimization directs the attention of discussion to robots complementing and assisting human employees rather than replacing them. 2.2. Affective responses, emotions, and IT at work To date, various studies in the IS field have examined the role of affective responses or emotions in IT use [e.g., 16, 15]. The research draws on psychology, but an understanding of the term emotion differs according to the applied academic literature. For example, so-called basic emotions theory approaches emotions as psychobiological, evolutionary, and universal, whereas appraisal theory considers emotions in a broader sense as processes [for a comparison, see 38]. The appraisal theory of emotions recognizes the physiological aspect of emotions but emphasizes the role of an evaluation of the subjective significance of a situation experienced. Consequently, emotions are seen as responses resulting from an individual’s situational assessment processes, in which the values, goals, and knowledge of an individual play an important part [39]. In the same vein, social and cultural understanding of emotions emphasizes emotions not only as mental states but also as meanings that subjects attach to certain emotions and their representations [e.g., 23, 20]. Furthermore, some researchers [see, e.g., 20, 40, 21] have adopted the term affect (or an affective response) when referring to a broader concept of a sensation; affects can be understood as involving emotions, moods, and feelings but also partly subconscious bodily sensations, emerging in relationships between entities affecting each other. The term affective cue has been used to describe elements, aspects or characteristics of an event or an object that evoke the affective responses [14, 15, 24]. Emotions play an important role in the use and adoption of technologies. Emotions experienced in the adoption phase of new IT affect its later use, either directly or indirectly [16]. According to Beaudry and Pinsonneault [16], positive emotions of excitement and happiness are related to more frequent use 210 of IT through task adaptation and seeking instrumental support, whereas negative emotions of anxiety have more complex repercussions depending on whether users perform psychological distancing or seek social support. Emotions are also important in the use of IT after its initial adoption. They affect intentions to use IT: positive emotions of satisfaction or happiness support intentions to continue its use [17, 16], whereas negative emotions such as anxiety affect the perceived ease of use and therefore decrease intentions to use [18, 16]. Emotions have a role in how use patterns emerge: users form evaluations of IT based on various material, social, and personal cues they respond to emotionally, with either uniform or mixed affective responses. Evaluations then lead to specific use patterns and coping strategies [14, 15]. Emotions experienced when using IT may affect a user’s continuing use behavior, even unconsciously [19]. Additionally, it has been found that emotions evoked by IT project control affect the controlees’ behavioral responses [41]. In computing, affective responses to robots and other artificial agents have been studied for decades [42] to develop human-mimicking software systems and artificial intelligence. However, few studies have investigated how human employees respond to robots in their actual work settings and how robots shape their work experiences. What we do know is that robots differ from other technologies with their physical embodiment and varying degrees of anthropomorphistic, that is, human-like, features and social interaction capabilities. As a result, affective responses to robots may resemble those to humans [10]. For example, van der Woerdt and Haselager [43] found that if a robot’s failure in a task seemed to be caused by its own lack of effort, people tended to ascribe the robot agency and responsibility and experience affective responses of disappointment. Additionally, a robot’s anthropomorphic features increase the ascriptions of agency and emotional capabilities, which evokes generally more positive attitudes toward them [44]. In the work context, You and Robert [10] noted that employees performed better and operated more efficiently when they were emotionally attached to robots they worked with. A literature review by Persson et al. [31] suggested that robots used in care settings may reduce caregivers’ emotional demands, such as stress, by monitoring patients and reducing the need for constant human presence. However, robots also evoked negative responses and increased emotional demands due to technical difficulties and low stability. Consequently, trust between an employee and a robot is important for robot acceptance and efficient teamwork [45]. 3. Methods 3.1. Data collection The usage and experience of robots at work is not yet common, and prior research on robots in the work context has been primarily based on perceptions and expectations of robots rather than on empirical data of actual use experiences [see, e.g., 26]. Therefore, we decided to target a wide online audience in order to reach actual robot users. The population of our research consisted of a database of participants in Amazon Mechanical Turk (MTurk), an online platform where registered users are able to participate in various tasks, such as surveys, for reasonable rewards. Within MTurk, we targeted participants who were located in the United States because the MTurk U.S. population has been found to be comparable with other online and offline research settings [46]. This approach enabled us to reach a nationwide audience of potential robot users efficiently. In MTurk, we conducted a questionnaire of open-ended questions complemented by close-ended questions for background information, such as age, gender, nationality, and questions of education and employment. The questionnaire targeted employees who had worked with physical robots, either using or working next to them (more than tried/trialed). With the term physical robot, we refer to any robot technology with a physical embodiment (excluding, e.g., software robots and chatbots). We conducted a pre-test and a pilot study in December 2021, which helped us to evaluate the questions asked and make minor wording changes. The pilot study was included in the final dataset. The rest of the final 211 dataset was collected between January and March 2022. Besides the location, several criteria typically used in MTurk [46] were applied for assuring data quality: the number of respondents’ approved HITs was set to 1000–5000, the HIT approval rate was set to 95–99%, and we used an attention check question. We excluded answers that, for example, clearly did not answer the questions asked, had text copied from the internet, or seemed like other kinds of spam. The attention check question helped in identifying these answers, although it was never the only criterion used for exclusion. The respondents were rewarded with 2.90 U.S. dollars, the expected time of answering the questionnaire being 15–20 minutes. The final dataset consisted of 208 respondents who had worked with physical robots. The respondents’ average age was 40 years. Of the respondents, 60% were male and 38% were female, while 2% chose not to answer. Respondents represented various industries, the most commonly mentioned of which were manufacturing, ICT, finance and banking, healthcare and medical sector, and retail. Accordingly, the respondents had worked with various kinds of robots, including robotic arms in industrial settings, delivery robots, cleaning robots, and service robots greeting customers in hotels or malls. As pointed out by Hökkä et al. [47], research on emotions in organizational settings tends to emphasize negative emotions and even ignore positive ones. Therefore, we decided to group the questions into two distinct categories – positive and negative emotions – as this allowed us to examine also positive emotional experiences in more detail. The open-ended questions were: 1. What kinds of positive emotions have you experienced when working with a physical robot? 2. What especially has caused positive emotions when working with a physical robot? 3. What kinds of negative emotions have you experienced when working with a physical robot? 4. What especially has caused negative emotions when working with a physical robot? Participants’ answers to questions ranged from 11 to 320 words. The questionnaire also contained questions concerning other technologies, but they were not considered in this article. 3.2. Data analysis The analysis was conducted using qualitative content analysis, following the guidelines by Lune and Berg [48]. After inductive coding and categorization, themes and relationships in the data were identified and examined, and they were compared to previous research. In other words, the analysis started with an exploratory bottom-up approach [49], but iterated toward the end to more of a top-down process. The first author began the coding process by creating data-driven subcategories. To identify affective cues, that is, antecedents of affective responses, the coding process was conducted based on the identification of specific aspects of working with robots. In this stage, there were 23 subcategories of affective cues (e.g., technical errors and sense of achievement). To identify affective responses, a similar coding process was conducted based on the identification of specific emotions or affects. In this stage, there were 40 subcategories of positive affective responses (e.g., happy and relaxed) and 31 subcategories of negative affective responses (e.g., frustration and worry). Several of the affective responses had only one mention. Next, subcategories were discussed among the authors, then subcategories of affective cues were combined by the first author to create seven broader main categories. In the final stage, the original data-driven categories of affective cues were classified as material, social and personal affective cues, following Stein et al. [14, 15] in adopting the terms in reference to specific elements or characteristics of a situation evoking emotions. The term affective cue allows consideration of the various triggers of emotions as fluid and interrelated; certain conditions or technical features, for example, do not always trigger certain emotions; rather, these conditions may involve varying affective cues and cognitive sensemaking. Nineteen of the respondents described positive emotions and their antecedents but indicated not having any negative emotions toward the robots they worked with. 212 4. Findings: Affective cues and responses to robots at work The findings of the study are presented in the following pages. We identified seven affective cues in total and classified them into categories of material cues (robot as a technological advancement, technical errors and restrictions, robot’s autonomy), personal cues (robot user’s self-efficacy), and social cues (emotional connection with a robot, decrease in human–human interaction, fear of losing jobs). Identified cues partly overlap, since some of them can be explored from multiple perspectives. The cues evoked various affective responses, the most common of which are mentioned. However, it is worth emphasizing that affective responses are the results of individual appraisals in a given situation, and none of the affective cues can be generalized as sole triggers of specific affective responses [16]. 4.1. Material cues associated with robotic features Our analysis revealed that employees responded affectively to various cues related to the materiality of a robot. We identified three material cues associated with robotic features: robot as a technological advancement (180 mentions), technical errors and restrictions (161 mentions), and robot’s autonomy (88 mentions). Technical features, functionality and the design of the robot evoked both positive and negative affective responses. Moreover, it turned out that some material features could be a root cause of both positive and negative affective responses, although it was not clear whether the respondents themselves identified the connection. The most commonly associated cue evoking affective responses to robots was perceiving them as a positive technological advancement at work. Especially getting work done more efficiently due to the speed and help of a robot was identified as eliciting positive affective responses of happiness and satisfaction: “When you work with these robots, there is a feeling of satisfaction on how they complete this task faster, without getting tired, it’s just so satisfying.” For some employees, the robot’s efficiency meant that time was freed up for other tasks, and others responded positively to increased capabilities and being more productive. Moreover, it was evident that some employees were intrigued by the pure novelty of robotic technology. Working with robots can create a sense of modernity and progress, which may be responded to with, for example, happiness, excitement, or enthusiasm: “I am attracted to technology, so any opportunity I have had to work with a new robot just makes me happy. I like seeing how much more advanced each new model is compared to the model or unit that came before it.” Working with robots may have very practical and concrete positive outcomes for employees. For some, robots have decreased routine manual work and/or physically demanding tasks and thus also decreased cognitive and/or physical workloads. Delegation of unpleasant work tasks to a robot evoked affective responses of happiness and relief. One employee described feeling “--- much better and --- happier at work” elaborating that: “I no longer have to do all of the heavy lifting, it has increased my mood while working.” More specifically, having a robot perform tasks that previously belonged to a human employee is related to the autonomy of a robot, which is discussed in more detail later. Moreover, the accuracy and predictability of robots were noted to decrease stress. Unsurprisingly, technical errors and restrictions were the most commonly mentioned triggers of negative affective responses among employees. In particular, the malfunctioning, technical errors, and glitches of robots evoked negative affective responses of frustration and anxiety. According to the data, repairing a malfunctioning robot was often a challenging task and could take a long time. In addition, 213 external support was often needed as employees working with robots may not have the ability to repair the robots themselves: “I get frustrated because I can’t fix it and I have to wait on someone else to do this.” Consequently, employees may perceive robots as hindering and delaying their work. A feeling of not having control over the malfunction leads to frustration and impatience. For some employees, stronger responses of anger and hatred were also present, as stated by an employee who described having felt hatred: “When a robot just won’t work correctly even though everything is set correctly and nothing is physically wrong.” Apart from technical errors, robots have indisputable material restrictions. The low-level adaptability of a robot signifies its inability to adapt to the changing needs of the robot user. Most robots are programmed to execute a single task repeatedly. Accordingly, employees working with robots may be compelled to engage in similar repetitiveness. This is in contrast to human work teams, among which behavioral adaptations are practical everyday skills. Thus, a lack of flexibility may evoke affective responses of frustration and boredom. Or, as one employee described himself, “Feeling a little lonely and isolated” because “When I’m hearing certain prompts and commands over and over on a bad day [it] can get kind of irritating.” A specific material cue evoking affective responses was robot’s autonomy. Autonomy itself was not mentioned in the responses but was evident from the material cues mentioned above. Notably, the robot’s autonomy as a technical feature elicited mixed affective responses. The same technological advancements that lessen an employee’s cognitive or physical workload and evoke positive affective responses may sooner or later threaten the employee’s job altogether. Whether this is true or probable is not that relevant, since this is how employees feel. Therefore, apart from positive affective responses, robot autonomy evoked negative affective responses of fear and worry: “Though the robot is helpful, I sometimes fear that the robot will eventually make my job unneeded. So sometimes I get a little worried about that.” 4.2. Personal cues associated with robot user’s characteristics Our analysis revealed that, apart from material cues and technical robotic features, employees responded affectively to personal cues associated with their own characteristics. We identified one personal cue from the data: robot user’s self-efficacy (100 mentions). As expected, a user’s self-efficacy evoked both positive and negative affective responses. The majority of affective responses to the robot user’s self-efficacy were positive affective responses of satisfaction and excitement. Working with a robot elicited a sense of accomplishment in employees, especially after learning something new or being able to solve a problem: “Fixing it myself, seeing it at work again because of my knowledge.” Controlling and maintaining the robot boosted self-efficacy and positive responses, as with an employee who enjoyed “being the go-to guy for how it works.” Employees could perceive a work task as rewarding and be proud of it even though the task was executed by a robot if they could maintain a sense of being responsible for the robot’s work and having control: 214 “I was proud working with the robots. I just sounded impressive when I told people I “fixed robots”. They thought it was more complicated than it was. Plus, the robots did tasks that would have been difficult for a human to do efficiently.” Apart from positive affective responses, the robot user’s self-efficacy was represented by negative affective responses of frustration, worry, and anxiety. Having to learn new skills was sometimes considered challenging: “The frustration in the beginning of learning something completely new at my age. I seriously had doubts if I would ever master the skills needed to interact with robots.” Then again, not being fully trained and having to work with a robot were triggers for confusion and self-doubt. Furthermore, self-efficacy as a personal cue is related to the material cues discussed above. For example, the inability to fix technical malfunctions and errors of a robot affected users’ understanding of self-performance. 4.3. Social cues associated with robots changing workplace dynamics Our analysis revealed that apart from material and personal cues, employees responded affectively to social cues associated with robots changing the workplace dynamics. We identified three social cues from the data: emotional connection with a robot (51 mentions), decrease in human–human interaction (24 mentions), and fear of losing jobs (30 mentions). Employees responded to social cues with both positive and negative affective responses. The cues of emotional connection with a robot and a decrease in human–human interaction evoked both positive and negative affective responses depending on the employees’ appraisal and preferences. The most commonly associated social cue evoking affective responses among employees was emotional connection with a robot. Depending on the employee, either the perceived lack of emotional connection with a robot caused negative affective responses, or the perceived connection with a robot caused positive ones. For some, the emotional connection evoked mixed affective responses. Positive and negative responses were more or less equally distributed in the data. Regardless of the nature of the affective responses, employees seemed to consider the emotional connections and social dynamics of the workplace important. The perceived lack of emotional connection with the robot evoked negative affective responses of frustration and loneliness. Some employees missed human coworkers and everyday social connection, represented by, for example, small talk or occasional smiles. As one employee described their relationship with a robot: “It is emotionless and performs only set tasks, which makes me feel like having a one-sided relationship.” The lack of emotional connection and involved insufficient interaction was related to some of the material cues discussed above. Inadaptable, inflexible robots lack spontaneity that is natural to humans and do not adapt to changing situations like human employees would. This itself underlines the mechanical, unemotional nature of robots, but in some cases may even cause safety hazards. In such cases, employees feel the robot “doesn’t care”: “When something goes wrong. The robot does not care and will do the same action over and over again.” For some employees, the perceived emotional connection with a robot evoked positive affective responses of happiness and relaxation. A few employees reported enjoying the human-like characteristics of a robot, such as its daily greetings or dressing it up. Often, the robot was seen as a coworker or part of a team: 215 “--- I think the robot is also a good colleague in that they are willing to help all the time.” For some, the robot appeared to be even like a friend, or someone with whom they could share an emotional bond. One employee described an emotional attachment to a robot they were working with: “--- It doesn’t matter that it’s not real, just that it appears to be real.” However, some employees reported having both positive affective responses of emotional connections with robots and negative affective responses of not having “enough” of it. Robots may also change workplace dynamics in other ways. Some employees identified their fear of losing jobs as a primary trigger for evoking negative affective responses at work. The fear of losing one’s job could be classified as a personal cue, but the data reveal that the fear was often directed not solely for respondents’ own jobs, but also for coworkers. Therefore, fear of losing jobs may be perceived as a broader issue affecting workplace dynamics. Naturally, the fear of losing jobs evoked purely negative affective responses, apart from fear worry and anxiety. A few of the employees had already experienced the replacement of a coworker with a robot: “The fact that up until two years ago the robot’s job had been done by a human who I knew and liked.” The fear of losing jobs is related to material cues of technological advancement and the robot’s autonomy – the same employees enjoyed working with modern technology, but simultaneously feared it would advance “too far” and replace them. The last social cue we identified from the data was decrease in human–human interaction. As close as it is to the cue emotional connection with a robot, within this cue, the trigger of affective responses was not the robot itself but the lack of social dynamics it symbolized – less of human–human interaction compared to a workplace without robots. From this perspective, the decrease in human–human interaction evoked positive affective responses of relief and relaxation. Some employees appeared to either have had negative experiences with human coworkers in the past, or they were anticipating those based on their presumptions: “I don’t have to worry about a coworkers stupid bullshit.” Therefore, they perceived working with a robot as preferable to working with humans. With a robot, social norms are not important: “It is better than working with a person. A robot doesn’t have feelings and so you can direct it to do what you want without worrying about hurting it’s feelings.” Some employees considered the workplace dynamics with other humans stressful and competitive, whereas a robot was seen as nonjudgmental or easier to work with: “I think the lack of drama and competition will create a more positive work environment. The robot also won't get mad or upset if I make a mistake.” In some cases, robots were considered more reliable and consistent than humans, which again decreased general stress. 5. Discussion A large part of prior research on robots becoming increasingly common in various industries has either emphasized employees’ negative attitudes toward robots or speculated about robots’ potential to cause job loss. Our study extends these approaches by demonstrating how employees working with 216 robots experience an array of both negative and positive affective responses to the technology. By so doing, it provides a more nuanced understanding of IT (and specifically robot) use, as called for by Stein et al. [14, 15]. How employees respond affectively to working with robots is related not only to technology itself, but to multiple social and personal dimensions. Furthermore, employees’ affective responses are not always uniform but sometimes mixed. Based on our analysis, employees experiencing negative affective responses to robots, such as fear or frustration, can still identify various triggers of positive affective responses. This is in line with a study by Hoorn et al. [42, p. 2], who noted that “a robot may be appreciated for working effectively but simultaneously that effectiveness may induce fear of job loss.” 5.1. Implications for research and practice Our study contributes to the existing knowledge about affective responses in IT use by extending the study context to robots at work. Our analysis followed research by Stein et al. [14, 15] in classifying identified data-driven affective cues into three categories: material, personal, and social cues. We identified seven affective cues in total: material cues of robot as a technological advancement, technical errors and restrictions, and robot’s autonomy, a personal cue of robot user’s self-efficacy, and social cues of emotional connection with a robot, decrease in human–human interaction, and fear of losing jobs. In line with Stein et al. [14, 15], our findings describe the importance of material cues such as technical features and restrictions, and the possibilities of technology in eliciting affective responses. However, robots and physical robots in particular differ from other IT systems and may have some characteristic features that prove to be salient in evoking affective responses. Our analysis reveals that large part of affective responses to robots relate to robots being associated with humans, such as the emotional connection and interaction with a robot, or lack thereof. Robots are often perceived as coworkers and accordingly, seen as friends or rivals. Our study has several practical implications. The findings of our study can be utilized by organizations planning an implementation process of robots, but also have value for organizations already using them. Paying attention to employees’ affective responses and their antecedents can help to smoothen the implementation process and increase overall job satisfaction. Experiencing positive emotions at work improves performance at the individual as well as organizational levels, while negative emotions may degrade organizational culture and, for example, lead to employee stress and burnout [12]. An understanding of emotions or affective responses may be used to motivate employees and improve their change readiness. Emotions are building blocks of attitudes [50], and an understanding of employees’ affective responses may help in supporting the formation of positive attitudes toward robots. Apart from technology itself, attitudes toward robots are affected by attitudes toward organizational change, which again is affected by emotions such as insecurity and loss of control [13]. The findings of our study suggest that employees value a sense of control, and respond more positively to, for example, delegation of work tasks to robots if their sense of control is maintained. Accordingly, a sense of control could help prevent the fear of job loss. Employers could put effort into dispelling employees’ worries by, for example, sharing organization’s plans for robotic technologies and employees’ retraining. On the other hand, negative emotions in general can also act as important indicators of injustice or wrongful conduct in organizations [12]. Therefore, emotions that are considered negative as such, may have positive outcomes for individuals [51]. 5.2. Limitations and future research Our research has certain limitations that need to be acknowledged. First, affective responses and their antecedents may differ in different contexts. They may be evoked by organizational context [16], which is a factor that our study did not focus on. It is also possible that affective responses and affective cues differ depending on robot type and features such as its appearance. In addition, prior research [32] showed that acceptance and motives for acceptance of robots varied across different countries, perhaps 217 due to disparities in the labor market and job security, and this might also be the case with affective responses. Second, studying affective responses with a questionnaire presents its own challenges. Affective responses were studied retrospectively, which may create recall bias. Using a written format of a questionnaire also measures respondents’ abilities to express themselves in a written format, and so differs from interview studies. The distinction of emotions to categories of positive and negative may restrict the representation of more controversial emotions and does not allow evaluations of employees’ general moods or attitudes toward robots. However, it did provide room for both kinds of emotions and resulted in data on the antecedents and characteristics of positive emotions, as called for by Hökkä et al. [47]. Finally, the analysis of the study was principally conducted by the first author, and the results may have been affected by their interpretation. Our findings provide possible topics for future research. In future studies, it could be important to investigate how affective responses affect robot use and emerging behavioral patterns. Furthermore, it would be of interest to explore what kinds of coping strategies employees apply when experiencing negative affective responses related to robot use. Our findings suggest that physical robots tend to evoke mixed or ambivalent affective responses. Previous research [15] has shown that in such situations, IT users mix different coping strategies as well. Whether similar user strategies are applied when working with robots remains yet to be investigated. 6. Conclusion As robots become increasingly common in various industries, we need a more comprehensive understanding of employees’ experiences working with them. Our study suggests that employees have various affective responses to working with robots triggered by multiple affective cues. Paying attention to cues triggering positive affective responses can help reinforce employees’ job satisfaction and performance, or even turn robot resisters into supporters. On the other hand, cues triggering negative affective responses may indicate a need for adjustments in an organization. 7. Acknowledgements This research has been funded by the Emil Aaltonen Foundation and the Academy of Finland (341359). This Word template was created by Aleksandr Ometov, TAU, Finland. The template is made available under a Creative Commons License Attribution-ShareAlike 4.0 International (CC BY-SA 4.0). 8. References [1] IFR. International Federation of Robotics, Presentation of World Robotics Press Conference, 2021. URL: https://ifr.org/downloads/press2018/2021_10_28_WR_PK_Presentation_long_version.pdf. [2] IFR. International Federation of Robotics, Robots and the Workplace of the Future, 2018. [3] R. Kurt, Industry 4.0 in Terms of Industrial Relations and Its Impacts on Labour Life, Procedia Computer Science 158 (2019) 590–601. doi:10.1016/j.procs.2019.09.093 [4] G. Graetz, G. Michaels, Robots at Work, The Review of Economics and Statistics 100 (2018) 753– 768. doi:10.1162/rest_a_00754 [5] D. Acemoglu, P. Restrepo, Robots and Jobs: Evidence from US Labor Markets, The Journal of Political Economy 128 (2020) 2188–2244. doi:10.1086/705716 218 [6] P. Fleming, Robots and Organization Studies: Why Robots Might Not Want to Steal Your Job, Organization Studies 40 (2019) 23–38. doi:10.1177/0170840618765568 [7] C. Lloyd, J. Payne, Fewer jobs, better jobs? An international comparative study of robots and ‘routine’ work in the public sector, Industrial Relations Journal 52 (2021) 109–124. doi:10.1111/irj.12323 [8] E. Brynjolfsson, A. McAfee, The second machine age: Work, progress, and prosperity in a time of brilliant technologies, W. W. Norton, New York, London, 2014. [9] M. Ford, The rise of the robots: Technology and the threat of mass unemployment,” Oneworld, London, 2016. [10] S. You, L. P. Jr. Robert, Emotional attachment, performance, and viability in teams collaborating with embodied physical action (EPA) robots, Journal of the Association for Information Systems 19 (2018) 377–407. doi:10.17705/1jais.00496 [11] M. Piccarozzi, B. Aquilani, B., C. Gatti, Industry 4.0 in Management Studies: A Systematic Literature Review, Sustainability 10 (2018) 1–24. doi:10.3390/su10103821 [12] S. G. Barsade, D. E. Gibson, Why Does Affect Matter in Organizations?, Academy of Management Perspectives 21 (2007) 36–59. doi:10.5465/AMP.2007.24286163 [13] A. Meissner, A. Trübswetter, A. Conti-Kufner, J. Schmidtler, Friend or Foe? Understanding Assembly Workers' Acceptance of Human-robot Collaboration, ACM Transactions on Human- Robotic Interaction 10 (2020) 1–30. doi:10.1145/3399433 [14] M-K. Stein, S. Newell, E. L. Wagner, R. D. Galliers, Continued Use of IT: An Emotional Choice, Thirty Third International Conference on Information Systems, Orlando, 2012, pp. 1–19. [15] M-K. Stein, S. Newell, E. L. Wagner, R. D. Galliers, Coping with information technology: Mixed emotions, vacillation, and nonconforming use patterns, MIS Quarterly 39 (2015) 367–392. doi:10.25300/MISQ/2015/39.2.05 [16] B. A. Beaudry, A. Pinsonneault, The Other Side of Acceptance: Studying the Direct and Indirect Effects of Emotions on Information Technology Use, MIS Quarterly 34 (2010) 689–710. doi:10.2307/25750701 [17] A. Bhattacherjee, Understanding Information Systems Continuance: An Expectation- Confirmation Model, MIS Quarterly 25 (2001) 351–370. doi:10.2307/3250921 [18] V. Venkatesh, Determinants of Perceived Ease of Use: Integrating Control, Intrinsic Motivation, and Emotion into the Technology Acceptance Model, Information Systems Research 11 (2000) 342–365. doi:10.1287/isre.11.4.342.11872 [19] A. O. De Guinea, M. L. Markus, Why Break the Habit of a Lifetime? Rethinking the Roles of Intention, Habit, and Emotion in Continuing Information Technology Use, MIS Quarterly 33 (2009) 433–444. doi:10.2307/20650303 [20] S. Ahmed, The Cultural Politics of Emotion, 2nd. ed., Edinburgh University Press, Edinburgh, 2014. [21] M. Wetherell, Affect and emotion: A new social science understanding, SAGE, Los Angeles, London, 2012. [22] C. Ratner, A Cultural-Psychological Analysis of Emotions, Culture & Psychology 6 (2000) 5–39. doi:10.1177/1354067X0061001 [23] A. Ducey, More Than a Job: Meaning, Affect, and Training Health Care Workers, in: P. T. Clough (eds.), The Affective Turn: Theorizing the Social, Duke University Press, Durham [N. C.], 2007. [24] P. Zhang, The Affective Response Model: A Theoretical Framework of Affective Concepts and Their Relationships in the ICT Context, MIS Quarterly 37 (2013) 247–274. doi:10.25300/MISQ/2013/37.1.11 [25] R. Kirby, J. Forlizzi, R. Simmons, Affective social robots, Robotics and Autonomous Systems 58 (2010) 322–332. doi:10.1016/j.robot.2009.09.015 [26] S. Paluch, S. Tuzovic, H. F. Holz, A. Kies, M. Jörling, ‘My colleague is a robot’: Exploring frontline employees’ willingness to work with collaborative service robots, Journal of Service Management 33 (2022) 363–388. doi:10.1108/JOSM-11-2020-0406 [27] C.S. Franklin, E. G. Dominguez, J. D. Fryman, M. L. Lewandowski, Collaborative robotics: New era of human–robot cooperation in the workplace, Journal of Safety Research 74 (2020), 153–160. doi:10.1016/j.jsr.2020.06.013 219 [28] A. Hentout, M. Aouache, A. Maoudj, I. Akli, Human-robot interaction in industrial collaborative robotics: A literature review of the decade 2008-2017, Advanced Robotics 33 (2019) 764–799. doi:10.1080/01691864.2019.1636714 [29] A. Rega, F. Vitolo, C. Di Marino, S. Patalano, A knowledge-based approach to the layout optimization of human–robot collaborative workplace, International Journal on Interaction Desing and Manufacturing 15 (2021) 133–135. doi:10.1007/s12008-020-00742-0 [30] E. Fernández-Macías, D. Klenert, J. Antón, Not so disruptive yet? Characteristics, distribution and determinants of robots in Europe, Structural Change and Economic Dynamics 58 (2021) 76–89. doi:10.1016/j.strueco.2021.03.010 [31] M. Persson, D. Redmalm, C. Iversen, Caregivers’ use of robots and their effect on work environment – a scoping review, Journal of Technology in Human Services (2021) 1–27. doi: 10.1080/15228835.2021.2000554 [32] T. Turja, A. Oksanen, Robot acceptance at work: A multilevel analysis based on 27 EU Countries, International Journal of Social Robotics 11 (2019) 679–689. doi:10.1007/s12369-019- 00526-x [33] S. Erebak, T. Turgut, Anxiety about the speed of technological development: Effects on job insecurity, time estimation, and automation level preference, Journal of High Technology Management Research 32 (2021) 100419. doi:10.1016/j.hitech.2021.100419 [34] V. N. Lu, J. Wirtz, W. H. Kunz, S. Paluch, T. Gruber, A. Martins, P. G. Patterson, Service robots, customers and service employees: What can we learn from the academic literature and where are the gaps?, Journal of Service Theory and Practice 30 (2020) 361–391. doi:10.1108/JSTP-04-2019- 0088 [35] T. Turja, Accepting robots as assistants: A social, personal, and principled matter, Ph.D. thesis, Tampere University, Tampere, Finland, 2019. [36] S. K. Ötting, L. Masjutin, J. J. Steil, G. W. Maler, Let's Work Together: A Meta-Analysis on Robot Design Features That Enable Successful Human-Robot Interaction at Work, Human Factors (2020) 1–24. doi:10.1177/0018720820966433 [37] S. Sarker, S. Chatterjee, X. Xiao, A. Elbanna, The Sociotechnical Axis of Cohesion for the IS Discipline: Its Historical Legacy and its Continued Relevance, MIS quarterly 43 (2019), 695–719. doi:10.25300/MISQ/2019/13747 [38] J. P. P. Jokinen, J. Silvennoinen, The appraisal theory of emotion in human-computer interaction, in: R. Rousi, J. Leikas, P. Saariluoma (eds.), Emotions in Technology Design: Emotions to Ethics, Springer International Publishing AG, 2020. [39] R. S. Lazarus, S. Folkman, Stress, Appraisal, and Coping, Springer Publishing Company, Inc., New York, 1984. [40] T. Juvonen, M. Kolehmainen, Affective inequalities in intimate relationships, Routledge, Abingdon, New York, 2018. [41] D. Murungi, M. Wiener, M. Marabelli, Control and emotions: Understanding the dynamics of controllee behaviours in a health care information systems project, Information Systems Journal 29 (2019) 1058–1082. doi:10.1111/isj.12235 [42] J. F. Hoorn, T. Baier, J. A. N. Van Maanen, J. Wester, Silicon Coppelia and the Formalization of the Affective Process, IEEE Transactions on Affective Computing, 2021. doi: 10.1109/TAFFC.2020.3048587 [43] S. van der Woerdt, S., P. Haselager, When robots appear to have a mind: The human perception of machine agency and responsibility, New Ideas in Psychology 54 (2019) 93–100. doi: 10.1016/j.newideapsych.2017.11.001 [44] K. C. Yam, Y. E. Bigman, P. M. Tang, R. Ilies, D. De Cremer, H. Soh, K. Gray, Robots at work: People prefer—and forgive—service robots with perceived feelings, Journal of Applied Psychology 106 (2021), 1557–1572. doi:10.1037/apl0000834 [45] T. Law, M. Chita-Tegmark, M. Scheutz, The Interplay Between Emotional Intelligence, Trust, and Gender in Human–Robot Interaction: A Vignette-Based Study, International Journal of Social Robotics 13 (2020) 297–309. doi:10.1007/s12369-020-00624-1 [46] W. Mason, S. Suri, Conducting behavioral research on Amazon’s Mechanical Turk, Behavior Research Methods 44 (2012) 1–23. doi:10.3758/s13428-011-0124-6 220 [47] P. Hökkä, K. Vähäsantanen, S. Paloniemi, Emotions in Learning at Work: A Literature Review, Vocations and Learning 13 (2019) 1–25. doi:10.1007/s12186-019-09226-z [48] H. Lune, B. L. Berg, Qualitative Research Methods for the Social Sciences, Pearson, Edinburgh, Harlow, Essex, 2017. [49] M. D. Myers, Qualitative Research in Business and Management, SAGE, London, Thousand Oaks, Calif., New Delhi, Singapore, 2020. [50] P. Saariluoma, J. Cañas, J. Leikas, Designing for Life: A Human Perspective on Technology Development, Palgrave Macmillan, London, 2016. [51] F. Meijers, Career Learning in A Changing World: The Role of Emotions, International Journal for the Advancement of Counselling 24 (2002) 149–167. doi:10.1023/A:1022970404517 221