Utilizing Organizational Communication Theory for Community Embedded Robotics Emily Norman1,* , Ryan Gupta2 , Luis Sentis2 and Keri K. Stephens1 1 Department of Communications Studies, University of Texas at Austin, Austin, Texas, USA 2 Department of Aerospace Engineering and Engineering Mechanics, University of Texas at Austin, Austin, Texas, USA Abstract Our team of researchers from communications studies, engineering, computer science, and data informatics have worked towards transdisciplinary research in community embedded robotics for the past two years. This paper highlights how Adaptive Structuration Theory, a seminal social science framework, can inform trust when explor- ing the complex problem of deploying autonomous robots in our community at the University of Texas at Austin. More information about our full team and ongoing research can be found at https://sites.utexas.edu/nsf-gcr/. Keywords Community Embedded Robotics, Organizational Communications Theory, Adaptive Structuration Theory 1. Introduction Robot deployments in communities represent a highly complex task as pedestrian spaces are rapidly changing, crowded, and present various hazards. Furthermore, robots share space with humans who maintain varying worldviews, attitudes, tolerances, and baseline trust levels. Lab studies have shown that robot malfunctions can alter participants’ trust [1], impacting their perceived safety [2]. Over the past two years, our team has approached the challenge of deploying trustworthy robots in community settings. We aim to converge methodologies, frameworks, and mental models from human-robot interaction (HRI), social science, engineering, computer science, and data informatics. Topics of interest include perceived safety, social navigation, human-robot teaming, and users’ perceptions of community robots. Our ultimate goal is to lead the next generation of transdisciplinary research into solving the highly complex and consequential problem of trust in community embedded robotics. All fields intersecting community embedded robotics can benefit from a better understanding of social interaction between people and robots, ultimately leading to the development of autonomous robots that are more trustworthy. This paper argues for the use of Adaptive Structuration Theory (AST) as an analytical method to increase understanding about community attitudes, trust, and decision making in an embedded robotics context. Adaptive Structuration Theory is designed to examine the social structure of interactions between people and technology, offering a new perspective from which to study community embedded robotics. Furthermore, AST accounts for how people, technologies, and organizations change over time. To that end, we propose Adaptive Structuration Theory as a framework to explore community and organizationally embedded robotics. 2. Adaptive Structuration Theory in Robotics To ground the team in our community, the project began with a series of nearly 70 qualitative interviews regarding participants’ ideas, concerns, knowledge, and fears of community embedded robots. The ALTRUIST, BAILAR, SCRITA, WARN 2024: Workshop on sociAL roboTs for peRsonalized, continUous and adaptIve aSsisTance, Workshop on Behavior Adaptation and Learning for Assistive Robotics, Workshop on Trust, Acceptance and Social Cues in Human-Robot Interaction, and Workshop on Weighing the benefits of Autonomous Robot persoNalisation. August 26, 2024, Pasadena, USA $ emilynorman@utexas.edu (E. Norman); ryan.gupta@utexas.edu (R. Gupta)  0009-0001-9112-760X (E. Norman); 0000-0002-3324-6600 (R. Gupta); 0000-0003-2856-4863 (L. Sentis); 0000-0002-9526-2331 (K. K. Stephens) © 2024 Copyright for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0). CEUR ceur-ws.org Workshop ISSN 1613-0073 Proceedings primary thrust was to ascertain community members’ general feelings of trust, acceptance, and understanding of robots. We believe this background study was an imperative step before successful deployments would be possible. The ongoing data analysis includes characterizing a theoretical deployment of community embedded robots, anticipating issues, successes, and unintended findings prior to deployment. To this end, we are performing a sensitized, constant comparative analysis [3] using Adaptive Structuration Theory on the interview transcripts. Adaptive Structuration Theory is frequently used in organizational studies to study changes in social interaction as advanced information technologies (AIT) are implemented [4]. Field studies suggest that deployment of robots into organizations affects workflow, social, and political structures following imple- mentation [5]. The benefit of using AST in such studies is that researchers can analyze aforementioned structural and social changes through the use of a systematic method. Furthermore, as many roboticists would agree, the intended use of a technology does not always result in the actual use of a technology (e.g. https://spectrum.ieee.org/children-beating-up-robot). Traditionally, AST examines technology use in organizational decision making, such as video conferencing platforms, voting tools, and virtual schedulers. These tools are referred to as group decision support systems (GDSSs). We argue that extending the model to technology that makes decisions is a valuable endeavor. Indeed, this extension to investigate autonomous robots in communities provides significant value to roboticists seeking to un- derstand trust, adoption, and social dynamics. Simultaneously, it enables social scientists to understand the complexities of human robot interaction dynamics as technological change in organizations. 2.1. Structural Features AST relies heavily upon a social perspective of technologies because the original technologies studied (GDSSs) were primarily social-organizational tools. The theory assumes that all technology provides a series of structures, referred to as structural features, that govern how people use systems during social interactions. For robots, structural features include the robots’ communication abilities, ability to ma- nipulate the world around them, and their mobility. For instance, one cannot have a conversation with a robot that lacks a microphone and speaker. Depending on the robot’s level of interaction complexity, its structural features are more loosely or tightly bound, which represents its adaptability in different contexts. To continue the example, a robot equipped with a large language model to respond to conver- sation with a person is loosely bound in its communication abilities, while a robot with a fixed database of phrases would be tightly bound. As users encounter a robot and interact with its structural features, users adapt their own communication (interactive) practices and social structures. Such examination provides valuable insights that enables researchers to adapt a technology’s features to suit community needs. Users can also leverage the robot’s structural features to appropriate it in unexpected ways [6]. 2.2. Spirit According to AST, the most iterative component of a technology is its spirit, also thought of as its intended use. Interpreting the spirit of a robot lies within one’s ability to anticipate (or examine post-deployment) resistance or acceptance of the robot in context. Spirit is the meeting place between a technology’s technical and non-technical features. Some scholars refer to spirit as a technology’s philosophy, intentions, or values [7]. However, the designers’ intentions often do not equate with the spirit of a technology. Instead, spirit is the combination of perspectives from many sources (e.g. designer, user, bystander, media) and is shaped through subsequent interactions over time. For example, researchers tested personalized HRI with a snack delivery robot [8], and although they did not use AST in their study, the theory provides a clear example of spirit. The researchers initially described the robot as offering a “holistic service," and set out to mirror how service workers interact with customers. The original spirit of the robot, based on the designers’ intent, could be labeled as routine, personalized service. However, the spirit changed over time as participants interacted with the robot based on the conditions they encountered. For instance, in the personalized condition, the robot would converse with participants based on previous snack orders and interactions. In the depersonalized condition, the robot had a pre-determined script that did not change based on interactions. Due to participants’ reported lackluster attitudes toward the robot in the depersonalized condition, we might consider the spirit of the robot at the end of the experiment to be routine, impersonal service. On the other hand, reports from participants in the personalized condition would describe the robot’s spirit along the lines of friendly, personalized service. This highlights that based on interactions with a technology over time, users’ and designers’ attitudes and experiences are reflected by describing spirit. 2.3. Levels of Analysis After the identification of structural features and spirit, the next step involves appropriation process analysis [4]. Using an analytical strategy, the appropriation analysis illustrates how new social structures and technologies affect human interaction more generally. The analytical strategy examines three different levels of social interaction: micro, global, and institutional. (See [9] , [10], and [11] for step-by-step examples of appropriation analysis.) The various levels of appropriation analysis are described below, and the analysis components are shown in parentheses. The micro level of analysis focuses on various individual level speech acts, such as pedestrian conversations, user experience interviews, or designer conversations. Analyzing these data produces evidence of adaptive social structures related to technology appropriation. Researchers should be sensitive to appropriations both within the bounds of designers’ original intentions for the technology, and also those outside the bounds of the designers’ intentions. Is the robot being used for its intended purpose, e.g., delivery (faithful appropriation)? Are pedestrians using the robot for entertainment due to repeated malfunctions (unfaithful appropriation)? Researchers may also find resistance to technology, phrases which challenge implementation, and overall skepticism (attitudes). The global level of analysis identifies patterns of appropriation and social structure changes across groups. This may include meeting transcripts, revised user manuals or media. Micro level texts may extend to the global level as they are gathered together and analyzed as a whole. Global level enables researchers to see the formation of larger changes in social interaction across time and multiple groups. Finally, the institutional level of analysis focuses on discourse about technology occurring across a longitudinal span. Social structure changes at this level include new discourse surrounding the technology, differences among users across the years, and policy decisions. Other examples may include the implementation of robotics in new settings. Researchers interested in examining changes at a total organization level will likely require an institutional level analysis; on the other hand, some researchers may find what they need at the global micro and global levels. The ability to generalize should be taken into account. 3. Conclusions & Future Directions This paper provides a brief introduction to Adaptive Structuration Theory for analytical use to under- stand community perceptions of embedded robotics. Our hope is that further utilization of social science theory in HRI research will yield a new understanding of trust and social response to autonomous robots. Whether organizational change be municipal, corporate, or non-governmental, new technology introduction is typically met with uncertainty. This paper provides an intro for roboticists and social scientists alike seeking to understand community perceptions, increase trust, and foster acceptance embedded robot deployment. ACKNOWLEDGMENT This research was supported in part by NSF Award #2219236 (GCR: Community Embedded Robotics: Understanding Sociotechnical Interactions with Long-term Autonomous Deployments) and Living and Working with Robots, a core research project of Good Systems, a UT Austin Grand Challenge. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. References [1] Neziha Akalin, Annica Kristoffersson, and Amy Loutfi. “Do you feel safe with your robot? Factors influencing perceived safety in human-robot interaction based on subjective and objective measures”. In: International journal of human-computer studies 158 (2022), p. 102744. [2] Neziha Akalin et al. “A taxonomy of factors influencing perceived safety in human–robot inter- action”. In: International Journal of Social Robotics 15.12 (2023), pp. 1993–2004. [3] Barney Glaser and Anselm Strauss. Discovery of grounded theory: Strategies for qualitative research. Routledge, 2017. [4] Gerardine DeSanctis and Marshall Scott Poole. “Capturing the complexity in advanced technology use: Adaptive structuration theory”. In: Organization science 5.2 (1994), pp. 121–147. [5] Bilge Mutlu and Jodi Forlizzi. “Robots in organizations: the role of workflow, social, and environ- mental factors in human-robot interaction”. In: Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction. 2008, pp. 287–294. [6] Dhaval Vyas, Cristina M Chisalita, and Alan Dix. “Organizational affordances: A structuration theory approach to affordances”. In: Interacting with Computers 29.2 (2017), pp. 117–131. [7] Huub JM Ruel. “Stressing office technology’s non-technical side: Applying concepts from adaptive structuration theory”. In: Issues of Human Computer Interaction. IGI Global, 2004, pp. 225–262. [8] Min Kyung Lee et al. “Personalization in HRI: A longitudinal field experiment”. In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction. 2012, pp. 319– 326. [9] Gerardine DeSanctis and Marshall S Poole. “Understanding the differences in collaborative system use through appropriation analysis”. In: Proceedings of the Twenty-Fourth Annual Hawaii International Conference on System Sciences. Vol. 3. IEEE. 1991, pp. 547–553. [10] David Salisbury and Matthew Stollak. “Process restricted AST: an assessment of group sup- port systems appropriation and meeting outcomes using participant perceptions”. In: ICIS 1999 Proceedings (1999), p. 3. [11] Andrew M Hardin, Clayton A Looney, and Mark A Fuller. “Self-efficacy, learning method appropri- ation and software skills acquisition in learner-controlled CSSTS environments”. In: Information Systems Journal 24.1 (2014), pp. 3–27.