Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 Teaching of Technology IS Teaching of Ethics. But How? Short paper Norberto Patrignani1 and Iordanis Kavathatzopoulos2 1 Politecnico of Torino, Torino, Italy 2 Uppsala University, Uppsala, Sweden 1 norberto.patrignani@polito.it Abstract. Computer experts, or computer professionals, they know how a system is made and how it works: it's time to ask how to design it, who will use it, for what purposes? Up to the crucial question: whether to design it (or not). For reaching this awareness they need a strong ethical competence, but how can they reach this competence? Introducing an analysis of the stakeholders' network in computer ethics courses for future designers can be a useful starting point. Keywords: Computer Ethics, Teaching Ethics, Slow Tech 1 Introduction Information and Communication Technologies (ICT) are now strictly intertwined with society and are critically impacting also the environment. In the last twenty years the importance of improving the curriculum of computer science with an ethics component has been growing and it is now largely recognized. The question still open is: while the technical part of the curriculum is quite easy to delineate, what is the best way for the introduction of the "computer ethics" part, where the social and ethical implications of the design choices are inserted? This part has now to face the enormous challenges of the entire ICT supply chain (from design to development, manufacturing, usage, deployment, and disposal). The Slow Tech concept, an ICT that is good, clean and fair, could provide some guidance in this direction (Patrignani, 2020). Also, computer professionals should be able to "enlarge" their view by including the most important stakeholders involved in the systems' design and deployment, being able to identify the relationships among them and, most importantly the quality of these relationships: are they equal? Are they based on reciprocity? Since technology is not neutral, and technology and society co-shape each other (Johnson, 1985), the designers of computer systems and digital services should enlarge their view from the technical details to the social complexity around them. This is the background of the main claim of this short paper: teaching about technology means teaching about ethics. In the XXI century, cultivating technical Copyright © 2021 for this paper by its authors. 108 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 competences, considering science and technology as neutral, it does not make sense anymore. 2 Teaching of technology is teaching of ethics The refusal of the techno-determinism has been proposed since the beginning of the XXth century (Mumford, 1934; Ellul, 1954). The dawn of computer age also was seen as a starting point of a controversial relationship between technology and society (Wiener, 1950). It is with Weizenbaum, a computer science professor at MIT in the '60s, that arose the critique about the use of computer in military applications and the risks of their abuse in any field of society (Weizenbaum, 1976). Despite these early warnings, ICT and the digital world have been seen as an "innovation", and society was considered always late in this race. Even the very definition of computer ethics provided by Moor defines it as an attempt to fill a "policy vacuum" about how computer technology should be used (Moor, 1985), but technology in itself and how it is designed, developed, and deployed, was never questioned. It is with the works of Johnson that the possibility of shaping technology by the designers is introduced: technology is not neutral, technology and society co-shape each other, and the systems should be defined as "socio-technical systems" (Johnson, 1985). Then, an even more difficult question arises: if computer professionals can steer socio- technical systems, what is the "right" direction? Since then, several scholars proposed interesting approaches. Just to mention a few of them: the "value-sensitive design" (Friedman, 1996; Nissenbaum, 1998) where human values are inserted in the design of tools and technology to support human "flourishing"; the "participatory design" (Nygaard, 1996; Bodker et al. 1987), where the users, who are going to use the systems under development, participate as codesigners in the design process); the view of design as "ethics by other means" (Verbeek, 2011; 2017), where design is seen as an inherently moral activity and designers are seen as technological "mediators". In line with these transparent and cooperative view of design, the Slow Tech approach proposes to focus on the "process of decision making" rather than on outcomes, a "philosophizing" approach to ethics. Several studies are focused on the importance of the process of ethical decision making, where the path that it takes to reach conclusions is more important than the conclusions themselves (Kavathatzopoulos, 2012). It is a form of a "proactive ethics", where designers are suggested three fundamental questions. The first one: is (the system under development) "good"? That is: is it designed with a human-centered approach and for "desirable" goals? The second one: is it "clean"? That is: is it "sustainable" by minimizing the environmental impact? The third one: is it "fair"? That is: is "ethically acceptable" and taking into account the working conditions of all people involved alongside the entire ICT supply chain (from mines in Africa to manufacturing places in South Asia)? Alongside these three questions, the core question is: how can be improved the ethical competences and skills of computer professionals and designers? The Slow Tech approach suggests to design the stakeholders' network. This is an attempt to Copyright © 2021 for this paper by its authors. 109 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 enlarge the context of technology view and it is fruitful if used for facing a complex case study or a real scenario. The design of the stakeholders' network means a graph with nodes (the stakeholders) and connections (the relationships). The analysis of this network, in particular the relationships, can be useful for the identification of potential ethical issues, and for cultivating the "ars interrogandi" of students. Fig. 1. A typical stakeholders' network (Patrignani, 2020). 3 Analysis of the stakeholders' network The ICT stakeholders' network related to a digital system is a complex one. Usually it is a value-chain that is a global network of companies and stakeholders (see Fig.1). At the center of the network there are always three nodes: the "technology developers" (usually ICT vendors, the ones dedicated to developing ICT), the "users", and the "policy makers". Then this "core" network can be enlarged by including all connected entities related to the case under analysis or the system that is going to be developed. When there are enough stakeholders and connections, a reflection time is required for looking at all these relationships. The focus here is on the arcs or lines connecting the nodes (e.g. the arc connecting the provider of an online service and the user of the service). The fundamental question here is: what kind of relationship is there? In order to qualify this relationship, it is necessary to analyse it in depth: is it "symmetrical"? What kind of meanings and values are embedded in this relationship in terms of power, dimension, role, values, and desires? In this reflection time the students, the computer professionals (or the future systems' designers) can identify possible conflicts and ethical dilemmas. They can look at the relationship from many points of view, at least: the developers, the users, and the policy makers point of view. Copyright © 2021 for this paper by its authors. 110 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 When comparing two entities, like the two stakeholders connected under analysis, usually the first approach is to evaluate the "quantitative" side ("equal", "greater than", or "minor then") but in this case can be more fruitful a "qualitative" approach. Since the analysis of the two connected entities has the goal of identifying possible ethical dilemmas, it is preferred to use the so called "comparative" approach used by humans' mind when comparing two entities in general. This is also the suggested method by psychologists and minds' researchers (Baussano, 1985). Fig. 2. A "comparative" approach to relationships of the stakeholders' network. With this approach, the two entities can be seen as two sets (X and Y) that in general can exist or not and can be overlapped in several ways. In order to identify all the possible configurations and relationships, it can be helpful to see the three instrumental subsets: α (alpha: the intersection of the two set X and Y), δX (delta X: the part of X non intersecting Y) and δY (the part of Y non intersecting X). It is easy to see that all the possible kind of relationships are eight, all the configurations of three binary entities (see Fig.2). When alpha is zero (the four first configurations) the relationship is quite "different": X and Y do not exist at all (000), only Y exists (001), only X exists (010), and X and Y exist but have no relationship at all (011). When alpha is one (the four last configurations) the relationship become interesting: alpha=X=Y (the two stakeholders melt into the same entity: 100), alpha is X (the set Y "dominates" X: 101), alpha is Y (the set X "dominates" Y: 110), alpha is indeed part of X and Y (the more "equilibrated" relationship: 111). Looking at real cases, it is easy to see the related configuration. For example, when the two stakeholders have no relationship (011), even if they are both present in the stakeholders' network, then this can be the case for a "remote" Copyright © 2021 for this paper by its authors. 111 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 connection, that is, there is not a direct relationship. For reaching one node from the other there is the need to cross intermediary nodes (a user of an online service has no direct relationship with the designer of the software, but the computer professional is part of the company providing the service). Also interesting is the case of the two stakeholders melt into the same entity (100), the simple real case could be a "merge and acquisition" among two companies, or a "one man company" providing online services to users, etc. The two symmetrical cases where one stakeholder "dominates" the other (101, 110) is a very common situation of a "power" relationship: one stakeholder is "stronger" than the other and, if not regulated by some kind of "soft" or "hard" law, the relationship risks to be unfair or with subtle ethical issues. A well-known situation like this is the computer professional working for a large tech company. The last case, where the relationship looks "equilibrated" (111), is indeed the most common. Often the intersection among the two stakeholders requires further investigations: is the relationship based on a "fair" contract? Can the two stakeholders really exercise the theoretical autonomy? What level of reciprocity is present in the relationship? For example, a user and an online service provider have a clear relationship but is it reciprocal? Has the user the possibility to check what kind of personal data and information the online company is managing? Is the online service wrapping the user with a kind of "information bubble"? Is it monitoring the user behavior by means of artificial intelligence applications for pushing her toward purchasing activities or some politically interested entities (CHT, 2021)? What kind of autonomy has the user in front of one of the "titans" of the Web? 4 Case study: the artificial retina stakeholders' network For many researchers, the main obstacle to a deep ethical reflection around their work is the lack of time. Most of them are enforced to publish continuously and are just concentrated on the narrow field of science and technology they are involved with. An example is the recent area of research around the new material called graphene. It's a two-dimensional material with unique properties. One of the many interesting potential new applications of graphene is as an "artificial retina", a futuristic eye implant (Choi et al., 2017). Many young researchers are excited to this future potential application for people with visual impairments and start a deep dive into the physical properties of the material losing the view of the application field, of the users, policy makers, in general they devote very little time to the relationship with society. The stakeholders' network gives them the opportunity to enlarge their view to the very large landscape of entities involved in the artificial retina application (Fig.3). Copyright © 2021 for this paper by its authors. 112 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 Fig. 3. Artificial retina - stakeholders' network (Patrignani, 2020). This simple exercise enables them to dedicate time for applying the comparative approach proposed in this short paper to the collection of relationships identified. for example: what kind of relationship is the one between graphene technology and users? With graphene manufacturers? With the environment? With the National Healts Systems? Are they "equilibrated" (of the kind "111" in the comparative approach described above) relationships? 5 Conclusions This short paper describes a simple approach for introducing tools for analysing the ICT stakeholders' network. This analysis is useful for reflecting on the relationships among the stakeholders and for identifying ethical issues. In a "computer ethics" course, this methodology, together with the Slow Tech questions (is it good, clean, and fair?) could be helpful for improving the ethical competences and skills of future computer professionals. Copyright © 2021 for this paper by its authors. 113 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0) Proceedings of the Conference on Technology Ethics 2021 - Tethics 2021 References Baussano, G. (1985). Psicanalisi in fabbrica. Bollati Boringhieri. Bødker, S., Ehn, P., Kammersgaard, J., Kyng, M., & Sundblad, Y. (1987). A utopian experience. In G.Bjerknes, P.Ehn & M.Kyng (eds.), Computers and democracy a Scandinavian challenge (pp.251-278), Gower Publishing. CHT (2021). Center for Humane Technology. https://www.humanetech.com Choi, C., Choi, M.K., Liu, S. et al. (2017). Human eyeinspired soft optoelectronic device usinghighdensity MoS2graphene curved image sensor array. Nature Communications, 8, 1664. Ellul, J. (1954). La technique ou l'enjeu du siècle. Armand. Friedman, B. (1996). Value Sensitive Design. Interactions, 3(6). ACM Digital Library. Johnson, D. (1985). Computer Ethics. Pearson. Kavathatzopoulos, I. (2012). Assessing and acquiring ethical leadership competence. In G.P. Prastacos, F. Wang & K.E. Soderquist (eds.), Leadership through the classics: leadership and management in a changing world lessons from ancient eastern and western. Springer. Moor, J.H. (1985). What is computer ethics? Metaphilosophy, 16(4), 266-275. Mumford L. (1934). Technics and civilization. Harcourt. Nissenbaum H. (1998). Values in the design of computer systems. Computers in Society,28(1), 38-39. Nygaard K. (1996). Those were the days. Scandinavian Journal of Information Systems, 8(2), 91-108. Patrignani, N. (2020). Teaching Computer Ethics. Steps towards Slow Tech, a Good, Clean, and Fair ICT. Uppsala: Acta Universitatis Upsaliensis. Verbeek, P.P. (2011). Moralizing technology: understanding and designing the morality of things. University of Chicago Press. Verbeek, P.P. (2017). Designing the morality of things: the ethics of behaviour guiding technology. In J. van den Hoven, S. Miller & T. Pogge (eds.), Designing in ethics (78- 94), Cambridge University Press. Weizenbaum, J. (1976). Computer power and human reason: from judgment to calculation. Freeman. Wiener, N. (1950). The human use of human beings. The Riverside Press. Copyright © 2021 for this paper by its authors. 114 Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0)