Principal and Helper: Notes on Reflex Responsibility in MAS Clara Smith1 Abstract.1 What justifies -in the head of another agent different We therefore keep apart from our analysis situations in which from the one acting- the obligation to compensate is the fact that performance in the interest of another has a contractual basis. This the principal agent has lengthen its own action through the because, in contracts, function for another one and subordination implementation of a foreign activity for its own interests. We may be rather straightforward to identify, mainly because there is present two basic modal operators for representing, respectively, a notion of obligation involved. If an agent gives some explicit intentions in the interest of another agent and agency in the interest of another agent. They appear useful enough for orders or instructions to another agent which acts as his helper, or characterizing the notion of reflex responsibility in a multi-modal if and agent is obliged through a contract in the interest of another multi-agent system (MAS) context. agent (even when agents voluntarily engage into contracts because of their own utility), or if an agent h forms part of agent p’s business organization, subordination is somehow established. We 1 MOTIVATION AND AIMS therefore exclude here cases such as mandates and any conferral of a power of representation accompanied by an obligation of As pointed out by Chopra and White [1], theorizing in domains representation in certain ways (e.g. a cheque is a mandate from the such as legal and cognitive status of agents is crucial for designers customer to its bank to pay the sum in question.) of agents, especially, for the design of “on demand” MAS. Within We give a definition for the concept of reflex responsibility such engineering account, a legal question arises: do the designed between a principal agent and a helper agent, mainly inspired on agents are to be autonomous enough to have rights and general provisions settled by Italian and Argentinean provisions responsibilities? (By being autonomous we at least mean that for persons. We indeed use the terms “does”, “performance”, and agents act to achieve their own goals cf. Conte and Castelfranchi “action” as referring to persons, although it is not entirely clear for [2].) us if it is meaningful to speak of the actions of devices, and Most works on the topic are centered on “contractual issues” artificial agents within highly automates systems; possibly a term (see e.g. [3,4]). Chopra and White point out four approaches as as “executes” sounds more suitable. In what follows, “does” has one moves up the sophistication scale of agents: three “weak” the usual expected anthropomorphic meaning. The definitions we positions, based on (i) the idea of agents as mere tools of their give may be useful as a step towards a specific notion of reflex operators, (ii) the unilateral offer doctrine (a contract formed by a responsibility of artificial agents. party´s offer plus an acceptance, stipulated in the offer), and (iii) Article 1113 of the Argentinean Civil Code states that “The the objective theory of contractual intention (a contract –usually obligation of the one who caused damage is extended to the words- is an obligation which is law to the parties, who have the damages caused by those under his dependence.” In its turn, art. intention to agree), plus a fourth, radical one, which involves 1228 of the Italian Civil Code settles that “except a different will treating artificial agents as the legal agents of their operators. of the parties, the debtor who profits from the work of a third party There is also a fifth position that postulates the legal systems for fulfilling the obligation is responsible for malicious or treating agents as legal persons. negligent facts carried out by that third party.” In this work we focus on the legal binding between a principal According to general doctrine and jurisprudence related to such agent and a dependent agent. Particularly, we are interested on the articles, reflex responsibility has a subjective basis. This is a dependent’s performance that has its origin in extra contractual reason that makes reflex responsibility challenging to represent: if situations e.g. factual and/or occasional situations, trust, or it had an objective basis, checking the standard legal extremes courtesy. Examples of such bindings occur e.g. between the owner would be sufficient. Requisites for reflex responsibility are: i) the of a car -or any other device- and the one who drives it with the existence of a dependence relationship between principal and owner´s authorization (and without a proper title for using it), blog helper (or dependent), ii) the successful performance of an illegal activities such as twitting in the name of another, or bidding in an action carried out by the helper, iii) that such performance was auction in the interest of another: performed in the interest of a carried out while exercising a subordinate incumbency, iv) that principal agent. In these situations is enough that the principal such performance provoked a damage or injury to a third party, wills to be bind to third parties through the helper’s or dependent’s and that v) there must be an efficient causal relation between the performance. helper’s act and the damage caused. Regarding the formal framework, we use as a basis a BDI multi- agent context for dealing with agents’ attitudes, extended with 1 Universidad Nacional de La Plata, and FACEI, Universidad Católica de generic obligations, as in [5]. A = {x, y, z...} is a finite set of La Plata, Argentina. csmith@info.unlp.edu.ar. agents, and P = {p, q, r, ...} is a countable set of propositions. that occasionally accesses my email account profile and transfers Complex expressions are formed syntactically from these, plus the part of its content to its (yet artificial) principal agent, which following unary modalities, in the usual way: Goalx A is used to performs some data mining and later shows me tuned web ads. mean that “agent x has goal A”, where A is a proposition. There is no impediment for dependence when the son works under Propositions reflect particular state-of-affairs cf. B. Dunnin- the orders of his father, or if the daughter drives the car of her Kepliçz and R. Verbrugge [6]. Intx A is meant to stand for “agent x mother, who is being transported in it (there is occasional has the intention to make A true”. The doxastic (or epistemic) dependence). Neither parental relationships nor marriage is an modality Belx A represents that “agent x has the belief that A”. The impediment for the configuration of a dependence relationship. deontic operator O represents generic (legal/lawful) obligations, Here then, dependence excludes delegation or mandate. There is meaning “it is obligatory that” [7]. The operator Doesx A no dependence between the car owner and the car-shop where the represents successful agency in the sense given by D. Elgesem, i.e. car is left in order to be repaired, except if the owner has agent x indeed brings about A [8]. For simplicity, we assume that authorized its use; neither between a student of a public school and in expressions like Doesx A, A denotes behavioral actions the State, neither between the owner of a field and the firm in concerning only single conducts of agents such as withdrawal, charge of its fumigation (all examples according to jurisprudence inform, purchase, payment, etc. (i.e. no modalized formulas occur in [10].) in the scope of a Does.) As classically established, Goal is a Kn operator, while Int and Bel are, respectively, KDn and KD45n. O is Definition 1. Dependence is a relation that holds according to taken to be a classical KD operator. These are all normal certain internal states of agents. Let p be the principal agent and h modalities. The logic of Does, instead, is non-normal [8,9]. the helper agent. Let A be a single behavioral action (e.g. pay, bid, The rest of the paper is organized as follows. Section 2 tweet, etc.). A plausible initial characterization of dependence addresses one possible characterization for a certain notion of between p and h regarding A is: A is one of p’s goals, p has the dependence which happens to be complex enough and central to intention that h indeed carries out A, and h intends to make A true the lawful concept of reflex responsibility we deal with. We believing (knowing) that A is one of p’s goals: attempt four subsequent definitions, each of which improves the previous one. We go through them by using several examples. A Depph A  Goalp A  Intp(Doesh A)  relativized modality is introduced for dealing with oriented,  Inth A  Belh(Goalp A ) . (1) coordinated intentions: an agent intends to become true a state-of- affairs A in the interest of another agent. We introduce in Section 3 Discussion. h adopts p’s goal (A) as its own intention (Inth A) in another modality, a directed agency operator that binds a helper the exercise of, e.g., courtesy. Based on this fact, h will carry out agent h to the principal agent p, and to the “oriented” action or A. Note that the last two conjunctors in (1) are meant to capture situation A that h carries out in the interest of p. In Section 4 we the idea of “function for another one” (h intends to become A true formally define reflex responsibility of p regarding h with respect because he knows it is p’s goal). to an action or state-of-affairs A when: there is dependence Nonetheless, (1) holds when it happens to be no subordination, between p and h w.r.t. A, h succeeds on carrying out A on account or is merely a coincidence, or p would like that h does A and h of p, such action constitutes an illegal act, and there is a damage a does it for other reasons. For instance, (1) holds in a situation third agent t suffers which is attributable to h´s performance of A where p and h are -rather than principal and helper- rivals involved on account of p. Section 5 presents the underlying logical structure in a competitive scenario, i.e. both effectively having the same and the corresponding semantics. Conclusions end the paper. goal and aiming to fulfilling it. Example 1. The Bach Double Concerto. Consider the two 2 DEPENDENCE violinists’ example in [6] where two violinists intend to perform A requisite for reflex responsibility to hold is the dependence the two solo parts of the Bach Double Concerto. (The Concerto for between the author of the harmful act and the agent to whom the 2 Violins, Strings and Continuo in D Minor is characterized by the responsibility is attributed by reflex, i.e. the principal. Such subtle yet expressive relationship between the violins throughout relation has two constitutive elements: 1) there is a function the the work.) Let us revisit the example: suppose Peter is a violinist helper carries out, on the principal´s utility; and 2) the helper is a who has as goal being one of the soloists. Moreover, Peter also subordinate of the principal w.r.t. the performance of such has the intention that Helen, his past fiancée -who is also a function, i.e. there is a subordinate incumbency. violinist- plays as the other soloist (he would like that). But as far as Helen goes, she intends to become one of the chosen soloists Examples and non-examples of dependence relations. There is without care of who the other soloist is (and whatsoever part she dependence between the owner of a car –or other device- and the plays); nonetheless, for sure she knows that Peter aims to play one who drives it with the owner´s authorization (and without a himself as a soloist too. We get that Goalpeterplay  Intpeter(Doeshelen proper title for using it), some blog activities such as twitting in play)  Belhelen(Goalpeter play)  Inthelen play holds although there is the interest of another, or bidding in an auction in the interest of no dependence between Peter and Helen (assume that they another. There is dependence between an artificial helper agent currently have no relationship at all!): Helen is in competence with principal intends that s/he does the task but also with his own Peter. “oriented” intention, in the interest of p, to carry out A. Formally: Let us attempt an improvement for our definition. Depph A  Goalp A  Intp(Doesh A)  Definition 2. There is dependence between p and h regarding A Belh(IntpDoesh A)  Intph(Doesh A) (4) when p has A as goal, h believes on this, and such p’s goal is what induces h to have the intention to carry out A: which stands for “A is one of agent p’s goals, and p intends that h performs A; h is aware of this, and intends to become A true in the Depph A  Goalp A  Belh(Goalp A)  interest of p”.  (Goalp A  Inth(Doesh A)) . (2) Intph A allows capturing custom or courtesy behavior: h may be an altruistic agent not expecting any reward, merely intending to Discussion. The conditional here is meant to specify that p’s goal fulfill p’s expectations, even occasionally. Observe that (4) indeed is the motive for h’s intention. reflects the power of the intention in the interest of another, as Expression (2) may even hold in a rivalry scenario such as the such “directed” intention defines dependence as an oriented, Bach Double example. Suppose that Helen, knowing that Peter coordinated, non-competitive relation. has as goal being one of the soloists, triggers her own interest in Improvements regarding the intensional basic operators have being a soloist, due to her competitive personality (and not based already been addressed through e.g. the concept of deadline in any interest in Peter). Note also that in (2) it is sufficient that p intentions and deadline beliefs. [11,12]. For example, suppose that has a goal, and that it is not necessary that the he wants h to be agent y does not believe that agent x is travelling, and says “I engaged. Then (2) also holds in a scenario where p does not want won’t believe he is travelling until he shows the ticket to me”: we to be helped by h. write a deadline belief using the until operator as U(DoesxShowsTicket,¬BelyTravels) [12]. Moreover, collective Example 2. The unwanted helper. I want my netbook to be intention operators for mutual and common intentions have been fixed, but not by Harry who is incompetent; Harry, who does the designed based on the basic Int operator in [6]. Relativised job, satisfies my goal and qualifies as a helper. obligations to bearers and counterparties are defined in [13]. Let us attempt a further improvement. Definition 3. h’s action will be triggered on the basis of p’s 3 ACTION ON ACOUNT OF ANOTHER intention that h does A (and h is aware of this), and not merely Another requisite for the emergence of reflex responsibility is that based on p’s goal: law is violated (a legal aspect has now emerged.) The illegal act must be imputable to the helper, who is the one who materially Depph A  Goalp A  Intp(Doesh A)  and effectively acts, therefore he becomes materially responsible  (Belh(IntpDoesh A)). (3) for the forbidden act. For the reflex responsibility to raise it is essential that the helper agent carries out the harmful activity on Discussion. Harry would not qualify as a helper under this account of the principal. definition, because I do not have the intention that he repairs my We have gone through the discussion on directed intentions. It netbook (he will not carry out the task on my utility, I do not want must be clear at this point that we also need an oriented/directed him to). agency operator for coordinating h, p and the proper “oriented” Unfortunately, (3) still holds under rivalry between p and h action h carries out in the interest of p. Let us illustrate with an w.r.t. goal A. (For a more artificial agents’ scenario, assume any example. state-of-affairs in which automatic allocation of resources is in permanent dispute, and devices are not necessarily dependent one Example 3. The truck driver. d, the occasional driver of p’s of each other.) truck, takes the truck off from p’s garage on Sunday afternoon, We next attempt a new definition that excludes rivalry with a view to have a ride with his friends. Due to his situations by introducing a primitive, relativized operator, that misguidance, his friends are injured on the occasion of this Sunday coordinates two agents to an intention with regard to A. Binding p drive (Doesd drive  injurefriends). with h trough an “oriented” intention is what we need to exclude competitive situations. Discussion. p has as goal that d drives his truck, and intends him to drive it, d believes in this, and d has the intention to drive the Definition 4. Intention in the interest of another. We define a truck in the interest of p. So we get dependence between h and p relativized operator: Intph A, meaning “h intends A to be true in the regarding A (i.e. (4) holds). Now, note that provided the general interest of p”. This way, we model dependence as a coordinated obligation that states that we should not harm others (O¬injuret), relation, as follows. The principal indeed must have the intention p´s reflex responsibility is about to raise. But d drove in its own that the helper performs the task, while the helper is aware. He interest. What justifies in the head of another agent -different from will somehow be “activated” not only by the belief that the the one acting- an obligation to compensate is that the principal agent has lengthen its own action through the implementation of a happened, and, of course, he may possibly be sued. According to foreign activity for its own interests. Here is not the case, d drove [14], when (7) holds we can say that p is legally liable for the on his own account when he provoked the accident. harmful event because all conditions for connecting the harm to We know that it is essential for reflex responsibility to hold that that person are realized: note that both dependence and directed the performing agent carries out the task on account of another action connect p to the harm and thus lead to p’s liability. agent. We should be able, then, to distinguish those directed Another relevant issue is that the responsibility of the dependent intentions and actions that we make in our own interest from those must be established before declaring the principal’s responsibility which we do in the interest of another one. by reflex. Only in a second moment the reflex can be settled. Consequently, we cannot conceive a case where the principal is Definition 5. Agency in the interest of another. We introduce a responsible but the dependent is not. The exclusion of the relativized operator Doesph A to represent agency in the interest of dependent’s responsibility excludes the principal’s responsibility: another, meaning “h carries out A in the interest of p”. This non- normal operator is meant to capture performance for another one ¬(Doesph A  Damaget)  ¬Reflexph A . (8) i.e. directed material performance in the head (and/or hands, or executable code) of d, but on account of p. This way, we establish An important ingredient for delimiting the application of reflex oriented agency as a basic type of event, the same way as Does is. responsibility is the consciousness (awareness) that the injured This relativised agency operator leads us to a more precise third party has w.r.t. the fact that the helper acted beyond the definition for dependence: subordinate incumbency. In this case, we may consider that the comitent has no responsibility even when the injuries possibly Depph A  Goalp A  Intp(Doesph A)  have been inflicted with devices entrusted to the helper just for  (Belh(Intp(Doesph A))  Intph A . (5) being so. For example, if d’s friends know it is p’s truck (and not d’s truck), p is not to be liable. We write this limit as: Back to the truck example, we have that, that Sunday, Doesdddrive holds and also ¬(Intp(Doespd drive)) holds, making (5) (Belt(¬Intp(Doesph A)) (Doesph A Damaget)) ¬Reflexph A false. (Note that, intuitively, Doesdd A collapses to Doesd A.)  (9) Also, recall that if it happens that d is the injured party (i.e. 4 REFLEX RESPONSIBILITY suppose for a moment that d=t in (7)) general provisions regarding negligence and incompetence exclude any d’s attempt to sue p. If the harmed third party t is bound to the principal by means of We saw that another requisite for the emergence of reflex a contract (e.g. it holds that it is obligatory for p in the interest of t responsibility is that the helper’s harmful performance provokes a that A: Otp A), and the dependent’s harmful performance imports damage or injury to a third party, let us say t, and that there must the non-execution of obligations assumed by the principal w.r.t. be an efficient causal relation between h’s performance -on the third party (Reflexph A ¬A), then such non-execution is account of p- and the damage caused to t: Doesph A  Damaget , imputable to the principal (contract beats reflex): here we have with t ≠ h ≠ p. entered the contractual arena in which any faulty act of the We are now in a position to define reflex responsibility. subordinate is imputed to his principal, p. The solution is thus beyond the reflex responsibility approach. Formally we may write: Definition 6. Reflex Responsibility. There is reflex responsibility of agent p regarding agent h w.r.t. the action or state-of affairs A (Otp A (Reflexph A ¬ A)) Otp Compensate . (10) when there is dependence between p and h w.r.t. A, h succeeds regarding A on account of p, such performance is an illegal act, Finally, if the harmed party is the principal, the dependence and there is a damage t suffers, which is attributable to h’s relationship becomes irrelevant and cannot be used as d’s excuse performance: or exception: d is to be sued according to general rules.) One more remark. G. Sartor et al. briefly outline in [14] the Reflexph A  Depph A  Doesph A   notion of vicarious liability in tort law. “Vicarious” refers to the  O¬ A  (Doesph A  Damaget) . (7) idea of one person being liable for the harm caused by another. In that work, it is pointed out that Anglo/American law does not Discussion. According to the analysis done in [14], reflex provide a general formula to deal with the requirement that the responsibility belongs to the category of: (i) blameworthiness liability of the principal p is based on whether the servant responsibility, meaning that the principal failed to comply with the committed the tort in the course of his duty; moreover, an “inner demands of the system i.e. being faulty according to the system connection” is needed between the harmful act and the task asked (because DoesphA andO¬A); and also to the category of: (ii) by p. accountability responsibility, because the principal has a particular Complex situations can be designed with the aid of a definition connection to the harm (the harm can be linked to the principal) so such as the one given here for reflex responsibility, when we use it that he has to give an explanation (an account) why the harm as a building block. It may lead us to an interesting and high level  O is the accessibility relation for the deontic modality O for of sophistication in the devise and outline of the lawful support of obligations, which is serial (standard KD semantics). a system. Note that if we are to represent formulas such as (10) we also Example 4. Reflex responsibility and trust deception. Paul need to include modalities for relativised obligations (standard lends to me his user name and password, so as I can use the KDn semantics.) wireless connection at his university, which I am visiting. I made In its turn, a multi-relational model is a structure M = wrong use of some contents, a database damaged, and I –under where R is a multi-relational frame as above, and V is a valuation Paul’s user name- got blacklisted. Paul trusted me, now he is function defined as follows: responsible by reflex for my misuse. His trust on me is connected to his responsibility, which for sure is now deceived with 1. standard Boolean conditions; independence of the case that he manages to give an adequate 2. V(w, Beli A) = 1 iff v (if w Bi v then V(v, A) = 1); explanation to whom he had to respond in order to be erased from 3. V(w, Goali A) = 1 iff v (if w Gi v then V(v, A) = 1); the blacklist. 4. V(w, Intji A) = 1 iff v (if w Iji v then V(v, A) = 1). 5. V(w, Doesji A) = 1 iff DjiDji such that v (w Dji v iff V(v, A) = 1); 5 SEMANTICS 6. V(w, O A) = 1 iff v (if wOv then V(v, A) = 1); Decidability for the logics for R follows directly from [15, 16]. The semantics for this logics of reflex responsibility is based on a The logics for F was there reorganized as a fibring in [15], this is a multi-relational frame F, with the following structure [5]: particular combination of logics which amounts to place one logics on top of another. In the case of F, the normal logic was put F = on top of the non-normal one. By exploiting results in regard to techniques for combining logics, it was proved in [15] that that where: fibred logics is complete and decidable. Therefore, we only have to extend the proofs in [15] for the new modalities in R.  A is the finite set of agents; In its turn, [16] gives a new presentation for existing theorems  W is a set of possible worlds; generalizing to neighborhood structures the well-known results  {Bi}iA is a set of accessibility relations w.r.t. beliefs, regarding decidability through filtrations for Kripke structures. F which are transitive, euclidean and serial; is a special case of [16, Def. 5] because its semantics can be  {Gi}iA is a set of accessibility relations w.r.t. goals; outlined within a neighborhood approach. Therefore it is with standard Kn semantics; straightforward to prove decidability for its extension R.  {Ii}iA is a set of accessibility relations w.r.t. intentions, which are serial;  {Di}iA is a family of sets of accessibility relations Di 6 FINAL REMARKS wrt Does, which are pointwise closed under intersection, reflexive and serial [5]. In this work we attempt to provide one step towards the issue of ‘rational automatic allocation of liability’ [14] within MAS. In Recall that we want to be able to represent directed intentions particular, we focus on a possible logical formalization of and directed actions; we should also be able to represent generic situations and state-of-affairs where a principal agent wills to be obligations. Therefore we introduce slight modifications extending bind to a helper for achieving his goals. F: the underlying structure for supporting reflex responsibility is a Clearly, whether one decides to include –or not- in the system variant of F, call it R: the automatic detection of reflex responsibility, depends on the interest on lawfully distinguishing between principal and helpers’ R = < A, W, {Bi}iA, {Gi}iA, {Iji}i,j A, {Dji}i,j A,O > separate responsibilities. Such a distinction has an impact on the concept of liability underlying the system and, possibly induced by where: this fact, on the issue of efficient distribution of available resources among agents, due to sanctions such as obligations to repair harm.  {Iji.}i,jA is a set of accessibility relations w.r.t. the notion of relativized intention, meaning that there is an I relation for Moreover, distinguishing between helpers and principals allows to each combination of is and js (which are serial); and the system’s users and to other agents to e.g. recognize which  {D ji}i,j A is a family of sets of accessibility relations Dji w.r.t. agent is to be sued for wrongdoing. In the words of M. Sergot [17], it has been suggested –from, let oriented actions, meaning that there is a set for each us say the last twenty years- that interactions among multiple, combination of is and js, which are pointwise closed under independently acting artificial agents can be effectively regulated intersection, reflexive and serial; and and managed by norms (or ‘social laws’) which, if respected, allow the agents to co-exist in a shared environment. This article [3] T. Allan, R. Widdison. Can computers make contracts? Harvard attempts an answer to his question of what happens to the system Journal of Law and Technolgy, 9, 25-52, 1996. behavior when ‘social laws’ are not respected. In our present [4] I. Kerr. Ensuring the success of contract formation in agent-mediated outline, trust, altruistic, and courtesy behavior can be seen as electronic commerce. Electronic Commerce Research, 1 (1/2), 183- social predispositions that may induce occasional dependence 202, 2001. between agents, generating a bound between them, and possibly [5] C. Smith, A. Rotolo. Collective trust and normative agents. Logic establishing a reflex responsibility. The usual expected behavior is Journal of IGPL, 18(1), 195–213, (2010). that the entrusted agent should behave according to accepted [6] B. Dunin-Keplicz, R. Verbrugge. Collective intentions. Fundamenta standards, acting good. When this principle is broken, there is a Informaticae, 271–295, (2002). need of lawfully repairing the wrongdoing. [7] A. Jones, M. Sergot. A logical framework. In Open agent societies, normative specification in multiagent systems, 2007. From the logical viewpoint, the structure of the systems [8] D. Elgesem, ‘The modal logic of agency’, Nordic Journal of outlined in this work is a simple combination of normal and non- Philosophical Logic, 2, 1–46, (1997). normal modalities. Nonetheless, the structure is suitable for [9] Guido Governatori and Antonino Rotolo, ‘On the Axiomatization of representing sophisticated relationships such as occasional Elgesem’s Logic of Agency and Ability’. Journal of Philosophical dependence, bridges between trust and responsibility, and bindings Logic, 34(4), 403–431, (2005). between agreements (such as contracts) and dependence. The [10] J.J. Llambias. Civil Code with Annotations. Buenos Aires. logical simplicity is also a support for their usefulness and [11] Broersen, J., Dignum, F., Dignum, V., Meyer, J-J. Designing a robustness, and also keeps systems manageable and suitable for Deontic Logic of Deadlines. LNCS 3065, 43-56. Springer, 2004. further extensions. [12] C. Smith, A. Rotolo, G. Sartor. Representations of time within At least two issues are left open. First, if it can be argued that normative MAS. Proceedings of the 2010 conference on Legal Knowledge and Information Systems: JURIX 2010: The Twenty- artificial agents act in the same sense humans do; in particular, if Third Annual Conference, 107-116. IOS Press Amsterdam, The they can will to be bind by other agent’s performance, have Netherlands, 2010. ISBN: 978-1-60750-681-2. directed intentions, and perform actions in the interest of another [13] H. Herrestad and C. Krogh, Deontic Logic Relativised to Bearers one. Second, provided that the reflex responsibility is, in this and Counterparties, 453–522, J. Bing and O. Torvund, 1995. paper, allocated by the system, what are its consequences or [14] G. Sartor et al. Framework for addressing the introduction of impact on the agents’ reactions. For example, what will Paul do automated technologies in socio-technical systems, in particular with and how will he behave from now on, now that he has been proved regard to legal liability. E.02.13-ALIAS-D1.1. EUI, Firenze, responsible by reflex? Will he reconsider from now on his beliefs? 2011. http://dl.dropbox.com/u/10505513/Alias/E0213ALIASD11Fra If so, with regard to everyone, or just with regard to me? This mingtheProblemV015.pdf topic leads to the study of what G. Sartor et al. call the social [15] C. Smith, A. Ambrossio, L. Mendoza, A. Rotolo. Combinations of normal and non-normal modal logics for modeling collective trust in consequences that are induced by allocating liabilities [14]. normative MAS, AICOL XXV IVR, Forthcoming Springer LNAI, Finally, we are to explore more in depth the relationship 2012. between reflex responsibility and trust. [16] C. Smith, L. Mendoza, A. Ambrossio. Decidability via Filtration of Neighbourhood Models for Multi-Agent Systems. Forthcoming proceedings of the SNAMAS 2012 @ AISB/IACAP World REFERENCES Congress, UK. [17] M. Sergot. Norms, Action and Agency in Multi-agent Systems [1] S. Chopra, L. White. Artificial Agents – Personhood in Law and Deontic Logic in Computer Science Lecture Notes in Computer Philosophy. ECAI, pages 635–639, 2004. Science, 2010, Volume 6181/2010, 2, DOI: 10.1007/978-3-642- [2] Conte, R. Castelfranchi, C. Cognitive and social action. UCL Press 14183-6_2 . Ltd., 1995. 