Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden What should I do? Deriving norms from actions,values and context Myrthe L. Tielman1 , Catholijn M. Jonker1,2 , M. Birna van Riemsdijk1 , 1 Interactive Intelligence Group, TU Delft, The Netherlands {m.l.tielman, c.m.jonker, m.b.vanriemsdijk}@tudelft.nl 2 Leiden Institute of Advanced Computer Science, Leiden University c.m.jonker@liacs.leidenuniv.nl Abstract However, current systems do not always explicitly take into account the role of context. This can be problematic, as a sin- Behavior support technology is increasingly used gle action might support different values in different ways, to assist people in daily life activities. To do this depending on the situation. For instance, biking can pro- properly, it is important that the technology under- mote the values health and sustainability. However, when it is stands what really motivates people. What values snowing, this effects how much health is promoted, as biking underlie their actions, but also the influence of con- through snow is decidedly less healthy. It has no effect on the text, and how this can be translated to norms which sustainability of the action though. govern behavior. In this paper, we expand a frame- work describing action hierarchies and values to in- This example shows that context cannot be ignored when clude the role of context. Moreover, we present a reasoning about how actions promote values. We therefore method to derive specific norms for behavior from propose a framework which does not just include a repre- this information on actions, values and context. Be- sentation of how values relate to our actions, but which also havior support technology can use this framework models the role of context. Our focus in this paper lies on rea- to reason about peoples ideal behavior, and so bet- soning about context, so not necessarily in modeling context ter offer personalized assistance. itself, as done, for example, in [Kola et al., 2018]. Aside from understanding our values in context, technol- ogy is also required to reason about what this means con- 1 Introduction cretely. We want it to understand what choices are best, in Behavior support technology is used increasingly in daily other words: what norms we wish to adhere to, given our val- life. We have technology which reminds us of meetings or ues, the actions we can take, and the context we are in. Norms to take our medicine [Milić et al., 2018] [Zhou et al., 2012], ”regulate the interactions between an individual and the soci- to help us eat healthier [Schoffman et al., 2013], and even ety” [Balke et al., 2013], and are often used in agent systems with diverse medical problems such as dementia [Carrillo et to reason about the behavior of artificial agents [Santos et al., al., 2009], depression [Karyotaki et al., 2017] or kidney trans- 2017], but can be used similarly to reason about what behav- plants [Wang et al., 2017]. As the role that technology plays ior to support in users [van Riemsdijk et al., 2015]. Most in our daily lives increases, it becomes more important that multi-agent systems employing norms either derive them be- technology supports us in a flexible way [van Riemsdijk et forehand based on goals, or look at how norms emerge in a al., 2015]. Ideally, you would want your daily behavior sup- society based on what actions agents perform [Balke et al., port app to understand it only needs to remind you to take 2013]. These systems look at how norms govern behavior your umbrella when it is going to rain, to remind you of job of groups. However, these approaches are less suitable when interviews earlier than meetings with friends, and to call your considering personal norms. For this reason, [Criado et al., doctor only if the medicine you forgot was absolutely crucial. 2013] use a more human-inspired model, but they do not ex- You want it to take into account the context you are in, and plicitly consider an individual’s values. Other work takes a understand how this would affect your ideal choices. different approach, and looks instead at how values can gov- Although all these flexibilities could be incorporated into ern behavior directly [Cranefield et al., 2017]. [Bench-Capon technology separately, a more sustainable approach seems to and Modgil, 2017] do consider how norms, actions and val- ensure that the technology itself understands our motivations ues work together, but they employ the societal norms, in- as humans. To realize this, many systems introduce values stead of norms based on the individuals own values. Finally, [van de Poel, 2015], [Cranefield et al., 2017], concepts which work from the angle of values-sensitive design considers how refer to what a person or group of people consider important norms relate to underlying values, but often do not translate in life [Friedman et al., 2006]. Values are used to identify the this into specific options for behavior [van de Poel, 2013]. In underlying reasons for our actions, and are particularly suit- our framework, we propose to bring actions, values, context able for this purpose due to their generalizability and stability and norms together, as shown in Figure 1. over time [Schwartz, 1992]. In section 2. we describe the action framework, as well 35 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden malization of values. However, this could be expanded to be more complex without consequences for the rest of the frame- work, as long as the following criteria are met. First, we assume there is a relationship between an action and value which denotes how much this action demotes or promotes the value. Secondly, we assume commensurability Figure 1: Schematic representation of our framework, including ac- in this relationship, so we can explicitly compare how much tions, values and context, and norms derived from this information. different actions promote a value. This second assumption is not a trivial one [van de Poel, 2015], but important for the computability of the impact of values for an agent. For this framework, we propose a simple number which expresses how much an action demotes (negative numbers) or promotes (positive numbers) a value. If no explicit rela- tionship between an action and value is given, we assume the action does not affect the value. A distinction which is nearly always made in the literature, Figure 2: Example of a tree with part-of relationships, describing is between the impact an action has on a value, and the impor- making pizza. The black diamond links indicate necessity. tance an individual gives to a value. For the purpose of this paper we only describe the first relation, and leave individual as how values relate to these actions. Section 3 discusses ordering out of the picture. However, as long as a commen- the role of context. Finally, in section 4 we describe how surability of values is maintained, this impact of individual specific norms can automatically be derived from this frame- preference could easily be added to the framework. work. These norms allow a behavior support agent to reason about what the ideal behavior of the user would be. 3 The role of Context 2 Actions and Values Consider the following scenario: 2.1 Action hierarchies John has a behavior support agent to help him live In order to support people in their daily behavior, it is impor- healthier, as he has problems with his back. One way it does tant to understand how they themselves conceptualize their this, is by encouraging John not to carry around unnecessary actions and the relations between them. To this end, Pasotti things such as an umbrella. However, one day John gets et al (2017) developed a knowledge representation capable of caught in the rain, and catches a cold because he did not describing Action Identification Hierarchies (AIH) [Pasotti et have his umbrella with him. al., 2017]. The core concepts of the action framework de- scribed in our paper originate in this work. This example illustrates the role context can play. Al- At the core, AIH describes relationships between actions. though in general, not taking an umbrella is good for the value For this paper, we only consider part-of relations. A part-of health, this does not hold when it is raining. Ideally, John’s relation from action A to action B describes that doing action behavior support agent understands this, and will advise him B is a part of doing the action A. So one can do B while doing to take the umbrella only if it is going to rain. A, but doing A entails more than just doing B. In this example, context is the type of weather. However, The original AIH’s can include a multitude of relations de- context can be any situational circumstance which is not cap- scribing full behavior trees. For the framework presented in tured in the definition of the actions themselves. Other exam- this paper, however, we need only consider two layers at a ples of circumstances which can affect the value-action rela- time. So our AIH’s will only consist of one top action, which tionship are time (of day or year), social situation or location. has a part-of relationship with at least two child actions. Fig- In our framework, we define a contextual factor as a tu- ure 2. shows an example of such a tree. ple hs, a, v, wi, where s is a situational property, which when present affects the numerical relationship between an action 2.2 Values a and a value v with weight w. Weight can be both posi- The question of how to formalize the relationship between tive or negative. So we have a ‘default’ relationship between our actions and values is a complex one, which is dealt with an action and value, and the contextual factor modifies this in different ways across frameworks. [van der Weide et al., relationship. For instance, given that the action taking an um- 2009] relate values to state changes, and define how a value brella promotes the value health by -1, the context of rain can either be demoted or promoted by such a change. A sim- might influence this by weight +3. The contextual factor ilar approach is taken by [?], and by [Pasotti et al., 2016], would be defined as hrain, takingUmbrella, health, +3i, and who link values to postconditions of actions. [Sartor, 2010] this would mean that when it is raining, taking the umbrella has a slightly different approach, adding how much a certain actually promotes health by 2. This means the assumption of choice affects a value. commensurability of value-action relationships is extended to For this paper, our focus is on the role of context and norm the influence of context. derivation. Therefore, we will employ a relatively simple for- Information about context does not need to be present in 36 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden our framework. If for an action-value relationship no con- textual factors are defined, we simply assume this relation is always the same. 4 Deriving Norms Norms are often used instead of values because they give ex- plicit rules for behavior. However, norms are not as general or stable as values. For this reason, some frameworks intro- Figure 3: Part-of tree for ’prepare for driving’. With values ’punctu- duce values as underlying motivations for norms [Kayal et ality’ and ’safety’, and situation ’running late’ and ’bright sun’. The al., 2014]. In our framework, we take the opposite approach, black part-of relation indicates a necessity. and instead propose to define norms based on value-action- context information. This not only gives us a clear insight For deriving all norms from such a tree, the following pro- into the relation between norms and values, but the additional cedure is followed, where entity e always refers to the person flexibility to automatically change norms if context, actions whose actions the tree describes. or values change. First, obligations can be derived for all necessary part-of Formally, we define a norm as a tuple hdeoc, e, ac, a, Ci, child actions a of parent action ac. These take the form where deoc is the deontic modality, obligation, prohibition or hobligation, e, ac, a, Ci where C = ∅. No other norms are permission. e is the entity, i.e. the person whose actions the derived for these necessary actions. This means that whether tree describes. ac is the parent action, which can be seen as these necessary actions support the user’s values given the sit- the action context. This describes during which action the uation does not affect the norm, as these actions always need norm is relevant. a is the child action, so the behavior the to be included to perform the parent action. norm describes. Finally, C is a set of all the different sit- Applying this to Figure 3, we get the formal notation for uational properties c, for which the norm holds. C can be norm 1: empty, if no specific context is specified. This representation is inspired by existing normative frameworks, e.g. [Balke et • hobl, e, prepForDriving, findCarKeys, ∅i 1 al., 2013], [Singh, 1999]. For all other part-of child actions a of parent action ac, In our normative system, we do not define norms for ‘neg- values and context are relevant. We use Nai to denote the set ative’ situatons. For instance, we might define that one needs of norms derived for action a with set of situational properties to take an umbrella when it is raining, but not that one should C such that |C| ≤ i. Below we inductively define Nai . not take an umbrella when it is not raining. This is done to P avoid having to check for the absence of a situational prop- 1. Derivation of Na0 . Let sa = {v|v ∈ v(a)}, where erty. Instead, we introduce the rule that a more specific norm v(a) is the set of value numericals of a. The norm for will always overrule a more general norm about the same e, a with |C| = ∅ is hdeoc(sa ), e, ac, a, ∅i, where deoc ac and a, if all situational properties C in the more specific depends on the sign of sa as follows: deoc(n) for any norm hold. So a norm A is defined as more specific than B number n is defined to be obligation if n > 0, deoc(n) if C in norm A includes more situational properties c than C = permission if n = 0, and deoc(n) = prohibition if in norm B, so if CB ⊂ CA . If we wish to express that one n < 0. should take the umbrella only when it is raining, one would Following this step for Figure 3, we can derive norm 2, have one norm expressing do not take the umbrella, and one formally: take the umbrella when it is raining. When it is raining, the second norm overrides the first. This rule follows the concept • hper, e, prepForDriving, sunglasses, ∅i of lex specialis, specifying that the more specific norm has 2. Derivation of Na1 . Next, for each (single) situational priority [Balke et al., 2013]. property f in the total set present in the tree F , we con- Figure 3 is a graphical representations of a part-of struc- ditionally ture, describing the actions and values for a user, and situa- P decide to add a norm. Let f ∈ F . We define sfa = {v + nfa |v ∈ v(a)}, where nfa denotes the nu- tional properties which are of influence. From Figure 3 , we merical value associated to situational property f with can derive the following norms: respect to action a. If sign(sfa ) 6= sign(sa ), then the 1. When preparing for driving, find car keys. following norm is added: hdeoc(sfa ), e, ac, a, {f }i. Following this step for Figure 3, we can derive norms 3 2. When preparing for driving, you may take sunglasses. and 4, formally: 3. When preparing for driving in the bright sun, take sun- • hobl, e, prepForDriving, sunglasses, {sun}i glasses. • hpro, e, prepForDriving, sunglasses, {late}i 4. When preparing for driving and running late, do not take 1 sunglasses. We abbreviate obligation to obl, prohibition to pro, permission to per, prepare for driving to prepForDriving, Running late to late, 5. When preparing for driving in the bright sunlight and Bright sun to sun, find car keys to findCarKeys and Take sunglasses running late, take sunglasses. to sunglasses in the formal norms. 37 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden 3. Derivation of Nai+1 . Following this, norms are added, support agent. After all, this agent will need to understand depending on whether or not progressively complex the behavior and wishes of this particular individual first. combinations of situational properties change the deoc. When expressing choices for actions in norms, the observa- By progressively complex combinations of situational tion can be made that some norms somehow seem ‘stronger’ properties we mean that we consider Pi (F) = {t ∈ than others. Using the norms generated from Figure 3, the P(F ) | |t| = i} with cardinality for increasing i, un- norm find the car keys when driving to work seems of a dif- til i = |F |. Similarly, P≤i (F) = {t ∈ P(F ) | |t| ≤ i}. ferent order than do not take sunglasses when running late. A norm is added at step i + 1 only if this larger set of They both make sense, but while ignoring the second norm situational properties changes the sign compared to the might just make you a minute late, the first will cause you signs of previously added norms at step i which partly not to arrive at all. While our action framework partly makes include the same situations. this distinction with the necessary part-of link, this is not yet More formally, we start with base case is P1 (F ) as de- translated into the norm. Moreover, some non-necessary ac- scribed in point 2. Iteratively, when we have calculated tions might still be more important than others. Some norma- Nai for Pi (F ), we derive Nai+1 by considering the fol- tive frameworks include the notion of sanction, which could lowing for any element t ∈ Pi+1 (F ): be used to express the difference between these norms. An- P other option might be to rank norms based on priority, for We define sta = {v + nta |v ∈ v(a)}, where nta is the sum of context numericals associated with the situa- instance based on how much an action promotes or demotes tional properties in set t with respect to with action a. values. Our framework, however, as yet does not include a way to express the effect of choosing to include an action. We add a norm hdeoc(sta ), e, ac, a, ti for action a with Another difference between the norms we derive and some t ∈ Pi+1 (F ), if there is a t0 ∈ P≤i (F ) such that: other frameworks, is that we include two actions instead of • t0 ⊂ t. one. This is an indirect result from expressing actions in hi- • There exists a norm n0 ∈ Nai of the form erarchies. In a way, what we denote as our parent action, is hdeoc, e, ac, a, t0 i such that there is no norm n00 ∈ also a form of context, giving our norms two explicit con- Nai of the form hdeoc, e, ac, a, t00 i where |t00 | > |t0 |. texts. The first is the action context, expressing what the user 0 is doing at the time at a higher level. The other type is the • sign(sta ) 6= sign(sta ) situational context as described in section 3. Applied to Figure 3 we can formally derive the final With regards to the way in which actions and values relate, norm, norm 5: we currently assume commensurability. This means we can compare how actions relate to values on a set scale. This as- • hobl, e, prepForDriving, sunglasses, {sun, late}i sumption is not trivial, however, and further research might therefore shed light on whether this assumption can be re- 5 Discussion laxed in any way. The key point of our framework is that The method for deriving norms from action-value-context in- different actions can be compared based on how well they formation as presented in this paper, generates norms with promote or demote values. Further research might reveal a several specific characteristics which are interesting to note. method to do this without assuming commensurability. Firstly, we only consider trees which define what actions are a Finally, our framework does not currently include any ex- part of another action. This means all norms describe whether plicit preference ordering of values. Because the individ- or not to include action A while doing action B. However, ual differences of value preferences are an important advan- [Pasotti et al., 2017] describe another type of relationship be- tage of values, this might be one of the first additions to tween actions, namely concretisations. These define action the framework. One simple method to do this would be to A as a more concrete way of doing action B. One of the di- take the weighted sum of values given their ordering. In- rections for future research would be to define how to derive spiration could also be taken from [Cranefield et al., 2017], norms from this other type of tree. who include a threshold for values. If a value has already Our framework currently does not include any way to de- reached this threshold, it temporarily becomes less important. rive social norms, for instance where one person has an obli- In whatever way this is done, as long as one can ‘calculate’ gation towards another [Singh, 1999]. This is due to the na- score for an action given the values it promotes and the con- ture of the action description, which does not have a notion of text, the method for deriving norms will still work. actions performed with or for someone else. The framework can describe an action such as sending a text, but it leaves 6 Conclusion implicit who it is sent to. This lack of explicit representation In this paper, we present a framework which represents hi- of other people means these can also not be made explicit in erarchical trees of actions, including how these promote and the norms derived. This also implies that the types of norms demote values, and the influence of context. Moreover, we we derive are slightly different than used in many multi-agent present a method for automatically deriving norms from this systems, where norms govern social behavior of agents. In- information, capable of generating obligations, permissions stead, our norms can be seen as personal preferences for an and prohibitions for behavior. These norms could serve as a individual’s own behavior. Although a social aspect would starting point for behavior support technology, which could be a very useful extension of our framework, it makes sense use them to better take into account both the users values and to start with individual behavior in the context of a behavior the context they are in while offering support. 38 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden the AAMAS/IJCAI Workshop on Modeling and Reason- Acknowledgement This work is part of the research pro- ing in Context. gramme CoreSAEP, with project number 639.022.416, which [Milić et al., 2018] Eleonora Milić, Dragan Janković, and is financed by the Netherlands Organisation for Scientific Re- Aleksandar Milenković. Health care domain mobile re- search (NWO). minder for taking prescribed medications. In Georgi Sto- janov and Andrea Kulakov, editors, ICT Innovations 2016, References pages 173–181, Cham, 2018. Springer International Pub- [Balke et al., 2013] Tina Balke, Celia da Costa Pereira, lishing. Frank Dignum, Emiliano Lorini, Antonino Rotolo, [Pasotti et al., 2016] Pietro Pasotti, M. Birna van Riemsdijk, Wamberto Vasconcelos, and Serena Villata. Normative and Catholijn M. Jonker. Representing human habits: to- Multi-Agent Systems. Schloss Dagstuhl, 2013. wards a habit support agent. In European Conference on [Bench-Capon and Modgil, 2017] Trevor Bench-Capon and Artificial Intelligence, 2016. Sanjay Modgil. Norms and value based reasoning: justi- [Pasotti et al., 2017] Pietro Pasotti, Catholijn M. Jonker, and fying compliance and violation. Artificial Intelligence and M. Birna van Riemsdijk. Action identification hierarchies Law, 25:29–64, 2017. for behaviour support agents. In Workshop on Cognitive [Carrillo et al., 2009] Maria C. Carrillo, Eric Dishman, and Knowledge Acquisition and Applications, 2017. Tim Plowman. Everyday technologies for alzheimer’s dis- [Santos et al., 2017] J.S. Santos, J.O. Zahn, E.A. Silvestre, ease care: Research findings, directions, and challenges. Alzheimer’s & Dementia, 5(6):479 – 488, 2009. V.T. Silva, and W.W. Vasconcelos. Detection and resolu- tion of normative conflicts in multi-agent systems: a liter- [Cranefield et al., 2017] S. Cranefield, M. Winikoff, ature survey. Journal of Autonomous Agent Multi-Agent V. Dignum, and F. Dignum. No pizza for you: Value- Systems, 31:1236–1282, 2017. based plan selection in BDI agents. In International Joint Conference on Artificial Intelligence, 2017. [Sartor, 2010] G. Sartor. Doing justice to rights and val- ues: teleological reasoning and proportionality. Artif Intell [Criado et al., 2013] N. Criado, E. Argente, P. Noriega, and Law, 18:175–215, 2010. V. Botti. Human-inspired model for norm compliance de- cision making. Information Sciences, 245:218–239, 2013. [Schoffman et al., 2013] Danielle E. Schoffman, Gabrielle Turner-McGrievy, Sonya J. Jones, and Sara Wilcox. Mo- [Friedman et al., 2006] Batya Friedman, Peter H. Kahn Jr., bile apps for pediatric obesity prevention and treatment, and Alan Borning. Human-Computer Interaction and healthy eating, and physical activity promotion: just Management Information Systems: Foundations Advances fun and games? Translational Behavioral Medicine, in Management Information Systems, Volume 5 (Advances 3(3):320–325, 2013. in Management Information Systems),, chapter Value Sen- sitive Design and Information Systems, pages 348–372. [Schwartz, 1992] S.H. Schwartz. Universals in the content M.E. Sharpe, 2006. and structure of values: theoretical advances and empiri- [Karyotaki et al., 2017] E Karyotaki, H Riper, J Twisk, Adri- cal tests in 20 countries. Advances in Experimental Social Psychology, 25:1–65, 1992. aan Hoogendoorn, Annet Kleiboer, Adriana Mira, An- drew Mackinnon, Bjorn Meyer, Cristina Botella, Eliza- [Singh, 1999] Munindar P. Singh. An ontology for commit- beth Littlewood, Gerhard Andersson, Helen Christensen, ments in multiagent systems: Toward a unification of nor- Jan P. Klein, Johanna Schroder, Juana Breton-Lopez, Jus- mative concepts. Artificial Intelligence and Law, 7:97– tine Scheider, Kathy Griffiths, Louise Farrer, Marcus J. H. 113, 1999. Huibers, Rachel Phillips, Simon Gilbody, Steffen Moritz, [van de Poel, 2013] Ibo van de Poel. Translating Values Thomas Berger, Victor Pop, Viola Spek, and Pim Cuijpers. into Design Requirements, chapter Philosophy and Engi- Efficacy of self-guided internet-based cognitive behavioral neering: Reflections on Practice, Principles and Process. therapy in the treatment of depressive symptoms: A meta- Springer, 2013. analysis of individual participant data. JAMA Psychiatry, 74(4):351–359, 2017. [van de Poel, 2015] Ibo van de Poel. Handbook of Ethics, [Kayal et al., 2014] Alex Kayal, Willem-Paul Brinkman, Ri- Values and Technological Design, chapter Conflicting Val- ues in Design for Values, pages 89–115. Springer, 2015. anne Gouman, Mark A. Neerincx, and M. Birna van Riemsdijk. A value-centric model to ground norms and [van der Weide et al., 2009] T.L. van der Weide, F. Dignum, requirements for epartners of children. In Coordination, J.-J. Ch. Meyer, H. Prakken, and G.A.W. Vreeswijk. Prac- Organizations, Institutions, and Norms in Agent Systems, tical reasoning using values? Giving meaning to values. In 2014. Proceedings of the 6th international conference on Argu- [Kola et al., 2018] Ilir Kola, Catholijn M. Jonker, and mentation in Multi-Agent Systems, 2009. M. Birna van Riemsdijk. Modemodel the social environ- [van Riemsdijk et al., 2015] M. Birna van Riemsdijk, ment: Towards socially adaptive electronic partners. In Catholijn M. Jonker, and Victor Lesser. Creating socially MRC - Tenth International Workshop Modelling and Rea- adaptive electronic partners. In International Conference soning in Context, Held at FAIM, 2018. Under revision at on Autonomous Agents and Multiagent Systems, 2015. 39 Tenth International Workshop Modelling and Reasoning in Context (MRC) – 13.07.2018 – Stockholm, Sweden [Wang et al., 2017] Wenxin Wang, Céline L. van Lint, Willem-Paul Brinkman, Ton J. M. Rövekamp, Sandra van Dijk, Paul J. M. van der Boog, and Mark A. Neerincx. Renal transplant patient acceptance of a self-management support system. BMC Medical Informatics and Decision Making, 17(1):58, May 2017. [Zhou et al., 2012] Shandan Zhou, Chao-Hisen Chu, Zhiwen Yu, and Jungyoon Kim. A context-aware reminder system for elders based on fuzzy linguistic approach. Expert Sys- tems with Applications, 39(10):9411 – 9419, 2012. 40