=Paper= {{Paper |id=Vol-2261/paper1 |storemode=property |title=Ethical Decision Making under the Weak Completion Semantics |pdfUrl=https://ceur-ws.org/Vol-2261/paper1.pdf |volume=Vol-2261 |authors=Steffen Hölldobler |dblpUrl=https://dblp.org/rec/conf/ijcai/Holldobler18 }} ==Ethical Decision Making under the Weak Completion Semantics== https://ceur-ws.org/Vol-2261/paper1.pdf
              Ethical Decision Making under the Weak Completion Semantics

                                            Steffen Hölldobler
          International Center for Computational Logic, Technische Universität Dresden, Germany
                    and North-Caucasus Federal University, Stavropol, Russian Federation
                                          sh@iccl.tu-dresden.de


                          Abstract                               all 12 cognitive theories discussed in [Khemlani and Johnson-
                                                                 Laird2012]. As all human reasoning tasks are solved within
     The weak completion semantics is a novel com-               one framework, the WCS is an integrated and computational
     putational theory based on logic programs. It is            cognitive theory. We are unaware of any other theory of this
     extended to deal with equalities, which is a pre-           kind and with such a wide variety of applications.
     requisite to represent and reason about actions and            Recently, ethical decision making has received much at-
     causality as in the fluent calculus. This is discussed      tention as autonomous agents become part of our daily life.
     in the context of ethical decision making. In or-           In particular, we were inspired by [Pereira and Saptawi-
     der to decide questions about the moral permissi-           jaya2016], who studied computational models of machine
     bility of actions, counterfactuals need to be consid-       ethics. Various ethical problems are implemented as logic
     ered. Somewhat surprisingly, this can be straight-          programs and these programs can be queried for moral per-
     forwardly done in the extended approach.                    missibility. Unfortunately, their approach does not provide
                                                                 a general method to account for ethical dilemmas and is not
                                                                 integrated into a cognitive theory about human reasoning.
1   Introduction                                                    The problems studied in [Pereira and Saptawijaya2016]
The weak completion semantics (WCS) is a novel cognitive         were trolley problems or variants thereof like the bystander
theory. Its original idea is based on [Stenning and van Lam-     case. In these problems, actions with direct and indirect ef-
balgen2008] who proposed to model human reasoning tasks          fects must be considered. Hence, in order to model and rea-
by, firstly, reasoning towards a normal logic program to rep-    son about these problems within the WCS, the WCS must be
resent the reasoning task and, secondly, by reasoning with re-   extended to deal with actions and causality. We have chosen
spect to the least model of the normal logic program. Unfor-     the fluent calculus [Hölldobler and Schneeberger1990] for
tunately, Stenning and van Lambalgen’s approach contained        modeling actions and causality because it treats fluents as re-
a technical bug which was corrected in [Hölldobler and Ken-     sources which can be consumed and produced. This property
cana Ramli2009].                                                 is shared with Petri networks [Hölldobler and Jovan2014], the
   The WCS is based on many techniques and methods from          latter of which have already been used in computational mod-
logic programming and computational logic. However, these        els for human reasoning [Barrett2010].
techniques and methods are usually tweaked a little bit in or-      In the fluent calculus [Hölldobler and Schneeberger1990]
der to model human reasoning tasks adequately. For exam-         states are represented as multisets of fluent. Multisets are rep-
ple, programs are not completed in the sense of [Clark1978],     resented with the help of a binary function symbol ◦ written
but only weakly completed. Instead of the semantic operator      infix and a constant 1 such that ◦ is commutative, associative,
introduced in [Fitting1985], a modified operator introduced      and 1 is its unit element. For example, the multisets {˙ }˙ and
in [Stenning and van Lambalgen2008] is used. Instead of the      ˙ b, b}˙ are represented by the fluent terms 1 and a ◦ b ◦ b,
                                                                 {a,
three-valued Kripke-Kleene logic used in [Fitting1985, Sten-     respectively. In order to deal with function symbols like ◦ in
ning and van Lambalgen2008], the three-valued Łukasiewicz        the WCS, we need to extend WCS to handle equality. Luck-
logic [Łukasiewicz1920] is used. Because of the latter, nor-     ily, as shown in [Dietz Saldanha et al.2018] the key properties
mal logic programs admit a least model and reasoning is per-     of the WCS, viz. the existence of a least model and the fact
formed with respect to this model (see [Hölldobler and Ken-     that this model can be computed as the least fixed point of an
cana Ramli2009]).                                                appropriate semantic operator, hold also for logic programs
   The approach has been applied to various human rea-           with equality.
soning tasks like the suppression task [Byrne1989, Dietz et         In this paper, we will focus on the representation of the
al.2012], the selection task [Wason1968, Dietz et al.2013],      bystander case. We will show how to represent this prob-
and human syllogistic reasoning [Khemlani and Johnson-           lem in the extended approach. In particular, we formalize
Laird2012, Oliviera da Costa et al.2017]. In fact, WCS per-      a purely utilitarian view [Bentham2009] and the doctrine of
formed better on the human syllogistic reasoning tasks than      double effect [Aquinas1988]. In order to decide which action
is morally permissible in the bystander case we need to rea-                    →          m         ••
                                                                                                                 (initial state)
son about a counterfactual [Nickerson2015]. It turns out, that                  0                     1
this can be straightforwardly done in the extended approach.                              s          •
                                                                                                     2
2       The Weak Completion Semantics with                                                 m       → ••
                                                                                                                 (trolley moving to track 1)
        Equality                                                                 0                  1
                                                                                          s         •
We assume the reader to be familiar with the WCS as pre-
                                                                                                    2
sented in [Hölldobler2015, Dietz Saldanha et al.2017]. In
the weak completion semantics with equality (WCSE) a logic                                 m       d→•
                                                                                                                 (trolley killing first human)
program P is considered together with a set E of equations.                      0                  1
As shown in [Jaffar et al.1984], E defines a finest congru-                               s         •
ence relation on the set of ground terms. Let [t] denote the                                        2
congruence class defined by the ground term t. For example,                                m       dd ↓
[a ◦ b ◦ b] = [b ◦ a ◦ b] = [b ◦ b ◦ a ◦ 1]. Furthermore,                                                        (trolley killing second human)
                                                                                 0                   1
let [p(t1 , . . . , tn )] be an abbreviation for p([t1 ], . . . , [tn ]),                 s          •
where p is an n-ary relation symbol and all ti , 1 ≤ i ≤ n,                                          2
are ground terms. [p(t1 , . . . , tn )] = [q(s1 , . . . , qm )] if and
only if p = q, n = m, and [ti ] = [si ] for all 1 ≤ i ≤ n.
For example, [p(a ◦ b ◦ b, 1)] = [p(b ◦ a ◦ b, 1 ◦ 1]. We con-              Figure 1: The bystander case (initial state) and its ramifications if
sider E-interpretations and E-models as usual (see e.g. [Jaffar             Hank decides to do nothing, where ↓ denotes that no further action
et al.1984]).                                                               is applicable.
   As shown in [Dietz Saldanha et al.2018], a logic program
P together with a set E of equation has a least E-model un-
der the three-valued Łukasiewicz logic [Łukasiewicz1920].                   the side track. Hank can change the switch, killing him. Or
This model is the least fixed point of the following semantic               he can refrain from doing so, letting the two die. Is it morally
                                                              E
operator: Let I be an E-interpretation. We define ΦP            (I) =       permissible for Hank to change the switch?
   >    ⊥
hJ , J i where                                                                 The case is illustrated in Figure 1 (initial state). The tracks
          J   >
                  = {[A] | there exists A ← Body ∈ gP                       are divided into segments 0, 1, and 2, the arrow represents that
                           and I(Body) = >},                                the trolley t is moving forward and that the track is clear (c),
          J⊥      = {[A] | there exists A ← Body ∈ gP                       the switch is in position m (main) but can be changed into po-
                           and for all A0 ← Body ∈ gP                       sition s (side), and a bullet above a track segment represents a
                           with [A] = [A0 ]                                 human (h) on this track. t, c, and h may be indexed to denote
                           we find I(Body) = ⊥},                            the track to which they apply. In addition, we need a fluent d
                                                                            denoting a dead human.
and gP denotes the set of all ground instances of clauses oc-
curring in P.                                                                  We choose to represent a state by a pair of multisets con-
   One should observe that the set E of equations is built                  sisting of the casualties in its second element and all other
into the computation of the ΦP E
                                 -operator: In the computation              fluents in its first element. Multisets are represented by so-
     >                                                                      called fluent terms in the fluent calculus, i.e., the initial state
of J , if a ground atom A is mapped to true because it is the
                                                                            of the bystander case is the pair
head of a rule whose body is true, then all members of the
congruence class containing A are mapped to true. Likewise,
in the computation of J ⊥ we do not only have to consider all                                 (t0 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1)                   (1)
rules with head A, but all rules whose head A0 is in the same
congruence class as A, and if A is mapped to false, then all                of fluent terms. The casualties are represented in the second
members of the congruence class containing A are mapped to                  element of (1) by the constant 1 encoding the empty multi-
false.                                                                      set. Initially, there are no casualties, but casualties will play
                                                                            a special role when preferring one action over another as will
3       The Bystander Case                                                  be discussed later in this section. The first element of (1) en-
A trolley, whose conductor has fainted, is headed towards two               codes the multiset {t˙ 0 , c0 , m, h1 , h1 , h2 }.
                                                                                                                            ˙
people walking on the main track.1 The banks of the track are
                                                                               There are two kinds of actions, the ones which can be per-
so steep that these two people will not be able to get off the
                                                                            formed by Hank (the direct actions donothing and change),
track in time. Hank is standing next to a switch, which can
                                                                            and the actions which are performed by the trolley (the indi-
turn the trolley onto a side track, thereby preventing it from
                                                                            rect actions downhill and kill ). We will represent the actions
killing the two people. However, there is a man standing on
                                                                            by the trolley explicitly with the help of a five-place relation
    1
    Note that in the original trolley problem, five people are on the       symbol action specifying the preconditions, the name, and
main track. For the sake of simplicity, we assume that only two             the immediate effects of an action. As a state is represented
people are on the main track.                                               by two multisets, the preconditions anf the immediate effects
have also two parts:                                                             →          m         ••
                                                                                                                    (initial state)
                                                                                 0                     1
    action(t0 ◦ c0 ◦ m, 1, downhill , t1 ◦ c0 ◦ m, 1) ← >                                  s          •
    action(t0 ◦ c0 ◦ s, 1, downhill , t2 ◦ c0 ◦ s, 1) ← >                                             2
    action(t1 ◦ h1 , 1, kill , t1 , d) ← >                                                  m       ••
                                                                                                                    (trolley moving to track 2)
    action(t2 ◦ h2 , 1, kill , t2 , d) ← >                                        0                  1
                                                                                           s        →•
If the trolley is on track 0, this track is clear, and the switch is                                 2
in position m, then it will run downhill onto track 1 whereas
track 0 remains clear and the switch will remain in posi-                                   m         ••
                                                                                                                    (trolley killing human
tion m; if, however, the switch is in position s, the trolley will                0
                                                                                                     d ↓                             on track 2)
                                                                                           s
run downhill onto track 2. If the trolley is on either track 1
or 2 and there is a human on this track, it will kill the human                                       2
leading to a casualty.
   The possible actions of Hank are the base cases in the def-              Figure 2: The bystander case (initial state) and its ramifications if
inition of causality:2                                                      Hank decides to change the switch. One should observe that now
 causes(donothing, t0 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1) ← >                      the switch points to the side track.
 causes(change, t0 ◦ c0 ◦ s ◦ h1 ◦ h1 ◦ h2 , 1) ← >    (2)
The recursive case of the definition of causality is given as                  Let P be the program consisting of the clauses mentioned
                                                                            in this section so far and E be the set of equations specifying
           causes(A, E1 ◦ Z1 , E2 ◦ Z2 ) ←                                  that ◦ is associative, commutative, and 1 being its unit ele-
               action(P1 , P2 , A0 , E1 , E2 ) ∧                            ment. Hank has the choice to do nothing or to change the
               causes(A, P1 ◦ Z1 , P2 ◦ Z2 ) ∧                      (3)     switch. Depending on his decision, the trolley will execute
               ¬ab(A0 ).                                                    its actions which are computed as ramifications in the fluent
It checks whether in a given state (P1 ◦ Z1 , P2 ◦ Z2 ) an                  calculus [Thielscher2003]. If Hank is doing nothing, then the
action A0 is applicable, which is the case if the precondi-                 least E-model of P – which is equal to the least fixed point
                                                                                 E                               E
tions (P1 , P2 ) are contained in the given state. If this holds,           of ΦP   – is computed by iterating ΦP   starting with the empty
then the action is executed leading to the successor state                  interpretation h∅, ∅i. The following equivalence classes will
(E1 ◦Z1 , E2 ◦Z2 ), where (E1 , E2 ) are the direct effects of the          be mapped to true in subsequent iterations:4
action A0 . In other words, if an action is applied, then its pre-                [causes(donothing, t0 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1)]
conditions are consumed and its direct effects are produced.                      [causes(donothing, t1 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1)]
Such an action application is considered to be a ramifica-                        [causes(donothing, t1 ◦ c0 ◦ m ◦ h1 ◦ h2 , d)]
tion [Thielscher2003] with respect to the initial, direct action                  [causes(donothing, t1 ◦ c0 ◦ m ◦ h2 , d ◦ d)]
performed by Hank. Hence, the first argument A of causes is
not changed. The execution of an action is also conditioned                 They correspond precisely to the four states shown in Fig-
by ¬ab(A0 ), where ab is an abnormality predicate. Such ab-                 ure 1. No further action is applicable to the elements of the
normalities were introduced in [Stenning and van Lambal-                    final congruence class. The two people on the main track will
gen2008] to represent conditionals as licenses for inference.               be killed.
In this example, there is nothing abnormal known with re-                      On the other hand, if Hank is changing the switch, then the
                                                                                                  E
spect to the actions downhill and kill and, consequently, the               least fixed point of ΦP contains
assumptions                                                                           [causes(change, t2 ◦ c0 ◦ s ◦ h1 ◦ h1 , d)].
                       ab(downhill ) ← ⊥
                                                                            The two people on the main track will be saved but the person
                       ab(kill ) ← ⊥
                                                                            on the side track will be killed. This case is illustrated in
are added to the program. But we can imagine situations,                    Figure 2.
where the trolley will only cross the switch if the switch is                  The two cases can be compared by means of a prefer
not broken.3                                                                clause:
    2
      In the original version of the fluent calculus, causes is a ternary               prefer (A1 , A2 ) ←
predicate stating that the execution of a plan transfers an initial into                     causes(A1 , Z1 , D1 ) ∧
a goal state. Its base case is of the form causes(X, [ ], X), i.e.,                          causes(A2 , Z2 , D1 ◦ d ◦ D2 ) ∧
the empty plans transforms arbitrary states X into X. Generating                             ¬abprefer (A1 )
models bottom up using a semantic operator one has to consider all                      abprefer (change) ← ⊥
ground instances of this atom, which is usually too large to consider
                                                                                        abprefer (donothing) ← ⊥
as a base case for human reasoning episodes. The solution presented
in this paper overcomes this problem in that we only have a small           Comparing D1 and D1 ◦ d ◦ D2 , action A2 leads to at least
number of base cases depending on the number of options an agent            one more dead person than action A1 . Hence, A1 is preferred
like Hank may consider.                                                     over A2 if nothing abnormal is known about A1 .
    3
      If the switch is broken, the trolley may derail. Such a scenario
                                                                               4                              E
can be modeled in WCSE as well, but it is beyond the scope of this               The first two iterations of ΦP are shown in detail in the Ap-
paper to discuss it in detail.                                              pendix.
     →           m        ••                                            moral machine project (moralmachine.mit.edu) can
                                        (initial state)
     0                     1                                            be automatically treated under WCSE. We also would like to
                s
                                                                        generalize the reasoning such that if an action does something
                           2                                            good and nothing abnormal is known, then it is permissible.
                 m        ••                                            This, however, requires a formalization of ‘something good’
                                        (trolley moving to track 2)     and very likely a formalization of ‘something bad’. And,
      0
                s          ↓                                            we should have a closer look at counterfactuals and minimal
                           2                                            change.

                                                                        Acknowledgements         I’d like to thank Dominic Deck-
Figure 3: The bystander case (initial state) and its ramifications if
Hank is considering the counterfactual.
                                                                        ert, Emmanuelle-Anna Dietz Saldanha, Sybille Schwarz, and
                                                                        Lim Yohanes Stefanus for jointly developing the weak com-
                                                                        pletion semantics with equality.
   Under an utilitarian point of view [Bentham2009], the
change action is preferable to the donothing action as it will          Appendix
kill fewer humans. On the other hand, we know that a purely             Let P be the program developed in Section 3 and E be the
utilitarian view is not allowed in case of human casualties.            set of equations specifying that ◦ is associative, commutative,
Hank may ask himself: Would I still save the humans on the              and 1 being its unit element. Let I0 = h∅, ∅i be the empty in-
main track if there were no human on the side track and I               terpretation. Suppose Hank has decided to do nothing. Then,
changed the switch? This is a counterfactual. But we can                                    E
easily deal with it in WCSE by starting a new computation                                  ΦP (I0 ) = I1 = hI1> , I1⊥ i,
with the additional fact                                                where
    causes(change, t0 ◦ c0 ◦ s ◦ h1 ◦ h1 ◦ c2 , 1) ← >.          (4)      I1> = { [causes(donothing, t0 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1)],
                                                                                   [action(t0 ◦ c0 ◦ m, 1, downhill , t1 ◦ c0 ◦ m, 1)],
Comparing (2) and (4), h2 has been replaced by c2 . There is                       [action(t0 ◦ c0 ◦ s, 1, downhill , t2 ◦ c0 ◦ s, 1)],
no human on track 2 anymore and, hence, this track is clear.                       [action(t1 ◦ h1 , 1, kill , t1 , d)],
This is a minimal change necessary to satisfy the precondition                     [action(t2 ◦ h2 , 1, kill , t2 , d)] },
of the counterfactual. In this case, the least E-model of the             I1⊥ = { [ab(downhill )],
extended program will contain                                                      [ab(kill )] }.
          [causes(change, t0 ◦ c0 ◦ s ◦ h1 ◦ h1 ◦ c2 , 1)].             Considering the body of (3) we find that both possible ground
                                                                        instances of ab(A0 ), viz. ab(downhill ) and ab(kill ), are false
This case is illustrated in Figure 3. Using                             under I1 and, consequently, their negations are true under I1 .
     permissible(change) ←                                              The only ground instance of
          prefer (change, donothing) ∧                                                    causes(A, P1 ◦ Z1 , P2 ◦ Z2 )                 (5)
          causes(change, t2 ◦ c0 ◦ s ◦ h1 ◦ h1 ◦ c2 , 1) ∧              being true under I1 is
          ¬abpermissible (change)
                                                                             causes(donothing, t0 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1).         (6)
     abpermissible (change) ← ⊥
                                                                        Hence, we are searching for a ground instance of
allows Hank to conclude that changing the switch is permis-                                action(P1 , P2 , A0 , E1 , E2 )
sible within the doctrine of double effect [Aquinas1988].
                                                                        being true under I1 such that the ground instance of P1 is con-
4    Discussion                                                         tained in t0 ◦c0 ◦m◦h1 ◦h1 ◦h2 and the ground instance of P2
                                                                        is contained in 1. There are four candidates in I1 . The only
We have extended the WCS to WCSE and we have shown                      possible ground instance of an action meeting the conditions
how the bystander case can be modeled in the extended                   is
approach. We believe that the methods and techniques
                                                                                action(t0 ◦ c0 ◦ m, 1, downhill , t1 ◦ c0 ◦ m, 1). (7)
can be applied to all ethical decision problems discussed
in [Pereira and Saptawijaya2016]. In [Dietz Saldanha et                 Comparing the second arguments of (5) and (6) with the first
al.2018] we have already considered the footbridge and the              argument of (7) we find that
loop case. Moreover, we have applied the doctrine of triple                                       P1 = t0 ◦ c0 ◦ m
effect [Kamm2006] to distinguish between direct and indirect            and
intentional killings. Currently, we are working out the details                                 Z1 = h1 ◦ h1 ◦ h2 .
for all problems. For us it is important that all these prob-           Likewise, comparing the third arguments of (5) and (6) with
lems can be discussed within the presented framework and                the second argument of (7) we find that P2 = 1 and Z2 = 1.
are compatable to our solutions for other human reasoning               Combining Z1 with the fourth argument of (7) and, likewise,
tasks like the suppression and the selection task.                      combining Z2 with the fifth argument of (7) we learn that
   On the other hand, there are many open questions. The ex-
amples discussed in this paper are hand-crafted and we would                   causes(donothing, t1 ◦ c0 ◦ m ◦ h1 ◦ h1 ◦ h2 , 1)
                                                                                                 E
like to develop an extension, where examples taken from the             must be true under ΦP      (I1 ).
References                                                       [Hölldobler and Kencana Ramli, 2009] S. Hölldobler and
                                                                    C. D. P. Kencana Ramli. Logic programs under three-
[Aquinas, 1988] T. Aquinas. Summa Theologica II-II, q. 64,          valued Łukasiewicz’s semantics. In P. M. Hill and D. S.
  art. 7, “Of Killing”. In W. P. Baumgarth and R.J. Regan,          Warren, editors, Logic Programming, volume 5649 of Lec-
  editors, On Law, Morality, and Politics, pages 226–227.           ture Notes in Computer Science, pages 464–478. Springer-
  Hackett Publishing Co., Indianapolis, 1988.                       Verlag Berlin Heidelberg, 2009.
[Barrett, 2010] L. Barrett. An Architecture for Structured,      [Hölldobler and Schneeberger, 1990] S. Hölldobler and
  Concurrent, Real-time Action. PhD thesis, Computer Sci-           J. Schneeberger. A new deductive approach to planning.
  ence Division, University of California at Berkeley, 2010.        New Generation Computing, 8:225–244, 1990.
[Bentham, 2009] J. Bentham. An Introduction to the Princi-       [Hölldobler, 2015] S. Hölldobler. Weak completion seman-
  ples of Morals and Legislation. Dover Publications Inc.,          tics and its applications in human reasoning. In U. Furbach
  2009.                                                             and C. Schon, editors, Bridging 2015 – Bridging the Gap
                                                                    between Human and Automated Reasoning, volume 1412
[Byrne, 1989] R. M. J. Byrne. Suppressing valid inferences
                                                                    of CEUR Workshop Proceedings, pages 2–16. CEUR-
  with conditionals. Cognition, 31:61–83, 1989.                     WS.org, 2015. http://ceur-ws.org/Vol-1412/.
[Clark, 1978] K. L. Clark. Negation as failure. In H. Gallaire   [Jaffar et al., 1984] J. Jaffar, J-L. Lassez, and M. J. Maher. A
   and J. Minker, editors, Logic and Databases, pages 293–          theory of complete logic programs with equality. In Pro-
   322. Plenum, New York, 1978.                                     ceedings of the International Conference on Fifth Genera-
[Dietz et al., 2012] E.-A. Dietz, S. Hölldobler, and               tion Computer Systems, pages 175–184. ICOT, 1984.
  M. Ragni.        A computational logic approach to the         [Kamm, 2006] F. M. Kamm. Intricate Ethics: Rights, Re-
  suppression task. In N. Miyake, D. Peebles, and R. P.             sponsibilities, and Permissible Harm. Oxford University
  Cooper, editors, Proceedings of the 34th Annual Confer-           Press, Oxford, 2006.
  ence of the Cognitive Science Society, pages 1500–1505.        [Khemlani and Johnson-Laird, 2012] S. Khemlani and P. N.
  Cognitive Science Society, 2012.                                  Johnson-Laird. Theories of the syllogism: A meta-
[Dietz et al., 2013] E.-A. Dietz, S. Hölldobler, and               analysis. Psychological Bulletin, 138(3):427–457, 2012.
  M. Ragni.        A computational logic approach to the         [Łukasiewicz, 1920] J. Łukasiewicz.                   O logice
  abstract and the social case of the selection task. In            trójwartościowej. Ruch Filozoficzny, 5:169–171, 1920.
  Proceedings Eleventh International Symposium on                   English translation: On Three-Valued Logic. In: Jan
  Logical Formalizations of Commonsense Reasoning,                  Łukasiewicz Selected Works. (L. Borkowski, ed.), North
  2013.          commonsensereasoning.org/2013/                     Holland, 87-88, 1990.
  proceedings.html.
                                                                 [Nickerson, 2015] R. S. Nickerson. Conditional Reasoning.
[Dietz Saldanha et al., 2017] E.-A. Dietz Saldanha, S. Höll-       Oxford University Press, 2015.
  dobler, and I. Lourêdo Rocha. The weak completion             [Oliviera da Costa et al., 2017] A. Oliviera da Costa, E.-A.
  semantics. In C. Schon and U. Furbach, editors, Pro-              Dietz Saldanha, S. Hölldobler, and M. Ragni. A compu-
  ceedings of the Workshop on Bridging the Gap between              tational logic approach to human syllogistic reasoning. In
  Human and Automated Reasoning – Is Logic and Auto-                G. Gunzelmann, A. Howes, T. Tenbrink, and E. J. Dave-
  mated Reasoning a Foundation for Human Reasoning?,                laar, editors, Proceedings of the 39th Annual Conference
  volume 1994, pages 18–30. CEUR-WS.org, 2017. http:                of the Cognitive Science Society, pages 883–888, Austin,
  //ceur-ws.org/Vol-1994/.                                          TX, 2017. Cognitive Science Society.
[Dietz Saldanha et al., 2018] E.-A. Dietz Saldanha, S. Höll-    [Pereira and Saptawijaya, 2016] L.         M.     Pereira   and
  dobler, S. Schwarz, and L. Y. Stefanus. The weak com-             A. Saptawijaya. Programming Machine Ethics. Springer,
  pletion semantics and equality. In Proceedings of the 22nd        Berlin, Heidelberg, 2016.
  International Conference on Logic for Programming, Arti-       [Stenning and van Lambalgen, 2008] K.           Stenning    and
  ficial Intelligence, and Reasoning (LPAR-22). EPiC series
                                                                    M. van Lambalgen. Human Reasoning and Cognitive
  in Computing, 2018. (to appear).
                                                                    Science. MIT Press, 2008.
[Fitting, 1985] M. Fitting. A Kripke–Kleene semantics for        [Thielscher, 2003] M. Thielscher.             Controlling semi-
   logic programs. Journal of Logic Programming, 2(4):295–          automatic systems with FLUX (extended abstract). In
   312, 1985.                                                       C. Palamidessi, editor, Logic Programming, volume 2916
[Hölldobler and Jovan, 2014] S. Hölldobler and F. Jovan.          of Lecture Notes in Computer Science, pages 515–516.
  Advanced Petri nets and the fluent calculus. In S. Höll-         Springer, Berlin, Heidelberg, 2003.
  dobler, A. Malikov, and C. Wernhard, editors, Proceedings      [Wason, 1968] P. C. Wason. Reasoning about a rule. The
  of the Young Scientists’ International Workshop on Trends         Quarterly Journal of Experimental Psychology, 20:273–
  in Information Processing, volume 1145 of CEUR Work-              281, 1968.
  shop Proceedings, pages 15–24. CEUR-WS.org, 2014.
  http://ceur-ws.org/Vol-1145/.