<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>The Structure and Complexity of Credal Semantics</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Fabio G. Cozman</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Denis D. Maua</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Universidade de S~ao Paulo</institution>
          ,
          <country country="BR">Brazil</country>
        </aff>
      </contrib-group>
      <fpage>3</fpage>
      <lpage>14</lpage>
      <abstract>
        <p>A exible semantics has been proposed by Lukasiewicz for probabilistic logic programs where we have a normal logic program augmented with a set of independent probabilistic facts. That semantics, which we call credal semantics, is the set of all probability measures (over stable models) that are consistent with a total choice of probabilistic facts. When each total choice produces a de nite program, credal semantics is identical to Sato's distribution semantics. However, credal semantics is also de ned for programs with cycles and negations. We show that the credal semantics always de nes a set containing the probability measures that dominate an in nite monotone Choquet capacity (also known as a belief function). We also show how this result leads to inference algorithms and to an analysis of the complexity of inferences.</p>
      </abstract>
      <kwd-group>
        <kwd>Probabilistic logic programming</kwd>
        <kwd>answer sets</kwd>
        <kwd>stable model semantics</kwd>
        <kwd>complexity theory</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        The term \probabilistic logic program" encompasses a signi cant number of
distinct syntactic proposals [
        <xref ref-type="bibr" rid="ref17 ref22 ref23 ref9">9,17,22,23</xref>
        ]. One particularly successful proposal
is jointly represented by Poole's probabilistic Horn abduction [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ] and Sato's
distribution semantics [
        <xref ref-type="bibr" rid="ref28">28</xref>
        ]. There the idea is that a logic program is enlarged
with probabilistic facts that are associated with probabilities and that are usually
assumed stochastically independent. For instance, we may have a rule:
up : actionUp; not disturbanceUp:
with P(disturbanceUp = true) = 0:1:
Depending on disturbanceUp, actionUp may succeed or not in leading to up.
      </p>
      <p>
        Poole and Sato emphasized acyclic logic programs [
        <xref ref-type="bibr" rid="ref25 ref26 ref28 ref29">25,26,28,29</xref>
        ], even though
Sato did handle cyclic ones. Since then, there has been work on cyclic
probabilistic logic programs under variants of the distribution semantics [
        <xref ref-type="bibr" rid="ref15 ref18 ref27 ref30">15,18,27,30</xref>
        ].
In particular, a semantics for (acyclic and cyclic) probabilistic logic programs
can be extracted from the work of Lukasiewicz on probabilistic description
logics [
        <xref ref-type="bibr" rid="ref18 ref19">18,19</xref>
        ]. His proposal is that a probabilistic logic program de nes a set of
probability measures: for a xed truth assignment over probabilistic facts, we
take the set of all probabilities over stable models of the resulting normal logic
program. This approach may seem a little complex, but it stays within
twovalued logic and is directly related to answer set programming languages.
      </p>
      <p>Lukasiewicz refers to his approach as the \answer set semantics" for
probabilistic logic programs. However, as the approach is certainly distinct from the
usual answer set semantics (which does not employ probabilities), and as the
approach certainly applies in the presence of functions (usually not allowed in
answer set programming), we adopt the name credal semantics to refer to it.</p>
      <p>
        In this paper we study the structure and complexity of the credal
semantics. We show that the semantics of a consistent probabilistic logic program is,
surprisingly, the set of dominating measures of an in nitely monotone Choquet
capacity. Such capacities are relatively simple objects that appear in several
elds [
        <xref ref-type="bibr" rid="ref21 ref31">21,31</xref>
        ]. We obtain a straightforward derivation for inference algorithms,
and we study the complexity of inferences, obtaining interesting novel results.
      </p>
      <p>The paper is organized as follows. Basic terminology is reviewed in Section 2,
and the credal semantics is presented in Section 3. The structure of the credal
semantics is derived in Section 4, and its consequences concerning inference
algorithms and complexity are discussed in Section 5.
2</p>
    </sec>
    <sec id="sec-2">
      <title>A very short review: some syntax, some semantics</title>
      <p>An atom is written as r(t1; : : : ; tk), where r is a predicate and each ti is a term;
that is, each ti is a constant or a logical variable. A normal logic program is a
nite set of rules such as A0 : A1; : : : ; Am; not Am+1; : : : ; not An:, where each
Ai is an atom. Atom A0 is the head of the rule; the right hand side is the body.
The body consists of a set of subgoals separated by commas (A1 is a subgoal,
not An is a subgoal). If there are no subgoals, the rule is called a fact and
written simply as A0:. Programs without not are de nite. An atom is ground if
it does not contain any logical variable; a program without logical variables is
propositional. The Herbrand base of a program is the set of all ground atoms built
from constants and predicates mentioned in the program. By grounding every
rule with atoms in the Herbrand base, we obtain the (propositional) grounding
of a program. The grounded dependency graph of a normal logic program is a
directed graph where each grounded predicate is a node, and where there is an
edge from node B to node C if there is a grounded rule where C appears in
the head and B appears in the body; if B is preceded by not in the grounded
rule, then the edge between B and C is negative. A program is acyclic when
its grounded dependency graph is acyclic. An interpretation for a normal logic
program P is a subset of the Herbrand base of P; the idea is that atoms in the
interpretation are true, and atoms not in the interpretation are false. A model is
an interpretation such that every grounded rule is satis ed (that is, either the
head is true or there is a subgoal that is false in the interpretation).</p>
      <p>In this paper we do not allow functions to be used in programs, so as to stay
within nite Herbrand bases. We leave the study of functions to future work.</p>
      <p>
        One strategy to de ne the semantics of normal logic programs is to translate
programs into some well-known logic, using a completion, and to take the models
of the completion as the semantics. Another strategy is to select some \canonical"
models as \the" semantics. Two canonical models have attracted most attention
due to their favorable properties: the well-founded [
        <xref ref-type="bibr" rid="ref33">33</xref>
        ] and the stable model
semantics [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. In this paper we are concerned with the latter (stable model)
semantics. To de ne it, take normal logic program P. For interpretation I, de ne
the reduct PI to be the de nite program obtained by (i) grounding P, (ii)
removing all rules that contain a subgoal not A in the body such that A is in
I, (iii) removing all remaining subgoals of the form not A from the remaining
rules. An interpretation I is a stable model if I is a minimal model for PI
(minimal according to set inclusion). Note that a de nite program always has a
single minimal model, so the whole construction is well de ned. An answer set
is exactly a stable model.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>Probabilistic logic programs</title>
      <p>
        A probabilistic logic program is a pair hP; PFi where P is a normal logic program
and PF is a set of probabilistic facts (we abbreviate \probabilistic logic program"
by plp). A probabilistic fact is an atom A that does not unify with any rule head
(hence with any fact either), and that is associated with a probability assessment
P(A) = . We write a probabilistic fact as :: A:, where we have adopted the
syntax of the ProbLog system1 [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. If A is a ground atom, then :: A is a
ground probabilistic fact. A probabilistic fact with logical variables is interpreted
through its groundings (for instance, :: r(X1; : : : ; Xm) is interpreted as the set
of :: r(a1; : : : ; ak) for all groundings of r). A truth assignment for all grounded
probabilistic facts in PF is a total choice. If is a total choice for PF, we denote
by PF# the set of atoms assigned true by .
      </p>
      <p>We assume that all probabilistic facts are independent, so the probability of
total choice = fA1 = t1; : : : ; Ak = tkg, where each ti is true or false, is
P( ) = P(A1 = t1; : : : ; Ak = tk) = P(A1 = t1)
P(An = tn) ;
(1)
where P(Ai = true) = i and P(Ai = false) = 1 i given i :: Ai.</p>
      <p>
        Once a total choice is xed, we can form a normal logic program consisting
of P [ PF# . Poole's and Sato's original work focused respectively on acyclic
and de nite programs, and for these programs the semantics of P [ PF# is
rather uncontroversial, and is given by their unique stable model. Hence, every
such probabilistic logic program induces a unique probability distribution over
all atoms. For cyclic programs, Sato, Kameya and Zhou later adopted Fitting's
three-valued semantics for each xed total choice [
        <xref ref-type="bibr" rid="ref30">30</xref>
        ], while Hadjichristodoulou
and Warren [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] and Riguzzi [
        <xref ref-type="bibr" rid="ref27">27</xref>
        ] have explored the well-founded semantics.
      </p>
      <p>
        An alternative semantics can be extracted from the work on probabilistic
description logic programs by Lukasiewicz [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ], as noted in Section 1. To describe
that proposal, a few de nitions are needed. A plp hP; PFi is consistent if there
is at least one stable model for each xed total choice of PF. A probability model
for a consistent plp hP; PFi is a probability measure P over interpretations of
P, such that:
1 Available at https://dtai.cs.kuleuven.be/problog/index.html.
(i) every interpretation I with P(I) &gt; 0 is a stable model of P [ PF# for the
total choice that agrees with I on the probabilistic facts; and
(ii) for every total choice = fA1 = t1; : : : ; Ak = tkg, where ti is true or false,
we have P( ) = Qk
      </p>
      <p>i=1 P(Ai = ti).</p>
      <p>The set of all probability models for a plp is the semantics of the program.</p>
      <p>
        Lukasiewicz calls his proposed semantics the answer set semantics for
probabilistic description logic programs. As we mentioned before, we prefer the term
credal semantics, which we adopt. The reason for this latter name is that a set
of probability measures is often called a credal set [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>Now given a consistent plp, we may be interested in the smallest possible
value of P(A) for some ground atom A, with respect to the set K of all probability
models of the plp. This is conveyed by the lower probability of A, P(A) =
infP2K P(A). Similarly, we have the upper probability of A, P(A) = supP2K P(A).</p>
      <p>
        It is perhaps time for a few much needed examples. We start with two
examples where each total choice produces a single stable model, and then move
to two examples where some total choices lead to several stable models.
Example 1. First, consider an acyclic probabilistic logic program hP; PFi. When
P is acyclic, each total choice for PF yields an acyclic program (with a unique
stable model that is also its well-founded semantics [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]), hence a single probability
model is obtained (a singleton credal set).
      </p>
      <p>
        For example, consider a fragment of the famous Bayesian network Asia
[
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] with three atoms, smoking, cancer, and bronchitis, with P(smoking) = 1=2,
P(cancerjsmoking) = 1=10, P(cancerj:smoking) = 1=100, P(bronchitisjsmoking) =
3=5, and P(bronchitisj:smoking) = 3=10. These probabilities can be expressed
through the following plp:
0:5 :: smoking:
      </p>
      <p>cancer :
bronchitis :</p>
      <p>0:1 :: a1: 0:01 :: a2:
smoking; a1: cancer :
smoking; a3: bronchitis :</p>
      <p>
        0:6 :: a3: 0:3 :: a4:
not smoking; a2:
not smoking; a4:
This plp has a single probability model, yielding for instance P(smokingjcancer) =
P(smokingjcancer) = 1=11. In fact, any Bayesian network over binary variables
can be encoded using an acyclic probabilistic logic program [
        <xref ref-type="bibr" rid="ref25 ref26">25,26</xref>
        ]. 2
Example 2. Sometimes a cyclic probabilistic logic program hP; PFi still has a
single probability model. This happens for instance if P is locally strati ed; that
is, if its grounded dependency graph has no cycles containing a negative edge.
If P is a locally strati ed program, then any total choice of probabilistic facts
produces a locally strati ed normal logic program. And for locally strati ed
programs, most existing semantics, and particularly the well-founded and the
stable model semantics, coincide and produce a single stable model.
      </p>
      <p>For example, consider a plp where p means \path" and e means \edge"
(based on an example in the ProbLog distribution):
p(X; Y ) : e(X; Y ): p(X; Y ) : e(X; Y ); p(X; Y ):
0:6 :: e(1; 2): 0:1 :: e(1; 3): 0:4 :: e(2; 5): 0:3 :: e(2; 6):
0:3 :: e(3; 4): 0:8 :: e(4; 5): 0:2 :: e(5; 6):
(2)
1
2
3
4
1
2
3
4
1
2
3
4
5
5
5
1
2
3
4
5</p>
      <p>
        That is, we have a random graph with nodes 1; : : : ; 6, and probabilities attached
to edges. The query P(p(1; 6) = true) yields the probability that there is a path
between nodes 1 and 6. Using ProbLog one obtains P(p(1; 6) = true) = 0:217. 2
Example 3. A common cyclic pattern in the literature is the pair of rules a :
not b: and b : not a: [
        <xref ref-type="bibr" rid="ref33">33</xref>
        ]. Here is a more elaborated version, adapted from
Ref. [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Take probabilistic fact 0:9 :: man(dilbert): and rules:
single(X) :
man(X); not husband(X): husband(X) :
man(X); not single(X):
When man(dilbert) is false, there is a single stable model s1 = fman(dilbert) =
false; husband(dilbert) = false; single(dilbert) = falseg. When man(dilbert) is true,
there are two stable models, s2 = fman(dilbert) = true; husband(dilbert) =
true; single(dilbert) = falseg, and s3 = fman(dilbert) = true; husband(dilbert) =
false; single(dilbert) = trueg. Now, there is a probability model (that is, a
probability measure over the stable models) such that P(s1) = 0:1 and P(s2) = 0:9
(hence P(s3) = 0), and a probability set model such that P(s1) = 0:1 and
P(s3) = 0:9. In fact, any probability measure such that P(s1) = 0:1, P(s2) = 0:9 ,
and P(s3) = 0:9(1 ), for 2 [0; 1], is also a probability model. 2
Example 4. Consider a graph coloring problem consisting of the rules:
coloredBy(V; red) : not coloredBy(V; yellow); not coloredBy(V; green); vertex(V ):
coloredBy(V; yellow) : not coloredBy(V; red); not coloredBy(V; green); vertex(V ):
coloredBy(V; green) : not coloredBy(V; red); not coloredBy(V; yellow); vertex(V ):
noClash : not noClash; edge(V; U ); coloredBy(V; C); coloredBy(U; C):
and the facts: for i 2 f1; : : : ; 5g, vertex(i):, and
      </p>
      <p>
        coloredBy(2; red): coloredBy(5; green): 0:5 :: edge(4; 5):
edge(1; 3): edge(1; 4): edge(2; 1): edge(2; 4): edge(3; 5): edge(4; 3):
The facts mentioning vertex and edge encode the graph in Figure 1 (left); the
probabilistic fact is indicated as a dashed edge. For a xed total choice, the
stable models of the program are the 3-colorings of the graph. Thus, if edge(4; 5)
is true, there is a single stable model; otherwise, there are two stable
models. We obtain P(coloredBy(1; yellow)) = 0 and P(coloredBy(1; yellow)) = 1=2,
while P(coloredBy(4; yellow)) = 1=2 and P(coloredBy(4; yellow)) = 1, and nally
P(coloredBy(3; yellow)) = P(coloredBy(3; yellow)) = 1. 2
Example 5. Consider a game where one wins whenever the opponent has no
moves [
        <xref ref-type="bibr" rid="ref33">33</xref>
        ], with rule wins(X) : move(X; Y ); not wins(Y ): and the following
moves, one of which is probabilistic: move(a; b): move(b; a): move(b; c): 0:3 ::
move(c; d):. If move(c; d) is false, there is a single stable model (where wins(b) is
the only winning position); otherwise, there are two stable models (wins(c) is true
in both of them; wins(a) is true in one, while wins(b) is true in the other). Thus
P(wins(a)) = 0:0 and P(wins(a)) = 0:3; P(wins(b)) = 0:7 and P(wins(b)) = 1:0;
and P(wins(c)) = 0:3 and P(wins(c)) = 0:3. 2
      </p>
      <p>Finally, consider a program without credal semantics:
Example 6. If the barber shaves every villager who does not shave himself, does
the barber shave himself? The program
shaves(X; Y ) : barber(X); villager(Y ); not shaves(X; Y ):</p>
      <p>0:5 :: barber(bob): 0:5 :: villager(bob):
does not have a stable model when both probabilistic facts are true (we obtain
the pattern p : not p). Note that even in this case the resulting normal logic
program has well-founded semantics (assigning unde ned to shaves(bob; bob)). 2
4</p>
    </sec>
    <sec id="sec-4">
      <title>The structure of credal semantics</title>
      <p>Given the generality of plps, one might think that credal sets generated by the
credal semantics can have an arbitrarily complex structure. Surprisingly, the
structure of the credal semantics of a plp is a relatively simple object:
Theorem 1. Given a consistent probabilistic logic program, its credal semantics
is a set of probability measures that dominate an in nitely monotone Choquet
capacity.</p>
      <p>
        Before we present a proof of this theorem, let us pause and de ne a few
terms. An in nitely monotone Choquet capacity is a set function P from an
algebra A on a set to the real interval [0; 1] such that [2, De nition 4.2]:
P( ) = 1 P(;) = 1 and, for any A1; : : : ; An in the algebra, P([iAi)
PJ f1;:::;ng( 1)jJj+1P(\j2J Aj ). In nitely monotone Choquet capacities appear
in several formalisms; for instance, they are the belief functions of
DempsterShafer theory [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ], and summaries of random sets [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. Given a set-function P, we
can construct a set of measures that dominate P as fP : 8A 2 A : P(A) P(A)g.
We abuse language and say that this set itself is in nitely monotone. If a
credal set K is in nitely monotone, then the lower probability P, de ned as
P(A) = infP2K P(A), is the generating in nitely monotone Choquet capacity.
We also have the upper probability P(A) = supP2K P(A) = 1 P(Ac).
Proof (Theorem 1). Consider a set containing as states the posssible total
choices of the probabilistic facts. Over this space we have a product measure that
is completely speci ed by the probabilities attached to probabilistic facts. Now
consider a multi-valued mapping between and the space of all possible
models of our probabilistic logic program. For each element 2 , de ne ( ) to
be the set of stable models associated with the total choice of the probabilistic
facts. Now we use the fact that a probability space and a multi-valued mapping
induce an in nite monotone Choquet capacity over the range of the mapping
(that is, over ) [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. 2
      </p>
      <p>In nitely monotone credal sets have several useful properties; for one thing
they are closed and convex. Convexity here means that if P1 and P2 are in the
credal set, then P1 + (1 )P2 is also in the credal set for 2 [0; 1]. Thus, as
illustrated by Example 3:
Corollary 1. Given a consistent probabilistic logic program, its credal semantics
is a closed and convex set of probability measures.</p>
      <p>
        There are several additional results concerning the representation of in nitely
monotone capacities using their Mobius transforms [
        <xref ref-type="bibr" rid="ref2 ref31">2,31</xref>
        ]; we refrain from
mentioning every possible corollary we might produce by rehashing those results.
Instead, we focus on a few important results that can be used to great e ect
in future applications. First, as we have a nite Herbrand base, we can use the
symbols in the proof of Theorem 1 to write, for any event A [2, Section 5.3.2]:
P(A) = P
2 : ( ) A P( ) ;
      </p>
      <p>
        P(A) = P
2 : ( )\A6=; P( ) :
(3)
This property directly leads to an inference algorithm described later. In fact,
for in nitely monotone credal sets we can nd easy expressions for lower and
upper conditional probabilities (that is, the in mum and supremum of conditional
probabilities): if A and B are events, then lower and upper probabilities of A
given B are (where the superscript c denotes complement) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]:
      </p>
      <p>P(AjB) =</p>
      <p>P(A \ B)
P(A \ B) + P(Ac \ B)
and P(AjB) =</p>
      <p>
        P(A \ B)
P(A \ B) + P(Ac \ B)
: (4)
We also note that the computation of lower and upper expected values, and
even conditional expected values, with respect to in nitely monotone Choquet
capacities admits relatively simple expressions [
        <xref ref-type="bibr" rid="ref34">34</xref>
        ].
      </p>
      <p>
        It is convenient to pause here and mention the constraint logic programming
language of Michels et el. [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ], a signi cant contribution that is based on credal
sets (even though it uses a di erent syntax and semantics, for instance allowing
continuous variables but not allowing multiple stable models per total choice).
In that work expressions are (conditional) lower and upper probabilities are
derived; those expressions directly mirror the ones presented in this section.
5
      </p>
    </sec>
    <sec id="sec-5">
      <title>The complexity of inferences with credal semantics</title>
      <p>In this section we focus on the computation of P(Q) and P(Q), where Q is a
conjunction of assignments to some ground atoms in its Herbrand base. Recall
that we preclude functions, so as to stay with nite Herbrand bases. Hence a
direct translation of Expression (3) into an algorithm leads to:</p>
      <p>Given a plp hP; PFi and Q, initialize a and b with 0.</p>
      <p>For each total choice of probabilistic facts, compute the set S of all stable
models of P [ PF# , and:
if Q is true in every stable model in S, then a a + P(C);
if Q is true in some stable model of S, then b b + P(C).</p>
      <p>
        Return a and b (the value of a is P(Q), the value of b is P(Q)).
Note that to nd whether Q is true in every stable model of a program, we
must run cautious inference, and to nd whether Q is true in some stable model
of a program, we must run brave inference; these logical procedures have been
studied in depth in the literature [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>
        In fact the algorithm above has already been derived by Cali et al. [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], using
clever optimization techniques (Cali et al. actually obtain lower and upper
conditional probabilities by combining Expressions (4) with (3)). The advantage of
our approach is that the algorithm is a transparent consequence of known facts
about capacities; other than that, Cali et al. have already presented the
algorithm so we do not need to dwell on it. Rather, we wish to focus on the speci c
question of how complex is the computation of the lower probability P(Q).
      </p>
      <p>To be precise: as input, we have a plp whose probabilities are rational
numbers, and an assignment Q to some ground atoms in the ( nite) Herbrand base;
as output, we want (the rational number) P(Q). We refer to this complexity as
the inferential complexity of plps.</p>
      <p>
        One may also be interested in the complexity of inferences when the plp
is xed, and the only input is the query Q. This is the query complexity of
the program: input is Q, output is P(Q). This concept is based on analogous
concepts found in database theory [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]: one may face situations where the program
may be small compared to the query, and query complexity is the concept of
practical interest.
      </p>
      <p>So, we are ready to state our main results on complexity.</p>
      <p>Theorem 2. In this theorem, assume that input plps are consistent. Inferential
complexity is #NP-equivalent for propositional plps, and #P-equivalent when
restricted to strati ed propositional plps. For plps where all predicates have a
bound on arity, inferential complexity is #NPNP-equivalent, and #NP-equivalent
when restricted to strati ed programs. Query complexity is #P-hard; it is in
#NP (up to multiplication by a polynomial number), and is #P-equivalent when
restricted to strati ed programs.</p>
      <p>
        Before we prove this theorem, we must de ne a number of terms that appear
in it. We deal with usual complexity classes such as NP, and with the concept of
oracles [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ]. The polynomial hierarchy is the class of languages Si iP = Si iP =
Si iP, where iP = P iP 1 , iP = co iP, iP = NP iP 1 and 0P = P. The
complexity class #P is the class of integer-valued functions computed by a counting
Turing machine in polynomial time; a counting Turing machine is a standard
non-deterministic Turing machine that prints in binary notation, on a separate
tape, the number of accepting computations induced by the input [
        <xref ref-type="bibr" rid="ref32">32</xref>
        ]. Similarly
to the polynomial hierarchy, the counting hierarchy is de ned by oracles. The
class # kP contains the integer-valued functions computed by a counting Turing
machine with access to a kP oracle. The counting hierarchy is the union of all
# kP (over all integers k). The problems with which we are concerned involve
the computation of rational numbers and do not precisely t into the class of #P
problems and its extensions, so we resort to weighted reductions [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], which are
parsimonious reductions (Karp reductions that preserve the number of accepting
paths) scaled by a positive rational number. The canonical complete problem
for the class # kP via parsimonious reductions is # ksat [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], that gets, as
input, a formula '(Y1; : : : ; Yn) = 8 X19 X2 : : : Qk Xk (X1; : : : ; Xk; Y1; : : : ; Yn),
where is a CNF formula over variables X1; : : : ; Xk; Y1; : : : ; Yn, and produces,
as output, the number of assignments to the variables Y1; : : : ; Yn that make the
formula ' evaluate to true. For k odd, the problem remains complete even if the
formula is speci ed in disjunctive normal form [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. For a complexity class #C
of integer-valued functions computed by some (oracle) counting Turing machine,
we say that a problem is #C-hard if any problem in #C can be reduced to it
by a weighted reduction. If a problem is #C-hard and can be solved with one
call to a #C oracle and a multiplication by a rational obtained with polynomial
e ort, then the problem is said to be #C-equivalent (as inspired by [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ]).
Proof (Theorem 2). All results for strati ed programs are available in a
companion publication on the complexity of distribution semantics [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. So the remainder
of this proof focuses on the general (possibly cyclic) case.
      </p>
      <p>For propositional programs: Membership follows as cautious logical reasoning is
coNP-complete [11, Table 2]; inference obtains by counting the result of cautious
inference, so we obtain membership in #NP. Hardness is shown by a reduction
from # 1SAT: Count the number of assignments of x1; : : : ; xn such that, for ' a
CNF formula with clauses c1; : : : ; ck, 8y1; : : : ; ym'(x1; : : : ; xn; y1; : : : ; ym). Write
rules ti : not fi: and fi : not ti: for every variable yi (thus there are 2m stable
models running through assignments of y1; : : : ; ym). For each xi introduce xi and
probabilistic fact 0:5 :: xi. For every clause cj write cj : `: for each one of the
literals in cj , where ` is xi or not xi or ti or fi respectively if the literal is xi or
:xi or yi or :yi. Finally, write sat : c1; : : : ; ck. The probability of a total choice
of x1; : : : ; xn adds to P(sat) i every stable model satis es all clauses. Hence the
desired counting is 2nP(sat).</p>
      <p>
        For programs where predicates have bounded arity: Membership follows as
cautious logical reasoning is 2P -complete [11, Table 5]; inference obtains by
counting the result of cautious inference, so we obtain membership in #NPNP.
Hardness is shown by a reduction from # 2SAT: Count the number of x1; : : : ; xn
s.t. 8y1; : : : ; ym9z1; : : : ; zs'(x1; : : : ; xm; y1; : : : ; ym; z1; : : : ; zs), where ' is a
3CNF formula with clauses c1; : : : ; ck. The reduction is a combination of the
previous reduction with the one used by Eiter et al. [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Introduce, as before,
predicates xi, ti and fi, and probabilistic facts 0:5 :: xi and rules ti : not fi: and
fi : not ti:. Introduce predicates cj but now the arity of cj equals the
number of variables zi in cj . Denote by Zi a logical variable that corresponds to
the propositional variable zi, and by Zj the logical variables corresponding to
propositional variables zi appearing in clause cj . For each clause cj , go over the
possible assignments zj of the propositional variables associated with Zj . If this
assignment makes the clause true, add the fact cj(zj ):. If instead the assignment
leaves the clause dependent on some propositional variables xi or yi, then for
every literal li (over a variable xi or yi) introduce the rule: cj(zj ) : `:, where `
is xi or not xi or ti or fi exactly as in the previous reduction. Finally, introduce
sat : c1(Z1); : : : ; ck(Zk):. As before, the desired count is obtained by 2nP(sat).
Concerning query complexity, membership follows from the fact that data
complexity of logical cautious reasoning is coNP [7, Theorem 5.8]. 2
      </p>
      <p>We leave to future work a number of questions. First, we conjecture that
the complexity of computing P(Q) is the same as computing P(Q) (as we only
need to change from cautious to brave reasoning). Also, we did not present
results on the complexity of probabilistic logic programs without a bound on
arity (without such a bound, logical cautious reasoning is coNEXP-complete,
so we conjecture that exponentially bounded counting Turing machines will be
needed here). Finally, our complexity results were obtained assuming that plps
were consistent: of course, in practice one must consider the problem of checking
consistency. We just indicate a few facts about consistency checking:
Proposition 1. Consistency checking is in 2P for propositional plps and in
3P for plps where predicates have a bound on arity.</p>
      <p>Proof. Consistency checking of a propositional plp obtains by verifying whether
logical consistency holds for each total choice of probabilistic facts, and this
can be accomplished by deciding whether all total choices satisfy consistency;
because logical consistency checking for this language is NP-complete [11, Table
1], we obtain membership to 2P . An analogue reasoning leads to 3P as logical
consistency checking with bounded arity is 2P -complete [11, Table 4].
6</p>
    </sec>
    <sec id="sec-6">
      <title>Discussion: moving to answer set programming</title>
      <p>
        We have studied the credal semantics for probabilistic logic program, as
proposed by Lukasiewicz [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. This semantics is quite attractive and is not as well
known as it deserves. We have shown that the credal semantics of a plp is an
in nite monotone credal set, and we have derived novel complexity results (using
non-trivial classes in the counting hierarchy). In future work we plan to look at
consistency checking, and also at the consequences of allowing functions.
Moreover, we must include in the analysis a number of useful constructs adopted
by answer set programming [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. There, classic negation, such as :wins(X), is
allowed on top of not. Also, constraints, such as : , are allowed to mean that
is false. More substantial is the presence, in answer set programming, of
disjunctive heads. With such a machinery, we can for instance rewrite the rules in
Example 3 as a single rule single(X) _ husband(X) : man(X):, and the rules in
Example 4 as the pair:
coloredBy(V; red) _ coloredBy(V; yellow) _ coloredBy(V; green) : vertex(V ):
: edge(V; U ); coloredBy(V; C); coloredBy(U; C):
      </p>
      <p>Now the point to be made is this. Suppose we have a probabilistic logic
program hP; PFi, where as before we have independent probabilistic facts, but
where P is now a logic program with classic negation, constraints, disjuctive
heads, and P is consistent in that it has stable models for every total choice of
probabilistic facts. The proof of Theorem 1 can be reproduced in this setting,
and hence the credal semantics (the set of measures over stable models) of these
probabilistic answer set programs is again an in nite monotone credal set. The
complexity of inference with these constructs is left for future investigation.</p>
    </sec>
    <sec id="sec-7">
      <title>Acknowledgements</title>
      <p>The rst author is partially supported by CNPq, grant 308433/2014-9. The
second author received nancial support from the S~ao Paulo Research Foundation
(FAPESP), grant 2016/01055-1.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <given-names>K. R.</given-names>
            <surname>Apt</surname>
          </string-name>
          and
          <string-name>
            <given-names>M.</given-names>
            <surname>Bezem</surname>
          </string-name>
          .
          <article-title>Acyclic programs</article-title>
          .
          <source>New Generation Computing</source>
          ,
          <volume>9</volume>
          :
          <fpage>335</fpage>
          {
          <fpage>363</fpage>
          ,
          <year>1991</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>T.</given-names>
            <surname>Augustin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. P. A.</given-names>
            <surname>Coolen</surname>
          </string-name>
          , G. de Cooman, and
          <string-name>
            <surname>M. C. M.</surname>
          </string-name>
          <article-title>Tro aes</article-title>
          . Introduction to Imprecise Probabilities. Wiley,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <given-names>A.</given-names>
            <surname>Bulatov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Dyer</surname>
          </string-name>
          , L. Ann Goldberg,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jalsenius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jerrum</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Richerby</surname>
          </string-name>
          .
          <article-title>The complexity of weighted and unweighted #CSP</article-title>
          .
          <source>J. of Computer and System Sciences</source>
          ,
          <volume>78</volume>
          :
          <fpage>681</fpage>
          {
          <fpage>688</fpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4. A,
          <string-name>
            <surname>Cal</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          <string-name>
            <surname>Lukasiewicz</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Predoiu</surname>
            , and
            <given-names>H.</given-names>
          </string-name>
          <string-name>
            <surname>Stuckenschmidt</surname>
          </string-name>
          .
          <article-title>Tightly coupled probabilistic description logic programs for the semantic web</article-title>
          .
          <source>In J. on Data Semantics XII</source>
          , pages
          <volume>95</volume>
          {
          <fpage>130</fpage>
          . Springer,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>F. G.</given-names>
            <surname>Cozman</surname>
          </string-name>
          and
          <string-name>
            <given-names>D. D.</given-names>
            <surname>Maua</surname>
          </string-name>
          .
          <article-title>Bayesian networks speci ed using propositional and relational constructs: Combined, data, and domain complexity</article-title>
          .
          <source>In AAAI Conf. on Arti cial Intelligence</source>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <given-names>F. G.</given-names>
            <surname>Cozman</surname>
          </string-name>
          and
          <string-name>
            <given-names>D. D.</given-names>
            <surname>Maua</surname>
          </string-name>
          .
          <article-title>Probabilistic graphical models speci ed by probabilistic logic programs: semantics and complexity</article-title>
          .
          <source>In Int. Conf. on Probabilistic Graphical Models</source>
          ,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>E.</given-names>
            <surname>Dantsin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Eiter</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Voronkov</surname>
          </string-name>
          .
          <article-title>Complexity and expressive power of logic programming</article-title>
          .
          <source>ACM Computing Surveys</source>
          ,
          <volume>33</volume>
          (
          <issue>3</issue>
          ):
          <volume>374</volume>
          {
          <fpage>425</fpage>
          ,
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>C. P. de Campos</surname>
            , G. Stamoulis, and
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Weyland</surname>
          </string-name>
          .
          <article-title>A structured view on weighted counting with relations to quantum computation and applications</article-title>
          .
          <source>Technical Report 133, Electronic Colloquium on Computational Complexity</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9. L. de Raedt,
          <string-name>
            <given-names>P.</given-names>
            <surname>Frasconi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Kersting</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Muggleton</surname>
          </string-name>
          .
          <article-title>Probabilistic Inductive Logic Programming</article-title>
          . Springer,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <given-names>A.</given-names>
            <surname>Durand</surname>
          </string-name>
          , M. Hermann, and
          <string-name>
            <given-names>P. G.</given-names>
            <surname>Kolaitis</surname>
          </string-name>
          .
          <article-title>Subtractive reductions and complete problems for counting complexity classes</article-title>
          .
          <source>Theoretical Computer Science</source>
          ,
          <volume>340</volume>
          (
          <issue>3</issue>
          ):
          <volume>496</volume>
          {
          <fpage>513</fpage>
          ,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <given-names>T.</given-names>
            <surname>Eiter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Faber</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Fink</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Woltran</surname>
          </string-name>
          .
          <article-title>Complexity results for answer set programming with bounded predicate arities and implications</article-title>
          .
          <source>Annals of Mathematics and Arti cial Intelligence</source>
          ,
          <volume>5</volume>
          :
          <fpage>123</fpage>
          {
          <fpage>165</fpage>
          ,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12. T. Eiter, G. Ianni, and
          <string-name>
            <given-names>T.</given-names>
            <surname>Krennwalner</surname>
          </string-name>
          .
          <article-title>Answer set programming: a primer</article-title>
          .
          <source>In Reasoning Web</source>
          , pages
          <volume>40</volume>
          {
          <fpage>110</fpage>
          . Springer-Verlag,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <given-names>D.</given-names>
            <surname>Fierens</surname>
          </string-name>
          , G. van den Broeck, J.
          <string-name>
            <surname>Renkens</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Shrerionov</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Gutmann</surname>
          </string-name>
          , Gerda Janssens, and Luc de Raedt.
          <article-title>Inference and learning in probabilistic logic programs using weighted Boolean formulas</article-title>
          .
          <source>Theory and Practice of Logic Programming</source>
          ,
          <volume>15</volume>
          (
          <issue>3</issue>
          ):
          <volume>358</volume>
          {
          <fpage>401</fpage>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <given-names>M.</given-names>
            <surname>Gelfond</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Lifschitz</surname>
          </string-name>
          .
          <article-title>The stable model semantics for logic programming</article-title>
          .
          <source>In Int. Logic Programming Conf. and Symp</source>
          ., pages
          <volume>1070</volume>
          {
          <fpage>1080</fpage>
          ,
          <year>1988</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <given-names>S.</given-names>
            <surname>Hadjichristodoulou</surname>
          </string-name>
          and
          <string-name>
            <given-names>D. S.</given-names>
            <surname>Warren</surname>
          </string-name>
          .
          <article-title>Probabilistic logic programming with well-founded negation</article-title>
          .
          <source>In Symp. on Multiple-Valued Logic</source>
          , pages
          <volume>232</volume>
          {
          <fpage>237</fpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <given-names>S. L.</given-names>
            <surname>Lauritzen</surname>
          </string-name>
          and
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Spiegelhalter</surname>
          </string-name>
          .
          <article-title>Local computations with probabilities on graphical structures and their application to expert systems</article-title>
          .
          <source>J. Royal Statistical Society B</source>
          ,
          <volume>50</volume>
          (
          <issue>2</issue>
          ):
          <volume>157</volume>
          {
          <fpage>224</fpage>
          ,
          <year>1988</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <given-names>T.</given-names>
            <surname>Lukasiewicz</surname>
          </string-name>
          .
          <article-title>Probabilistic logic programming</article-title>
          .
          <source>In European Conf. on Arti cial Intelligence</source>
          , pages
          <fpage>388</fpage>
          {
          <fpage>392</fpage>
          ,
          <year>1998</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <given-names>T.</given-names>
            <surname>Lukasiewicz</surname>
          </string-name>
          .
          <article-title>Probabilistic description logic programs</article-title>
          .
          <source>In European Conf. on Symbolic and Quantitative Approaches to Reasoning with Uncertainty</source>
          , pages
          <volume>737</volume>
          {
          <fpage>749</fpage>
          ,
          <string-name>
            <surname>Barcelona</surname>
          </string-name>
          , Spain,
          <year>July 2005</year>
          . Springer.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <given-names>T.</given-names>
            <surname>Lukasiewicz</surname>
          </string-name>
          .
          <article-title>Probabilistic description logic programs</article-title>
          .
          <source>Int. J. of Approximate Reasoning</source>
          ,
          <volume>45</volume>
          (
          <issue>2</issue>
          ):
          <volume>288</volume>
          {
          <fpage>307</fpage>
          ,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <given-names>S.</given-names>
            <surname>Michels</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hommersom</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. J. F.</given-names>
            <surname>Lucas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Velikova</surname>
          </string-name>
          .
          <article-title>A new probabilistic constraint logic programming language based on a generalised distribution semantics</article-title>
          .
          <source>Arti cial Intelligence Journal</source>
          ,
          <volume>228</volume>
          :1{
          <fpage>44</fpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21. I. Molchanov.
          <source>Theory of Random Sets</source>
          . Springer,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <given-names>R.</given-names>
            <surname>Ng</surname>
          </string-name>
          and
          <string-name>
            <given-names>V. S.</given-names>
            <surname>Subrahmanian</surname>
          </string-name>
          .
          <article-title>Probabilistic logic programming</article-title>
          .
          <source>Information and Computation</source>
          ,
          <volume>101</volume>
          (
          <issue>2</issue>
          ):
          <volume>150</volume>
          {
          <fpage>201</fpage>
          ,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <given-names>L.</given-names>
            <surname>Ngo</surname>
          </string-name>
          and
          <string-name>
            <given-names>P.</given-names>
            <surname>Haddawy</surname>
          </string-name>
          .
          <article-title>Answering queries from context-sensitive probabilistic knowledge bases</article-title>
          .
          <source>Theoretical Computer Science</source>
          ,
          <volume>171</volume>
          (
          <issue>1</issue>
          {2):
          <volume>147</volume>
          {
          <fpage>177</fpage>
          ,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.
          <string-name>
            <given-names>C. H.</given-names>
            <surname>Papadimitriou</surname>
          </string-name>
          . Computational Complexity. Addison-Wesley,
          <year>1994</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <given-names>D.</given-names>
            <surname>Poole</surname>
          </string-name>
          .
          <article-title>Probabilistic Horn abduction and Bayesian networks</article-title>
          .
          <source>Arti cial Intelligence</source>
          ,
          <volume>64</volume>
          :
          <fpage>81</fpage>
          {
          <fpage>129</fpage>
          ,
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          26.
          <string-name>
            <given-names>D.</given-names>
            <surname>Poole</surname>
          </string-name>
          .
          <article-title>The Independent Choice Logic and beyond</article-title>
          .
          <source>In Probabilistic Inductive Logic Programming</source>
          , pages
          <volume>222</volume>
          {
          <fpage>243</fpage>
          . Springer,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          27.
          <string-name>
            <given-names>F.</given-names>
            <surname>Riguzzi</surname>
          </string-name>
          .
          <article-title>The distribution semantics is well-de ned for all normal programs</article-title>
          .
          <source>Int. Workshop on Probabilistic Logic Programming</source>
          , pages
          <volume>69</volume>
          {
          <fpage>84</fpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          28.
          <string-name>
            <given-names>T.</given-names>
            <surname>Sato</surname>
          </string-name>
          .
          <article-title>A statistical learning method for logic programs with distribution semantics</article-title>
          .
          <source>In Int. Conf. on Logic Programming</source>
          , pages
          <volume>715</volume>
          {
          <fpage>729</fpage>
          ,
          <year>1995</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          29.
          <string-name>
            <given-names>T.</given-names>
            <surname>Sato</surname>
          </string-name>
          and
          <string-name>
            <given-names>Y.</given-names>
            <surname>Kameya</surname>
          </string-name>
          .
          <article-title>Parameter learning of logic programs for symbolicstatistical modeling</article-title>
          .
          <source>J. of Arti cial Intelligence Research</source>
          ,
          <volume>15</volume>
          :
          <fpage>391</fpage>
          {
          <fpage>454</fpage>
          ,
          <year>2001</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          30.
          <string-name>
            <given-names>T.</given-names>
            <surname>Sato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Kameya</surname>
          </string-name>
          , and
          <string-name>
            <given-names>N.-Fa</given-names>
            <surname>Zhou</surname>
          </string-name>
          .
          <article-title>Generative modeling with failure in PRISM</article-title>
          .
          <source>In Int. Joint Conf. on Arti cial Intelligence</source>
          , pages
          <fpage>847</fpage>
          {
          <fpage>852</fpage>
          ,
          <year>2005</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          31.
          <string-name>
            <given-names>G.</given-names>
            <surname>Shafer</surname>
          </string-name>
          .
          <source>A Mathematical Theory of Evidence</source>
          . Princeton University Press,
          <year>1976</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          32.
          <string-name>
            <surname>L. G. Valiant.</surname>
          </string-name>
          <article-title>The complexity of enumeration and reliability problems</article-title>
          .
          <source>SIAM J. of Computing</source>
          ,
          <volume>8</volume>
          (
          <issue>3</issue>
          ):
          <volume>410</volume>
          {
          <fpage>421</fpage>
          ,
          <year>1979</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          33.
          <string-name>
            <surname>A. van Gelder</surname>
            ,
            <given-names>K. A.</given-names>
          </string-name>
          <string-name>
            <surname>Ross</surname>
            , and
            <given-names>J. S.</given-names>
          </string-name>
          <string-name>
            <surname>Schlipf</surname>
          </string-name>
          .
          <article-title>The well-founded semantics for general logic programs</article-title>
          .
          <source>Association for Computing Machinery</source>
          ,
          <volume>38</volume>
          (
          <issue>3</issue>
          ):
          <volume>620</volume>
          {
          <fpage>650</fpage>
          ,
          <year>1991</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          34.
          <string-name>
            <given-names>L.</given-names>
            <surname>Wasserman</surname>
          </string-name>
          and
          <string-name>
            <given-names>J. B.</given-names>
            <surname>Kadane</surname>
          </string-name>
          .
          <article-title>Computing bounds on expectations</article-title>
          .
          <source>J. of the American Statistical Association</source>
          ,
          <volume>87</volume>
          (
          <issue>418</issue>
          ):
          <volume>516</volume>
          {
          <fpage>522</fpage>
          ,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>