<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Revising Weighted Knowledge Bases Using FH-Conditioning</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Omar Et-Targuy</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Ahlame Begdouri</string-name>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Salem Benferhat</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Carole Delenne</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>CRIL, CNRS UMR 8188, University of Artois</institution>
          ,
          <country country="FR">France</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>HSM, University of Montpellier</institution>
          ,
          <addr-line>CNRS, IRD, Montpellier</addr-line>
          ,
          <country country="FR">France</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Inria, Team Lemon</institution>
          ,
          <addr-line>Montpellier</addr-line>
          ,
          <country country="FR">France</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Université Sidi Mohamed Ben Abdellah (USMBA)</institution>
          ,
          <addr-line>Fez</addr-line>
          ,
          <country country="MA">Morocco</country>
        </aff>
      </contrib-group>
      <fpage>89</fpage>
      <lpage>92</lpage>
      <abstract>
        <p>Conditioning is an important task for updating and revising uncertain information when new information, often considered reliable, is added. This paper deals with the so-called Fagin and Halpern (FH-)conditioning within the framework of possibility theory. We discuss in particular the computation of FH-conditioning when it is applied to weighted knowledge bases. We also compare FH-conditioning with the two forms of standard possibilistic conditioning (min-based conditioning and product-based conditioning).</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Conditioning</kwd>
        <kwd>possibility theory</kwd>
        <kwd>weighted knowledge bases</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>ing (denoted by FH-conditioning), initially defined
within the framework of belief function theory in [5].</p>
      <p>
        Belief revision [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] is a fundamental problem in knowl- This conditioning was proposed in order to have a
edge representation. It consists in revising a set of better characterization of belief functions in terms of
beliefs of an agent in the light of new information, particular families of probability distributions (see [5]
considered completely reliable. This problem has been for more details). We are interested in the study of
widely studied in the literature both from the rational FH-conditioning within the framework of possibility
postulates point of view and from a computational point theory; a particular framework of belief function theory.
of view. A large number of belief revision operators have The possibilistic counterpart of FH-conditioning has
been proposed; in particular within the framework of already been discussed only from a semantic point of
propositional logic (and its extensions). view [6]. This paper is interested in the revision of the
weighted belief bases which is in full agreement with
      </p>
      <p>
        Within the frameworks of uncertainty theories, the possibilistic FH-conditioning. A weighted belief
the process of belief revision is realized through the base is represented by a set of pairs (,  ) where 
concept of conditioning. A large number of conditioning is a propositional logic formula, and   is a degree of
operators have been defined: Bayesian conditioning certainty (a degree of necessity) attached to the formula
(in probability theory), Dempster’s rule of condition- .
ing (in belief functions theory [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]), min-based and
product-based possibilistic conditioning (in possibility
theory [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]), diferent forms of conditioning in ordinal
conditional functions (OCF) [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], etc. These "standard"
conditioning modes have been extensively studied in
the literature from a semantic point of view but also
from a computational point of view; in particular for the
propagation of the uncertainty of beliefs in the presence
of new observations.
      </p>
      <p>The rest of the paper is organized as follows. We first
recall the basic elements of possibility theory. Next, we
summarize the syntactic computation of FH-conditioning
of the weighted knowledge bases that we recently
developed in [7]. Section 4 briefly positions the computation
of possibilistic FH-conditioning in relation to the two
standard forms of possibilistic conditioning (min-based
and product-based possibilistic conditioning). Section 5
concludes the paper.</p>
      <p>This paper focuses on Fagin and Halpern
condition</p>
    </sec>
    <sec id="sec-2">
      <title>2. Weighted Knowledge Bases and</title>
    </sec>
    <sec id="sec-3">
      <title>Possibility Distributions</title>
      <p>
        We place ourselves within the framework of propositional
logic. We will denote ℒ the set of propositional logic
formulas and Ω the set of interpretations. A possibility
distribution  is a mapping from the set of propositional
logic interpretations Ω to the unit interval [
        <xref ref-type="bibr" rid="ref1">0, 1</xref>
        ].  ()
represents the degree of compatibility or consistency
of the interpretation  with respect to the set available
knowledge. A possibility distribution is said to be
normalized if there exists an interpretation  which is fully
possible (i.e.,  () = 1).
      </p>
      <p>Given a possibility distribution  , we can define two
measures over the set of formulas:
• The degree of consistency (or possibility):</p>
      <p>Π( ) = max{ ()| |= }
which evaluates to what extend the propositional logic
formula  is consistent with the available knowledge
expressed by  .
• The degree of certainty (or of necessity):
 () = 1
− Π(
¬)
which is measures to what extent a proposition the
propositional logic formula  is entailed by the
knowledge expressed by  .</p>
      <p>A possibilistic weighted knowledge base is a finite set
of weighted formulas, denoted as Σ =
1, ..., }, where   ∈]0, 1] serves as the weight assigned
to each formula. This weight is treated as a lower bound
{(,  ),  =
for the degree of necessity  ().</p>
      <p>Each possibilistic weighted knowledge base Σ induces
a unique possibility distribution [8], denoted by  Σ,
deifned by :
 Σ() =
⎧⎪1,
⎨</p>
      <p>1
⎪
⎩
−</p>
      <p>if ∀(,  ) ∈ Σ ,  |= 
max{  : (,  ) ∈ Σ ,  ̸|= }
otherwise
(1)</p>
    </sec>
    <sec id="sec-4">
      <title>3. Syntactic Computation of</title>
    </sec>
    <sec id="sec-5">
      <title>Possibilistic FH-Condiotioning</title>
      <p>At the semantic level, possibilistic conditioning consists
in transforming a priori possibility distribution  and a
certain information, represented here by a propositional
logic formula  , into a new possibility distribution (a
posteriori) denoted by  (.| ).</p>
      <p>
        Several methods exist to define  (. |  ) (as discussed
in [9]). The two major definitions of possibilistic
conditioning are [10]:
• Min-based conditioning :
 ( | )
=
=
=
0 Otherwise .
1 if  () = Π( ) and  |= 
 () if  () &lt; Π( ) and  |= 
with,
• Product-based conditioning (also known as Dempster
rule of conditioning [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]) where we assume that Π( ) &gt;
(i.e. can be represented by consonant belief functions
where elements with positive mass are nested), the
FHconditioning was then adapted to the possibility theory
framework as follows [6]:
Π(  |   ) =
      </p>
      <p>Π(  ∧  )
Π(  ∧  ) +  (¬ ∧  )
Where  () = 1
− Π(</p>
      <p>¬).</p>
      <p>For justifications of FH-conditioning and a discussion
of its various interpretations see for example [11, 12, 13].</p>
      <p>When we restrict the definitions Π( . |   ) to
interpretations, we get the definition of FH-conditioning
defined on possibility distributions:
 ( |   ) =
0
︁(</p>
      <p>︁)
{︃max  (),  ()+( )
 ()
  |= 
  ̸|=</p>
      <p>The interesting question is how to compute the
FHconditioning, in an equivalent way, from the weighted
knowledge bases. More specifically, given the initial
weighted knowledge base Σ and the fully certain
information (, 1), how to compute a novel weighted knowledge
base, denoted by Σ   , such that:</p>
      <p>∀ ∈ Ω ,  Σ  () =  Σ( |   ).
where  Σ  (resp.  Σ) is the possibility distribution
associated with Σ   (resp. Σ ) as defined by Equation 1.</p>
      <p>In [7], a positive answer was obtained to this question.
To compute the knowledge base Σ   , a reformulation of
the semantic definition of FH-conditioning, as a sequence
of three transformation operations of possibility
distributions, has been first proposed. For each of these semantic
transformation operations, an equivalent
characterization on the weighted belief bases has been defined. At the
end of the third operation, the following final weighted
knowledge base was obtained (see [7] for more details):
Σ   = {(, 1)} ∪ Σ 2.
(3)
(4)
(5)
(6)
(7)
Example 1. Let
Σ
=
{(¬ ∨ , 0.72), ( ∨ ¬, 0.65),
(¬ ∨ ¬, 0.03), ( ∨ , 0.41)}
be a weighted knowledge base. Assume that the new
piece of information is:</p>
      <p>=  ∨  ∨ .
¬ is the most preferred one since it is the only one
which is consistent with Σ . Hence, their possibility degree
is  Σ(¬) = 1. The interpretation  gets the
possibility degree  Σ(¬) = 0.97 because it falsifies the least
certain belief in Σ ; namely (¬ ∨ ¬, 0.03).</p>
      <p>At the syntactic level, using Equation 7, we get:
Σ   = {( ∨  ∨ , 1)} ∪ {(¬ ∨ , 0.594), ( ∨
¬, 0.540), (¬ ∨ ¬, 0.03), ( ∨ , 0.41)}.</p>
      <p>Finally, one can check that computing the possibility
distribution  Σ  associated with the weighted
knowledge base Σ   , using Equation 1, gives exactly the same
applying the semantic FH-conditioning with 
distribution  ( |   ) given in the table above when
=  ∨  ∨ .
4.</p>
    </sec>
    <sec id="sec-6">
      <title>Min-based and Product-based</title>
    </sec>
    <sec id="sec-7">
      <title>Conditioning vs</title>
    </sec>
    <sec id="sec-8">
      <title>FH-conditioning</title>
      <p>This section compares FH-conditioning with the two
standard forms of possibilistic conditioning: min-based
conditioning and product-based conditioning.
•  (. |◇  ) is normalized (or consistent).
• ∀ ∈ Ω , if  ̸|=  then  ( |◇  ) = 0.</p>
      <p>This property confirms that the new information 
is completely certain and therefore any countermodel
of  is considered impossible after the conditioning
operation.
• ∀ ∈ Ω , ∀′ ∈ Ω such that  |= ,  ′ |=  , we have:
 () &gt;  (′) if  ( |◇  ) &gt;  (′ |◇  ).</p>
      <p>This property means that the conditioning does not
alter the relative order between the models of the new
information 
• ∀ ∈ Ω if  () = 0 then  ( |◇  ) = 0</p>
      <p>This property means that a priori impossible
conditional interpretations will remain so after
condition</p>
      <p>The three possibilistic conditioning (min-based
conditioning, product-based conditioning and
FHconditioning) satisfy the above four properties.</p>
      <p>() =  ( |◇  ).</p>
      <p>There remains however the following property which
is satisfied by two standard possibilistic conditioning
(min-based and product-based conditioning) but which
is not satisfied by the FH-conditioning
• if  ( ) &gt; 0 then ∀ ∈ Ω such that  |=  , we have:</p>
      <p>This property means that if  is a priori accepted
(expressed by  ( ) &gt; 0 or by Π(  ) &gt; Π(
the degrees of possibilities on the models remain
unchanged. This property is not satisfied with possibilistic
FH-conditioning where the possibility degree of a model
of  can be modified, depending on the a priori degree
¬</p>
      <p>)) then
of beliefs of  .</p>
      <p>Moreover, it is easy to see in the equation 7 that
when  ( ) = 0 then the revised base is simply equal
to the new information {(,</p>
      <p>1)}.
diferent with min-based conditioning and product-based
conditioning which retains part of the initial information
even if the new information was not initially accepted.</p>
      <p>This behavior is</p>
      <p>At the syntactic level, the computational complexity
of performing the FH-conditioning of a weighted
knowledge base is the same as that of the standard possibilistic
conditioning (min-based and product-based possibilistic
conditioning) given in [14]. For the three possibilistic
conditioning operators, the spatial complexity is linear
in with respect to the size of the initial base Σ . As for the
min-based and product-based possibilistic conditioning,
the most dificult task concerns the computation of
inconsistency degrees of a weighted knowledge base. These
two tasks (computing the degree of necessity of</p>
      <p>or
computing the degree of inconsistency of a weighted
knowledge base) have the same level of computational
ing operator):
At the semantic level, FH-conditioning shares the follow- time complexity, the dificult task when performing the
ing four properties with min-based and product-based
FH conditioning is to compute the necessity degree of
conditioning (where |◇ stands for possibilitic condition-  ( ) from the initial weighted knowledge base. With
complexity. Both tasks need log2() calls to the proposi- [5] R. Fagin, J. Y. Halpern, A new approach to
updattional logic satisfiability test, where  is the number of ing beliefs, in: P. P. Bonissone, M. Henrion, L. N.
diferent degrees in the weighted knowledge base Σ . Kanal, J. F. Lemmer (Eds.), UAI ’90: Proceedings
of the Sixth Annual Conference on Uncertainty in
Artificial Intelligence, MIT, Cambridge, MA, USA,
5. Conclusions July 27-29, 1990, Elsevier, 1990, pp. 347–374.
[6] D. Dubois, H. Prade, Bayesian conditioning in
posIn this paper, we presented the computation of the FH- sibility theory, Fuzzy sets and systems 92 (1997)
conditioning when it is defined on the weighted knowl- 223–240.
edge bases. This syntactic computation is in full agree- [7] O. Ettarguy, A. Begdouri, S. Benferhat, C. Delenne,
ment with the semantics of FH-conditioning defined at Syntactic computation of fagin-halpern
conditionthe level of possibility distributions. ing in possibility theory, in: R. Piskac, A. Voronkov
Possibilistic FH-conditioning shares several properties (Eds.), LPAR 2023: Proceedings of 24th
Internawith the two standard forms of possibilistic conditioning. tional Conference on Logic for Programming,
ArtifiThey also difer on the revision to adopt if the new in- cial Intelligence and Reasoning, Manizales,
Colomformation is already accepted or not. The combination bia, 4-9th June 2023, volume 94 of EPiC Series in
of product-based conditioning with FH-conditioning, to Computing, 2023, pp. 164–180.
take into account the a priori status of the new informa- [8] D. Dubois, J. Lang, H. Prade, Handbook of logic in
tion, will be studied in a future work. We also plan to artificial intelligence and logic programming, tome
apply diferent forms of conditioning for the revision of 3: Nonmonotonic reasoning and uncertain
reasongeographic information systems associated with wastew- ing, chapitre possibilistic logic, 1994.
ater networks. [9] B. Bouchon-Meunier, G. Coletti, C. Marsala,
Independence and possibilistic conditioning, Ann. Math.
6. Acknowledgments Artif. Intell. 35 (2002) 107–123.
[10] D. Dubois, H. Prade, Possibilistic logic: a
retrospecThis research has received support from the European tive and prospective view, Fuzzy Sets and Systems
Union’s Horizon research and innovation programme un- 144 (2004) 3–23. Possibilistic Logic and Related
Isder the MSCA-SE (Marie Skłodowska-Curie Actions Staf sues.</p>
      <p>Exchange) grant agreement 101086252; Call: HORIZON- [11] R. Fagin, J. Y. Halpern, A new approach to updating
MSCA-2021-SE-01; Project title: STARWARS (STormwA- beliefs, arXiv preprint arXiv:1304.1119 (2013).
teR and WastewAteR networkS heterogeneous data AI- [12] T. Denoeux, D. Dubois, H. Prade, Representations
driven management). of Uncertainty in AI: Beyond Probability and
PosThis research has also received support from the French sibility, Springer International Publishing, Cham,
national project ANR (Agence Nationale de la Recherche) 2020, pp. 119–150.</p>
      <p>CROQUIS (Collecte, représentation, complétion, fusion [13] J. Dezert, A. Tchamova, D. Han, Total belief theorem
et interrogation de données hétérogènes et incertaines and conditional belief functions, Int. J. Intell. Syst.
de réseaux d’eaux urbains). 33 (2018) 2314–2340.
[14] S. Benferhat, D. Dubois, H. Prade, M. Williams, A
practical approach to revising prioritized
knowlReferences edge bases, Stud Logica 70 (2002) 105–130.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>P.</given-names>
            <surname>Gärdenfors</surname>
          </string-name>
          , Belief Revision, Cambridge Tracts in Theoretical Computer Science, Cambridge University Press,
          <year>1992</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>G.</given-names>
            <surname>Shafer</surname>
          </string-name>
          ,
          <source>A Mathematical theory of evidence</source>
          , Princston University Press, NJ, USA,
          <year>1976</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D.</given-names>
            <surname>Dubois</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Prade</surname>
          </string-name>
          ,
          <article-title>Possibility theory: qualitative and quantitative aspects</article-title>
          ,
          <source>in: Quantified representation of uncertainty and imprecision</source>
          , Springer,
          <year>1998</year>
          , pp.
          <fpage>169</fpage>
          -
          <lpage>226</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>W.</given-names>
            <surname>Spohn</surname>
          </string-name>
          ,
          <article-title>Ordinal conditional functions: A dynamic theory of epistemic states, in: Causation in decision, belief change</article-title>
          ,
          <source>and statistics: Proceedings of the Irvine Conference on Probability and Causation</source>
          , Springer,
          <year>1988</year>
          , pp.
          <fpage>105</fpage>
          -
          <lpage>134</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>