<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>ev~ao F. Aguiar</string-name>
          <email>estevaofaguiar@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Marcelo Ladeira</string-name>
          <email>mladeira@unb.br</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Rommel N. Carvalho</string-name>
          <email>rommel.carvalho@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Shou Matsumoto</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science University of Bras lia Campus Universitario Darcy Ribeiro Bras lia</institution>
          ,
          <addr-line>Distrito Federal</addr-line>
          ,
          <country country="BR">Brazil</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This paper focuses on the incorporation of the Markov Logic Network (MLN) formalism as a plug-in for UnBBayes, a Java framework for probabilistic reasoning based on graphical models. MLN is a formalism for probabilistic reasoning which combines the capacity of dealing with uncertainty tolerating imperfections and contradictory knowledge based a Markov Network (MN) with the expressiveness of First Order Logic. A MLN provides a compact language for specifying very large MNs and the ability to incorporate, in modular form, large domain of knowledge (expressed in First Order Logic sentences) inside itself. A Graphical User Interface for the software Tu y was implemented into UnBBayes to facilitate the creation, and inference of MLN models. Tu y is a Java open source MLN engine.</p>
      </abstract>
      <kwd-group>
        <kwd>Markov Logic Network</kwd>
        <kwd>MLN</kwd>
        <kwd>Tu y</kwd>
        <kwd>UnBBayes</kwd>
        <kwd>Markov Network</kwd>
        <kwd>probabilistic reasoning</kwd>
        <kwd>probabilistic graphical models</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        In the past decade, several languages have been proposed to deal with complex
knowledge representation problems that also need to deal with uncertainty. A
frequent approach is to combine both logic and probabilistic formalisms resulting
in a powerful model for knowledge representation and treatment of uncertainty.
Some examples of these approaches were build and have been improved
every day as Markov Logic Networks (MLN) [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], Multi-Entity Bayesian Networks
(MEBN) [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], Probabilistic Relational Models (PRM) [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], Relational Markov
Networks (RMN) [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], and Structural Logistic Regression (SLR) [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        Markov Logic Network (MLN) is a principled formalism which combines
First-Order Logic (FOL) with Markov network (MN). An MLN, basically, is
a rst-order knowledge base where a weight is assigned to each formula. The
weight of a formula indicates how strong the formula is as a constraint. Together
with a nite set of constants, an MLN can be grounded as a Markov network.
This way, a MLN can be seen as a template for building Markov networks [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>
        There are a few implementations for MLN like Alchemy [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] in C++, Tu y
[
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] in Java, ProbCog [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] in both Python and Java, and Markov TheBeast [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] in
Java. In some of them there is no graphical user interface (GUI). The one that
does, the interface is quite simple providing no real ease-of-use. In general, as
these software are not very friendly, they become hard to use for users without
previous experience with their programming tasks and command lines.
      </p>
      <p>
        This paper presents an implementation of a Java tool that consists of a GUI to
facilitate the task of making inferences, creating, and editing MLN models. This
tool was developed at the University of Brasilia (UnB) and uses the software
Tu y as a library. Its current features include GUIs for modeling terms of a
knowledge base into a tree structure and for searching them in order to help the
user nd terms easily in large models. Moreover, it is also possible to edit and to
persist these structures as a standard MLN le (compatible with both Tu y and
Alchemy). Besides that, every parameter that can be set on Tu y can be easily
set in the GUI. It even supports the addition in the GUI of new parameters that
might be present in future versions of Tu y using only a con guration le. This
tool was implemented as a plug-in for UnBBayes, a Java open source software
developed at UnB that is capable of modeling, making inferences and learning
probabilistic networks [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
      </p>
      <p>This paper is structured as follows. Section 2 presents the MLN. Section 3
overviews some implementations of MLN and presents the major reasons for
choosing Tu y as the application programming interface (API) behind this
plugin. Section 4 introduces the GUI developed as a plug-in for UnBBayes.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Markov Logic Networks</title>
      <p>
        A knowledge base of First-Order Logic (FOL) can be viewed as a set of
constraints on possible worlds. Each formula has an associated weight that re ects
how strong this formula is as a constraint [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. An MLN is a set with formulas of
FOL assigned to a real-valued weight for each formula. Together with a nite set
of constants, it de nes a Markov Network (MN). Each state of the MN generated
represents a possible world of the generic MLN representation. A possible world
determines the truth value of each ground (i.e. instantiated) predicate. Thus,
it is said that an MLN is like a template for constructing MNs. Given di erent
set of constants, it will produce di erent MNs with di erent values and sizes.
However they have the same parameters and regularities in structure. Di erent
instantiated formulas still have the same weights. So, in MLN it is possible to use
inference methods generally used for MNs, since the used network is a grounded
one. However, due to the fact that most of time the grounded network is large
and complex, to use this method could be infeasible. Therefore, approximate
and lifted inference algorithms have been proposed [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>Maximum a Posteriori (MAP) inference (i.e. nding the most likely state of
the world consistent with some evidence) and marginal inference (i.e. computing
arbitrary conditional probabilities) are common approaches to making inferences
in MLN. Learning algorithms are used to build, from historic data, models that
represent a problem to be treated. For this formalism, learning methods are used
to construct or re ne a MLN. Two types of learning are speci ed: weight learning
(i.e. which tries to nd the weights of the speci ed formulas that maximize
the likelihood or conditional likelihood of a relational database) and the harder
technique of structure learning (i.e. which tries to learn the formulas themselves).</p>
      <p>
        More details on MLN can be found in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and will not be covered in this
paper.
3
      </p>
    </sec>
    <sec id="sec-3">
      <title>The choice of an implementation</title>
      <p>
        With the intent of building a GUI for MLN, the rst step is to implement or
nd an existing implementation of the formalism. So, pros and cons of some
implementations have to be analyzed. If no implementation had compatibility with
UnBBayes, it would be necessary to create a new one. Fortunately it was not
the case. The pros and cons of the more common implementations are presented
below. As our goal was to build a plug-in for UnBBayes, the programming
language had a larger weight than the features available on the tool. UnBBayes [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]
is an open source application developed in Java that provides a framework for
building and reasoning with probabilistic graphical models. Since version 1.5.1,
it works with Java Plugin Framework (JPF). JPF allows the construction of
scalable projects, loading plug-ins at runtime. The MLN GUI has been built as
a plug-in for UnBBayes.The software analyzed were Alchemy [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], ProbCog [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ],
TheBeast [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ] and Tu y [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
3.1
      </p>
      <sec id="sec-3-1">
        <title>Alchemy</title>
        <p>Alchemy is the reference for other implementations of MLN and is the most
complete of them. It covers MAP Inference, marginal inference, weight learning,
structure learning and other features from each of the mentioned topics. Alchemy
is an open source software developed in C/C++. It does not have a GUI and
it works only in Linux or Linux shell emulator. Alchemy was the rst option to
extend, but its programing language is not easily integrated with Java.
3.2</p>
      </sec>
      <sec id="sec-3-2">
        <title>TheBeast</title>
        <p>
          TheBeast [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] is an open source and is a Statistical Relational Learning software
based on MLN. Although it is developed in Java, it does not have much
documentation and it does not work similarly to Alchemy. This fact impacts on that
it would be harder to work with it. TheBeast has no GUI implemented either.
3.3
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>ProbCog</title>
        <p>ProbCog is an open source software for Statistical Relational Learning that
supports learning and inference for relational domains. Merged to ProbCog, is
PyMLN, a toolbox and a software library for learning and reasoning in MLN.
It has a GUI for MLN but it, seemingly, shows the necessary les for inference
and the main parameters to be more easily selected, but nothing beyond the
basic. Most of the code of ProgCog is developed in Java, although its MLN tool
is developed in Python.
3.4</p>
        <p>Tu y
Tu y is an open source Markov Logic Network engine. It is developed in Java
and makes use of a PostgreSQL database. Tu y is in version 0.3 and is capable
of MRF partitioning, MAP inference, marginal inference and weight learning.
Since Tu y has many similar features to Alchemy, as weel as the same structure
for input les, it has no GUI, and it is implemented in the same programming
language as UnBBayes, it ended up being the most suitable MLN implementation
to be used in the MLN GUI plug-in.
4</p>
        <p>The GUI for MLN
There are several helpful easy to use GUI tools for Bayesian networks. However,
this is not true to MLN yet. For most of them, the only way to make it work is
to set command line parameters and then enter commands through a console.
Sometimes you must memorize a bunch of commands if you want to realize a
task fast, while you could just press buttons and choose options with some clicks
in a more easy to use GUI interface. Creating a GUI to simplify this process
of designing and using MLNs was the main motivation of this research. The
following paragraphs describe the main features of a proposed GUI for MLN.</p>
        <p>This project of a GUI for MLN into UnBBayes was built as a JPF plug-in.
The plug-in structure provides a way to run a new project inside the running
environment of UnBBayes. The bind between the new plug-in and the core of
UnBBayes happens in a way that no changes are needed in the core structure.</p>
        <p>
          Basically, building new plug-in implementations for UnBBayes is really
simple, since a stub implementation is available in [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ].
        </p>
        <p>Figure 1 presents the GUI divided in numbered parts. Each part is described
bellow.</p>
        <p>The Tu y input les are: a MLN le (.mln), an evidence le (.db) and a query
le (.db). The last one can be replaced passing its content through command line.
Figure 1 Part 1 shows the possibility to load this three les and the possibility
to send the query predicates through a text eld.</p>
        <p>When the MLN le and the evidence le are loaded, their terms (i.e.
predicates, weighted formulas and evidences) are separated and organized in a tree
structure as shown in Figure 1 Part 5. This tree structure gives a great gain of
visualization and di erences between structures into the MLN.</p>
        <p>Figure 1 Part 2 presents a very useful search tool. It searches dynamically
predicates, formulas, and evidence that match the inputted string. This feature
is useful when working with very large MLNs.</p>
        <p>The GUI also presents a way to add and remove predicates, formulas and
evidence. This feature is shown in Figure 1 Part 3. Lots of terms can be directly
inputted into the correct classi cation. The deletion is made from a drop down
list which brings to the user all the existing terms. Every change made through
this feature is persisted in the original le. This feature makes it easier for the
user to include or remove terms in a MLN model.</p>
        <p>Figure 1 Part 4 allows the user to choose what inference method to use and
the button to trigger the inference process, which will be executed by Tu y in
the background. Tu y is embedded into UnBBayes and used as a library through
its API.</p>
        <p>Figure 1 Part 7 is displayed when the "inference" tab is chosen. It presents
the output in a text area, the same way that it is presented in the output le in
Tu y.</p>
        <p>Figure 1 Part 6 presents the parameters of Tu y in an easy way to set and
save. The parameters of Tu y were parameterized by type that they represent
(e.g. integer, oat, boolean and string). This allows the parameters to be loaded
to the interface from a con guration le and new parameters added in new
versions of Tu y can be easily incorporated to UnBBayes without the need to
change any programming code. The dynamic values of the parameters are de ned
in another con guration le.
5</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusion</title>
      <p>This paper presents a GUI for Tu y, a Java Markov Logic Network inference
engine. As shown, this GUI facilitates the task of creating MLNs models and
reasoning with them. This GUI was implemented as a JFP plug-in for the
UnBBayes software. UnBBayes and this plug-in1 is available from http://
unbbayes.sourceforge.net/ under GPL license.
1 This plug-in can only be downloaded from the SVN repository. Soon a distribution
will be released for simple download and installation.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <given-names>P.</given-names>
            <surname>Domingos</surname>
          </string-name>
          and
          <string-name>
            <given-names>D.</given-names>
            <surname>Lowd</surname>
          </string-name>
          .
          <article-title>Markov Logic: An Interface Layer for AI</article-title>
          . Morgan and Claypool,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>Nir</given-names>
            <surname>Friedman</surname>
          </string-name>
          , Lise Getoor, Daphne Koller, and
          <article-title>Avi Pfe er. Learning probabilistic relational models</article-title>
          .
          <source>In International Joint Conference on Arti cial Intelligence</source>
          , volume
          <volume>16</volume>
          , pages
          <fpage>1300</fpage>
          {
          <fpage>1309</fpage>
          . LAWRENCE ERLBAUM ASSOCIATES LTD,
          <year>1999</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>K.B. Laskey</surname>
          </string-name>
          .
          <article-title>MEBN: A Language for First-Order Bayesian Knowledge Bases</article-title>
          .
          <source>Arti cial Intelligence</source>
          ,
          <volume>172</volume>
          (
          <issue>2-3</issue>
          ):
          <fpage>140</fpage>
          -
          <lpage>178</lpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>M. Vieira M. Onishi R.N. Carvalho M. Ladeira</surname>
          </string-name>
          , D. da Silva and W. da Silva.
          <article-title>Platform independent and open tool for probabilistic networks</article-title>
          .
          <source>Proceedings of the IV Arti cial Intelligence National Meeting (ENIA</source>
          <year>2003</year>
          )
          <article-title>on the XXIII Congress of the Brazilian Computer Society</article-title>
          (SBC
          <year>2003</year>
          ), Unicamp, Campinas, Brazil,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <given-names>Shou</given-names>
            <surname>Matsumoto</surname>
          </string-name>
          , Rommel Novaes Carvalho, Marcelo Ladeira, Paulo Cesar G. da Costa, Laecio Lima Santos, Danilo Silva, Michael Onishi, Emerson Machado, and Ke Cai.
          <article-title>UnBBayes: a Java Framework for Probabilistic Models in AI</article-title>
          .
          <source>In Java in Academia and Research</source>
          . iConcept Press,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>6. R C. Doan A. Shavlik J. Niu</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <article-title>Tu y: Scaling up statistical inference in markov logic networks using an RDBMS</article-title>
          .
          <source>Proceedings of the VLDB Endowment</source>
          ,
          <volume>4</volume>
          (
          <issue>6</issue>
          ),
          <fpage>373</fpage>
          -
          <lpage>384</lpage>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <given-names>Alexandrin</given-names>
            <surname>Popescul</surname>
          </string-name>
          and
          <string-name>
            <surname>Lyle H Ungar</surname>
          </string-name>
          .
          <article-title>Structural logistic regression for link analysis</article-title>
          .
          <source>Departmental Papers (CIS)</source>
          ,
          <source>page 133</source>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <given-names>Sebastian</given-names>
            <surname>Riedel</surname>
          </string-name>
          .
          <article-title>Improving the accuracy and e ciency of MAP inference for Markov Logic</article-title>
          .
          <source>In Proceedings of the 24th Annual Conference on Uncertainty in AI (UAI '08)</source>
          , pages
          <fpage>468</fpage>
          {
          <fpage>475</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <given-names>Sebastian</given-names>
            <surname>Riedel</surname>
          </string-name>
          .
          <article-title>Markov thebeast user manual</article-title>
          .
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Ben</surname>
            <given-names>Taskar</given-names>
          </string-name>
          , Pieter Abbeel, and
          <string-name>
            <given-names>Daphne</given-names>
            <surname>Koller</surname>
          </string-name>
          .
          <article-title>Discriminative probabilistic models for relational data</article-title>
          .
          <source>In Proceedings of the Eighteenth conference on Uncertainty in arti cial intelligence</source>
          , pages
          <volume>485</volume>
          {
          <fpage>492</fpage>
          . Morgan Kaufmann Publishers Inc.,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Jain D. Beetz M. Tenorth</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>Knowledge representation for cognitive robots</article-title>
          .
          <source>Knstliche Intelligenz</source>
          , Springer, volume
          <volume>24</volume>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>