<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Modern Deductive Database Systems Can Enhance Data Integration</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Francesco Calimeri</string-name>
          <email>calimeri@dlvsystem.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Simona Perri</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Giorgio Terracina</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jessica Zangari</string-name>
          <email>zangarig@mat.unical.it</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>DLVSystem Srl</institution>
          ,
          <addr-line>Rende</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Mathematics and Computer Science, University of Calabria</institution>
          ,
          <addr-line>Rende</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2018</year>
      </pub-date>
      <fpage>24</fpage>
      <lpage>27</lpage>
      <abstract>
        <p>Data integration systems provide a transparent access to heterogeneous, possibly distributed, sources; deductive database and their extensions allow to easily address complex issues arising in data integration. However, the gap between state-of-the-art deductive databases and data integration systems is still to be closed. In this paper we focus on some recent advancements implemented in the I-DLV system, and point out how these can facilitate the development of advanced data integration systems.</p>
      </abstract>
      <kwd-group>
        <kwd>Deductive DataBase</kwd>
        <kwd>Data Integration</kwd>
        <kwd>Instantiation</kwd>
        <kwd>Answer Set Programming</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>The task of an information integration system is to combine data residing at
di erent sources, providing the user with a uni ed view called global schema.
Users can formulate queries in a transparent and declarative way over the global
schema: they do not need to be aware of details about the sources: the
information integration system automatically retrieves relevant data from the sources,
and suitably combines them to provide answers to user queries [16]. The global
schema may also contain integrity constraints (such as key dependencies,
inclusion dependencies, etc.).</p>
      <p>
        Recent developments in IT, have made available a huge number of
information sources, typically autonomous, heterogeneous and widely distributed. As a
consequence, information integration has emerged as a crucial issue in several
application domains, e.g., distributed databases, cooperative information
systems, data warehousing, ontology-based data access, or on-demand computing.
Deductive database systems in general, and Answer Set Programming (ASP)
in particular, are powerful tools in this context, as demonstrated, for instance,
by the approaches formalized in [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4, 17</xref>
        ]. More generally, the adoption of
logicbased systems allows to easily address complex problems like Consistent Query
Answering (CQA) [22] and querying ontologies under inconsistencies [15]. The
database community has spent many e orts in this area, and relevant research
results have been obtained to clarify semantics, decidability and complexity of
data-integration systems under di erent assumptions. However, lling the gap
between deductive database systems and database integration tools is still an
open challenge, and continuous improvements and extensions in ASP systems [
        <xref ref-type="bibr" rid="ref9">9,
14</xref>
        ] are certainly important contributions to reach this goal.
      </p>
      <p>In this paper we discuss some of the most recent database oriented
innovations in ASP as implemented in the I-DLV system, and we point out how such
improvements may enhance advanced data integration systems.
2</p>
      <p>
        The I-DLV System
The I-DLV system [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] is a stand-alone modern ASP instantiator and deductive
database engine, that has been also integrated as the grounding module of the
renewed version of the popular system DLV [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The description of all the features
of I-DLV is out of the scope of this paper. In the following, we outline the major
features having an important impact on I-DLV as deductive database engine. For
a comprehensive list of customizations and options, along with further details,
we refer the reader to [
        <xref ref-type="bibr" rid="ref6 ref7">7, 6</xref>
        ] and to the online documentation [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
2.1
      </p>
    </sec>
    <sec id="sec-2">
      <title>Overview of evaluation features</title>
      <p>
        I-DLV supports the ASP-Core-2 [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] standard language; its high exibility and
extensible design ease the incorporation of optimization techniques, language
updates and customizability. We provide next a brief overview of its instantiation
process, focusing on peculiar optimizations whose synergic work, that can be
driven at a ne-grained level from the user, is the key of I-DLV e ciency.
Optimizations. The system adopts a bottom-up evaluation strategy based on a
semi-nave approach [27]. One of the most crucial and computationally expensive
tasks is the grounding of each rule; it resembles the evaluation of the relational
joins among positive body literals, and I-DLV adopts a bunch of techniques to
optimize it, many of which inspired by the database eld and properly enhanced
and readapted to I-DLV purposes. Here we mention body-reordering criteria,
indexing strategies and decomposition rewritings [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], along with additional
netuning optimizations acting to di erent extents on the evaluation process, with
the general common aim of reducing the search space and improving overall
performances [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        { Body-reordering techniques aim at nding an optimal execution ordering
for the join operations, by varying the order of literals in the rule bodies.
Di erent orderings have been de ned for I-DLV; the one adopted by default
has been speci cally designed by considering the e ects of each literal on the
binding of variables [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
{ Indexing techniques, instead, are intended to optimize the retrieval of
matching instances from the predicate extensions. I-DLV de nes a exible indexing
schema: any predicate argument can be indexed, allowing both single- and
multiple-argument indices, and for each predicate di erent indexing data
structures can be built \on-demand", only if needed, while instantiating a
rule containing that predicate in its body.
{ I-DLV exploits also a heuristic-guided decomposition rewriting technique
relying on hyper-tree decompositions that replaces long rules with sets of
smaller ones, with the aim of transforming the input program into an
equivalent one possibly evaluated more e ciently.
{ Eventually, we cite a series of techniques falling into the category of join
optimizations, such as \pushing down selections" and other join rewritings;
they have diverse aims, such as decreasing the number of matches considered
during rule instantiation, early recovering inconsistencies in the input
program, or syntactically rewriting the input program with the twofold intent
of easing the instantiation and improving performance.
      </p>
      <p>
        Query answering in I-DLV is empowered with the magic-sets technique [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]: when
the input program features a query, it simulates a top-down computation by
rewriting the input program for identifying the relevant subset of the
instantiation which is su cient for answering the query. Restrictions on the instantiation
is obtained by means of additional \magic" predicates, whose extensions
represent relevant atoms with respect to the query.
      </p>
      <p>Customizability. I-DLV provides a ne-grained control over the whole
computational process, allowing for enabling/disabling each one of the many optimization
techniques both via command-line options and inline annotations. More in
detail, I-DLV programs can be enriched by global and local annotations (i.e., on
a per-rule basis), for customizing some machineries such as body ordering and
indexing. For instance, the indexing schema of a speci c atom in a rule can be
constrained to satisfy some speci c conditions, annotating the rule as follows:
%@rule atom indexed(@atom=a(X,Y,Z), @arguments=f0,2g). when instantiating
the annotated rule, the atom a(X; Y; Z) will be indexed, if possible, with a
doubleindex on the rst and third arguments.</p>
      <p>
        Since its release, I-DLV proved its reliability and e ciency as both ASP
grounder and deductive database engine. Recently, in the latest ASP
Competition [14] I-DLV ranked both as rst and second combined with an automatic
solver selector [12] that inductively chooses the best solver depending on some
inherent features of the instantiation produced, and with the state-of-the-art
solver clasp [13], respectively. Moreover, I-DLV performance results are
promising also as deductive database system [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. The system has been tested on the
query-based set of problems from OpenRuleBench [20], an open set of resources
featuring a suite of benchmarks for analyzing performance and scalability of
different rule engines, and compared with the former DLV version and XSB [24],
which was among the clear winners of the o cial OpenRuleBench runs [20]
and is currently one of the most widespread logic programming and deductive
database systems. Results show that not only I-DLV behaves better than DLV,
but it is de nitely competitive against XSB. For a detailed description on such
experiments we refer the reader to [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
2.2
      </p>
    </sec>
    <sec id="sec-3">
      <title>Interoperability Features</title>
      <p>In this section we brie y illustrate mechanisms and tools of I-DLV for (i)
interoperability with external systems, (ii) accommodation of external sources
of computation, and (iii) value invention/modi cation within logic programs.
In particular, I-DLV supports direct connection with relational databases and
SPARQL enabled ontologies via explicit import/export directives, and access to
external data via calls to Python scripts with external atoms.</p>
      <p>RDBMS Data Access. I-DLV can import relations from an RDBMS by means of
an #import sql directive. For instance, #import sql(DB, "user", "pass", "SELECT
* FROM t", p) can access database DB and import all tuples from table t into facts
with predicate name p. Similarly, #export sql directives are used to populate
speci c tables with the extension of a predicate.</p>
      <p>Ontology-Based Data Access. Data can also be imported from local RDF/XML
les and from remote EndPoints via SPARQL queries by means of directives of
form: #import local sparql("file","query",pred name,pred arity[,types]). or
#import remote sparql("endpoint url","query",pred name,pred arity[,types]).
where query is a SPARQL statement de ning data to be imported and the
optional types speci es the conversion for mapping data types to
ASP-Core2 terms. For the local import, file can be either a local or remote URL pointing
to an RDF/XML le: in the latter case, the le is downloaded and treated as a
local RDF/XML le; in any case, the ontology graph is built in memory. As for
the remote import, the endpoint url refers to a remote endpoint and building
the graph is up to the remote server; this second option might be very convenient
in case of large datasets.</p>
      <p>Generic Data Access via Python scripts. Input programs can be enriched by
external atoms of the form: &amp;p(i1; : : : ; in; o1; : : : ; om), where p is the name of
a Python function, i1; : : : ; in and o1; : : : ; om (n; m 0) are input and output
terms, respectively. For each instantiation i01; : : : ; i0n of the input terms,
function p is called with arguments i01; : : : ; i0n, and returns a set of instantiations for
o1; : : : ; om. For instance, a single line of Python: def rev(s): s[::-1] is su
cient to de ne a function rev that reverses strings, and which can be used within
a rule of the following form: revW ord(Y ) :{ word(X); &amp;rev(X; Y): External atoms
give the user a powerful tool for signi cantly extending interoperability, granting
access to virtually unlimited external data sources. Hence, additional
import/export features to speci c semistructured or unstructured data sources can be
externally de ned by suitable Python scripts. Obviously, \native" support for
interoperability should be preferred whenever available. In fact, it is intuitive to
understand that native support allows much better performance; experiments</p>
      <p>OPTIMIZER
DATA PROCESSOR Magic Sets</p>
      <p>SkolEeTmLizer BIJnoodidneyxOOinprgtdimSetririzanatgtesigoiness</p>
      <p>Hypertree Decompositions
SQL</p>
      <p>
        SPARQL PYTHON
reported in [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] give an idea of the e ective gain on performance obtainable with
a native support of SQL/SPARQL local import directives against the same
directives implemented via Python scripts.
      </p>
      <p>Value invention/modi cation. The availability of both external atoms and
function symbols, included in the ASP-Core-2 compliance, allows to address very
interesting issues from a database perspective. First of all, it is well known that
function symbols allow to implement value invention by skolemization. This
turns out to be a very useful feature when dealing with ontologies. Moreover,
the generality of external atoms allows to include in logic rules data modi cation
processes, typical of ETL work ows. In [26] we already described how external
atoms may help data cleaning processes in a logic-based scenario.
3</p>
      <p>Application of I-DLV Features for Data Integration
The adoption of deductive database technology for data integration solutions is
not new [16, 17, 25, 19, 23]; however, the recent developments on ASP described
in this paper, allow a more concrete application of deductive systems in
realworld applications requiring integration of heterogeneous data such as RDBMS,
Ontologies and Semi-structured information sources. A general architecture for
a modern integration system based on I-DLV is presented in Figure 1, where
both main architectural elements and speci c I-DLV functionalities oriented to
data integration are highlighted; these will be described next by layers.</p>
      <p>The Data Layer, which comprises the set of input information sources, can
handle several kind of data types: (i) standard databases can be directly accessed
through the import sql directives included in I-DLV; (ii) graph databases, RDF
ontologies, and more generally SPARQL-enabled ontologies, can be accessed by
the import local sparql and import remote sparql directives; (iii)
interoperability with any other kind of input format can be granted by external atoms relying
on suitable Python scripts.</p>
      <p>
        The Schema Layer includes everything that describes the data integration
context from a conceptual point of view, namely source and global schemas,
mappings and constraints, in a way similar to what has been widely studied in
the literature [16]. The support to this design phase could be provided by already
available external graphical tools, such as the one presented in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>The Evaluation Layer includes everything that allows to transform input
data, schemas, and queries into answers in an e ective way. The core role is
played by I-DLV which, as previously pointed out, has been incorporated as
the grounding module of the DLV system. Here we concentrate our attention on
three main logical portions: the Data Processor, the Optimizer, and the CQA
Rewriter.</p>
      <p>The Data Processor highlights some of the advanced functionalities included
in I-DLV; in more detail, the general capabilities of Python-based external atoms
put into play the possibility to include ETL processes inside the ASP engine. This
is a particularly interesting innovation, since reasoning on deductive databases
usually excluded ETL processes that were con ned to external work ows.
Moreover, ASP-Core-2 compliance of the I-DLV language implies the possibility to
exploit function symbols as predicate arguments; in a database oriented setting,
this allows to easily simulate skolemization. This is a particularly interesting
feature when ontologies are among the inputs; in fact, it is well known that, in
particular cases, value invention in ontologies can be handled via skolemization.
This opportunity signi cantly expands data integration potentialities of the
system w.r.t. existing proposals. It is worth observing that, in a parallel project
involving DLV, a more general extension of ASP supporting existentially
quanti ed rule heads, and consequently more complex axioms in ontologies, named
DLV9 , has been proposed [18]; however this language extension and the
corresponding evaluation engine is not included in I-DLV yet.</p>
      <p>The Optimizer applies to the resulting ASP program all database oriented
optimizations previously outlined, and included in I-DLV. In more detail, magic
sets, join optimizations, hypertree decompositions, body orderings and indexing
strategies may altogether provide crucial speedup in query answering processes,
thus allowing the adoption of the system in real application scenarios.</p>
      <p>In order to complete the picture relative to the Evaluation Layer, it is worth
observing that, if the global schema is equipped with constraints that must be
satis ed during data integration, Consistent Query Answering techniques and
optimizations such as the ones presented in [21, 22] can be applied. In Figure
1, this is represented as a functionality external to DLV since it is not included
inside the engine yet. However, it would be straightforward to incorporate them
inside the system since they are based on rewritings of ASP programs.</p>
      <p>
        Finally, the Presentation Layer is devoted to allow users to compose queries
and get the corresponding results. Again available external graphical tools [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]
can support this phases.
      </p>
      <p>Future Work
In this paper we brie y reported on the most recent advancements on the
deductive system I-DLV for database oriented features, and we have shown their
application to a data integration setting. In particular, reported features clearly
show that data integration is still a very active and promising research area,
which is kept strongly alive by new challenges arising from ontologies,
semistructured and unstructured information sources. Given the positive results in
terms of e ciency and extensibility we obtained for the I-DLV system, rst of
all we plan to incorporate in I-DLV the features already developed in parallel
projects, such as the CQA rewriting and optimizations techniques and the
support to existential rules introduced in DLV9 for ontology querying. Moreover, we
plan to explicitly implement connectors to di erent data formats. As a matter of
facts, reasoning on top of big data is also part of ongoing projects in the research
group.
12. Fusca, D., Calimeri, F., Zangari, J., Perri, S.: I-DLV+MS: preliminary report on an
automatic ASP solver selector. In: RCRA@AI*IA. CEUR Workshop Proceedings,
vol. 2011, pp. 26{32. CEUR-WS.org (2017)
13. Gebser, M., Kaminski, R., Kaufmann, B., Romero, J., Schaub, T.: Progress in clasp
series 3. In: LPNMR. Lecture Notes in Computer Science, vol. 9345, pp. 368{383.</p>
      <p>Springer (2015)
14. Gebser, M., Maratea, M., Ricca, F.: The sixth answer set programming
competition. J. Artif. Intell. Res. 60, 41{95 (2017)
15. Lembo, D., Lenzerini, M., Rosati, R., Ruzzi, M., Savo, D.F.: Inconsistency-tolerant
query answering in ontology-based data access. J. Web Sem. 33, 3{29 (2015)
16. Lenzerini, M.: Data integration: A theoretical perspective. In: PODS. pp. 233{246.</p>
      <p>ACM (2002)
17. Leone, N., Gottlob, G., Rosati, R., Eiter, T., Faber, W., Fink, M., Greco, G., Ianni,
G., Kalka, E., Lembo, D., Lenzerini, M., Lio, V., Nowicki, B., Ruzzi, M., Staniszkis,
W., Terracina, G.: The INFOMIX System for Advanced Integration of Incomplete
and Inconsistent Data. In: Proceedings of the 24th ACM SIGMOD International
Conference on Management of Data (SIGMOD 2005). pp. 915{917. ACM Press,
Baltimore, Maryland, USA (Jun 2005)
18. Leone, N., Manna, M., Terracina, G., Veltri, P.: E ciently computable datalog9
programs. In: KR. AAAI Press (2012)
19. Leone, N., Ricca, F., Rubino, L.A., Terracina, G.: E cient application of answer set
programming for advanced data integration. In: PADL. Lecture Notes in Computer
Science, vol. 5937, pp. 10{24. Springer (2010)
20. Liang, S., Fodor, P., Wan, H., Kifer, M.: OpenRuleBench: An Analysis of the
Performance of Rule Engines. In: Proceedings of the 18th International Conference
on World Wide Web, WWW 2009, Madrid, Spain, April 20-24, 2009. pp. 601{
610. ACM (2009). https://doi.org/10.1145/1526709.1526790, http://doi.acm.org/
10.1145/1526709.1526790
21. Manna, M., Ricca, F., Terracina, G.: Consistent query answering via ASP from
di erent perspectives: Theory and practice. Theory and Practice of Logic
Programming 13(2), 277{252 (2013)
22. Manna, M., Ricca, F., Terracina, G.: Taming primary key violations to
query large inconsistent data via ASP. TPLP 15(4-5), 696{710 (2015).
https://doi.org/10.1017/S1471068415000320, http://dx.doi.org/10.1017/
S1471068415000320
23. Nardi, B., Reale, K., Ricca, F., Terracina, G.: An integrated environment for
reasoning over ontologies via logic programming. In: RR. Lecture Notes in Computer
Science, vol. 7994, pp. 253{258. Springer (2013)
24. Swift, T., Warren, D.S.: XSB: Extending Prolog with Tabled Logic
Programming. Theory and Practice of Logic Programming 12(1-2), 157{187
(2012). https://doi.org/10.1017/S1471068411000500, http://dx.doi.org/10.1017/
S1471068411000500
25. Terracina, G., Leone, N., Lio, V., Panetta, C.: Experimenting with recursive queries
in database and logic programming systems. Theory and Practice of Logic
Programming 8, 129{165 (2008)
26. Terracina, G., Martello, A., Leone, N.: Logic-based techniques for data cleaning:
An application to the italian national healthcare system. In: LPNMR. Lecture
Notes in Computer Science, vol. 8148, pp. 524{529. Springer (2013)
27. Ullman, J.D.: Principles of Database and Knowledge-Base Systems, Volume I.</p>
      <p>Computer Science Press (1988)</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Alviano</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dodaro</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fusca</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leone</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Perri</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ricca</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Veltri</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zangari</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>The ASP system DLV2</article-title>
          .
          <source>In: LPNMR. Lecture Notes in Computer Science</source>
          , vol.
          <volume>10377</volume>
          , pp.
          <volume>215</volume>
          {
          <fpage>221</fpage>
          . Springer (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Alviano</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Faber</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Greco</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leone</surname>
          </string-name>
          , N.:
          <article-title>Magic sets for disjunctive datalog programs</article-title>
          .
          <source>Artif. Intell</source>
          .
          <volume>187</volume>
          ,
          <issue>156</issue>
          {
          <fpage>192</fpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Arenas</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bertossi</surname>
            ,
            <given-names>L.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chomicki</surname>
          </string-name>
          , J.:
          <article-title>Specifying and Querying Database Repairs using Logic Programs with Exceptions</article-title>
          . In: Larsen,
          <string-name>
            <given-names>H.L.</given-names>
            ,
            <surname>Kacprzyk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            ,
            <surname>Zadrozny</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Andreasen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            ,
            <surname>Christiansen</surname>
          </string-name>
          , H. (eds.)
          <source>Proceedings of the Fourth International Conference on Flexible Query Answering Systems (FQAS</source>
          <year>2000</year>
          )
          <article-title>(</article-title>
          <year>2000</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Cal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lembo</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rosati</surname>
          </string-name>
          , R.:
          <article-title>Query rewriting and answering under constraints in data integration systems</article-title>
          .
          <source>In: IJCAI</source>
          . pp.
          <volume>16</volume>
          {
          <fpage>21</fpage>
          . Morgan Kaufmann (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Faber</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gebser</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ianni</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kaminski</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Krennwallner</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leone</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ricca</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schaub</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Asp-core-2: Input language format</article-title>
          . ASP Standardization Working Group,
          <source>Tech. Rep</source>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fusca</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Perri</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zangari</surname>
          </string-name>
          , J.:
          <article-title>External computations and interoperability in the new DLV grounder</article-title>
          .
          <source>In: AI*IA. Lecture Notes in Computer Science</source>
          , vol.
          <volume>10640</volume>
          , pp.
          <volume>172</volume>
          {
          <fpage>185</fpage>
          . Springer (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fusca</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Perri</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zangari</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>I-DLV: the new intelligent grounder of DLV. Intelligenza Arti ciale 11(1</article-title>
          ),
          <volume>5</volume>
          {
          <fpage>20</fpage>
          (
          <year>2017</year>
          ). https://doi.org/10.3233/IA170104, http://dx.doi.org/10.3233/IA-170104
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fusca</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Perri</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zangari</surname>
          </string-name>
          , J.:
          <article-title>Optimizing answer set computation via heuristic-based decomposition</article-title>
          .
          <source>In: PADL. Lecture Notes in Computer Science</source>
          , vol.
          <volume>10702</volume>
          , pp.
          <volume>135</volume>
          {
          <fpage>151</fpage>
          . Springer (
          <year>2018</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gebser</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Maratea</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ricca</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>Design and results of the fth answer set programming competition</article-title>
          .
          <source>Artif. Intell</source>
          .
          <volume>231</volume>
          ,
          <issue>151</issue>
          {
          <fpage>181</fpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Calimeri</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Perri</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fusca</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zangari</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>I-DLV homepage</article-title>
          (since
          <year>2016</year>
          ), https: //github.com/DeMaCS-UNICAL/
          <article-title>I-DLV/wiki</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Febbraro</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grasso</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leone</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Reale</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ricca</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>Datalog development tools - (extended abstract)</article-title>
          .
          <source>In: Datalog. Lecture Notes in Computer Science</source>
          , vol.
          <volume>7494</volume>
          , pp.
          <volume>81</volume>
          {
          <fpage>85</fpage>
          . Springer (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>