<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Hagenberg, Austria</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Towards an Implementation in LambdaProlog of the Two Level Minimalist Foundation</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Claudio Sacerdoti Coen</string-name>
          <email>sacerdot@cs.unibo.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Bologna</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2018</year>
      </pub-date>
      <volume>1</volume>
      <fpage>3</fpage>
      <lpage>08</lpage>
      <abstract>
        <p>In this work we will present an implementation of the Minimalist Foundation [MS05; Mai09] using the Prolog [NM88] implementation ELPI [Dun+15]. The Minimalist Foundation is a two level constructive type theory with both an extensional level with quotients and sets and an intensional level satisfying the proof-as-programs interpretation. Linking the two level there is an interpretation of the extensional level in a dependent setoid model build upon the intensional level where extensionality is modeled by intensional propositions. In our work we have implemented type checkers for both levels and the translation for a significant subset of well-typed extensional terms and types to well-typed intensional terms and types. Future works include: formal verification in Abella for our implementation, code extraction at the intensional level and finally a proof assistant at the extensional level (using the constraint programming functionality offered by ELPI).</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>In 2005 M. Maietti and G. Sambin [MS05] argued about the necessity of building a foundation for constructive mathematics
to be taken as a common core among relevant existing foundations in axiomatic set theory, such as Aczel-Myhill’s CZF
and Mizar’s Tarski-Grothendieck set theory, or in category theory, such as the internal theory of a topos, or in type theory,
such as Martin-Löf’s type theory and Coquand’s Calculus of Inductive Constructions. Moreover, they asked the foundation
to satisfy the “proofs-as-programs” paradigm, namely the existence of a realizability model where to extract programs
from proofs. Finally, the authors wanted the theory to be appealing to standard mathematicians and therefore they wanted
extensionality in the theory, e.g. to reason with quotient types and to avoid the intricacies of dependent types in intensional
theories.</p>
    </sec>
    <sec id="sec-2">
      <title>In the same paper they noticed that theories satisfying extensional properties, like extensionality of functions, cannot satisfy the proofs-as-programs requirement. Therefore they concluded that in a proofs-as-programs theory one can only represent extensional concepts by modelling them via intensional ones in a suitable way, as already partially shown in some categorical contexts.</title>
      <p>Finally, they ended up proposing a constructive foundation for mathematics equipped with two levels: an intensional
level that acts as a programming language and is the actual proofs-as-programs theory, and an extensional level that acts as
the set theory where to formalize mathematical proofs. Then, the constructivity of the whole foundation relies on the fact
that the extensional level must be implemented over the intensional level, but not only this. Indeed, following Sambin’s
forget-restore principle, they also required that extensional concepts must be abstractions of intensional ones as result of
forgetting irrelevant computational information. Such information is then restored when extensional concepts are translated
back at the intensional level.</p>
    </sec>
    <sec id="sec-3">
      <title>In 2009 M. Maietti [Mai09] presented the syntax and judgements of the two levels, together with a proof that a suit</title>
      <p>able completion of the intentional level provides a model of the extensional one. The proof is constructive and based on
a sequence of categorical constructions, the most important being the construction of a quotient model within the
intensional level, where setoids are used to encode extensional equality in an intensional language, and a notion of canonical
isomorphism between intensional dependent setoids.</p>
      <p>In this work we will present an implementation of the following software components:</p>
    </sec>
    <sec id="sec-4">
      <title>1. a type checker for the intensional level</title>
      <p>2. a reformulation of the extensional level that allows to store sintactically proof objects that are later used to provide the
information to be restored when going from the extensional to the intensional level so to avoid general proof search
during the interpretation</p>
    </sec>
    <sec id="sec-5">
      <title>3. a type checker for the obtained extensional level</title>
    </sec>
    <sec id="sec-6">
      <title>4. a translator from well-typed extensional terms and types to well-typed intensional terms and types</title>
    </sec>
    <sec id="sec-7">
      <title>As a future extension we hope to implement also other software components:</title>
      <p>1. a formal validation (in Abella) of our reformulation of the extensional level and of the correctness of the implementation</p>
    </sec>
    <sec id="sec-8">
      <title>2. code extraction at the intensional level (in Haskell/Lisp)</title>
    </sec>
    <sec id="sec-9">
      <title>3. a proof assistant at the extensional level (using the constraint programming functionality offered by ELPI) Combining the translator with a proof extraction component it will be possible to extract programs from proofs written in the extensional level.</title>
    </sec>
    <sec id="sec-10">
      <title>We have chosen Prolog [NM88] (and in particular its recent implementation ELPI [Dun+15]) as the programming</title>
      <p>language to write the two type checkers and the interpreter. The benefits are that Prolog takes care of the intricacies of
dealing with binders and alpha-conversion and moreover a Prolog implementation of a syntax-directed judgement is just
made of simple clauses that are almost literal translations of the judgemental rules. This allows humans (and logicians in
particular) to easily inspect the code to spot possible errors.
1</p>
      <sec id="sec-10-1">
        <title>Higher Order Logic Programming</title>
        <p>At the beginning of our work it was decided that the this project should be implemented using Prolog and ELPI, one of
its implementations which allows for more flexibility (e.g. optional type declarations) more functionality (e.g. constraint
programming, which will be useful in later phases of the project) and also improved performances compared to other major
implementations of Prolog.</p>
      </sec>
    </sec>
    <sec id="sec-11">
      <title>As a programming language Prolog is naturally a good fit for a binder-heavy project like ours as it natively offer all</title>
      <p>the functionality required to handle -equivalence, capture avoiding substitutions and higher-order unification; moreover
the high level abstractions and backtracking allow for almost a 1-1 encoding of inference rules. As an example below we
can see the comparison between a natural deduction inference rule and a Prolog clause in the case of -Introduction. In
natural deduction style we have the following inference rule</p>
      <p>C.x/ set [x Ë B]</p>
      <p>B set
xËBC.x/ set
-F
while in Prolog we would write
ofType (setPi B C) KIND3 IE
:- ofType B KIND1 IE
, (pi x\ locDecl x B</p>
      <p>=&gt; ofType (C x) KIND2 IE)
pts_fun KIND1 KIND2 KIND3
where ofType represent the set judgement, locDecl is used to locally introduce hypothesis in the context and pts_fun is
used to decide whether the produced type is a set or a collection.</p>
    </sec>
    <sec id="sec-12">
      <title>While the Prolog version is more verbose it should be noted that this one rule encode the -Formation rule for both the extensional and the intensional level and also for all combination of arguments kinds between sets, collections, propositions and small proposition.</title>
      <p>1.1</p>
      <sec id="sec-12-1">
        <title>Changes to the calculus</title>
        <p>As pointed out in the introduction before we could begin the implementation we had to apply some changes to the calculi
exposed in [Mai09].
1.1.1</p>
      </sec>
      <sec id="sec-12-2">
        <title>Syntax Directed Rules</title>
      </sec>
    </sec>
    <sec id="sec-13">
      <title>Between the rules of mTT and emTT there is a rule which states that equal type have the same elements, that is:</title>
      <p>a Ë A</p>
      <p>A = B conv
a Ë B</p>
      <p>This rule is problematic from an implementation point of view since it can be triggered at any moment and (at the
extensional level) would start a proof search trying to prove A = B for a unknown A. A simple solution is to preemptively
compose this rule to all other rule that might need it; for example we can see that the elimination rule for dependents
products changes from
to
the main difference being that in the syntax directed version we check that the domain type of f and the inferred type of
t are convertible instead of requiring syntactic equality like in the original rule. In this way the power of the calculus is
surely not increased as the syntax directed rule is admissible in the calculus with the two previous rules, indeed
f Ë
xËBC.x/</p>
      <p>t Ë B
apply.f ; t/ Ë C.t/</p>
      <p>-E
f Ë
xËBC.x/
t Ë B¨</p>
      <p>B = B¨
apply.f ; t/ Ë C.t/
f Ë
xËBC.x/
apply.f ; t/ Ë C.t/
t Ë B¨</p>
      <p>B = B¨
t Ë B
-E</p>
      <p>-E
conv
is a derivation of the same judgement from the same hypothesis.</p>
      <p>For this change to be effective we also need to reformulate the conversion rules given by the calculi (e.g. apply. Bx:f ; t/ =
f ^t_x`) in directed rules describing a reduction predicate Î (e.g. apply. Bx:f ; t/ Î f ^t_x`) from which we will define
equality as reducibility to a common term, which is equivalent to the original equality given that both calculi of the
Minimalist Foundation have both the property of Church-Rosser confluence and strong normalization. This new relation will
be implemented by the predicate conv.</p>
    </sec>
    <sec id="sec-14">
      <title>The full modifications to the calculi in our implementation with respect to making the rule syntax directed are mostly about checking for convertibility/equality where mTT/emTT require syntactical equality. We plan to provide a full proof of the validity of this modification, and also of the followings, in Abella in the future.</title>
      <sec id="sec-14-1">
        <title>1.1.2 Informative Proof-terms at the Extensional Level and</title>
      </sec>
      <sec id="sec-14-2">
        <title>Extensional Equality</title>
        <p>b Ë B
true Ë C.b/</p>
        <p>C.x/ prop [x Ë B]
true Ë ÇxËBC.x/
Ç-I
true Ë Eq.A; a; b/
a = b Ë A</p>
        <p>Eq-E
In the calculus emTT a token proof term true is used to represent all proofs of propositions and small propositions; this poses
problems during the interpretation when we need to construct intensional proof terms given only the token true. Moreover
the strong elimination rule for the propositional extensional equality turns every conversion in an undecidable problem of
general proof search for true Ë Eq.A; a; b/. To reasonably implement the extensional calculus and the interpretation we
need a way to construct intensional proofs from extensional proof and a way to contain the undecidability of the strong
elimination rule for the extensional propositional equality.</p>
      </sec>
    </sec>
    <sec id="sec-15">
      <title>A first approach to the problem could have been to have both the extensional type checker and the interpretation work</title>
      <p>on full derivations; this is the method chosen by Maietti in [Mai09] and would solve the problems regarding the missing
informations in the extensional calculus and language, but was discarded as a solution because from an implementation
standpoint the code to encode a derivation would be almost a copy of the code needed to encode terms, yet with subtle, but
non trivial, differences.</p>
    </sec>
    <sec id="sec-16">
      <title>The chosen solution was to both add informative proof terms to the extensional level and weaken the elimination rule of the propositional extensional equality. The first part of the solution makes the extensional rule for proposition more similar to the intensional ones</title>
      <p>b Ë B
p Ë C.b/</p>
      <p>C.x/ prop [x Ë B]
&lt; b;Ç p &gt;Ë ÇxËBC.x/
Ç-I
so that during the interpretation more informations are available.</p>
    </sec>
    <sec id="sec-17">
      <title>The second part modifies the extensional propositional equality elimination rule to a weaker but equivalent form; the rule</title>
      <p>true Ë Eq.A; a; b/
a = b Ë A</p>
      <p>Eq-E
a = b Ë A [h Ë Eq.A; a; b/]</p>
      <p>Eq-E
becomes
This weaker form has nicer computational properties, the most important one being that it is always deterministic and
terminating to check if it is possible to apply given a specific context while still allowing to locally definitions in a context
via implication or dependents types.</p>
    </sec>
    <sec id="sec-18">
      <title>This solution can be seen as an extension of simply encoding proofs in -terms; we have already noted that at the exten</title>
      <p>sional level there are not satisfying such encoding because of the propositional equality elimination rule, whose appearances
in the -terms would be confusing at the extensional level. From this point of view our solution can be thought of as
realizing that adding the needed equalities to the context is enough, and that the locally scoped nature of the context provide
exactly the functionality we need to use equalities. Compared to using whole derivation trees we now use -terms plus
some context.</p>
    </sec>
    <sec id="sec-19">
      <title>A drawback of this solution is that it is complex to use even simple lemmas; to this problem we offer two solutions: to</title>
      <p>manually use the cut elimination of the calculus or to introduce a new let in construct
let x := t1 Ë T in t2
used to introduce locally typed definitions.</p>
    </sec>
    <sec id="sec-20">
      <title>As of this writing this let in in not yet implemented as there many different ways the needed functionality can be obtained</title>
      <p>each with different compromises and it is not entirely clear which implementation method is the most adeguate to our project.
In our simple cases it is acceptable to manually perform the cut elimination to locally include the desired lemma so we plan
to implement a proper let in once there is a strong need for it.
2</p>
      <sec id="sec-20-1">
        <title>The Interpretation</title>
        <p>Another difference between the calculus in Maietti’s work and ours is that our model validates the -rule at the intensional
level, removing it would require a refactoring of the conversion to include a list of forbidden variables that block conversion.
Until this moment it was not a priority to implement this functionality, but still we have some feasible implementation paths.
One of the main result of [Mai09] is a categorical proof of the validity of an interpretation of an extensional type theory
into an intensional one.</p>
        <p>In particular first a quotient model Q(mTT) is built on the theory mTT, this model is given as a category where objects
are pairs of .A; =A/ (abbreviated to A= when =A it is clear from the context) where A is an intensional type called support
and</p>
        <p>x =A y props [x Ë A; y Ë A]
is an equivalence relation on A; which means that there are terms rf lA, symmA and traA such that the following judgements
hold in mTT</p>
        <p>rf lA.x/ Ë x =A x [x Ë A]
symA.x; y; u/ Ë y =A x [x Ë A; y Ë A; u Ë x =A y]
traA.x; y; z; u; v/ Ë x =A z [x Ë A; y Ë A; z Ë A; u Ë x =A y; v Ë y =A z]</p>
      </sec>
    </sec>
    <sec id="sec-21">
      <title>In the following we will refers to the objects of Q(mTT) as setoids.</title>
      <p>The morphisms of Q(mTT) from .A; =A/ to .B; =B/ are all the well-typed terms of mTT</p>
      <p>f .x/ Ë B.x/ [x Ë A]
that preserve the setoid equality, that is there exist a proof-term</p>
      <p>pr1.x; y; z/ Ë f .x/ =B f .y/ [x Ë A; y Ë A; z Ë x =A y]
Equality between two morphisms f,g : .A; =A/  .B; =B/ holds if they maps equal elements of A= to equal elements of
B=, that is if there is a proof-term</p>
      <p>pr2.x/ Ë f .x/ =B g.x/ [x Ë A]</p>
      <p>Starting from the category Q(mTT) we can define also dependent setoids, given a setoid A= we say that
is a dependent setoid if its support is a dependent type B.x/ type [x Ë As], if its relation is a dependent proposition</p>
      <p>B=.x/ [x Ë A=]
y =B.x/ y¨ [x Ë A; y Ë B.x/; y¨ Ë B.x/]
and if there is a substitution morphism</p>
      <p>xx12 .d; y/ Ë B.x2/ [x1 Ë A; x2 Ë A; d Ë x1 =A x2; y Ë B.x1/]
that satisfies the properties given by definition 4.9 in [Mai09], that is it needs to preserve the equality of B=, to be
proofirrelevant by not depending on the particular choice of the proof-term d Ë x1 = x2 (so that we can write xx12 .y/ in place of
xx12 .d; y/), there need to always exists the identity substitution 1 xx11 .rf lB.x1/.x1/; –/ and lastly substitution morphisms need
to be composable, that is we require</p>
      <p>xx23 . xx12 .y// =B.x3/ xx13 .y/</p>
    </sec>
    <sec id="sec-22">
      <title>1Which in general does not need to be the identity morphism, for example</title>
      <p>xx.rflA.x/; w/  elim+.w; .z/:inl. xx.rflA.x/.x/; z//; .z/:inr. xx.rflA.x/.x/; z///
if w Ë B + C [x Ë A].
in the appropriate context2.</p>
      <p>Over these substitution we can define canonical isomorphism between two type B1 and B2 as the only between them.</p>
    </sec>
    <sec id="sec-23">
      <title>In our code we have implemented these isomorphisms as with the predicate tau B B’ P.</title>
      <p>The reason these construction are needed is because the strong elimination rule of the extensional propositional equality
can derive a judgemental equality from just an hypothetical assumption. For an example we can see that a judgement like
true Ë Eq.C ; c; d/ [c Ë C ; d Ë C ] is a well formed but not derivable judgement in emTT, but if we add one more assumption
to the context we obtain</p>
      <p>true Ë Eq.C + C ; inl.c/; inl.d// [c Ë C ; d Ë C ; h Ë Eq.C ; c; d/]
which is derivable since we can use the rules prop * true and E-Eq to derive the judgement c = d Ë C from the given
context.</p>
    </sec>
    <sec id="sec-24">
      <title>This can be seen also in the fact that this judgement</title>
      <p>ÅxË1Åf Ë .Eq.1;x;?/;1/Eq.1; ?; x/ Ù
Eq.1; apply.f ; eq.1; ?//; apply.f ; eq.1; ?/// props3
is derivable in the empty context even if there is a mismatch between the types of f and eq.1; ?/. Here for example we
would want to apply a canonical isomorphism to àeq.1; ?/á4 so that its type match with the domain of àf á; after the
corFrercotmio nthwiseinotbrtoadinucatpiopnlyt.of ;theààEEsqqt..r11u;;?xc;;t??u//rááe.àoefqt.h1e; i?n/táer/p/rwethaitcihoniswweeclla-ntyspeeed tihnamttThTe method used to force the semantics of
emTT on mTT (via Q(mTT)) is to require the creation of suitable proof terms thus restricting the available language to a
subset that preserve the extensional semantic.</p>
    </sec>
    <sec id="sec-25">
      <title>We can see another more explicit application of this method in the interpretation of dependent products, indeed in this</title>
      <p>case a function term of emTT is interpreted as a pair whose first projection is an intensional function and whose second
projection is a proof that the first projection preserves the setoid equalities of Q(mTT).</p>
    </sec>
    <sec id="sec-26">
      <title>From the point of view of the implementation this means that we have to produce various proof for each of the type constructor we wish to include in our implementation.</title>
    </sec>
    <sec id="sec-27">
      <title>As of now we have implemented all that is needed to fully interpret the subset of emTT comprised of singletons, dependent products, implications, universal quantifications, propositional equality. We believe that this subset exhibits enough of the complexity of the interpretation and feel confident that the structure of our implementation will be able to cover also the needs of the remaining constructors.</title>
      <p>3</p>
      <sec id="sec-27-1">
        <title>The Code</title>
        <p>All the code was written in Prolog (in particular it was written for the ELPI implementation [Dun+15] of Prolog) and
make extensive use of many high level features that in other languages would not be available. Still our code does not use
many of the more advanced features offered by ELPI like constraint programming which we plan to use in the next phase
of the project when we will start to implement a kernel for the planned proof checker</p>
      </sec>
    </sec>
    <sec id="sec-28">
      <title>In this section we will show some extract of the code and explain how we obtained the desired functionalities.</title>
      <p>3.1</p>
      <sec id="sec-28-1">
        <title>Project Structure and Extensibility</title>
        <p>We have divided the code 4 categories main, calc, test, debug. The first, main holds type declaration for the other
modules and define the general functionality of the theory shared by all type constructor of both levels. the second is a
folder containing a .elpi file for each type constructor; in each of these files we implement the typing and conversion
rules for that particular type constructor. For example in calc/setPi.elpi we keep all definitions related to dependent
products, like the implementation of -I or the interpretation of a function term.
of (lambda B F) (setPi B C) IE
:- ofType B _ IE
, (pi x\ locDecl x B =&gt; isa (F x) (C x) IE)
.</p>
        <p>2It is worth noting that in the Minimalist Foundation the only type that directly takes terms as arguments are the propositional equalities (written Eq
and Id in emTT and mTT respectively). For any B.x/ type [x Ë A] the only possible occurrences of x are as an argument of Eq (resp. Id if in mTT),
together with the absence of strong elimination of terms towards types this means that conversion of types is always only reduced to conversion of terms
and that types containing no propositions cannot be truly dependent. For example the canonical way to construct a type list of type C of length n is via a
dependent sum .List.C/; .l/:P .l; n// where P .l; n/ is the proposition length of l equals n.</p>
        <p>4In this work we use the double bracket notation for the interpretation instead of the I used in [Mai09].</p>
      </sec>
    </sec>
    <sec id="sec-29">
      <title>The last two, test and debug, hold utilities to quickly test the predicates from main and calc in addition to useful</title>
      <p>predicates needed to inspect the runtime behaviour of the type checking and of the interpretation.</p>
      <sec id="sec-29-1">
        <title>3.2 Explanation of the Predicates</title>
        <p>We can divide the predicate defined in our implementation in three groups: judgement predicates, conversion predicates
and interpretation predicates.
3.2.1 of, isa, ofType, isaType and locDecl
In the first group are the predicate corresponding to the judgements A type [ ] or a Ë A [ ]. We use the
of Term Type Level predicate to represent the judgement T erm Ë T ype [ ] at the level indicated by the third
argument, we also define another predicate isa Term Type Level to check if the inferred type of Term is convertible with
Type at level Level; a major difference between these two predicate is that for of the first and last argument are inputs
and the middle is an output, while for isa all three arguments are inputs. Similarly ofType Type Kind Level and
isaType Type Kind Level implement the corresponding functionality at the type level.</p>
        <p>locDecl is used to form the context where of will look up the types of variables via the clause
of X Y _ :- locDecl X Y.. We have no need to have a parameter to indicate the level for locDecl because
extensional and intensional variables are disjoint and we can assume that we will never try to compute the extensional type of
an intensional variable. Moreover as of now our implementation validates the -rule at the intensional level so that also the
conversion can be level agnostic.
3.2.2 pts_* Predicates
pts_leq A A.
pts_leq set col.
pts_leq props set.
pts_leq propc col.
pts_leq props propc.
pts_leq props col.</p>
        <p>In the Minimalist Foundation in general the kind of a type is not unique but admit a most general unifier, this unifier will
be the intended output of ofType and the logic to find it is implemented in the various PTS predicates. As an example a
dependent product .B; C/ is never a proposition and is a set if both A set and B.x/ set [x Ë A] hold.</p>
      </sec>
    </sec>
    <sec id="sec-30">
      <title>This can be expressed with the following code:</title>
      <p>pts_fun A B set :- pts_leq A set, pts_leq B set, !.
pts_fun _ _ col.</p>
    </sec>
    <sec id="sec-31">
      <title>Other variations allow to decide the most general kind of universal/existential quantifiers, connectives or lists.</title>
      <p>3.2.3 conv, dconv, hnf and hstep
While conversion at the intensional level is almost straightforward at the extensional level we need to be more careful as a
naive implementation would be extremely inefficient due to the rule Eq-E
true Ë Eq.C; c; d/ [c Ë C; d Ë C]</p>
      <p>c = d Ë C</p>
    </sec>
    <sec id="sec-32">
      <title>The solution we have chose to implement consist of two step, first we weaken the Eq-E rule to perform a context lookup instead of trying to prove a judgement and then we divide the conversion in three different operations.</title>
      <p>conv A A :- !.
conv A B :- locDecl _ (propEq _ A B).
conv A B :- (hnf A A’), (hnf B B’), (dconv A’ B’).</p>
      <p>The predicate that will actually implement the conversion judgement will be conv, for this predicate both arguments will
be inputs. When called between two terms the predicate will first check if the two terms are unifiable at the meta-level, if
this fail then it will search for a context declaration that states the equality it wants; if this also fail then if will reduces both
term to head normal form and apply a dconv step on the head normal forms. dconv role is to propagate the conversion to
subterms; it assumes to receive two terms in head normal form (as it will only ever be called on output of hnf) and and will
check for conversion on the corresponding subterms of its arguments.
dconv (setSigma B C) (setSigma B’ C’)
:- (conv B B’)
, (pi x\ locDecl x B =&gt; conv (C x) (C’ x))
.
dconv (pair B C BB CC) (pair B’ C’ BB’ CC’)
:- (conv B B’)
, (pi x\ locDecl x B =&gt; conv (C x) (C’ x))
, (conv BB BB)
, (conv CC CC’)
.
dconv (elim_setSigma Pair M MM) (elim_setSigma Pair’ M’ MM’)
:- (conv Pair Pair’)
, (of Pair (setSigma B C))
, (pi z\ locDecl z (setSigma B C) =&gt; conv (M z) (M’ z))
, (pi x\ pi y\ locDecl x B
=&gt; locDecl y (C x)
=&gt; conv (MM x y) (MM’ x y))</p>
      <p>In general it will be needed to declare a dconv clause for each new non-constant constructor (constant constructors are
all handled by a dconv A A :- !. clause) the main reason to use dconv is that is does not trigger the context lookup
associated with the Eq-E rule.
3.2.4 tau, setoid_*, proof_eq and interp</p>
    </sec>
    <sec id="sec-33">
      <title>The main entry point of the interpretation is the predicate interpr, for example the typical structure of our test is</title>
    </sec>
    <sec id="sec-34">
      <title>1. Construct a suitable extensional type</title>
    </sec>
    <sec id="sec-35">
      <title>2. Typecheck the type at the extensional level</title>
    </sec>
    <sec id="sec-36">
      <title>3. Interpret the type to the instensional level</title>
    </sec>
    <sec id="sec-37">
      <title>4. Typecheck this last result. During the interpretation many other auxiliary predicates are needed. Here we will give a description of the most important ones.</title>
    </sec>
    <sec id="sec-38">
      <title>As explained above during the interpretation we need to correct the types of interpreted terms to compensate for the</title>
      <p>different rules at the two levels; in our code we have chosen to implement this functionality with the predicate tau T1 T2 P.
tau takes in input 2 mutually convertible extensional types and return a Prolog function which convert elements of type
T1 to elements of type T2. It is important to note that P is a pure -function at the meta-level and thus in the conversion
between the two types T1 and T2 we can not use backtracking or patter matching; thankfully a pure function is all we need
because canonical isomorphism only apply -expansions and do not actually inspect the term they are applied to, only its
type and that is completely know to tau. As an example we show how this type correction would look like for a disjoint
sum for a variable w Ë àB.x/ + C.x/á [àx Ë Aá] when transitioning to a type àB.x¨/ + C.x¨/á
¨ ¨ ¨
xx .w/  Elim+.w; .y1/: xx .y1/; .y2/: xx .y2//</p>
    </sec>
    <sec id="sec-39">
      <title>We can see that we have applied structural recursion over the types even when computing a function over terms, this is</title>
      <p>helpful when treating types with more than one canonical constructor and when dealing with variables, but in general it is
necessary since we cannot assume that all terms will have a canonical head normal form.</p>
    </sec>
    <sec id="sec-40">
      <title>It is important to note that all predicates working at the interpretation level will always assume to handle well formed</title>
      <p>and well typed terms and types so that we do not need to fill our code with an enormous amount of runtime checks.</p>
      <p>A related predicate is tau_eq, which at the moment is conceptually the most complex predicate in the code and its aim
is to implement the second and third property of required of substitution morphisms: preserving equalities of Q(mTT). In
formulae this means that the following hypothetical judgment should be derivable for some choice of d2
d2 Ë
xx12 .y/ =B.x2/ xx12 .y¨/ [x1 Ë A; x2 Ë A; d1 Ë x1 = x2;
y Ë B.x1/; y Ë B.x2/; w Ë y =B.x1/ y¨] (1)
tau_eq role is exactly to find a constructive and functional (after taking as inputs B.x1/ and B.x2/) transformation that
produces such d2 from y, y¨ and d1.</p>
    </sec>
    <sec id="sec-41">
      <title>This predicate becomes most useful during the interpretation of dependent products, since in Q(mTT) a dependent</title>
      <p>function is represented as a pair containing a mTT function f and a proof that f preserve the extensionality of Q(mTT),
similarly to how it is done in [Nec97].</p>
    </sec>
    <sec id="sec-42">
      <title>Another problem is that in dealing with dependent products is that we need to automatically generate a proof that the</title>
      <p>interpretations of emTT function will preserve the equalities of Q(mTT). The generation of these proofs is handled by the
predicate proof_eq.</p>
    </sec>
    <sec id="sec-43">
      <title>A family of related predicates is built around the setoids of Q(mTT), the principal one setoid_eq takes an exten</title>
      <p>sional type in input and return an mTT proposition representing the setoid equality of the interpretation of the given type,
three related predicates setoid_refl, setoid_symm and setoid_trans produce proofs that the proposition returned by
setoid_eq is indeed an equivalence relation.
[Dun+15]
[MS05]
[NM88]</p>
    </sec>
    <sec id="sec-44">
      <title>Gopalan Nadathur and Dale Miller. “An overview of Lambda-PROLOG”. In: (1988).</title>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [Mai09] [Nec97]
          <string-name>
            <given-names>Cvetan</given-names>
            <surname>Dunchev</surname>
          </string-name>
          et al. “
          <article-title>ELPI: Fast, Embeddable,\lambda Prolog Interpreter”</article-title>
          .
          <source>In: Logic for Programming</source>
          ,
          <source>Artificial Intelligence, and Reasoning</source>
          . Springer.
          <year>2015</year>
          , pp.
          <fpage>460</fpage>
          -
          <lpage>468</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>Maria</given-names>
            <surname>Emilia</surname>
          </string-name>
          <article-title>Maietti. “A minimalist two-level foundation for constructive mathematics”</article-title>
          .
          <source>In: Annals of Pure and Applied Logic</source>
          <volume>160</volume>
          .3 (
          <issue>2009</issue>
          ), pp.
          <fpage>319</fpage>
          -
          <lpage>354</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>Maria</given-names>
            <surname>Emilia</surname>
          </string-name>
          Maietti and
          <string-name>
            <given-names>Giovanni</given-names>
            <surname>Sambin</surname>
          </string-name>
          . “
          <article-title>Toward a minimalist foundation for constructive mathematics”</article-title>
          .
          <source>In: From Sets and Types to Topology and Analysis: Practicable Foundations for Constructive Mathematics</source>
          <volume>48</volume>
          (
          <year>2005</year>
          ), pp.
          <fpage>91</fpage>
          -
          <lpage>114</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>George C Necula</surname>
          </string-name>
          <article-title>. “Proof-carrying code”</article-title>
          .
          <source>In: Proceedings of the 24th ACM SIGPLAN-SIGACT symposium on Principles of programming languages. ACM</source>
          .
          <year>1997</year>
          , pp.
          <fpage>106</fpage>
          -
          <lpage>119</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>