=Paper= {{Paper |id=None |storemode=property |title=Enumerating Pseudo-Intents in a Partial Order |pdfUrl=https://ceur-ws.org/Vol-1062/paper4.pdf |volume=Vol-1062 |dblpUrl=https://dblp.org/rec/conf/cla/BazinG13 }} ==Enumerating Pseudo-Intents in a Partial Order== https://ceur-ws.org/Vol-1062/paper4.pdf
   Enumerating Pseudo-Intents in a Partial Order

                    Alexandre Bazin and Jean-Gabriel Ganascia

        Université Pierre et Marie Curie, Laboratoire d’Informatique de Paris 6
                                      Paris, France
                                Alexandre.Bazin@lip6.fr
                              Jean-Gabriel@Ganascia.name


        Abstract. The enumeration of all the pseudo-intents of a formal context
        is usually based on a linear order on attribute sets, the lectic order.
        We propose an algorithm that uses the lattice structure of the set of
        intents and pseudo-intents to compute the Duquenne-Guigues basis. We
        argue that this method allows for efficient optimizations that reduce the
        required number of logical closures. We then show how it can be easily
        modified to also compute the Luxenburger basis.


  1   Introduction
  Various fields, such as artificial intelligence, data mining and databases, are
  concerned with the problem of finding implications between sets of attributes
  describing objects. Formal concept analysis offers a sound mathematical frame-
  work to help develop algorithms that compute implications. As it has been shown
  in [3], the set of implications of minimum cardinality needed to obtain all the
  interesting information can be found by enumerating attribute sets known as
  pseudo-intents. The best-known algorithm for the computation of pseudo-intents
  is Next Closure [4], in which pseudo-intents are enumerated following a total
  order on attribute sets called lectic order. The problem of enumerating pseudo-
  intents in lectic order has been studied in [2] and found to be CONP-complete.
  The same work has been unable to find a lower bound to the complexity of this
  problem once the restriction on the order is removed. This indicates that the
  order plays an important role and that using a weaker order could potentially
  yield interesting results. This prompted us to study an algorithm based on the
  same principles as Next Closure but with a partial order on the attribute sets
  instead of a linear order. We are inspired by the multiple algorithms for the
  related problem of enumerating formal concepts that make use of the lattice
  structure of the problem. There already are approaches that do not make use of
  the lectic order for pseudo-intents, such as the attribute incremental algorithm
  that has been introduced in [8], or the divide-and-conquer approach in [9] but
  we will not compare it to our own in this work as their nature differs.
      After an overview of the notations in formal concept analysis and a brief
  recall of Next Closure, we will present our algorithm. In a second part, we will
  show that it can be modified to compute not only the Duquenne-Guigues basis
  but also the Luxenburger basis. The functioning of the algorithm will then be
  illustrated by means of a small example.

c paper author(s), 2013. Published in Manuel Ojeda-Aciego, Jan Outrata (Eds.): CLA
  2013, pp. 45–56, ISBN 978–2–7466–6566–8, Laboratory L3i, University of La
  Rochelle, 2013. Copying permitted only for private and academic purposes.
46      Alexandre Bazin and Jean-Gabriel Ganascia


2     Notations

In formal concept analysis, data is represented by a triple C = (O, A, R), called
formal context, where O is a set of objects, A a set of binary attributes and
R ⊆ O × A a relation assigning a set of attributes to each object. An object o
is said to be described by an attribute set A iff (o, a) ∈ R for every a ∈ A.

     Two derivation operators are commonly employed and defined as follows :

                                  .0 : 2A → 2O


                        A0 = {o ∈ O | ∀a ∈ A, (o, a) ∈ R}                     (1)


                                  .0 : 2O → 2A


                        O0 = {a ∈ A | ∀o ∈ O, (o, a) ∈ R}                     (2)

    The compositions of these two operators are closure operators and we will
use the notation .00 for both. A set A is said to be closed if A = A00 . A closed
attribute set is called an intent and a closed object set an extent. A pair (E, I)
where E ⊆ O and I ⊆ A is called a formal concept of the context if I = E 0 and
E = I 0 . A formal concept (A, B) is said more general than a concept (C, D) if
C ⊂ A or B ⊂ D. The set of all concepts of a given context ordered in this way
is a complete lattice called concept lattice of the context.

   An implication is a relation between two attribute sets A and B, denoted
by A → B. An implication A → B is said to hold in a context if every object
described by A is also described by B. We write A ≡ B if A → B and B → A.
The implication A → A00 always holds in the context.

    An attribute set A is a pseudo-intent if it is not an intent and P 00 ⊂ A
for all pseudo-intents P ⊂ A. The set B = {A → A00 | A is a pseudo-intent},
called the canonical basis, is a set of implications of minimum cardinality such
that every implication that holds in the context can be inferred from it through
the application of the Armstrong rules. That is, it is the minimum number of
implications containing all information on the relations between attribute sets
in the context.

     Computing the canonical basis of a context thus requires the enumeration of
its pseudo-intents. This presents some challenges as the number of pseudo-intents
can be exponential in the size of the context (the number of attributes), as shown
in [5]. Moreover, whether pseudo-intents can be enumerated without intents is
still an open problem and the number of intents can itself be exponential in the
number of pseudo-intents.
                             Enumerating Pseudo-Intents in a Partial Order    47


2.1   Next Closure
The most known algorithm for computing the canonical basis is Ganter’s Next
Closure [4]. For an arbitrary linear order < on A, a linear order, called lectic
order, ≤lec on attribute sets is defined as follows:

                          A ≤lec B ⇔ min(A∆B) ∈ B
    ∆ is the symmetric difference. The implicational closure of an attribute set
A with respect to a set of implications L, usually noted L(A), is the smallest
superset of A such that, for every X ⊆ L(A), we have X → Y ∈ L ⇒ Y ⊆ L(A).
The implicational closure is a closure operator. The implicational pseudo-closure
of an attribute set A, noted L− (A), is the smallest superset of A such that, for
every X ⊂ L(A), we have X → Y ∈ L ⇒ Y ⊆ L(A).
    It is shown that both intents and pseudo-intents are closed under B − . Next
Closure, in its original form, computes closed sets for a given closure operator.
Applied to the computation of pseudo-intents, it goes through implicationally
pseudo-closed subsets of A in lectic order and checks whether the result is an
intent or a pseudo-intent. The lectic order being linear, every set is computed
once and only once. For an attribute set A and an attribute i, the operator ⊕ is
defined as follows :

                       A ⊕ i = B − ({a ∈ A | a ≤ i} ∪ {i})
    The closed set that immediately follows A in the lectic order is A ⊕ i with i
maximal such that min((A ⊕ i) \ A). For each set closed under B − , Next Closure
tests whether it is an intent or a pseudo-intent and then constructs the next set
with the operator ⊕ until it reaches A. Numerous optimizations can be added
to reduce the number of implicational closures. For example, as proposed in [8],
if N ext(A) = A ⊕ i = B and min(B 00 \ A) < max(A) then we can continue as
if A ⊕ i had been rejected by Next Closure. It is the only one we will consider
here as it is specific to Next Closure and has no incidence on our algorithm.


3     Algorithm
3.1   Principles
We consider the lattice Φ = ({A = B − (A) | A ⊆ A}, ⊆) of attribute sets closed
under B − ordered by the inclusion relation. Enumerating pseudo-intents is enu-
merating all the elements of Φ and testing whether they are intents or pseudo-
intents. As with other enumeration problems, we want to make sure every ele-
ment is found once and only once. The other great FCA problem, computing the
formal concepts of a context, can be viewed as the enumeration of the elements
of ({B(A) | A ⊆ A}, ⊆). It has been extensively studied (see [6] for a compari-
son of different algorithms) and various approaches have been proposed for both
generating elements from previous ones and checking whether an attribute set
has already been found (or even preventing the generation of redundant sets).
48     Alexandre Bazin and Jean-Gabriel Ganascia


Surprisingly, it seems only Next Closure is commonly used for the problem of
pseudo-intents, non-batch algorithms excluded. Indeed, it is very efficient in that
the lectic order on attribute sets allows for the generation of the next set with-
out stocking more than the current set and the total order ensures we do not
generate them more than once. However, we feel that the fact that sets are not
always generated by one of their subsets is a weakness for some applications,
such as data mining. We propose to modify Next Closure with techniques from
some algorithms for formal concepts that make use of the lattice structure.
    The lectic order has the interesting property that a set B is greater than all
its subsets. If B ∈ Φ is supposed to be generated by a single set, it should be
one that is easily identifiable such as its lectically greatest subset A ∈ Φ. For any
two A and B in Φ, we shall use A        B to denote the fact that A is the lectically
greatest subset of B closed under B − or, equivalently, that B is generated from
A. We say that A is the predecessor of B and B is a successor of A.
Proposition 1. The relation          defines a spanning tree of the neighbouring
graph of Φ.
Proof. Any non-empty set B has at least one strict subset in Φ. We call A its
lectically greatest subset in Φ. The lectic order respects the inclusion relation so
there is no C such that A ⊂ C ⊂ B and A is a lower cover of B. Since every
non-empty B ∈ Φ has a single lower cover A such that A             B, the successor
relation defines a spanning tree of the neighbouring graph of Φ. 
Proposition 2. For any two A, B ∈ Φ, the attribute set B is an upper cover
of A if and only if B = B − (A ∪ {i}) for every i ∈ B \ A.
Proof. If B is an upper cover of A, then there is no C such that A ⊂ B − (C) ⊂ B.
We have B − (A ∪ {i}) ⊆ B when i ∈ B \ A, so B = B − (A ∪ {i}) for all i ∈ B \ A.
    If B = B − (A ∪ {i}) for every i ∈ B \ A, given that A ∪ {i} is the smallest
subset of B that contains both A and i, then there is no superset of A that is a
strict subset of B. 
    Attribute sets are generated according to the tree, starting from ∅. The re-
cursive nature of the definition of a pseudo-intent prevents us from computing
a set before the computation of all its subsets. The lectic order solves this prob-
lem efficiently but our tree contains only the knowledge of the lectically greatest
subset. In order to make sure that every subset has been found first, we pro-
pose to consider attribute sets in order of increasing cardinality. This way, when
an intent or pseudo-intent of cardinality n is considered, we are sure that all
pseudo-intents of cardinality n − 1 have been found.
Proposition 3. If i < min(A) and A ⊂ A ⊕ i, then A ⊕ i has a subset in Φ that
is lectically greater than A.
Proof. If i < min(A), then B − ({a ∈ A | a ≤ i} ∪ {i}) = B − ({i}). If A ⊂ A ⊕ i,
then {i} is a pseudo-intent that is a subset of A ⊕ i and is lectically greater than
A. 
                              Enumerating Pseudo-Intents in a Partial Order        49


Algorithm 1 Successors(A)
 1: Successors = ∅
 2: for every attribute i greater than min(A) do
 3:   B =A⊕i
 4:   if A ⊂ B then
 5:      if i = min(B \ A) and ∀j ∈ B \ A, A ⊕ j = B then
 6:         Successors = Successors ∪ {B}
 7:      end if
 8:   end if
 9: end for
10: return Successors



Proposition 4. For any two attribute sets A and B such that A ⊂ B, A
B ⇔ ∀i ∈ B \ A, A ⊕ i = B.

Proof. If ∀i ∈ B \ A, A ⊕ i = B, then B has no subset in Φ lectically greater
than A. As such, ∀i ∈ B \ A, A ⊕ i = B ⇒ A       B.
    If A     B and ∃i ∈ B \ A such that A ⊕ i ⊂ B, then B has a subset in Φ
lectically greater than A, which contradicts the hypothesis. As such, A B⇒
∀i ∈ B \ A, A ⊕ i = B. 

    So, in order to generate all the successors of A, it suffices to compute A ⊕ i
for every i greater than min(A). If a resulting attribute set is an upper cover of
A it is a successor.
   Testing whether a new attribute set B = A ⊕ i is a successor of A is easy
as we know A ⊕ j for every j ∈ B \ A. Algorithm 1 uses this to compute the
successors of an attribute set.
   If an attribute set B is a pseudo-intent, it is a ∧-irreducible element of Φ.
That is, it has a single upper cover. If B = B − (A), the set B 00 is the lectically
next set after B if max(A) ≤ min(A00 \ A). That way, if an attribute set is a
pseudo-intent, we do not have to explicitly compute its successors.
    Algorithm 2 uses Algorithm 1 to go through the intents and pseudo-intents
of the context and compute its canonical basis. It starts with ∅ and computes
the closure of all the attribute sets of the same cardinality simultaneously before
generating their successors. However, when an intent A is considered and its
successors generated, only the pseudo-intents of cardinality lesser than |A| + 1
have been found so the A ⊕ i computed at this point might be different from
the final A ⊕ i. For example, if |A ⊕ i| − |a| = 2, an implication B → C with
|B| = |A| + 1 and B ⊂ A ⊕ i will change the result. For this reason, we must
split Algorithm 1 into two parts. First, the sets A ⊕ i are computed for all i once
every intent of cardinality |A| have been found. Then, when A ⊕ i is considered
as a Candidate, we must check whether it is still logically closed and, if it is, test
whether it is a successor with Proposition 4.
50    Alexandre Bazin and Jean-Gabriel Ganascia


Algorithm 2 Computing the Duquenne-Guigues Basis
 1: Card = 0
 2: Candidates = {∅}
 3: B = ∅
 4: while Candidates 6= ∅ do
 5:   Intents = ∅
 6:   for every attribute set A of cardinality Card in Candidates do
 7:     if A = B− (A) and it is a successor of its generator then
 8:        B = A00
 9:        if A 6= B then
10:           B = B ∪ {A → B}
11:           if B lectically follows A then
12:              Candidates = Candidates ∪ {B}
13:           end if
14:        else
15:           Intents = Intents ∪ {A}
16:        end if
17:     else
18:        A = B− (A)
19:     end if
20:   end for
21:   for every set C in Intents do
22:     Candidates = Candidates ∪ {C ⊕ i | i ∈ A and i = min((C ⊕ i) \ C)}
23:   end for
24:   Card = Card + 1
25: end while
26: return B



Proposition 5. Algorithm 2 terminates and produces the canonical basis of the
context.

Proof. An attribute set can only be used to generate attribute sets of greater
cardinality. The set of attributes is finite, so the algorithm terminates when it
reaches A.
    Every element of Φ, except ∅, has a lower neighbour that is a predecessor in
the spanning tree. If we generate the attribute sets along the tree, the lattice
Φ being complete, every one of its elements is considered at least once. If an
attribute set A is not equal to B − (A) it is not a pseudo-intent so all pseudo-
intents are found. 

    During Algorithm 2, Algorithm 1 is applied to every intent to generate
attribute sets. Their closure is then computed. As such, Algorithm 2 is in
O(|Φ| × (X + Y )) where X is the complexity of computing the closure of a
set and Y = |A| × L is the complexity of generating successors that depends on
the complexity L of the saturation algorithm used. A study of algorithms for
computing the implicational closure and their use in our problem can be found
in [1]. Note that, since every potential pseudo-intent P is constructed from an
                              Enumerating Pseudo-Intents in a Partial Order       51


intent I with I ⊂ P , computing P 00 can be done on the subcontext containing
the objects of I 0 .

3.2   Improvements
Additional reduction of the number of implicational closures required to compute
the base can be achieved by making use of the property that every attribute set
is constructed by adding an attribute to one of its subsets.
Proposition 6. If B = A ⊕ i is a successor of A with i = min(B \ A), then
B ⊕ j = A ⊕ j for every attribute j < i.
Proof. If j < i, then B − ({b ∈ B | b < j} ∪ {j}) = B − ({a ∈ A | a < j} ∪ {j})
because i = min(B \ A). Thus, we have B ⊕ j = A ⊕ j. 
    This lets us use the knowledge acquired on a set A to reduce the number
of necessary logical closures on its supersets. Indeed, for any B = A ⊕ i with
i = min(B \ A), the set B ⊕ j is already known for every attribute lesser than i.
This means that, in order to compute the successors of B, we need only to use
the ⊕ operator with attributes greater than i.
    We can also use the fact that there are pseudo-intents P such that P 00 = A,
which means that no object of the context is described by P . Indeed,    S for an
intent A such that A     A ⊕ i, there are two possibilities for i. If i ∈ o∈A0 {o}0 ,
meaning there is at least an object described
                                      S       by A ∪ {i}, the set A ⊕ i is either an
intent or a pseudo-intent. If i ∈ A \ o∈A0 {o}0 , meaning no object is described
by A ∪ {i}, the closure of A ⊕ i is A and, for any superset B of A, the set B ⊕ i
is either equal to A ⊕ i or A. This means that computing B ⊕ i is unnecessary
for every successors B of A in the spanning tree.


4     Computing a Base for Partial Implications
4.1   Luxenburger Basis
A partial implication between two attribute sets A and B, otherwise called asso-
ciation rule, is a relation of the form A →s,c B where s is called the support and
c the confidence. It means that s% of the objects of the context are described
by A and that c% of them are also described by B. Implications, as defined
in Section 2, are partial implications with a confidence of 100%. An attribute
set A having the same support as A00 , the confidence and support of a partial
implication A →s,c B can be derived from A00 →s,c B 00 . As such, to obtain a
basis for partial implications, one needs only to know the intents of the formal
context.
    It has been shown in [7] that a minimal basis for partial implications, called
Luxenburger basis, is obtained by considering a set {A →s,c B | A = A00 and
B = B 00 } that forms a spanning tree of the neighbouring graph of the intent
lattice such that A is the conclusion of a single partial implication.
52     Alexandre Bazin and Jean-Gabriel Ganascia


4.2   Modification of the Algorithm

In Algorithm 2, every attribute set is generated from a single intent with the
exception of some essential intents that lectically follow a pseudo-intent. We
would like those intents to be constructed from their lectically greatest subset
that is an intent instead of just their lectically greatest subset. Let us suppose
that we have an intent A and a pseudo-intent B such that A         B and B     B 00 .
                                                      00
The lectically greatest intent that is a subset of B is either A or a superset of
A so it can be constructed by adding attributes of B 00 \ A to A.
   Algorithm 3 computes C for a given A, B and B 00 .


Algorithm 3 Lectically Greatest Sub-Intent
1: C = A
2: for every attribute i in B 00 \ A in increasing order do
3:   if B(C ∪ {i}) 6= B 00 then
4:      C = C ∪ {i}
5:   end if
6: end for
7: return C



Proposition 7. Algorithm 3 terminates and computes the lectically greatest in-
tent that is a subset of B 00

Proof. There is a finite number of attributes and the loop goes through them
in a total order so the algorithm terminates. The attribute set it returns, C, is
either A or closed under B(.), so it is an intent. It is the implicational closure of
a subset of B 00 and it is not equal to B 00 so we have C ⊂ B 00 . Let us suppose that
there is an intent D ⊂ B 00 lectically greater than C with i = min(C∆D). The
attribute i is such that B(X ∪ {i}) ⊂ B 00 for any A ⊆ X ⊆ D so the algorithm
must have added it to C and we have C = D.
    Thus, the algorithm returns C, the greatest intent that is a subset of B 00 . 

    If, in Algorithm 2, we maintain the spanning tree we used to generate at-
tribute sets, it is easy to apply Algorithm 3 to every intent that lectically follows
a pseudo-intent. If we change the predecessor of those intents to the attribute
set computed by Algorithm 3 we obtain a spanning tree of Φ in which the prede-
cessor of every intent is an intent. As A also has a unique predecessor, it gives us
the Luxenburger basis of the context along with the Duquenne-Guigues basis.


5     Example

We apply Algorithm 2 on the following context with a < b < c < d < e :
                              Enumerating Pseudo-Intents in a Partial Order      53




                                 Fig. 1. Context 1



   The algorithm starts with ∅. It is closed so we generate its successors.

                                    ∅⊕e= e
                                    ∅⊕d= d
                                    ∅⊕c= c
                                    ∅⊕b= b
                                    ∅⊕a= a

    The set of Candidates is then {a, b, c, d, e}. Among them, only a is a pseudo-
intent with a00 = ab so we add a → ab to the basis. Moreover, ab lectically follows
a so we add ab to Candidates. Its lectically greatest subset that is an intent is b.
We then generate the successors of b, c, d and e. There are no attributes greater
than min(e) = e so e has no successors.

                       b ⊕ e = be c ⊕ e = ce d ⊕ e = de
                       b ⊕ d = bd c ⊕ d = cd
                       b ⊕ c = bc

    The set of Candidates is then {ab, bc, bd, be, cd, ce, de}. Among them, bc, be
and cd are pseudo-intents so we add bc → bcd, be → bde and cd → bcd to B. The
set bcd lectically follows bc so we add bcd to Candidates. Its lectically greatest
subset that is an intent is bd. We then generate the successors of ab, bd, ce and
de. The set de has no successors for the same reasons as the set e in the previous
step and we already know that c ⊕ d = cd so ce ⊕ d is known and ce has no
successors.

                           ab ⊕ e = abde bd ⊕ e = bde
                           ab ⊕ d = abd
                           ab ⊕ c = abcd

    The set of Candidates is then {abd, bcd, bde}. Among them, abd is a pseudo
intent so we add abd → abcde to B. The set abcde lectically follows abd so we
add abcde to Candidates. Its lectically greatest subset that is an intent is ab. We
then generate the successors of bcd and bde. We already know bd ⊕ c so we also
know bde ⊕ c. As such, computing bde ⊕ c is not needed and no other attribute
is available so bde has no successors.
54      Alexandre Bazin and Jean-Gabriel Ganascia


                                  bcd ⊕ e = bcde

    The set of Candidates is then {bcde, abcde}. Only bcde has a cardinality of 4
and it is a pseudo-intent so we add bcde → abcde to B. There are no new intents
so we continue with the elements of Candidates of cardinality 5. The set abcde
is an intent and has no successors since it is equal to A.
     The algorithm stops having produced the following implications :

 – a → ab
 – bc → bcd
 – be → bde
 – cd → bcd
 – abd → abcde
 – bcde → abcde

   This is the Duquenne-Guigues basis of the context. In addition, the following
spanning tree of the intent lattice has been constructed :




             Fig. 2. Spanning Tree of the Lattice of Intents of Context 1



     It corresponds to the following set of partial implications :

 – ∅ →1,0.6 b
 – ∅ →1,0.4 c
 – ∅ →1,0.6 d
 – ∅ →1,0.6 e
 – b →0.6,0.33 ab
 – b →0.6,0.66 bd
 – c →0.4,0.5 ce
 – d →0.6,0.66 de
 – ab →0.2,0 abcde
 – bd →0.4,0.5 bcd
 – bd →0.4,0.6 bde
                             Enumerating Pseudo-Intents in a Partial Order    55


    To compute it, Algorithm 3 has been used a total of 3 times.
     To compute the Duquenne-Guigues basis of this context, our algorithm has
performed 16 logical closures. Since the version of Next Closure using the op-
timizations presented in Section 2.1 would have performed only 15, it is more
efficient in this case. However, on the context presented in Figure 3, our algo-
rithm needs only 16 logical closures while Next Closure performs 19 of them.
This is due to the fact that attributes are not considered if they have been used
to generate an implication P → A and this context has a number of pairs of
attribute that never appear together. This leads us to believe that our algorithm
may be more efficient in those kinds of cases.




                                Fig. 3. Context 2




6    Conclusion
We proposed an algorithm that computes the Duquenne-Guigues basis of a for-
mal context. It makes use of the lattice structure of the set of both intents
and pseudo-intents in a fashion similar to that of algorithms for computing the
concept lattice. The construction of attribute sets is inspired by Next Closure,
the most common batch algorithm for computing pseudo-intents, in that it uses
the lectic order to ensure the uniqueness of generated sets while avoiding go-
ing through what has already been computed. We showed that the algorithm
can easily be modified, without much loss complexity-wise, to produce both the
Duquenne-Guigues and the Luxenburger bases. Those two bases together are
sought after in the domain of association rules mining where it is crucial to ob-
tain a minimal number of rules. They are usually derived from the set of frequent
concepts that has to be computed first and, to the best of our knowledge, no
algorithm is able to compute them both directly and at the same time without
a significant increase in complexity.
    Some improvements can be made on the generation of the attribute sets. Most
notably, the inclusion test in Algorithm 1 could be done during the computation
56     Alexandre Bazin and Jean-Gabriel Ganascia


of the implicational closure and multiple closures could be realized simultane-
ously. The fact that attribute sets are effectively generated following a partial
order instead of a total order could also permit some degree of parallelization.
Such optimizations will be the subject of further research on our part.


References
1. Konstantin Bazhanov and Sergei A. Obiedkov. Comparing performance of algo-
   rithms for generating the Duquenne-Guigues basis. In CLA, pages 43–57, 2011.
2. Felix Distel and Barış Sertkaya. On the complexity of enumerating pseudo-intents.
   Discrete Applied Mathematics, 159(6):450 – 466, 2011.
3. V. Duquenne and J.-L. Guigues. Famille minimale d’implications informatives ré-
   sultant d’un tableau de données binaires. Mathématiques et Sciences Humaines,
   24(95):5–18, 1986.
4. Bernhard Ganter. Two basic algorithms in concept analysis. In Proceedings of the
   8th international conference on Formal Concept Analysis, ICFCA’10, pages 312–
   340, Berlin, Heidelberg, 2010. Springer-Verlag.
5. Sergei O. Kuznetsov. On the intractability of computing the Duquenne-Guigues
   base. Journal of Universal Computer Science, 10(8):927–933, 2004.
6. Sergei O. Kuznetsov and Sergei Obiedkov. Comparing performance of algorithms
   for generating concept lattices. Journal of Experimental and Theoretical Artificial
   Intelligence, 14:189–216, 2002.
7. Michael Luxenburger. Implications partielles dans un contexte. Mathématiques et
   Sciences Humaines, 113:35–55, 1991.
8. S. Obiedkov and V. Duquenne. Attribute-incremental construction of the canonical
   implication basis. Annals of Mathematics and Artificial Intelligence, 49(1-4):77–99,
   April 2007.
9. Petko Valtchev and Vincent Duquenne. On the merge of factor canonical bases.
   In Proceedings of the 6th international conference on Formal concept analysis,
   ICFCA’08, pages 182–198, Berlin, Heidelberg, 2008. Springer-Verlag.