<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Computing the Concept Lattice using Dendritical Neural Networks</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>David Ernesto Caro-Contreras</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Andres Mendez-Vazquez</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Centro de Investigacin y Estudios Avanzados del PolitØcnico Nacional Av. del Bosque 1145, colonia el Bajo</institution>
          ,
          <addr-line>Zapopan , 45019, Jalisco, MØxico</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2013</year>
      </pub-date>
      <fpage>141</fpage>
      <lpage>152</lpage>
      <abstract>
        <p>Formal Concept Analysis (FCA) is a new and rich emerging discipline, and it provides ecient techniques and methods for ecient data analysis under the idea of attributes. The main tool used in this area is the Concept Lattice also named Galois Lattice or Maximal Rectangle Lattice. A naive way to generate the Concept Lattice is by enumeration of each cluster of attributes. Unfortunately the numbers of clusters under the inclusion attribute relation has an exponential upper bound. In this work, we present a novel algorithm, PIRA (PIRA is a Recursive Acronym), for computing Concept Lattices in an elegant way. This task is achieved through the relation between maximal height and width rectangles, and maximal anti-chains. Then, using a dendritical neural network is possible to identify the maximal anti-chains in the lattice structure by means of maximal height or width rectangles.</p>
      </abstract>
      <kwd-group>
        <kwd>formal concept analysis</kwd>
        <kwd>lattice generation</kwd>
        <kwd>neural networks</kwd>
        <kwd>dendrites</kwd>
        <kwd>maximal rectangles</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        Formal Concept Analysis (FCA) refers to a mathematized and formal
humancentered way to analyze data. It can be described as an ontology-based Lattice
Theory. This theory describes ways of visualizing patterns, generate implications,
and representing generalizations and dependencies [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The theory makes use of
a binary relation between objects and attributes, which plays a fundamental
role in the understanding of human model conceptualization. FCA formalizes all
these ideas, and gives a mathematical framework to work on them [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        The Concept Lattice is mainly used to represent and describe hierarchies
between data clusters (formal concepts or classes), which are inherent in the
perceived information. A formal concept can be regarded also as a data cluster,
in which certain attributes are all shared by a set of objects. The Concept Lattice
is the main subject of the theory of FCA, and it was rst introduced in the early
eighties by Rudolf Wille [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. Over time, FCA has become a powerful theory
for many interesting applications. Examples of these are in the data analysis
of Frequent Closed Itemsets [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], association rules and Functional/Conditional
hierarchies discovery [
        <xref ref-type="bibr" rid="ref5 ref6 ref7">5, 6, 7</xref>
        ]. For all these reasons, concept lattice generation
is an important topic in FCA research [
        <xref ref-type="bibr" rid="ref10 ref11 ref12 ref13 ref14 ref15 ref8 ref9">8, 9, 10, 11, 12, 13, 14, 15</xref>
        ]. However,
the main drawback of concept lattices generation is the exponential size of the
concept lattice and its generation complexity [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ].
      </p>
      <p>In this work, we present a Morphological Neural Network based method to
generate the Concept Lattice. This batch method is capable of generating the
Hasse diagram, and it is a bottom-up generation algorithm. With that in mind,
this work is organized as follows. Section 2 is an elementary description of FCA
Theory. Next, section 3 is a short introduction on how the Single Lattice Layer
Perceptron (SLLP) works. In section 4, a description on how to compute the
Maximal Rectangles is given. In this section, we also dene a link between SLLP
and the design of a classier for maximal anti-chains search. Section 5 shows a
comparison between PIRA and other known algorithms.
2
2.1</p>
    </sec>
    <sec id="sec-2">
      <title>Basic Denitions</title>
      <sec id="sec-2-1">
        <title>Formal Concept Analysis (FCA)</title>
        <p>
          First, it is necessary to dene the concept of a formal context [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ].
Denition 1. A binary formal context K is a triple (G, M, I). In this
mathematical structure, G and M are two nite sets, called objects and attributes
respectively, and I ⊆ G × M is a binary relation over G and M, named the
incidence of K.
        </p>
        <p>In order to dene formal concepts of the formal context (G,M,I), it is necessary
to dene two derivation operators, named Galois connectors.</p>
        <p>Denition 2.</p>
        <sec id="sec-2-1-1">
          <title>For an arbitrary subsets A ⊆ G and B ⊆ M :</title>
          <p>A0 := {m ∈ M | (g, m) ∈ I, ∀g ∈ A}</p>
          <p>B0 := {g ∈ G | (g, m) ∈ I, ∀m ∈ B}
These two derivation operators satisfy the following three conditions over
arbitrary subsets A1, A2 ⊆ G and B1, B2 ⊆ M :
1. A1 ⊆ A2 then A02 ⊆ A01 dually B1 ⊆ B2 then B20 ⊆ B10
2. A1 ⊆ A010 and A01 = A0100 dually B1 ⊆ B100 and B10 = B1000
3. A1 ⊆ B10 ⇐⇒ B1 ⊆ A01
Next, the denition of a formal concept idea, which represents the building unit
of FCA.</p>
          <p>Denition 3. Let K be a formal context, K := (G, M, I). A formal concept C
of K is dened as a pair C = (A, B), A ⊆ G and B ⊆ M , where the following
conditions are satised, A = B0 and A0 = B, where A is named the extent and
B is named the intent of the formal concept (A, B).</p>
          <p>
            Next, we will show some useful notions from Lattice Theory [
            <xref ref-type="bibr" rid="ref17 ref18">17, 18</xref>
            ] to
understand the algebraic structure generated by those derivation operators and the
formal concept idea.
Denition 4. A Partially Ordered Set (Poset) is a set X in which there is a
binary relation between elements of X, ≤, with the following properties:
1. ∀x, x ≤ x (Reexive).
2. if x ≤ y and y ≤ x, then x = y (Antisymmetric).
3. if x ≤ y and y ≤ z, then x ≤ z (Transitive).
          </p>
          <p>Formal Concepts can be partially ordered by inclusion. And for any pair Ci,
Cj have a unique greatest lower bound and a unique least upper bound. Then,
the set of all formal concept in K, ordered by inclusion, is known as a Concept
Lattice. Next denition links some denitions with the Formal Concept notion.
Denition 5. (Rectangles in K). Let A ∈ G and B ∈ M , a rectangle in K is
a pair (A, B) such that A × B ⊆ I .</p>
          <p>Given the set of Rectangles in K, a special kind of rectangles are dened as
follows:
Denition 6. (Maximal Rectangles). A rectangle (A1, B1) is maximal if and
only if there does not exist another valid rectangle (A2, B2) in K such that A1 ⊆
A2 or B1 ⊆ B2 .</p>
          <p>From here, we have the formal concept as a maximal rectangle.</p>
          <p>Theorem 1. (A, B) is a formal concept of K if and only if (A, B) is a maximal
rectangle in K.</p>
          <p>Now, the following denitions are going to be useful for the proposed bottom-up
approach of FCA generation.</p>
          <p>Denition 7. Let K := (G, M, I) and A ⊆ G. A is said to be an object
derivative anti-chain set if and only if A01 * A02 and A02 * A01 for any two distinct
A1, A2 ∈ A. Dually, for B ⊂ M nd B10 * B20 for any
two distinct B10, B20 in B.</p>
          <p>We will denote as D a derivative anti-chain and as A(K) the set of all antichains
in K. All sets in which the super/subconcept order is not satised for any distinct
elements of the set is called anti-chain set.</p>
          <p>Denition 8. D+ is a maximal derivative anti-chain i there does not exist
other D ∈ A(K) such that D ⊃ D+. There exists a maximal derivative
antichain for objects and one for attributes.</p>
          <p>Finally, we use a simple upward closed set denition.</p>
          <p>Denition 9. Upward closed set. Let (L, ≤) be a poset P . A set S ⊆ L is said
to be upward closed if ∀x, y ∈ S with y ≥ x implies that y ∈ S.</p>
          <p>Using these denitions of anti-chains, maximal rectangles and upward closed set,
it is possible to device a bottom-up approach by means of dendritic neuronal
networks.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Lattice-Based Neural Networks (LBNN).</title>
      <p>
        Articial Neural Networks (ANN ) are models of learning and automatic
processing inspired by the nervous system. The features of ANN make them quite
suitable for applications where there is no prior pattern that can be identied,
and there is only a basic set of input examples (previously classied or not).
They are also highly robust to noise, failure of elements in the ANN, and,
nally, they are parallelizable. As we said early, our work is related with LBNN,
which is also considered an Articial Neural Network, and are inspired in recent
advances in the neurobiology and biophysics of neural computation (author?)
[
        <xref ref-type="bibr" rid="ref19 ref20">19, 20</xref>
        ].
      </p>
      <p>
        The theory of LBNN is actively used in classication [
        <xref ref-type="bibr" rid="ref20 ref21 ref22">21, 20, 22</xref>
        ], clustering
[
        <xref ref-type="bibr" rid="ref23">23</xref>
        ], associative memories [
        <xref ref-type="bibr" rid="ref24 ref25">24, 25</xref>
        ], fuzzy logic [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ], among others[
        <xref ref-type="bibr" rid="ref27 ref28">27, 28</xref>
        ].
Basically, in the LBNN model, an input layer receives external data, and subsequent
layers perform the necessary functions to generate the desired outputs. Single
Lattice Layer Perceptrons (SLLP), also named Dendritic Single Layer
Perceptron, are basically a classier in which there exists a set of input neurons, a
set of output neurons, and a set of dendrites growing from the output neurons.
Those dendrites are connected with the input set by some axons from those input
neurons. A training set congures those outputs based on the maxima W and
minima V operations derived from the morphological algebra (R, +, W, V). FCA
and Mathematical Morphology common algebraic framework: Erosion,
dilatation, morphological operators, valuations, an many other functions in concept
lattices have been previously studied [
        <xref ref-type="bibr" rid="ref29">29</xref>
        ].
      </p>
      <p>
        In SLLP, a set of n input neurons N1, . . . , Nn accepts input x = (x1, ..., xn) ∈
Rn. An input neuron provides information through axonal terminals to the
dendritic trees of output neurons. A set of O output neurons is represented by
O1, ..., Om. The weight of an axonal terminal of neuron Ni connected to the kth
dendrite of the Oj output neuron is denoted by wi`jk, in which, the superscript
` ∈ {0, 1} represents an excitatory ` = 1 or a inhibitory ` = 0 input to the
dendrite. The kth dendrite of Oj will respond to the total value received from
the N input neurons set, and it will accept or reject the given input. Dendrite
computation is the most important operation in LBNN. The following equation
τkj (x), from SLLP, corresponds to the computation of the kth dendrite of the jst
output neuron[
        <xref ref-type="bibr" rid="ref21">21</xref>
        ].
      </p>
      <p>τkj (x) = pjk
^</p>
      <p>^ (−1)1−`(xi + wi`jk)
i∈I(k) `∈L</p>
      <p>
        Where x is the input value of neurons N1, ..., Nn and xi is the the value of
the input neuron Ni. I(k) ⊆ 1, ..., n represents the set of all input neurons with
synaptic connection on the kth dendrite of Oj . The number of terminal axonal
bers on Ni that synapse on a dendrite of Oj is at most two, since `(i) ⊆ {0, 1}
. Finally, the last involved element is pjk ∈ {−1, 1} and it denotes the excitatory
(pjk = 1) or inhibitory (pjk = =1) response of the kth dendrite of Oj to the
received input. All the values τkj (x) are passed to the neuron cell body. The
value computation of Oj is a function that computes all its dendrites values.
The total value received by Oj is given by[
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]:
τ j (x) = pj ∗
      </p>
      <p>Kj
_ τkj (x)
k=1</p>
      <p>In this SLLP model, Kj is the set of all dendrites of Oj , pj = ±1 represents
the response of the cell body to the received input vector. At this point, we
know that pj = 1 means that the input is accepted and pj = =1 means that the
cell body rejects the received input vector. The last statement related with Oj
correspond to an activation function f , namely yj = f [τ j (x)].</p>
      <p>f [τ j (x)] =
(1 ⇐⇒ τ (x) ≥ 0</p>
      <p>0 ⇐⇒ τ (x) &lt; 0</p>
      <p>
        As, we mention early, dendrites congurations is computed using a training
set. For this, they use merge and elimination methods [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ].
4
      </p>
    </sec>
    <sec id="sec-4">
      <title>PIRA Algorithm.</title>
      <p>In PIRA-LBNN model for nding upward-closed set elements, a set of binary
patterns are represented by G0. Thus, the binary representation of a formal
context is itself a set of patterns, in which the derivative of each object is an
element x ∈ G0. Then, we can dene each element x = (x1, ..., xn) ∈ G0 as a
binary vector. This allows us to dene a simple class classication rule to nd
maximal rectangles, and in an equivalent way the maximal anti-chains.</p>
      <p>In PIRA algorithm we search all the rectangles with maximum width or
height from each formal concept founded. There are two ways to achieve our
goal. It means that ` = 1 or ` = 0 depending on whether it is calculating, a
supremum or an inmum, by using excitatory or inhibitory dendrites. Thus, pjk
is also a constant, pjk = 1 or pjk = −1 and it denotes the excitatory or inhibitory
response of the kth dendrite of Mj to the received input, and another remarkable
statement is the fact that we only need to connect zeros as axonal branches. The
simplest way to compute the value of the kth dendrite derived from the SLLP
equations, is:
τkj (x) =</p>
      <p>_ (xi)
i∈I(k)</p>
      <p>This equation, where τkj (x) is the value of the computation of the kth dendrite
of the jth output neuron given a input x, I(k) ⊆ {1, ..., n} is the set of input
neurons with terminal bers that synapse the kth dendrite of our output neuron.
We realize that all weights wi`jk are equal to zero, this is, for our upward closed
set classier, we only need to store zero values from the input patterns in the
training step. Our goal is that the output of our classication neuron be x ∈ C1
(1)
(2)</p>
      <sec id="sec-4-1">
        <title>Algorithm 1 addDendrite</title>
        <p>INPUT : NeuralNetwork P , P a t t e r n x
OUTPUT: Updated P</p>
        <p>D e n d r i t e k = addNewDendrite (P)
FOR EACH element i n x</p>
        <p>IF g e t V a l u e ( element ) = 0
i = g e t P o s i t i o n ( element )
addAxonalBranch ( k , i )</p>
        <p>END</p>
        <p>END
END
if the input x ∈ D+. The training step is to nd the set D+. This is achieved by
processing elements from higher to lower cardinality,</p>
        <p>Specically, each dendrite k corresponds with one maximal antichain intent
to be tested, I(k) is the incidence set of positions where the value of the maximal
rectangle is zero for the pattern represented by the k dendrite. We get the state
value of Mj computing the minimum value of all it’s dendrites. Again, as the
SLLP, each τkj(x) is computed, and it is passed to the cell body of Mj. Then we
can get the total value received by our output neuron as follows:
τ j(x) = ^ τkj(x)
(3)
We realize that the activation function is not required since f [τ j(x)] = τ j(x)
where τ j(x) = 1 if x y for all y ∈ C1 and τ j(x) = 0 if x ≤ y for some y ∈ C1.
As we mentioned above x is a maximal rectangles if and only if x y and y x
for all y ∈ C1. Using the previous statement we can ensure that half of the work
is done, and the second test, y x, will be performed by processing data in a
particular order. In our case, we use cardinality order ensuring that each new
computed row is not a superset of the previously computed rows.</p>
        <p>As we said before, the idea is to use our LBNN structure to classify maximal
rectangles. When we start computing a formal concept, our structure is empty,
this means there are not dendrites or axonal connections. So, the rst step is to
add each element with the maximal cardinality as a pattern to learn. Algorithm
1 shows how an element is added to our LBNN for maximal rectangles learning.
First, algorithm 1 receives as parameter the LBNN which is being trained and
a binary vector. As we will see below, this binary vector has been proven as an
antichain element. Algorithm 1, rst grows a new dendrite kth in our output
neuron Oj. Every column in x is checked, if that property is not contained by
the object x, then an axonal branch grows from the i position of the input
neuron set to the new dendrite. This operation is represented by addAxonalBranch
calling. We can assure that we will not misclassify formal concepts in the
processed context. The algorithm 2 shows how to compute the Concept Lattice by
computing Maximal Antichain Sets recursively.</p>
      </sec>
      <sec id="sec-4-2">
        <title>Algorithm 2 Compute Maximal Rectangles</title>
        <p>ELSE
add a Link from p r e v i o u s l y c r e a t e d Rectangle i n</p>
        <p>Global_Maximal_Rectangles to K−Infimum</p>
      </sec>
      <sec id="sec-4-3">
        <title>Algorithm 3 Main Function</title>
        <p>INPUT : B i n a r y C o n t e x t (G,M, I )
OUTPUT: L a t t i c e L
Step 1 : Get Maximum and Infimum e l e m e n t s .</p>
        <p>FormalConcept max = getMax (G,M, I )
FormalConcept min = getMin (G,M, I )
addConcept ( L , max ) , addConcept ( L , min )
STEP 2 : Get maximal R e c t a n g l e s From min</p>
        <p>MaxRectangles = Compute Maximal R e c t a n g l e s with :</p>
        <p>G/min . e x t e n t ,
min . i n t e n t , I −Proj ,
max ,
min ,</p>
        <p>L</p>
        <p>In algorithm 2, the rst step creates a new dendritic neural network. It is
initialized with one output neuron, n = |intent| and k = 0, where the number
of input neurons is n, and each input neuron represents one attribute element
in M . The second step is used to get the G0 set ordered by attribute
cardinality. Next, in a third step, if the maximal cardinality of the elements in G’ is
equal to the Supremum intent cardinality, it adds a link between Supremum and
Inmum, and stops. Otherwise, it adds all the elements in G’ with the
maximal cardinality, to the dendritical neural network structure. Those elements are
maximal rectangles in the given binary context. The fourth step is used to check
the remaining elements. If an element derivative does not exist already, as an
intent, and the evaluation 3 says that it is a maximal rectangle, then, it adds the
object and its derivative as a new maximal rectangle. Otherwise, if the element
derivative already exists, it is added to the previously formal concept as its
extent. In the last step, it processes each maximal rectangle that has been found. If
that element is already contained in the lattice, it adds a link between Inmum
and the previous element created in the lattice. Otherwise, it adds that link and
processes that formal concept recursively. Then, it adds this new element to the
lattice structure.</p>
        <p>
          At this point, it computes all the concepts given by the binary context, but
B(K) cardinality is bounded by an exponential complexity. A simple way to
avoid this issue is to use a Binary Tree to store and recover all elements in B(K).
Basically, this binary tree works as a hash function using the concept of intent
in their binary representation. Then, it searches, nds and adds any intent in
|M | steps. In addition, the processing order enables us to generate the edges of
the Hasse diagram without additional computing steps. Therefore, the process
of generating the concept lattice and Hasse diagram presented in this paper has
an expected polynomial delay time [
          <xref ref-type="bibr" rid="ref16">16</xref>
          ] for maximal anti-chains search.
        </p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Computational Experiments.</title>
      <p>
        As shown in [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], many parameters are involved in the performance and running
time of an algorithm. For our tests, we considered the number of objects, the
number of attributes, the density of the context and the worst case for contexts
of M × M . Where density is the least percentaje of binary relations for each
object in K. Those parameters were tested
independently. Four algorithms were selected to compare the Dendritical algorithm
performance. Implementations were performed in java.
      </p>
      <p>Figure 1 shows the execution time behavior for a formal context with 10%
density and 100 attributes. Here, the V-axis represent the running time of each
algorithm in ms, and the H-axis the total number of objects under constant
number of attributes. The test was performed by increasing the number of objects
from one thousand to twenty thousand, with a random distribution and 10% of
density.</p>
      <p>We can verify that Bordat algorithm and our algorithm have the best
performance when the number of objects increase with respect to the attributes.</p>
      <p>Execution time when the number of
objects grows from 1000 to 20000, with
100 attributes and 10% of density.</p>
      <p>Execution time when the number of
attributes from 10 to 100 with 1000
objects and 10% density case.</p>
      <p>Figure 2 shows the increase in execution time when the number of attributes
increases. All datasets for this test has a 10% density and 1000 objects, and the
number of attributes is increasing from ten to one hundred, and, the number of
formal concepts is growing too. Here we can notice that Bordat and Dendritical
algorithms, are faster than Godin, Nourrine and Valtchev algorithms. We can
also notice that Dendritical execution time behavior is faster when the number
of objects grows.</p>
      <p>Figure 3 shows the increase in execution time when the density percentage
becomes higher. All datasets has 1000 objects and 100 attributes and when the
density grows the number of formal concepts grows too. As we mention, density
is mainly the percentage of attributes for all the objects in the formal context.</p>
      <p>Here, we notice that when the density grows Dendritical is closer and even faster
than the other algorithms. Figure 4 shows running time for zero diagonal
contexts where |G| = |M| and yields the complete lattice, which means 2|M| formal
concepts. In this kind of formal contexts, we can see a clear running time
superiority of the dendritical algorithm. Finally, table 1 shows some examples of time
complexity at each algorithm step.
No. Obj No. Att Density Concepts Query Time Ordering Dendritical Total
1000 36 17 8377 722ms 20ms 49ms 791ms
1000 49 14 14190 924ms 67ms 101ms 1092ms
1000 81 11 26065 1853ms 124ms 201ms 2178ms
Algorithms time when density becomes higher and the number of attributes become
lower. Those tests shows the increase in execution when attributes and I ⊆ G × M
are modied.
6</p>
    </sec>
    <sec id="sec-6">
      <title>Conclusion</title>
      <p>In this paper, we presented an algorithm based on the main idea of maximal
rectangles, using the cardinality notion and the dendritical classier. We have
also compared it with some known algorithms for the construction of Concept
Lattices.</p>
      <p>From the tests presented in this paper, we can see that as the number of
objects grows, our algorithm time execution is higher than Bordat or other
methods, but it could be a better choice when the density or the number of
attributes is high. Also, the results shows a good performance for the M × M
contexts in the worst case scenario, which demonstrates the feasibility of our
algorithm on some kinds of datsets.</p>
      <sec id="sec-6-1">
        <title>Bibliography</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Belohlavek</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Beydoun</surname>
          </string-name>
          , G.:
          <article-title>Formal Concept Analysis With Background Knowledge: Attribute Priorities</article-title>
          .
          <source>IEEE Transactions on Systems, Man, and Cybernetics</source>
          , Part C:
          <article-title>Applications</article-title>
          and Reviews 5465 (
          <year>2009</year>
          )
          <fpage>109117</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Hereth</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stumme</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wille</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wille</surname>
            ,
            <given-names>U.</given-names>
          </string-name>
          :
          <article-title>Conceptual knowledge discovery: a human-centered approach</article-title>
          .
          <source>Journal of Applied Articial Intelligence</source>
          <volume>17</volume>
          (
          <year>2003</year>
          )
          <fpage>281302</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Wille</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          :
          <source>Restructuring Lattice Theory: An Approach Based on Hierarchies of Concepts</source>
          .
          <source>In: ICFCA</source>
          . (
          <year>2009</year>
          )
          <fpage>314339</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Valtchev</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Missaoui</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Godin</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Meridji</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Generating frequent itemsets incrementally: two novel approaches based on Galois lattice theory</article-title>
          .
          <source>J. Exp. Theor. Artif. Intell</source>
          .
          <volume>14</volume>
          (
          <issue>2-3</issue>
          ) (
          <year>2002</year>
          )
          <fpage>115142</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Maddouri</surname>
            ,
            <given-names>M.:</given-names>
          </string-name>
          <article-title>A Formal Concept Analysis Approach to Discover Association Rules from</article-title>
          <string-name>
            <given-names>Data. R.</given-names>
            <surname>Belohlavek</surname>
          </string-name>
          . V.
          <string-name>
            <surname>Snasel</surname>
            <given-names>CLA</given-names>
          </string-name>
          1 (
          <year>2005</year>
          )
          <fpage>1021</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Agrawal</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          :
          <article-title>Fast algorithms for mining association rules in large databases</article-title>
          .
          <source>Proceedings of the 20th International Conference on Very Large Data Bases</source>
          ,
          <volume>1</volume>
          (
          <year>1994</year>
          )
          <fpage>487499</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Lakhal</surname>
            , L.,
            <given-names>S.G.</given-names>
          </string-name>
          :
          <article-title>Ecient Mining of Association Rules Based on Formal Concept Analysis</article-title>
          .
          <source>Ganter et al. Springer. LNAI 3626</source>
          (
          <year>2005</year>
          )
          <fpage>180195</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Merwe</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Obiedkov</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kourie</surname>
            ,
            <given-names>D.:</given-names>
          </string-name>
          <article-title>AddIntent: A New Incremental Algorithm for Constructing Concept Lattices</article-title>
          . In Eklund, P., ed.:
          <source>Concept Lattices. Volume 2961 of Lecture Notes in Computer Science</source>
          . Springer Berlin Heidelberg (
          <year>2004</year>
          )
          <fpage>372385</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Lv</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jia</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhou</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          :
          <article-title>A Bottom-Up Incremental Algorithm of Building Concept Lattice</article-title>
          . In Wu, Y., ed.:
          <source>Software Engineering and Knowledge Engineering: Theory and Practice. Volume 115 of Advances in Intelligent and Soft Computing</source>
          . Springer Berlin Heidelberg (
          <year>2012</year>
          )
          <fpage>9198</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Valtchev</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Missaoui</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lebrun</surname>
            ,
            <given-names>P.:</given-names>
          </string-name>
          <article-title>A partition-based approach towards constructing Galois (concept) lattices</article-title>
          . Discrete Math.
          <volume>256</volume>
          (
          <issue>3</issue>
          ) (
          <year>2002</year>
          )
          <fpage>801</fpage>
          <lpage>829</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Nourine</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Raynaud</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          :
          <article-title>A Fast Algorithm for Building Lattices</article-title>
          . Inf. Process. Lett.
          <volume>71</volume>
          (
          <issue>5-6</issue>
          ) (
          <year>1999</year>
          )
          <fpage>199204</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Nourine</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Raynaud</surname>
            ,
            <given-names>O.:</given-names>
          </string-name>
          <article-title>A fast incremental algorithm for building lattices</article-title>
          .
          <source>J. Exp. Theor. Artif. Intell</source>
          .
          <volume>14</volume>
          (
          <issue>2-3</issue>
          ) (
          <year>2002</year>
          )
          <fpage>217227</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Farach-Colton</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Huang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          :
          <article-title>A Linear Delay Algorithm for Building Concept Lattices</article-title>
          . In Ferragina, P.,
          <string-name>
            <surname>Landau</surname>
          </string-name>
          , G., eds.:
          <source>Combinatorial Pattern Matching. Volume 5029 of Lecture Notes in Computer Science</source>
          . Springer Berlin Heidelberg (
          <year>2008</year>
          )
          <fpage>204216</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Godin</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Missaoui</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Alaoui</surname>
          </string-name>
          , H.:
          <article-title>Incremental concept formation algorithms based on Galois (Concept) lattices</article-title>
          .
          <source>Computational Intelligence</source>
          <volume>11</volume>
          (
          <issue>2</issue>
          ) (
          <year>1995</year>
          )
          <fpage>246267</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Ganter</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          :
          <article-title>Two basic algorithms in concept analysis</article-title>
          .
          <source>In: Proceedings of the 8th international conference on Formal Concept Analysis. ICFCA-10</source>
          , Berlin, Heidelberg, Springer-Verlag (
          <year>2010</year>
          )
          <fpage>312340</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Kuznetsov</surname>
            ,
            <given-names>S.O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Obiedkov</surname>
            ,
            <given-names>S.A.:</given-names>
          </string-name>
          <article-title>Comparing performance of algorithms for generating concept lattices</article-title>
          .
          <source>Journal of Experimental and Theoretical Articial Intelligence</source>
          <volume>14</volume>
          (
          <issue>2-3</issue>
          ) (
          <year>2002</year>
          )
          <fpage>189216</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Kaburlasos</surname>
            ,
            <given-names>V.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.X.</given-names>
          </string-name>
          :
          <source>Computational Intelligence Based on Lattice Theory. Volume 67 of Studies in Computational Intelligence</source>
          . Springer (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>B. A.</given-names>
            <surname>Davey</surname>
          </string-name>
          ,
          <string-name>
            <surname>H.A.P.</surname>
          </string-name>
          :
          <article-title>Introduction to lattices and orders. 2nd edn</article-title>
          . Press Sindicate H. Cambridge University (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iancu</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Urcid</surname>
          </string-name>
          , G.:
          <article-title>Neurons, Dendrites, and Pattern Classication</article-title>
          . In: CIARP. (
          <year>2003</year>
          )
          <fpage>116</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Urcid</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Iancu</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Single Layer Morphological Perceptron Solution to the N-Bit Parity Problem</article-title>
          . In: CIARP. (
          <year>2004</year>
          )
          <fpage>171178</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Urcid</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          :
          <article-title>Learning in Lattice Neural Networks that Employ Dendritic Computing</article-title>
          . In
          <string-name>
            <surname>Kaburlasos</surname>
          </string-name>
          , V.,
          <string-name>
            <surname>Ritter</surname>
          </string-name>
          , G., eds.:
          <source>Computational Intelligence Based on Lattice Theory. Volume 67 of Studies in Computational Intelligence</source>
          . Springer Berlin / Heidelberg (
          <year>2007</year>
          )
          <fpage>2544</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <surname>Barmpoutis</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.X.</given-names>
          </string-name>
          :
          <article-title>Orthonormal Basis Latice Neural Networks</article-title>
          .
          <source>In Computational Intelligence Based on LatticeTheory</source>
          , V. Kaburlasos and
          <string-name>
            <surname>G. X. Ritter 1</surname>
          </string-name>
          (Springer-Verlag, Heidelberg, Germany,
          <year>2007</year>
          )
          <fpage>4356</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <surname>Kaburlasos</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>Granular Enhancement of Fuzzy ART-SOM Neural Classiers Based on Lattice Theory</article-title>
          . In
          <string-name>
            <surname>Kaburlasos</surname>
          </string-name>
          , V.,
          <string-name>
            <surname>Gerhard</surname>
          </string-name>
          , R., eds.:
          <source>Computational Intelligence Based on Lattice Theory. Volume 67 of Studies in Computational Intelligence</source>
          . Springer Berlin / Heidelberg (
          <year>2007</year>
          )
          <fpage>323</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <surname>Aldape-Perez</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Yanez-Marquez</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Camacho-Nieto</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>J.ArguellesCruz</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>An associative memory approach to medical decision support systems</article-title>
          .
          <source>Comput. Methods Prog. Biomed</source>
          .
          <volume>106</volume>
          (
          <issue>3</issue>
          ) (
          <year>2012</year>
          )
          <fpage>287307</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <surname>Ritter</surname>
            ,
            <given-names>G.X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chyzhyk</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Urcid</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Grana</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <string-name>
            <given-names>A Novel</given-names>
            <surname>Lattice</surname>
          </string-name>
          <article-title>Associative Memory Based on Dendritic Computing</article-title>
          . In: HAIS. (
          <year>2012</year>
          )
          <fpage>491502</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <surname>Kaburlasos</surname>
            ,
            <given-names>V.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>P.V.</surname>
          </string-name>
          :
          <article-title>Fuzzy lattice neurocomputing (n) models</article-title>
          .
          <source>Neural Networks</source>
          <volume>13</volume>
          (
          <issue>10</issue>
          ) (
          <year>2000</year>
          )
          <fpage>11451170</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <surname>Witte</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Schulte</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nachtegael</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Malange</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kerre</surname>
          </string-name>
          , E.:
          <article-title>A LatticeBased Approach to Mathematical Morphology for Greyscale and Colour Images</article-title>
          . In
          <string-name>
            <surname>Kaburlasos</surname>
          </string-name>
          , V.,
          <string-name>
            <surname>Ritter</surname>
          </string-name>
          , G., eds.:
          <source>Computational Intelligence Based on Lattice Theory</source>
          . Volume
          <volume>67</volume>
          . Springer Berlin / Heidelberg (
          <year>2007</year>
          )
          <fpage>129148</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <surname>Urcid</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nieves-Vazquez</surname>
            ,
            <given-names>J.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Garcia-A.</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Valdiviezo-N.</surname>
            ,
            <given-names>J.C.</given-names>
          </string-name>
          :
          <article-title>Robust image retrieval from noisy inputs using lattice associative memories</article-title>
          .
          <source>In: Image Processing: Algorithms and Systems</source>
          . (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <surname>Atif</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>B.I.D.F.H.C.</surname>
          </string-name>
          :
          <article-title>Mathematical morphology operators over concept lattices</article-title>
          . . In: Cellier,
          <string-name>
            <given-names>P.</given-names>
            ,
            <surname>Distel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            ,
            <surname>Ganter</surname>
          </string-name>
          ,
          <string-name>
            <surname>B. (Eds.) ICFCA</surname>
          </string-name>
          <year>2013</year>
          , (Springer-Verlag Berlin Heidelberg) LNAI
          <volume>7880</volume>
          (
          <year>2013</year>
          )
          <fpage>2843</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>