<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>A Neural-Network Like Logical-Combinatorial Structure of Data and Constructing Concept Lattices</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Xenia Naidenova</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vladimir Parkhomenko</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergey Curbatov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>AO NIEVT</institution>
          ,
          <addr-line>Moscow</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Military Medical Academy</institution>
          ,
          <addr-line>Saint-Petersburg</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Peter the Great St. Petersburg Polytechnic University</institution>
          ,
          <addr-line>Saint-Petersburg</addr-line>
          ,
          <country country="RU">Russia</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>An algorithm is proposed to implement the well-known effective inductive method of constructing sets of cardinality (q+1) from their previously constructed subsets of cardinality q. A new neural networklike combinatorial data structure supporting this algorithm is advanced. Some algorithms for constructing concept lattice, inferring good maximally redundant and irredundant classification tests are given using a generalization process based on Galois connections and a direct and backward wave of network activity propagation.</p>
      </abstract>
      <kwd-group>
        <kwd>Galois lattice</kwd>
        <kwd>concepts</kwd>
        <kwd>closed sets</kwd>
        <kwd>neural networks</kwd>
        <kwd>classification</kwd>
        <kwd>diagnostic tests</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        There is a great interest in computer sciences to the relation between the
Artificial Neuron Networks (ANN) and the Formal Concept Analysis (FCA). The
first attempts to relate the FCA and the Neural Networks (NN) were given in
[
        <xref ref-type="bibr" rid="ref28">28</xref>
        ]. The main goals for using the FCA with respect to the ANN are the
following ones: applying concepts lattices for constructing ANN’s architecture and
to make ANNs to be interpretable. In [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], the FCA is applied for interpretation
of neuron codes. In [
        <xref ref-type="bibr" rid="ref22 ref33">33,22</xref>
        ], the authors apply concept lattices to constructing
neural network architecture.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ], the authors proposed another possible implementation of building
interpretable NN using FCA. Their approach to generating NN architecture is
based on constructing covering relation of a lattice with the use of two type of
Galois connections: antitone (standard concept lattices) [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] and monotone [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]
ones.
      </p>
      <p>In our paper, we do not propose any approach to improve NN architecture.
We advance a new configuration of neuron-like logical-combinatorial network
to implement the well-known effective inductive method of constructing sets of
cardinality (q + 1) from their subsets of cardinality q. This method is
applicable to many problems of symbolic machine learning including Concept Lattices
construction and, in particularly, generating good classification tests (maximal
hypotheses in the FCA and their minimal generators).</p>
      <p>
        Mining logical rules (dependencies) from datasets in the form of association
rules, implicative and functional dependencies, and key patterns [
        <xref ref-type="bibr" rid="ref3 ref31">3,31</xref>
        ] attracts
a great interest because of its potential useful application. It has been proven
that the problems of implicative and functional dependences inferring (w.r.t.
classification problem) are algorithmically equivalent [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ]. These problems are
viewed as ones of supervised symbolic machine learning based on classification
and plausible (commonsense) reasoning.
      </p>
      <p>
        A universal algorithm (as the studies demonstrate) for inferring logical
dependencies is the algorithm using an effective inductive method of constructing
sets of cardinality (q +1) from their subsets of cardinality q. A (q +1)-set can
be constructed if and only if there exist all its proper q -subsets. For example,
the algorithms Apriori, AprioriTid, and AprioriHybrid have been presented in
[
        <xref ref-type="bibr" rid="ref1 ref30">1,30</xref>
        ] for association rule mining. Data mining using the Apriori Algorithm is
described in [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ]. The same principle underlies the algorithm Titanic for generating
key patterns [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ] and the algorithm TANE for discovering functional
dependencies [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]. A level-wise method of (q+1)-sets’ construction has also been proposed
for inferring good diagnostic tests for a given classification or class of objects
[
        <xref ref-type="bibr" rid="ref17 ref19">17,19</xref>
        ]. These tests serve as a basis for extracting functional dependences,
implications, and association rules from a given dataset. Discovering frequent closed
itemsets and generators (algorithm FCFG) is considered in [
        <xref ref-type="bibr" rid="ref27">27</xref>
        ]. This algorithm
extracts frequent itemsets in a depth-first search method. Then this algorithm
extracts frequent closed itemsets and frequent generator itemsets by a level-wise
approach. Algorithm Titanic is used for computing Iceberg Concept Lattice [
        <xref ref-type="bibr" rid="ref31">31</xref>
        ].
Also the algorithms of fast iceberg lattice construction are described in [
        <xref ref-type="bibr" rid="ref29 ref32">29,32</xref>
        ].
      </p>
      <p>
        A level-wise procedure is also applied in text mining, for example, for
extracting association dependencies between words, extracting topic of text and
contexts of topic [
        <xref ref-type="bibr" rid="ref10 ref16">10,16</xref>
        ].
      </p>
      <p>In all enumerated problems, the same algorithm deals with different sets of
elements (items, itemsets, attributes, object descriptions, indices of itemsets)
and checks the different properties of generated subsets. These properties can
be, for example: “to be a frequent (large) itemset”, “to be a key pattern”, “to
be a test for a given class of examples”, “to be a good test for a given class of
examples”, and some others.</p>
      <p>If a constructed subset does not possess a required property, then it is deleted
from consideration. This deletion reduces drastically the number of subsets to be
built at all greater levels. Generally, this algorithm solves the task of inferring all
maximal subsets of a set S (i.e., such subsets that cannot be extended) possessing
a given PROPERTY. The set S can be interpreted depending on the context of
a considered problem. This algorithm implements a level-wise inductive method
of (q +1)-sets’ construction.</p>
      <p>
        A neural network-like combinatorial data structure for constructing (q
+1)sets from their q -subsets has been proposed in a number of publications, see,
please, for details [
        <xref ref-type="bibr" rid="ref21">21</xref>
        ]. In this paper, two algorithms based on this structure are
proposed for generating Good Maximally Redundant Tests (GMRT) and Good
Irredundant Tests (GIRT) or generators.
      </p>
      <p>Sec. 2 presents the basic definitions of formal concept analysis (FCA) and
good test analysis (GTA). Sec. 3 is devoted to the idea of a level-wise algorithm
of inferring (q +1)-sets of elements from their previously constructed q-subsets of
elements. Some special combinatorial networks for this algorithm are discussed
in Sec. 4. Some related papers are discussed in Sec. 5.
2</p>
    </sec>
    <sec id="sec-2">
      <title>Basic Definitions</title>
      <p>
        Let us recall the main definitions of FCA [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ] and GTA [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ]. Denote by M the
set of attribute values such that M = [{dom(attr ), attr 2 U }, where dom(attr )
is the set of all values of attr, and U is a set of all considered attributes. Let
G = G+ [ G be the set of objects, where G+ and G are the sets of positive
and negative objects, respectively. Denote a description of g 2 G by (g ) and
the descriptions of positive and negative objects by D+ = { (g ) j g 2 G+} and
D ={ (g) j g 2 G }, respectively.
      </p>
      <p>
        We give the following Galois mappings 2G ! 2M and 2M ! 2G, respectively
[
        <xref ref-type="bibr" rid="ref18">18</xref>
        ]: obj(B) = fg 2 G j B (g)g, returning all the objects the descriptions of
which include the set B of values, and val(A) = {m 2 M j m 2 \ (g), g 2 A},
returning the intersection of all objects descriptions (g), g belonging to A.
      </p>
      <p>
        Two closure operators [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] are defined as follows: generalization_of(B) =
val(obj(B)) and generalization_of(A) = obj(val(A)). A set A is closed if A =
obj(val(A)). A set B is closed if B = val(obj(B)). Similar mappings denoted by
one symbol ( )0 are given in [
        <xref ref-type="bibr" rid="ref35">35</xref>
        ], see also notation ( ) in [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] and other notations,
for example, in [
        <xref ref-type="bibr" rid="ref14 ref23 ref6">6,14,23</xref>
        ].
      </p>
      <p>
        The pair (A; B), where A and B are closed, is the formal concept in terms of
Formal Concept Analysis and A is called concept extent and B is called concept
intent. All formal concepts form Galois lattice (concept lattice) [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ]. A triplet
(G; M; I), where I is a binary relation between G and M determined by functions
obj(B) and val(A), is a formal context K .
      </p>
      <p>
        According to the goal attribute, we get some possible forms of the formal
context: K" := (G", M , I") and I" := I \ (G" M ), where " 2 {+, } (if
necessary, the value can be added to provide undefined objects) [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. These
contexts form a classification context K .
      </p>
      <p>Definition 1. A Diagnostic Test (DT) for G+ is a pair (A; B) such that B
A = obj(B) 6= ?, A G+, and obj(B) \ G = ?.</p>
      <p>M ,</p>
      <p>
        In this connection, it is worth noting that DT in Definition 1 is a special kind
of semiconcept in the framework of FCA [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ].
      </p>
      <p>Definition 2. A Diagnostic Test (A; B) for G+ is maximally redundant if obj(B
[ m) A for all m 2 M\B.</p>
      <p>
        It is worth noting that Definition 2 is equivalent to the definition of positive
hypothesis given in [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>Definition 3. A Diagnostic Test (A; B) for G+ is good iff any extension A* =
A [ i, i 2 G+\A, implies that (A ; val(A )) is not a test for G+ .</p>
      <p>
        It is worth noting that good maximally redundant test (GMRT) is equivalent
to the definition of minimal positive hypothesis given in [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] and GMRTs are
formal concepts.
      </p>
      <p>Definition 4. A Diagnostic Test (A; B) for G+ is irredundant if for all m 2 B,
(obj(B\m), B\m) is not a test for positive objects (any proper subset of B is not
the intent for a test for G+).</p>
      <p>
        Note that a good irredundant test (GIRT) is not generally a formal concept.
If a GIRT is closed, then it is simultaneously a GMRT and it is unique in its
class of equivalence with positive hypothesis in [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Naturally, GIRT is a minimal
generator of a minimal positive hypothesis.
      </p>
      <p>We use a method of GIRT construction based on the following consideration:
GIRT is contained in one and only one GMRT equivalent to it with respect to its
extent (a set of objects covered by it). So, we shall search for GIRTs contained
in a given GMRT.
3</p>
    </sec>
    <sec id="sec-3">
      <title>Idea of Level-Wise Algorithm</title>
      <p>Let S be a set of some entities. By sq = (i 1, i 2, . . . , i q), we denote a subset of
S, containing q elements of S. Let S (prop-q ) be the set of subsets s = {i 1, i 2,
. . . , i q}, q = 1, 2, . . . , nt-1, satisfying the PROPERTY. Here nt denotes the
cardinality of S. We use an inductive rule for constructing {i 1, i 2, . . . , i q+1}
from {i 1, i 2, . . . , i q}, q = 1, 2, . . . , nt -1. This rule relies on the following
consideration: if the set {i 1, i 2, . . . , i q+1} possesses the PROPERTY, then all
its proper subsets must possess this PROPERTY too. Thus the set {i 1, i 2, . . . ,
i q+1} in S(prop-(q+1)) can be constructed if and only if S (prop-q ) contains all
its proper subsets.</p>
      <p>Having constructed the set sq+1 = {i 1, i 2, . . . , i q+1}, we have to determine
whether it possesses the PROPERTY or not. If not, sq+1 is deleted, otherwise
sq+1 is inserted in S (prop-(q +1)). The algorithm is over when it is impossible
to construct any element for S (prop-(q +1)).</p>
      <p>The Background Algorithm has an essential disadvantage consisting in the
necessity to generate all subsets {i 1, i 2, . . . , i q+1} from {i 1, i 2, . . . , i q}, q = 1,
2, . . . , nt -1. In next section, we consider a neuron-like combinatorial structure
that makes this construction more effective.
4</p>
    </sec>
    <sec id="sec-4">
      <title>Special Logical-Combinatorial Network for Background</title>
    </sec>
    <sec id="sec-5">
      <title>Algorithm</title>
      <p>The idea of the following algorithm is based on the functioning of a
combinatorial network structure, whose elements correspond to subsets of a finite set S
generated in the algorithm. These elements are located in the network along the
layers, so that each q-layer consists of the elements corresponding to subsets the
cardinality of which is equal to q. All the elements of q-layer have the same
number q of inputs or connections with the elements of previous (q-1)-layer. Each
element “is excited” if and only if all the elements of previous layer connected
with it are active. The weight of connection going from the excited element is
taken as equal to 1; the weight of connection going from the unexcited element
is taken as equal to 0. An element of q-layer is activated if and only if the sum of
weights of its inputs is equal to q. The possible number Nq of elements (nodes)
at each layer is known in advance as the number of combinations of S on q. In
the process of the functioning of the network, the number of its nodes can only
diminish.</p>
      <p>An advantage of this network consists in the fact that its functioning does not
require the complex techniques for changing the weights of the connections and
it is not necessary to organize the process of constructing q-sets from their
(q-1)subsets. The nodes of network can be interpreted depending on a problem to be
solved. The assigned properties can be checked via different attached procedures.</p>
      <p>If an activated node does not possess the assigned property, then it is excluded
from consideration by setting to 0 all connections going from it to the nodes of
above layer. The work of this combinatorial network consists of the following
steps:
1. The weights going from the nodes of the first layer is set to 1. For each layer
beginning with the second one:
2. The excitation of nodes, if they were not active and all their incoming traffic
(links) have the weight equal to 1; checking the assigned property for the
activated nodes of this layer;
3. If the assigned property of a node is not satisfied, then all the outgoing
connections of this node are established to 0. If the assigned property of a
node is satisfied, then its outgoing connections are set to be equal to 1;
4. The propagation of “excitation” to the nodes of the following higher layer
(with respect to the current one) and the passage to analyzing the following
layer;
5. “The readout” of the nodes not connected with above lying nodes. Such nodes
correspond to intents of GIRTs (not extended).</p>
      <p>The process of excitation stops if it is impossible to generate the next layer
of the network.</p>
      <p>The work of the network can be performed by several ways: 1) step by step
from lower layers to upper layers; 2) with back propagation: from top to bottom
and from bottom to top, simultaneously; 3) in parallel, with decomposition of
the network into fragments connected via some nodes.
4.1</p>
      <p>Examples of the Network Functioning
Example 1. Generating GMRTs with and without back propagation.</p>
      <p>For inferring all GMRTs for G+, let S = G+. We use the level-wise algorithm,
testing the property “to be test” for G+ ”, vertical mode of generating extents of
tests, i. e. generating nodes as subsets of objects. An attached testing procedure
verifies whether the following property fulfils: PROPERTY(s) = if obj(val(s))
G+, then true else false, where s G+.</p>
      <p>In Tables 1, 2, the sets of positive and negative object descriptions are given,
for our examples.</p>
      <p>Let X = {m4, m12, m14, m15, m24, m26} be the intent of a GMRT. In Table
3, we give the initial set of negative examples for GIRTs extracting from X.</p>
      <p>We use the level-wise algorithm, testing the property “not to be test for
G+”, horizontal mode of generating intents of GIRTs, i. e. generating nodes as
subsets of attribute values. An attached procedure verifies whether the following
4 ; 7 ; 8
4,7,10
4,8,10
7,8,10
1,7,8
1,7,10
1,8,10
1,4,8
1,4,10
1 ; 4 ; 7
4 ; 7</p>
      <p>4 7 8 10
Fig. 1. Fragment of GMRTs Generation
6,4
6,8
: : :
6,11
11,8
11,4
8,4</p>
      <p>Definition 5. Let t be a set of values such that (obj(t); t) is a test for G+ . The
value m 2 M , m 2 t is essential in t if (obj(t\m), (t\m)) is not a test for G+.</p>
      <p>Generally, we are interested in finding a maximal subset sbmax(t) t such
that (obj(t); t) is a test but (obj(sbmax(t)), sbmax(t)) is not a test for G+. Then
sbmin(t) = t\sbmax(t) is one of minimal subsets of essential values in t.</p>
      <p>In our example, GMRT contains only one essential value: m26, because
deleting m26 implies that remain part X\m26 = {m4, m12, m14, m15, m24} is equal
to the description of negative object 46 (see Table 3). It means that value m26
will be included in any GIRT. Thus, we need in a configuration of the network
containing only the nodes in which m26 appears.</p>
      <p>A quasi minimal subset of essential values in t can be found by the use of
the following procedure.</p>
      <p>We begin with the first value m1 of t, then we take the next value m2 of t and
evaluate the function to_be_test((obj(m1, m2), (m1, m2))). If the value of the
function is false, then we take the next value m3 of t and evaluate the function
to_be_test ((obj(m1, m2, m3), (m1, m2, m3))). If the value of the function
to_be_test((obj(m1, m2), (m1, m2))) is true, then value m2 is skipped and the
function to_be_test((obj(m1, m3), (m1, m3))) is evaluated. We continue this
process until we achieve the last value of t.</p>
      <p>The function to_be_test(t ), where t t, in this procedure: if t 6 g, for
all g 2 G then true else false.</p>
      <p>As a result, we have one of quasi maximal subsets, let sbmax, sbmax X
such that (obj(sbmax), sbmax) is not a test for G+. Then Lev = {X\sbmax} is a
quasi minimal subset of essential values in X. In our illustrative example with
only one essential value m26 in X, we have the following configuration of the
network: Fig. 3, where dashed arrows have weight 1, all other arrows have weight
0, double circles presents intents of GIRTs.</p>
      <p>Table 4 depicts the number of nodes in the complete network (C(6,5) + C(6,
4)+C(6,3)+C(6,2)+C(6,1)), where C(n; i) means the combination from n by i.</p>
      <p>As a result, we obtain 6 GIRTs: {m2,m14,m26}, {m4,m15,m26}, {m4,m24,m26},
{m12,m14,m26}, {m12,m15,m26}, {m12,m24,m26}.</p>
      <p>Construction of initial configuration of networks is beyond the scope of this
paper.</p>
      <p>Example 3.</p>
      <p>The next example: X = {m19, m20, m21, m22, m26}.</p>
      <p>In this case, the only GIRT is ((3,8), {m19, m20, m21, m22, m26}). We can
find essential values in X by means of the procedure described above. Assume
that we have find that m26, m22, m21 are essential in X. Since essential values
must enter simultaneously in an intent of a GIRT we can obtain only one node
Number of nodes in Number of nodes
the complete network in the real network</p>
      <p>C(6,5) = 6
C(6,4) = 15
C(6,3) = 20
C(6,2) = 15
C(6,1) = 6
in the network, containing m26, m22, and m21 and it is (m19m21m22m26), which
is the GIRT in X.</p>
      <p>Apparently, we can see that the size of network may be a problem if the
data is large. But the decomposition of the main task into subtasks drastically
diminishes the memory size of the algorithm. A subtask is determined by a
sub-network generated by a node of the network.</p>
      <p>
        Generally, the main advantages of combinatorial network are the following
ones: 1) the size of network is computed in advance; 2) it is possible to decompose
network into autonomic fragments; 3) different fragments of network can be
joined via common nodes; 4) the states of nodes can be established by the use
of attached procedures; 5) this can be used for problems of pattern recognition
based on using logical rules [
        <xref ref-type="bibr" rid="ref20">20</xref>
        ].
5
      </p>
    </sec>
    <sec id="sec-6">
      <title>Special Logical-Combinatorial Network as a Cognitive</title>
    </sec>
    <sec id="sec-7">
      <title>Structure</title>
      <p>
        Let us note that the model of perceptron-type neuron with the summing up of
the weights of the connections, widely utilized in technical sciences is criticized
from the side of specialists, who study the work of brain. This model cannot
explain the work of organism’s functional systems [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        Neuron in [
        <xref ref-type="bibr" rid="ref34">34</xref>
        ] is defined as conversion of the values of predicates P1, P2, Pk,
which designate input excitations arriving via axons at the entrance of neuron,
into the value of predicate P0 (output of neuron). It is known that each
neuron has a receptive field, whose stimulation excites it unconditionally (without
learning).
      </p>
      <p>In the process of learning, neurons obtain the supporting or braking
(prohibitive) signals depending on varied conditions and purposes of organism.</p>
      <p>A neuron participates in the work of different functional systems and, thus,
separate neuron does not have specific fixed semantics. It is assumed that among
all input excitations of each neuron there are motivational and emotional
excitations, and at each moment of time in the dependence on the purpose and state
of organism neuron works within the framework only one functional system.</p>
      <p>It is assumed also that there is a mechanism including in the operation of
neuron the frequency of its excitation, i.e., the ratio of the number of
simultaneous excitation of all its entrances to the number of reinforcements from the side
of conditional signals.</p>
      <p>
        It is important for us that the model of neuron, proposed by us, coincides
mainly with the new formal model of neuron, described in [
        <xref ref-type="bibr" rid="ref26 ref34">34,26</xref>
        ].
– neuron network has lattice structure;
– neuron, which is located in the lattice node obtains the excitations from the
previous layers and transfers the excitations to the subsequent layers;
– the reinforcement of neuron is realized within the framework a certain
functional system and only if some goal is achieved (in our model, if it is carried
out a certain purposeful property). Neuron can not transfer excitation with
a forbidden signal, i.e., if a certain necessary condition is not satisfied.
      </p>
      <p>We do not deny existence of perceptron-type neuron network. However,
neural logical-combinatorial network can interact with it. While the perceptron, as
a result of learning, recognizes the objects of some class and gives an answer of
the type “yes-no”, neural logical-combinatorial network gives the description of
this class of objects in terms of objects’ properties.
6</p>
    </sec>
    <sec id="sec-8">
      <title>Conclusion</title>
      <p>In this paper, we proposed a neural network-like combinatorial structure of data
and knowledge the advantage of which consists in the fact that the functioning
of it does not require the complex techniques for changing the weights of
connections. The nodes of network can be interpreted depending on a problem to be
solved. The assigned properties of nodes can be checked via different attached
procedures.</p>
      <p>
        Furthermore, the advantages of combinatorial network are the following ones:
– the size of network is computed in advance;
– it is possible to decompose network into autonomic fragments which can
operate in a parallel way;
– different fragments of network can be joined via common nodes;
– this network can be used not only for inferring logical rules from datasets
but also for problems of pattern recognition based on these induced rules
[
        <xref ref-type="bibr" rid="ref20">20</xref>
        ];
– a neuron is activated if and only if its weight reaches a certain value equal
to the number of its inputs (the sum of the weights of its inputs) and the
excitation in the network is transmitted in the same way as in the artificial
neural network.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Agrawal</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mannila</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Srikant</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Toivonen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Verkamo</surname>
            ,
            <given-names>A.I.</given-names>
          </string-name>
          :
          <article-title>Fast discovery of association rules. Advances in knowledge discovery and data mining 12(1</article-title>
          ),
          <fpage>307</fpage>
          -
          <lpage>328</lpage>
          (
          <year>1996</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Anokhin</surname>
            ,
            <given-names>P.K.</given-names>
          </string-name>
          :
          <article-title>Systemic analysis of neural integrative activity</article-title>
          , pp.
          <fpage>347</fpage>
          -
          <lpage>440</lpage>
          .
          <article-title>Essays on physiological functional systems</article-title>
          ,
          <source>Medicine</source>
          (
          <year>1975</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Bastide</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taouil</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pasquier</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stumme</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lakhal</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Mining frequent patterns with counting inference</article-title>
          .
          <source>SIGKDD Explor. Newsl</source>
          .
          <volume>2</volume>
          (
          <issue>2</issue>
          ),
          <fpage>66</fpage>
          -
          <lpage>75</lpage>
          (
          <year>2000</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Düntsch</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gediga</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          :
          <article-title>Approximation Operators in Qualitative Data Analysis</article-title>
          , pp.
          <fpage>214</fpage>
          -
          <lpage>230</lpage>
          . Springer (
          <year>2003</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Endres</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Foldiak</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Interpreting the neural code with formal concept analysis</article-title>
          .
          <source>In: Advances in Neural Information Processing Systems</source>
          . vol.
          <volume>21</volume>
          , pp.
          <fpage>425</fpage>
          -
          <lpage>432</lpage>
          . MIT Press, Cambridge (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Ganascia</surname>
            ,
            <given-names>J.G.</given-names>
          </string-name>
          :
          <article-title>TDIS: an algebraic formalization</article-title>
          .
          <source>In: Proceedings of the 13th International Joint Conference on Artificial Intelligence</source>
          . vol.
          <volume>2</volume>
          , pp.
          <fpage>1008</fpage>
          -
          <lpage>1015</lpage>
          (
          <year>1993</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Ganter</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wille</surname>
          </string-name>
          , R.:
          <source>Formal concept analysis: mathematical foundations</source>
          . Springer, Berlin (
          <year>1999</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Ganter</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kuznetsov</surname>
            ,
            <given-names>S.O.</given-names>
          </string-name>
          :
          <article-title>Pattern structures and their projections</article-title>
          . In: Delugach,
          <string-name>
            <given-names>H.S.</given-names>
            ,
            <surname>Stumme</surname>
          </string-name>
          ,
          <string-name>
            <surname>G</surname>
          </string-name>
          . (eds.)
          <source>Conceptual Structures: Broadening the Base: Proceedings of the 9th International Conference on Conceptual Structures</source>
          . pp.
          <fpage>129</fpage>
          -
          <lpage>142</lpage>
          (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Huhtala</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kärkkäinen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Porkka</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Toivonen</surname>
          </string-name>
          , H.:
          <article-title>Tane: An efficient algorithm for discovering functional and approximate dependencies</article-title>
          .
          <source>Comput. J</source>
          .
          <volume>42</volume>
          (
          <issue>2</issue>
          ),
          <fpage>100</fpage>
          -
          <lpage>111</lpage>
          (
          <year>1999</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Kulkami</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kulkami</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Knowledge discovery in text mining using association rules extraction</article-title>
          .
          <source>International Journal of Computer Applications</source>
          <volume>134</volume>
          (
          <issue>12</issue>
          ),
          <fpage>30</fpage>
          -
          <lpage>36</lpage>
          (
          <year>2016</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Kuznetsov</surname>
            ,
            <given-names>S.O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Makhazhanov</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ushakov</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <source>On Neural Network Architecture Based on Concept Lattices</source>
          , pp.
          <fpage>653</fpage>
          -
          <lpage>663</lpage>
          . Springer, Cham (
          <year>2017</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Kuznetsov</surname>
            ,
            <given-names>S.:</given-names>
          </string-name>
          <article-title>JSM-method as a Machine Learning System</article-title>
          .
          <source>Itogi Nauki i Tekhniki, ser. Informatika</source>
          <volume>15</volume>
          ,
          <fpage>17</fpage>
          -
          <lpage>50</lpage>
          (
          <year>1991</year>
          ), (in Russian)
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Kuznetsov</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Mathematical aspects of concept analysis</article-title>
          .
          <source>Journal of Mathematical Sciences</source>
          <volume>80</volume>
          (
          <issue>2</issue>
          ),
          <fpage>1654</fpage>
          -
          <lpage>1698</lpage>
          (
          <year>1996</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Liquiere</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sallantin</surname>
            ,
            <given-names>J.:</given-names>
          </string-name>
          <article-title>Structural machine learning with Galois lattice and graphs</article-title>
          .
          <source>In: Proceedings of the 5th International Conference on Machine Learning</source>
          . pp.
          <fpage>305</fpage>
          -
          <lpage>313</lpage>
          (
          <year>1998</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Luksch</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wille</surname>
          </string-name>
          , R.:
          <article-title>A Mathematical Model for Conceptual Knowledge Systems</article-title>
          . In: Bock,
          <string-name>
            <given-names>H.H.</given-names>
            ,
            <surname>Ihm</surname>
          </string-name>
          , P. (eds.)
          <source>Proceedings of the 14th Annual Conference of the Gesellschaft fur Klassifikation (GfKl</source>
          <year>1990</year>
          ). pp.
          <fpage>156</fpage>
          -
          <lpage>162</lpage>
          (
          <year>1991</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          16.
          <string-name>
            <surname>Mahmood</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shahbaz</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guergachi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Negative and positive association rules mining from text using frequent and infrequent itemsets</article-title>
          .
          <source>The Scientific World Journal</source>
          <year>2014</year>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          17.
          <string-name>
            <surname>Megretskaya</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          :
          <article-title>Construction of natural classification tests for knowledge base generation. The Problem of the Expert System Application in the National Economy: Reports of the Republican Workshop</article-title>
          . Kishinev pp.
          <fpage>89</fpage>
          -
          <lpage>93</lpage>
          (
          <year>1988</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          18.
          <string-name>
            <surname>Naidenova</surname>
            ,
            <given-names>X.A.</given-names>
          </string-name>
          :
          <article-title>Searching Good Diagnostic Tests as a Model of Reasoning</article-title>
          . In:
          <article-title>Proceedings of the Scientific-Practical Conference "Knowledge-Dialog-Solution" (KDS-</article-title>
          <year>2001</year>
          ), pp.
          <fpage>501</fpage>
          -
          <lpage>506</lpage>
          . Saint-Petersburg (
          <year>2001</year>
          ), (in Russian)
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          19.
          <string-name>
            <surname>Naidenova</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <article-title>: Machine Learning as a diagnostic task</article-title>
          . In: Arefiev, I. (ed.)
          <article-title>Knowledge-Dialog-Solution, Materials of the Short-Term Scientific Seminar</article-title>
          , pp.
          <fpage>26</fpage>
          -
          <lpage>36</lpage>
          . State North-West Technical University Press, St Petersburg,
          <string-name>
            <surname>Russia</surname>
          </string-name>
          (
          <year>1992</year>
          ), in Russian
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          20.
          <string-name>
            <surname>Naidenova</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          :
          <article-title>Good classification tests as formal concepts</article-title>
          .
          <source>In: Domenach</source>
          ,
          <string-name>
            <given-names>F.</given-names>
            ,
            <surname>Ignatov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            ,
            <surname>Poelmans</surname>
          </string-name>
          ,
          <string-name>
            <surname>J</surname>
          </string-name>
          . (eds.)
          <source>Proceedings of the 10th International Conference on Formal Concept Analysis</source>
          . vol.
          <volume>7278</volume>
          , pp.
          <fpage>211</fpage>
          -
          <lpage>226</lpage>
          (
          <year>2012</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          21.
          <string-name>
            <surname>Naidenova</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Parkhomenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>An approach to incremental learning good classification tests</article-title>
          . In: Cellier,
          <string-name>
            <given-names>P.</given-names>
            ,
            <surname>Distel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            ,
            <surname>Ganter</surname>
          </string-name>
          ,
          <string-name>
            <surname>B</surname>
          </string-name>
          . (eds.) Contributions to the
          <source>11th International Conference on Formal Concept Analysis</source>
          , pp.
          <fpage>51</fpage>
          -
          <lpage>64</lpage>
          . Technische Universitat Dresden (
          <year>2013</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          22.
          <string-name>
            <surname>Nguifo</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tsopze</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tindo</surname>
          </string-name>
          , G.:
          <article-title>M-CLANN: Multiclass concept lattice-based artificial neural network</article-title>
          .
          <source>Constructive Neural</source>
          Networks pp.
          <fpage>103</fpage>
          -
          <lpage>121</lpage>
          (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          23.
          <string-name>
            <surname>Nguifo</surname>
            ,
            <given-names>E.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Njiwoua</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          :
          <article-title>Iglue: A lattice-based constructive induction system</article-title>
          .
          <source>Intelligent data analysis 5(1)</source>
          ,
          <fpage>73</fpage>
          -
          <lpage>91</lpage>
          (
          <year>2001</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          24.
          <string-name>
            <surname>Ore</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          :
          <article-title>Galois connections</article-title>
          .
          <source>Trans. Amer. Math. Soc 55</source>
          ,
          <fpage>494</fpage>
          -
          <lpage>513</lpage>
          (
          <year>1944</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          25.
          <string-name>
            <surname>Palekar</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Narwekar</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Manjeshwar</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rane</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          :
          <article-title>Data mining using apriori algorithm</article-title>
          .
          <source>Intern. Journal of Eng. Trends and Technology</source>
          <volume>28</volume>
          (
          <issue>4</issue>
          ),
          <fpage>190</fpage>
          -
          <lpage>192</lpage>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          26.
          <string-name>
            <surname>Polyakov</surname>
          </string-name>
          , G.:
          <article-title>About the principles of neural brain organization</article-title>
          . Moscow State Univercity (
          <year>1965</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          27.
          <string-name>
            <surname>Rani</surname>
            ,
            <given-names>U.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Premchand</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Govardhan</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>A novel algorithm for discovering frequent closures and generators</article-title>
          .
          <source>International Journal on Recent and Innovation Trends in Computing and Communication</source>
          <volume>3</volume>
          (
          <issue>9</issue>
          ),
          <fpage>5484</fpage>
          -
          <lpage>5487</lpage>
          (
          <year>2015</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          28.
          <string-name>
            <surname>Rudolph</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Using FCA for Encoding Closure Operators into Neural Networks</article-title>
          , pp.
          <fpage>321</fpage>
          -
          <lpage>332</lpage>
          . Springer Berlin Heidelberg, Berlin, Heidelberg (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          29.
          <string-name>
            <surname>Smith</surname>
            ,
            <given-names>D.T.</given-names>
          </string-name>
          :
          <article-title>A Formal Concept Analysis Approach to Data Mining: The QuICL Algorithm for Fast Iceberg Lattice Construction</article-title>
          .
          <source>Computer and Information Science</source>
          <volume>7</volume>
          (
          <issue>1</issue>
          ),
          <fpage>10</fpage>
          -
          <lpage>32</lpage>
          (
          <year>2014</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          30.
          <string-name>
            <surname>Stumme</surname>
          </string-name>
          , G.:
          <article-title>Efficient data mining based on formal concept analysis</article-title>
          . In: Hameurlain,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Cicchetti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Traunmüller</surname>
          </string-name>
          ,
          <string-name>
            <surname>R</surname>
          </string-name>
          . (eds.)
          <source>DEXA. Lecture Notes in Computer Science</source>
          , vol.
          <volume>2453</volume>
          , pp.
          <fpage>534</fpage>
          -
          <lpage>546</lpage>
          . Springer (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          31.
          <string-name>
            <surname>Stumme</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Taouil</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bastide</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pasquier</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lakhal</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Computing iceberg concept lattices with Titanic</article-title>
          .
          <source>Data &amp; Knowledge Engineering</source>
          <volume>42</volume>
          (
          <issue>2</issue>
          ),
          <fpage>189</fpage>
          -
          <lpage>222</lpage>
          (
          <year>2002</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          32.
          <string-name>
            <surname>Szathmary</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Valtchev</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Napoli</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Godin</surname>
          </string-name>
          , R.:
          <article-title>Constructing Iceberg Lattices from Frequent Closures Using Generators</article-title>
          .
          <source>In: Proceedings of International Conference on Discovery Science</source>
          , pp.
          <fpage>136</fpage>
          -
          <lpage>147</lpage>
          . Springer (
          <year>2008</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          33.
          <string-name>
            <surname>Tsopzé</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nguifo</surname>
            ,
            <given-names>E.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tindo</surname>
          </string-name>
          , G.:
          <article-title>CLANN: Concept lattice-based artificial neural network for supervised classification</article-title>
          .
          <source>In: Proceedings of the Fifth International Conference on Concept Lattices and Their Applications</source>
          . vol.
          <volume>331</volume>
          , pp.
          <fpage>153</fpage>
          -
          <lpage>164</lpage>
          (
          <year>2007</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          34.
          <string-name>
            <surname>Vityaev</surname>
          </string-name>
          , E.:
          <article-title>Knowledge acquisition from data. Cognition via computers</article-title>
          .
          <source>Models of cognitive processes. Essays on physiological functional systems</source>
          , Novosibirsk State University (
          <year>2006</year>
          )
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          35.
          <string-name>
            <surname>Wille</surname>
          </string-name>
          , R.:
          <article-title>Restructuring lattice theory: an approach based on hierarchies of concepts</article-title>
          . In: Ferré,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Rudolph</surname>
          </string-name>
          , S. (eds.)
          <source>Formal Concept Analysis: Proceedings of 7th ICFCA</source>
          . pp.
          <fpage>314</fpage>
          -
          <lpage>339</lpage>
          . Springer (
          <year>2009</year>
          )
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>