=Paper= {{Paper |id=None |storemode=property |title=Extracting Both MofN Rules and if-then Rules from the Training Neural Networks |pdfUrl=https://ceur-ws.org/Vol-764/paper10.pdf |volume=Vol-764 |dblpUrl=https://dblp.org/rec/conf/nesy/TsopzeNT11 }} ==Extracting Both MofN Rules and if-then Rules from the Training Neural Networks== https://ceur-ws.org/Vol-764/paper10.pdf
    Extracting both MofN rules and if-then rules from the training neural networks
                Norbert Tsopze1,4 - Engelbert Mephu Nguifo2,3 - Gilbert Tindo1
     1
       Department of Computer Science - Faculty of Science - University of Yaounde I, Cameroon
2
  Clermont Université, Université Blaise Pascal, LIMOS, BP 10448, F-63000 Clermont-Ferrand, France
                        3
                          CNRS, UMR 6158, LIMOS, F-63173 Aubiére, France
   4
     CRIL-CNRS UMR 8188, Université Lille-Nord de France, Artois, SP 16, F-62307 Lens, France

1    Introduction                                                                                    a
                                                                                                                     h1                  c1
                                                                                                                                                      (1)ANN to the If the n rules
                                                                                                                                                      (2) ANN to the MofN rules

Artificial Neural Networks classifiers have many advantages                                          b
                                                                                                                                     c2
                                                                                                                                                      (3) ANN to the MaxSubsets and the
                                                                                                                                                      geneartors lsts
                                                                                                                                                      (4)From the MaxSubsets to the if the rules
such as: noise tolerance, possibility of parallelization, better
                                                                                                                     h2
                                                                                                                                                      (5) From the Generators to the MofN rules
                                                                                                     c

training with a small quantity of data . . . . Coupling neural                                       d
                                                                                                                     h3                  c3


networks with an explanation component will increase its us-                                         Trained Artificial Neural Network

age for those applications. The explanation capacity of neu-                                                              (3)

ral networks is solved by extracting knowledge incorporated
in the trained network [Andrews et al., 1995]. We consider a
                                                                                                                                                (2)
                                                                                         (1)                  MaxSubsets

                                                                                                                           Generators
single neuron (or perceptron) with Heaviside map as activa-                                    (4)
                                                                                                                                              (5)


tion function (f (x) = 0 if x < 0 else 1). For a given percep-         If - then rules                                                                            MofN rules
tron with the connection weights vector W and the threshold
θ, this means finding the different states where the neuron is
active (wich could be reduced to the Knapsack problem.                                    Figure 1: Rules extraction process:
   With the existing algorithms, two forms of rules are re-
ported in the literature : ’If (condition) then conclusion’
                                                                   list and (5) generation of MofN rules from the generators list.
form noted ’if then’ rules, ’If (m of a set of conditions) then
                                                                   An extended version of this work is described in [Tsopze et
conclusion’ form noted ’M of N ’
                                                                   al., 2011].
   The intermediate structures that we introduce are called
MaxSubset list and generator list. The MaxSubset is a min-
imal structure used to represent the if-then rules while the       3   Conclusion
generator list is some selected MaxSubsets from which we           This approach consists in extracting a minimal list of ele-
can derive all MaxSubsets and all MofN rules. We introduce         ments called MaxSubset list, and then generating rules in one
heuristics to prune and reduce the candidate search space.         of the standard forms : if-then or MofN. To our knowledge,
These heuristics consist of sorting the incoming links accord-     it is the first approach which is able to propose to the user
ing to the descending order, and then pruning the search space     a generic representation of rules from which it is possible to
using the subset cardinality bounded by some determined val-       derive both forms of rules.
ues.
                                                                   Acknowledgments
2 The MaxSubsets and generators Rules                              This work is partially supported by the French Embassy in
  extraction approach                                              Cameroon, under the first author’s PhD grant provided by the
                                                                   French Embassy Service SCAC (Service de Coopération et
The general form of a MofN rule is ’if mV         1 of N1 ∧        d’Action Culturelle).
m2 of N2 ∧...∧mp of Np then conclusion’ or ’ i (mi of Ni )
then conclusion’; for each subset Ni of the inputs set, if mi      References
elements are verified, the conclusion is true.
   The common limit of previous approaches is the exclusive        [Andrews et al., 1995] R. Andrews, J. Diederich, and
form of the extracted rules. Thus we introduce a novel ap-            A. Tickle. Survey and critique of techniques for extracting
proach called MaxSubset from which it is possible to generate         rules from trained artificial neural networkss. Knowledge-
both forms of rules. The MaxSubset approach follows oper-             Based Systems, 8(6):373–389, 1995.
ations (3), (4) and (5) of figure 1; while the existing known      [Tsopze et al., 2011] N. Tsopze, E. Mephu Nguifo, and
algorithms follow the path (1) for the if-then rules and (2) for      G. Tindo. Towards a generalization of decompositional
the MofN rules. The processes (3), (4) and (5) of the figure 1        approach of rules extraction from network. In Proceeding
are described as follows: (3) MaxSubsets and generators ex-           IJCNN’11 international joint conference on Neural Net-
traction; (4) generation of if-then rules from the MaxSubsets         works, 2011.




                                                              38