<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Research perspective: the role of automated machine learning in fuzzy logic</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Radwa El Shawi</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Stefania Tomasiello</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Henri Liiva</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Institute of Computer Science, University of Tartu</institution>
          ,
          <addr-line>Tartu</addr-line>
          ,
          <country country="EE">Estonia</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>In this short paper, we briefly discuss the potential of Automated Machine Learning (AutoML) in the context of fuzzy-based systems. The scarce presence of related works in the current literature shows that this process is just in the beginning. We will give an overview of AutoML and its investigated and possible use to enhance fuzzy-based systems.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;fuzzy-based systems</kwd>
        <kwd>automated machine learning</kwd>
        <kwd>hyperparameter optimization</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>In general, the process of building a high-quality machine learning pipeline is an iterative,
complex, and time-consuming process that requires solid knowledge and understanding of the
various techniques that can be employed in each step of the pipeline. With the continuous and
vast increase of the amount of data in our digital world, it has been acknowledged that the
number of knowledgeable data scientists can not scale to address these challenges. Thus, there is
a crucial need for automating the process of building good machine learning pipelines where the
presence of a human in the loop can be dramatically reduced. Research in Automated Machine
Learning (AutoML) aims to alleviate both the computational cost and human expertise required
for developing machine learning pipelines through automation with eficient algorithms. In
particular, AutoML techniques are enabling the widespread use of machine learning techniques
by domain experts and non-technical users.</p>
      <p>
        The use of AutoML in fuzzy logic is just in the beginning. There are very few papers in the
literature covering such a topic. For instance, in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], the joint use of fuzzy logic and AutoML
allowed to get an enhanced intelligent tiering system for storage optimization. The proposed
architecture in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] consists of a fuzzy inference system to select the eligible tiers, satisfying
some business criteria, and an AutoML component, such as Auto-SKLearn, to recommend a
classification algorithm on the basis of a list previously formed. The study in [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] introduces a
fuzzy-based active learning method for predicting students’ academic performance, exploiting
AutoML practices. The authors considered first six auto-configured representative fuzzy
classiifers, by performing many experiments to select the most promising ones. Secondly, four active
learning models were constructed, while AutoML was employed to automate the selection of the
fuzzy machine learning models and their respective hyperparameters, using the three prevailing
classifiers of the previous comparative study. AutoML and fuzzy C-means are discussed in [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]
in the context of biomedical data analysis. The latter is a field attracting growing interest, due
to the complex data sets. To this end, the authors discussed a model based on fuzzy c-means
with meta-heuristic optimization. In this short paper, we will briefly discuss the potential of
AutoML in the design of fuzzy systems.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. AutoML: an overview</title>
      <p>
        In general, the process of building a high-quality machine learning model is an iterative, complex
and time-consuming process (Figure 1). In particular, a data scientist is commonly challenged
with a large number of choices and design decisions needed to be taken. For example, the data
scientist needs to select among a large variety of algorithms including classification or regression
techniques (e.g. Support Vector Machines, Neural Networks, Bayesian Models, Decision Trees,
etc) in addition to tuning large number of hyper-parameters of the selected algorithm. The
performance of the model can optimised by the metric of choice (e.g., accuracy, sensitivity,
specificity, F1-score). Naturally, the decisions of the data scientist in each of these steps afect
the performance and the quality of the developed model [
        <xref ref-type="bibr" rid="ref4 ref5 ref6">4, 5, 6</xref>
        ]. For instance, in yeast dataset,
diferent parameter configurations of a Random Forest classifier result in diferent range of
accuracy values with around 5% diference 1. Also, using diferent classifier learning algorithms
leads to widely diferent performance values for the fitted model that reached 20% on the
same dataset. Thus, making such decisions require experts’ knowledge. However, in practice,
increasingly, users of machine learning tools are non-expert ones who require of-the-shelf
solutions. Therefore, there has been a growing interest to automate the steps of building the
machine learning pipelines.
      </p>
      <p>
        Therefore, several frameworks have been designed to support automating the Combined
Algorithm Selection and Hyper-parameter tuning (CASH) problem [
        <xref ref-type="bibr" rid="ref7 ref8">7, 8</xref>
        ]. These techniques
have commonly formulated the problem as an optimization problem that can be solved by wide
range of techniques. Let  denote a machine learning algorithm with  hyperparameters.
We denote the domain of the  − ℎ hyperparameter by Λ  and the overall hyperparameter
configuration space as Λ = Λ (1) × Λ (2) × ...Λ (). A vector of hyperparameters is denoted by
Λ , and  with its hyperparameters instantiated to is denoted by  . Given a dataset , the
CASH problem aims to find
* * ∈  (, , )
      </p>
      <p>*  A, ∈Λ
where (, , ) measures the loss of a model generated by algorithm  with
hyperparameters  on training data  and evaluated on validation data .</p>
      <p>In the following, we review some of the-state-of-art frameworks for AutoML.
Data
Optimization
Metric
Constraints
(Time/Cost)</p>
      <p>Automated
Machine Learning</p>
      <p>Systems
Optimization</p>
      <p>Technique</p>
      <p>Search Space</p>
      <p>Machine Learning</p>
      <p>Model</p>
      <sec id="sec-2-1">
        <title>2.1. AutoWeka</title>
        <p>
          Auto-Weka2 is considered the earliest open source automated machine learning framework
with a basic GUI [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]. It was implemented in Java on top of Weka3, a popular machine
learning library that ofers an extensive suite of machine learning methods. Auto-Weka applies
Bayesian optimization using Sequential Model-based Algorithm Configuration ( SMAC) [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ] as
default optimizer but also support tree-structured parzen estimator (TPE) as optimizer for both
algorithm selection and hyper-parameter optimization. The main advantage of using SMAC
is its robustness by having the ability to discard low performance parameter configurations
quickly after the evaluation on a low number of dataset folds. SMAC shows better performance
on experimental results compared to TPE [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. Auto-Weka tackles classification and regression
type of machine learning problems.
        </p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Auto-Sklearn</title>
        <p>
          Auto-Sklearn4 [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ] has been implemented on top of Scikit-Learn5, a popular Python machine
learning package. Auto-Sklearn introduced the idea of meta-learning in algorithm selection
and hyper-parameter tuning instantiations and used SMAC as a Bayesian optimization technique.
In addition, ensemble methods were used to improve the performance of output models. Both
meta-learning and ensemble methods improved the performance of vanilla SMAC optimization.
2https://www.cs.ubc.ca/labs/beta/Projects/autoweka/
3https://www.cs.waikato.ac.nz/ml/weka/
4https://github.com/automl/auto-sklearn
5https://scikit-learn.org/
2.3. TPOT
TPOT 6 framework represents another type of solution [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] that is implemented on top of
Scikit-Learn as its backend to automatically find a machine learning pipleline. It is based on
genetic programming by exploring many diferent possible pipelines of feature engineering and
learning algorithms. Then, it finds the best one out of them. TPOT tackles both classification
and regression problems and allows the user to restrict the optimization search space in terms
of feature transformers, machine learning model and their hyper-parameters. TOPT also is
scalable as it supports distributed computation through dask7.
        </p>
      </sec>
      <sec id="sec-2-3">
        <title>2.4. Recipe</title>
        <p>
          Recipe [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ] follows the same optimization procedure as TPOT using genetic programming,
which in turn exploits the advantages of a global search. However, Recipe considers the
unconstrained search problem in TPOT, where resources can be spent into generating and
evaluating invalid solutions by adding a grammar that avoids the generation of invalid pipelines,
and can speed up optimization process. Second, it works with a bigger search space of diferent
model configurations than Auto-SkLearn and TPOT.
        </p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. AutoML and fuzzy logic: application, challenges and perspectives</title>
      <p>
        The potential of AutoML to select a proper fuzzy classifier has been recently shown [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. The
performance of most computing schemes (both for classification and prediction) depends on
the values assigned to their hyperparameters. Hyperparameters have usually a varying degree
of complexity and dimension which are dificult to set. Hyperparameters optimization is one
of the tasks that AutoML foresees. There are some examples in literature of hyperparameters
optimization in the context of fuzzy-based systems. Major hyperparameters which afect the
eficiency of fuzzy-based systems are related to the partitioning of the universe of discourse (e.g.
length, type). For instance, in [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] an evolutionary hyperparameter optimization for Weighted
Multivariate Fuzzy Time Series method was proposed. In [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], particle swarm optimization of
partitions and fuzzy order for fuzzy time series forecasting was discussed. In [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], the evidence
framework is applied to optimize the hyperparameters of Fuzzy HyperSphere Support Vector
Machine. An attempt to find the best fuzzy partition to enhance the performance of Fuzzy
Transforms is ofered in [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. All these examples are not in the context of AutoML, which
instead could provide a better framework to select the most suitable fuzzy-based system.
      </p>
      <p>
        It must be mentioned that existing AutoML systems and tools mainly target the domain of
supervised learning. Unsupervised learning, in particular clustering, would also benefit from
AutoML solutions (e.g. for the evaluation of clustering results). To address such issue, in [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ], a
framework for automated clustering, encompassing algorithm selection and hyperparameter
tuning, was proposed. A similar tool would be valuable to support the best choice of a clustering
method in neuro-fuzzy systems [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ].
      </p>
      <p>6https://automl.info/tpot/
7https://dask.org/</p>
      <p>
        There is in literature an example of unsupervised fuzzy rule-based system, not using
clustering. It is a self-developing (evolving) fuzzy-rule-based classifier system, called AutoClass
[
        <xref ref-type="bibr" rid="ref20">20</xref>
        ]. AutoClass learns without specifying number of rules and number of classes, by using data
clouds. Data clouds are subsets of given data samples with common properties. Although the
concepts of data clouds and traditional clusters are diferent, since in the first case, there are no
well-defined boundaries, the authors use a measure of zone of influence of a data cloud. This is
the only user-defined parameter. While this scheme deserves attention, it would be interesting
to compare its performance against the one of a fuzzy inference system built with the support
of AutoML.
      </p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>Choosing the proper method for a given problem formulation and configuring the optimal
parameter setting is a demanding task. AutoML can help in this regard. The performance
of most fuzzy-based systems is afected by the choice of the partition. In some unsupervised
neuro-fuzzy systems, the choice of a proper clustering method is critical. The application of
AutoML to such problems would be beneficial, as shown by very few examples in the current
literature.</p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgments</title>
      <p>Stefania Tomasiello and Henri Liiva acknowledge support from the European Social Fund
through the IT Academy Programme. The work of Radwa El Shawi is funded by the European
Regional Development Funds via the Mobilitas Plus programme (grant MOBTT75)</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Batrouni</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>A</surname>
          </string-name>
          <article-title>Hybrid Architecture for Tiered Storage with Fuzzy Logic and</article-title>
          AutoML, Y. Luo (Ed.):
          <source>CDVE</source>
          <year>2020</year>
          , LNCS 12341, pp.
          <fpage>67</fpage>
          -
          <lpage>74</lpage>
          ,
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Tsiakmaki</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kostopoulos</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kotsiantis</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ragos</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <article-title>Fuzzy-based active learning for predicting student academic performance using autoML: a step-wise approach</article-title>
          ,
          <source>Journal of Computing in Higher Education</source>
          ,
          <year>2021</year>
          , https://doi.org/10.1007/s12528-021-09279-x
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Nayak</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Naik</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dash</surname>
            ,
            <given-names>P.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pelusi</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <article-title>Optimal fuzzy cluster partitioning by crow search meta-heuristic for biomedical data analysis</article-title>
          ,
          <source>International Journal of Applied Metaheuristic Computing</source>
          <volume>12</volume>
          (
          <issue>2</issue>
          ),
          <fpage>49</fpage>
          -
          <lpage>66</lpage>
          ,
          <year>2021</year>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Vafeiadis</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Diamantaras</surname>
            ,
            <given-names>K. I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sarigiannidis</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chatzisavvas</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <article-title>A comparison of machine learning techniques for customer churn predictio</article-title>
          ,
          <source>Simulation Modelling Practice and Theory</source>
          ,
          <volume>55</volume>
          ,
          <fpage>1</fpage>
          -
          <lpage>9</lpage>
          ,
          <fpage>2015</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Probst</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boulesteix</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <article-title>To Tune or Not to Tune the Number of Trees in Random Forest</article-title>
          ,
          <source>Journal of Machine Learning Research</source>
          ,
          <volume>18</volume>
          ,
          <fpage>181</fpage>
          -
          <lpage>191</lpage>
          ,
          <year>2017</year>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Pedregosa</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          et al.,
          <article-title>Scikit-learn: Machine learning in Python</article-title>
          ,
          <source>Journal of machine learning research</source>
          ,
          <volume>12</volume>
          ,
          <fpage>2825</fpage>
          -
          <lpage>2830</lpage>
          ,
          <year>2011</year>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>El</surname>
            <given-names>Shawi</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            ,
            <surname>Maher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Sherif</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          ,
          <source>Automated Machine Learning: State-of-The-Art and Open Challenges</source>
          ,
          <year>2019</year>
          , http://arxiv.org/abs/
          <year>1906</year>
          .02287
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>He</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhao</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chu</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <article-title>AutoML: A Survey of the State-of-the-</article-title>
          <string-name>
            <surname>Art</surname>
          </string-name>
          ,
          <year>2019</year>
          , arXiv preprint arXiv:
          <year>1908</year>
          .00709
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Kotthof</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          et al.,
          <source>Auto-WEKA 2</source>
          .
          <article-title>0: Automatic Model Selection and Hyperparameter Optimization in WEKA</article-title>
          , JMLR,
          <volume>18</volume>
          (
          <issue>1</issue>
          ),
          <fpage>2017</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Hutter</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hoos</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Leyton-Brown</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <article-title>Sequential model-based optimization for general algorithm configuration</article-title>
          ,
          <source>International Conference on Learning and Intelligent Optimization</source>
          ,
          <fpage>507</fpage>
          -
          <lpage>523</lpage>
          ,
          <year>2011</year>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Feurer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Klein</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Eggensperger</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Springenberg</surname>
            ,
            <given-names>J.T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Blum</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Hutter</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <year>2019</year>
          .
          <article-title>Auto-sklearn: eficient and robust automated machine learning</article-title>
          .
          <source>In Automated Machine Learning</source>
          (pp.
          <fpage>113</fpage>
          -
          <lpage>134</lpage>
          ). Springer, Cham.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Randal</surname>
            <given-names>S. O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Jason</surname>
            <given-names>H. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>TPOT:</surname>
          </string-name>
          <article-title>A Tree-based Pipeline Optimization Tool for Automating Machine Learning</article-title>
          ,
          <source>Proceedings of the Workshop on Automatic Machine Learning</source>
          ,
          <year>2016</year>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>de Sá</surname>
            ,
            <given-names>A.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pinto</surname>
            ,
            <given-names>W.J.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Oliveira</surname>
            ,
            <given-names>L.O.V.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Pappa</surname>
            ,
            <given-names>G.L.</given-names>
          </string-name>
          ,
          <year>2017</year>
          , April. RECIPE:
          <article-title>a grammarbased framework for automatically evolving classification pipelines</article-title>
          .
          <source>In European Conference on Genetic Programming</source>
          (pp.
          <fpage>246</fpage>
          -
          <lpage>261</lpage>
          ). Springer, Cham.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Silva</surname>
            ,
            <given-names>P.C.L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De Oliveira E Lucas</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sadaei</surname>
            ,
            <given-names>H.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guimaraes</surname>
            ,
            <given-names>F.G.</given-names>
          </string-name>
          ,
          <article-title>Distributed Evolutionary Hyperparameter Optimization for Fuzzy Time Series</article-title>
          ,
          <source>IEEE Transactions on Network and Service Management</source>
          <volume>17</volume>
          (
          <issue>3</issue>
          ),
          <volume>9034097</volume>
          ,
          <fpage>1309</fpage>
          -
          <lpage>1321</lpage>
          ,
          <year>2020</year>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>N.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Susan</surname>
          </string-name>
          ,
          <article-title>Particle swarm optimization of partitions and fuzzy order for fuzzy time series forecasting of COVID-</article-title>
          19, Applied Soft Computing
          <volume>110</volume>
          ,
          <issue>107611</issue>
          ,
          <year>2021</year>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Tian</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhimin</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Qian</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wenge</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <article-title>The evidence framework applied to fuzzy hypersphere SVM for UWB SAR landmine detection</article-title>
          ,
          <source>International Conference on Signal Processing Proceedings, ICSP 3</source>
          ,
          <issue>4129197</issue>
          ,
          <year>2007</year>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Loia</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tomasiello</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Troiano</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <article-title>Improving approximation properties of fuzzy transform through non-uniform partitions</article-title>
          ,
          <source>Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 10147 LNAI</source>
          , pp.
          <fpage>63</fpage>
          -
          <lpage>72</lpage>
          ,
          <year>2017</year>
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Poulakis</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Doulkeridis</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kyriazis</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,Proceedings, AutoClust:
          <article-title>A framework for automated clustering based on cluster validity indices</article-title>
          ,
          <source>IEEE International Conference on Data Mining, ICDM</source>
          <year>2020</year>
          ,
          <volume>9338346</volume>
          , pp.
          <fpage>1220</fpage>
          -
          <lpage>1225</lpage>
          ,
          <year>2020</year>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Pillai</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <article-title>Recent advances in neuro-fuzzy system: A survey, Know</article-title>
          . Bas. Syst., vol.
          <volume>152</volume>
          , pp.
          <fpage>136</fpage>
          -
          <lpage>162</lpage>
          ,
          <year>2018</year>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Costa</surname>
            ,
            <given-names>B. S. J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Angelov</surname>
            ,
            <given-names>P. P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Guedes</surname>
            ,
            <given-names>L. A.</given-names>
          </string-name>
          <article-title>Fully unsupervised fault detection and identification based on recursive density estimation and self-evolving cloud-based classifier</article-title>
          .
          <source>Neurocomputing</source>
          ,
          <volume>150</volume>
          ,
          <fpage>289</fpage>
          -
          <lpage>303</lpage>
          ,
          <year>2015</year>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>