<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Hyperplane Clasterization of the Small Data Based on Pseudo-Inverse and Projective Matrices</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Iurii Krak</string-name>
          <email>krak@univ.kiev.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Hrygorii Kudin</string-name>
          <email>gkudin@ukr.net</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mykola Efremov</string-name>
          <email>nick.yefremov.in@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alexander Samoylov</string-name>
          <email>SamoylovSasha@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vladislav Kuznetsov</string-name>
          <email>kuznetsov.wlad@incyb.kiev.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yedilkhan Amirgaliyev</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Veda Kasianiuk</string-name>
          <email>veda.kasianiuk@gmail.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Glushkov Cybernetics Institute</institution>
          ,
          <addr-line>Kyiv, 40, Glushkov ave., 03187</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Institute of Information and Computer Technologies</institution>
          ,
          <addr-line>125, Pushkin str., Almaty, 050010</addr-line>
          ,
          <country>Republic of Kazakhstan</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Taras Shevchenko National University of Kyiv</institution>
          ,
          <addr-line>60, Volodymyrska Street, Kyiv, 01033</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Based on the developed mathematical methods for solving linear algebraic equations, an approach to solving problems of classification and clustering of information using the characteristic features of objects is proposed. An algorithm of hyperplane clustering with the verification of a given efficiency criterion by constructing hyperplanes in a space derived from the original feature space using the theory of perturbation of pseudo inverse and projection matrices is developed. A method of piecewise hyperplane cluster synthesis for selection of the most effective characteristics features and an algorithm for construction of piecewise hyperplane clusters are also proposed, which allows to find an effective solution of the given problems. The productivity and efficiency of the proposed approach is shown by the example of scaling the characteristic features of the recognition of the letters of the alphabet of fingerprints of sign language.</p>
      </abstract>
      <kwd-group>
        <kwd>1 1 clustering</kwd>
        <kwd>classification</kwd>
        <kwd>pseudo-inverse operations</kwd>
        <kwd>SLAE</kwd>
        <kwd>optimization</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>One of the important problems of classification and clustering of information is the problem of
minimizing the dimension of the space of features, the choice of criteria for optimal solutions in
the processes of practical use. Such problems are effectively solved by the method of
multidimensional scaling of empirical data on the proximity of objects, with the help of which
the dimension of the measured objects essential characteristics space is determined and the
configuration of points - objects in this space - is constructed. This space is a multidimensional
scale, similar to the commonly used scales in various applications in the sense that the values of
the specially generated essential characteristics of the measured objects correspond to certain
positions on the axes of the new space [1]-[5].</p>
      <p>The purpose of this work is the development of mathematical methods for the synthesis of
systems for solving problems of classification and clustering based on information about the
characteristic features of objects [6]-[10]. These problems are proposed to be solved by
constructing hyperplanes in a space derived from the original space of features using theory of
perturbation of pseudo-inverse and projection matrices and solving systems of linear algebraic
equations. The paper proposes a method for synthesizing a piecewise hyperplane cluster to
isolate the most effective characteristic features and an algorithm for constructing piecewise
hyperplane clusters that allow you to find an effective solution to the problems. The performance
and efficiency of the proposed approach is shown on the example of the scaling of characteristic
features for recognizing the letters of the fingerprint alphabet of the sign language [8], [9], [11].</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related works</title>
      <p>
        The problem of synthesis of a piecewise hyperplane cluster for a training sample of vectors
Ω0 = {x : x( j) ∈ E m , j = 1, n}, where x(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ),, x(n) - vectors from the Euclidean space of features
E m is to build a cluster so that the training sample points in this space are located quite close, in
the sense of a given distance criterion, to some set of hyperplanes that are formed according to
this sample.
      </p>
      <p>
        Note that in the formulation of this problem for clustering of information, the components of
the set of hyperplanes are not known in advance. Therefore, for the correct construction of
piecewise hyperplane clustering procedures, it is assumed that vectors x(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ),, x(n) from the
space of features E m can belong to one of several hyperplanes L( A(k ),b(k )), where A(k ) ∈ E s×m ,
b(k ) ∈ E s , k = 1,2,... some given dimension s , (s &lt; m) . Here A(k ) and b(k ) are matrix and vector
parameters, respectively, for a fixed hyperplane L( A(k ),b(k )) , k = 1,2,... .
      </p>
      <p>The proposed method of a piecewise-hyperplane cluster synthesis is based on using the
representation of hyperplanes by means of a set of solutions (pseudo solutions) of systems of
algebraic equations.</p>
      <p>A(k )x = b(k ) ,</p>
      <p>L( A(k ), b(k )) =
= {x ∈ E m : x = A+ (k )b(k ) + Z ( A(k ))z, z ∈ E m} .</p>
      <p>Here A+ - pseudo inverse matrix, Z - projection matrix.</p>
      <p>Let us give some mathematical results on the inversion (pseudo inversion) of matrices and the
construction of projection matrices, which are important for solving the problem of synthesizing
a piecewise-hyperplane cluster.</p>
      <p>Let the matrix be given A = (aij ),i = 1, m j = 1, n and we write down, which are important in
further studies, the representations of this matrix in columns or rows, respectively:
where T - transpose symbol.</p>
      <p>
        A = (a(
        <xref ref-type="bibr" rid="ref1">1</xref>
        )...a(n)), a( j) ∈ E m , j = 1, n,
A = (a(T1)

      </p>
      <p>
        T )T , a(Ti) ∈ E n ,i = 1, m ,
a(m)
(
        <xref ref-type="bibr" rid="ref1">1</xref>
        )
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        )
We consider the singular decomposition of an arbitrary matrix A of dimension m × n of rank
r
r ≤ min(m,n) in the form A = i∑=1λiuiviT , where: λ12 ≥ ... ≥ λ2r &gt; 0 - non-zero eigenvalues of matrices
AAT , AT A ; vi ∈ E n ,i = 1, r - orthonormal set of eigenvectors of the matrix AT A , which
correspond to non-zero eigenvalues λi2 ,i = 1, r , AT Avi = λi2vi , i = 1, r,viT v j = δ ij ui ∈ E m ,i = 1, r
orthonormal set of eigenvectors of the matrix AAT , which also correspond to non-zero
eigenvalues λi2 ,i = 1, r , AAT ui = λi2ui , i = 1, r, uTi u j = δ ij , δ ij - Kronecker’s symbol.
      </p>
      <p>Also, using the singular representation of the matrix A∈ E m×n , the pseudo inverse matrix
A+ ∈ E n×m can be represented as [13]:</p>
      <p>r</p>
      <p>A+ = j∑=1ν juTj λ−j1 .</p>
      <p>Will consider important for practical application, matrices that are defined and calculated using
matrices A и A+ :</p>
      <p>r
1) a projection matrix P( A) = A+ A ≡ ∑ν iν iT that is an orthogonal projector onto the subspace
i=1
LAT generated by the row vectors of the matrix A ;</p>
      <p>2) projection matrix Z ( A) =I n−P( A) - orthogonal projection onto a subspace orthogonal to a
subspace LAT , I n - unit matrix;</p>
      <p>r
3) matrix R( A) = A+ ( A+ )Τ ≡ j∑=1ν jν Τjλ−j 2 .</p>
      <p>Note the important properties of projection matrices P and Z :</p>
      <p>P( A) + Z ( A) = In , P( A) = A+ A,</p>
      <p>r r
P( A) = ∑ viviT , Z ( A) = ∑ viviT .</p>
      <p>i=1 i=r+1
Note that the calculation of the pseudo inverse matrix for an arbitrary matrix can be reduced to
the calculation of the corresponding to it some inverse matrix using the following relations:</p>
      <p>A+ = ( AT A)+ AT = AT ( AAT )+ , A+ = R( A) AT .</p>
      <p>Using the above mathematical ratios for the inversion and pseudo inversion of matrices, we
write the necessary formulas for the synthesis of a piecewise hyperplane cluster.</p>
      <p>The distance ρ (x( j), L( A,b)) from the point x( j) to the hyperplane L( A,b) will be found from
the relation</p>
      <p>To calculate the sum of the squares of the distances of the set of points x( j), j = 1,n to the
hyperplane L( A,b) , we use the following formula:
ρ 2 ({x : x( j), j = 1, n}, L( A,b)) =
n T
= ∑ (b − Ax( j)) R( AT )(b − Ax( j)) =</p>
      <p>j=1
= tr R( AT ) ∑n (b − Ax( j))(b − Ax( j))T .</p>
      <p>j=1
Here tr() - matrix trace.</p>
      <p>Then, for given values A, x( j), j = 1, n , the optimal value of the vector of the right-hand side
of the system of equations determining the hyperplane is determined from the conditions
bopt = Axˆ =
= arg minρ 2 ({x : x( j), j = 1, n}, L( A,b)), xˆ =
b∈RS
1 n</p>
      <p>∑ x( j).
n j=1
From here, the distance ρ ({x : x( j), j = 1, n}, L( A,b)) of the set of points x( j) , j = 1, n to the
hyperplane, with an optimal vector bopt , is calculated by the following formula:</p>
      <p>The optimal matrix A∈ E s×m is defined as a solution of the problem</p>
      <p>Aopt = arg ρ 2 ({x : x( j), j = 1, n},.</p>
      <p>min</p>
      <p>AAT =Es ,A∈Es×m
L( A,bopt ( A))) = (umT−s+1

umT )T .</p>
      <p>Wherein trAo+pt Aopt X~X~ T = ∑ λ 2j , (u1,, um )T (u1,, um ) = I m .</p>
      <p>
        r
j=m−s+1
Using the above results on pseudo inversion of matrices and calculation of distances, a piecewise
hyperplane clustering method is proposed. The idea of the method is to perform a sequence of
steps, at each of which the parameters of the hyperplanes L( A(k ),b(k )), k = 1,2,... are found.
These hyperplanes are constructed within the framework of performance the requirements of the
implemented hyperplane clustering efficiency criterion. As initial parameters of piecewise
hyperplane clustering it is assumed that all the training set vectors x(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ),, x(n) from the space
of features E m are optimally approximated by the hyperplane L( Aopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )) . At these values
Aopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) , the performance efficiency criterion of hyperplane clustering is checked. If the
efficiency criterion is satisfied, this means that the result of the construction of the piecewise
hyperplane cluster is achieved by building the cluster with one hyperplane and those
characteristic features are the optimal set. If the conditions of the efficiency criterion in the
framework of this (first) hyperplane are not met, then the transition to the construction of the
second hyperplane of the cluster is carried out. For this purpose, the vectors that determine the
non-fulfillment of the efficiency criterion are excluded from the training sample, that is, a subset
is formed Ω1 = {x : x( j1) ∈ E m , j1 = 1,n1}. Then the actions on the optimal approximation of the
subset Ω1 by the hyperplane L( Aopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )) are repeated, and at the new optimal values
Aopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) the fulfillment of the clustering efficiency criterion is checked. When the criterion
is met, the result of the piecewise hyperplane cluster construction is reached, then the resulting
vectors of characteristic features are added to the optimal set of features. When not performing
it is necessary to exclude from the training sample those vectors that cause to the non-fulfillment
of the efficiency criterion and repeat the procedure. Note that this finite recurrent process will
always ensure the construction of an optimal piecewise hyperplane cluster.
3. Algorithm of synthesis of the piecewise hyperplane cluster
Using the proposed method of synthesizing piecewise hyperplane clusters, the algorithm of such
a synthesis can be represented as the following sequence of actions:
1. Formation of a single-link cluster (the number of a link in a cluster is an index in brackets):
1) All vectors of the training sample Ω(0) = {x : x( j) ∈ E m , j = 1, n} from the space of features
are to be optimally approximated by a hyperplane L( Aopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )) , which is defined as the set of
solutions (pseudo-solutions) of systems of algebraic equations (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ), (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), where A(k ) and b(k ) are
respectively the matrix and vector parameters of a certain hyperplane (basic formulas for
constructing a hyperplane cluster are implemented).
      </p>
      <p>
        2) To form a set Ω(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) = {x : x( j1 ) ∈ E m , j1 = 1, n1} of points Ω(0) for which the condition is
satisfied, namely:
( b1opt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) − A1opt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )x( j1 ))T R( A1Topt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )) ×
× ( b1opt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) − A1opt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )x( j1 )) &gt; hmin ,
where hmin - valid distance of vectors from components of the cluster of hyperplanes. The linear
dependence or independence of vectors that can be removed from the set makes it possible to
simplify the form of the formulas for the distances of these vectors from the corresponding
hyperplanes.
      </p>
      <p>
        3) The stop of the algorithm at the stage of the first cluster link can be completed after the
distance of each of the vectors of the training sample Ω(0) = {x : x( j) ∈ E m , j = 1, n} to the
hyperplane L( Aopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )) does not exceed the allowable distance.
      </p>
      <p>
        4) If the set Ω(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) = {x : x( j1 ) ∈ E m , j1 = 1, n1} includes at least two vectors, then go on to form
the second link of the cluster.
      </p>
      <p>
        2. Formation of the second cluster link. Consists of the following:
1) From the set obtained in the process of building the first link
Ω(
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) = {x : x( j1 ) ∈ E m , j1 = 1, n1}, a hyperplane L( Aopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), bopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )) defined as the set of solutions
(pseudo - solutions) of a system of algebraic equations
      </p>
      <p>
        Ak (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )x = bk (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) , k = 1,2,...
is constructed.
      </p>
      <p>
        2) To calculate optimal Akopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) , bkopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) for L( Ak (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ),bk (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )) , k = 1,2,...
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        )
3) To form a set Ω0k (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), k = 1,2,... by the distance of the vector x( j)∈Ω0 (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) to each of the
hyperplanes (
        <xref ref-type="bibr" rid="ref3">3</xref>
        ):
      </p>
      <p>
        ρ 2 (x : x( j) , L( Akopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), bkopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ))) = = (bkopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) − Akopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )x( j))T RT ( Akopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )) × × (bkopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        ) − Akopt (
        <xref ref-type="bibr" rid="ref2">2</xref>
        )x( j)),
i.e., perform actions similar to paragraph 3 of the construction of the first link.
j = 1,2,. The superscript j denotes the
number of iterations at the second link stage.
      </p>
      <p>The algorithm is stopped at the stage of the second cluster link after the distances of each of the
vectors of the corresponding partitioning element to the corresponding hyperplane are not
improved.</p>
      <p>The efficiency criterion of the carried out hyperplane clustering is checked (for example, the
requirement of the level of compactness of the cluster links). When the criterion is fulfilled, the
cluster is completed, and if it is unfulfilled, the transition to the formation of the third cluster link
is carried out. Then the process repeats and, since it is finite, we will always find a solution to the
clustering problem.</p>
    </sec>
    <sec id="sec-3">
      <title>4. Experimental studies</title>
      <p>To test the effectiveness of the proposed method of scaling information, we used characteristic
features for recognizing the dactyl letters of the Ukrainian sign language alphabet [6], [7]. As
characteristic features, 52 features were taken, which were divided into 6 groups, depending on
the method of their getting.
The construction of piecewise hyperplane clusters was carried out with groups of features that
characterize the geometric - topological parameters of the human hand when showing the letters
of the dactyl alphabet and for which an acceptable recognition quality was obtained [6]. Using
the example of the classification of nine dactylemes (А, Б, В, Г, Ж, І, Є, И, Й) according to
three and five characteristic features, the separation of these dactylemes on the scaling plane was
obtained. Wherein three characteristic features were taken: compactness, directionality,
elongation, and in experiments with five features: the ratio of width to height, four angles values
between vectors drawn from the center of the hand to the most distant points of the hand.</p>
      <p>Experimental results using these three characteristic features are given in Fig. 1, where image
of the dactylemes location on the scaling plane is normalized on the interval 0,1. Without
reducing the generality of research, dactylem A was placed at the origin.</p>
      <p>The results of clustering with using of five characteristic features are shown in Fig. 2., where
the dactylem A was also placed at the origin.
The results of the experiments showed that using of five characteristic features makes it possible
to get a clearer separability (the distance of the dactylemes from the scaling plane ranged from
0.1580 to 0.3828), while using three features the distances from the scaling plane were
significantly less (ranging from 0.0306 to 0.1274) that is, three to five times less. The exception
was dactylem Б, in the first case (0.0177) and in the second case (0.0073), the separability from
the scaling plane was not insignificant in both cases.</p>
    </sec>
    <sec id="sec-4">
      <title>5. Conclusions</title>
      <p>The paper proposes method and algorithm for multidimensional scaling of recognition objects
characteristic features using the means of matrix pseudo inversion, building piecewise
hyperplane clusters that provide a solution of the problem [14]-[18]. Proposed method makes it
possible to analyze information about the set of characteristic features and to identify those that
are essential for solving the recognition problem that is important due to their significant number
and for difficult separable classes [19]-[25]. The effectiveness of the proposed approach is
shown by the example of obtaining clusters for dactylemes sign language in order to obtain the
optimal number of characteristic features for effective recognition.</p>
      <p>
        Subsequent research will focus on the study of different types of characteristics features and
their impact on the quality of recognition.
[15] I.Krak, O.Barmak, E.Manziuk, Using visual analytics to develop human and
machinecentric models: A review of approaches and proposed information technology,
Computitional Intelligence, 2020, pp. 1-26. https://doi.org/10.1111/coin.12289
[16] A.V.Barmak, Y.V.Krak, E.A.Manziuk, V.S.Kasianiuk, Information technology of
separating hyperplanes synthesis for linear classifiers, Journal of Automation and
Information Sciences, 51(
        <xref ref-type="bibr" rid="ref5">5</xref>
        ), 2019, pp. 54-64. Doi: 10.1615/JAutomatinfScien.v51.i5.50
[17] D.Oosterlinck, D.F.Benoit, P.Baecke, From one-class to two-class classification by
incorporating expert knowledge: Novelty detection in human behaviour. European Journal
of Operational Research, 282(
        <xref ref-type="bibr" rid="ref3">3</xref>
        ), 2020, pp. 1011-1024.
https://doi.org/10.1016/j.ejor.2019.10.015
[18] D.Abati, A.Porrello, S.Calderara, R.Cucchiara, Latent space autoregression for novelty
detection, In: Proceedings of the IEEE Conference on Computer Vision and Pattern
Recognition, 2019, pp. 481-490
[19] D.Gong, L.Liu, V.Le, B.Saha, M.R.Mansour, S.Venkatesh, A.v.d.Hengel, Memorizing
normality to detect anomaly: Memory-augmented deep autoencoder for unsupervised
anomaly detection, In: Proceedings of the IEEE International Conference on Computer
Vision, 2019, pp. 1705 - 1714
[20] C.You, D.P.Robinson, R.Vidal, Provable self-representation based outlier detection in a
union of subspaces, In 2017 IEEE Conference on Computer Vision and Pattern
Recognition, CVPR 2017, Honolulu, HI, USA, July 21-26, 2017, pp. 4323 – 4332
[21] C.-H.Lai, D.Zou, G.Lerman, Robust subspace recovery layer for unsupervised anomaly
detection. in Proc. Int. Conf. Learn. Represent, 2020, pp. 1 - 28. arXiv:1904.00152
[22] Z.Cheng, E.Zhu, S.Wang, P.Zhang, W.Li, Unsupervised Outlier Detection via
Transformation Invariant Autoencoder, IEEE Access, 9, 2021, pp. 43991-44002. doi:
10.1109/ACCESS.2021.3065838
[23] H.Wang, M.J.Bah, M.Hammad, Progress in outlier detection techniques: A survey, IEEE
      </p>
      <p>Access, 7, 2019, pp. 107964-108000. doi: 10.1109/ACCESS.2019.2932769
[24] L.Ruff, J.R.Kauffmann, R.A.Vandermeulen, G.Montavon, W.Samek, M.Kloft, K.R.Müller,
A unifying review of deep and shallow anomaly detection, Proceedings of the IEEE, 2021.
doi: 10.1109/JPROC.2021.3052449
[25] P.Perera, P.Oza, V.M.Patel, One-Class Classification: A Survey, 2021, arXiv preprint.
arXiv:2101.03064</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Devison</surname>
          </string-name>
          , Multidimensional scaling,
          <source>Moscow: Finansy I Statistyka</source>
          ,
          <year>1988</year>
          , 254 p.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>P.</given-names>
            <surname>Perera</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Oza</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.M.</given-names>
            <surname>Patel</surname>
          </string-name>
          ,
          <string-name>
            <surname>One-Class Classification</surname>
          </string-name>
          : A Survey,
          <year>2021</year>
          , arXiv preprint.
          <source>arXiv:2101.03064</source>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M.Z.</given-names>
            <surname>Zaheer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Astrid</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Mahmood</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.I.Lee</surname>
          </string-name>
          ,
          <article-title>Cleaning label noise with clusters for minimally supervised anomaly detection</article-title>
          ,
          <year>2021</year>
          . arXiv preprint,
          <source>arXiv:2104.14770</source>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>W.</given-names>
            <surname>Sultani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Shah</surname>
          </string-name>
          ,
          <article-title>Real-world anomaly detection in surveillance videos</article-title>
          ,
          <source>In Proceedings of the IEEE conference on computer vision and pattern recognition</source>
          ,
          <year>2018</year>
          , pp.
          <fpage>6479</fpage>
          -
          <lpage>6488</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>I.V.</given-names>
            <surname>Krak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.I.</given-names>
            <surname>Kudin</surname>
          </string-name>
          ,
          <string-name>
            <surname>A.I.Kulias</surname>
          </string-name>
          ,
          <article-title>Multidimensional Scaling by Means of Pseudoinverse Operations, Cybernetics</article-title>
          and
          <string-name>
            <given-names>Systems</given-names>
            <surname>Analysis</surname>
          </string-name>
          ,
          <volume>55</volume>
          (
          <issue>1</issue>
          ),
          <year>2019</year>
          , pp.
          <fpage>22</fpage>
          -
          <lpage>29</lpage>
          . doi:
          <volume>10</volume>
          .1007/s10559-019-00108-9
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>L.</given-names>
            <surname>Cheng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <article-title>Outlier Detection Ensemble with Embedded Feature Selection</article-title>
          .
          <source>In Proceedings of the AAAI Conference on Artificial Intelligence</source>
          ,
          <volume>34</volume>
          (
          <issue>04</issue>
          ),
          <year>2020</year>
          , pp.
          <fpage>3503</fpage>
          -
          <lpage>3512</lpage>
          . https://doi.org/10.1609/aaai.v34i04.
          <fpage>5755</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>S.A.N.</given-names>
            <surname>Alexandropoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.B.</given-names>
            <surname>Kotsiantis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.E.</given-names>
            <surname>Piperigou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.N.</given-names>
            <surname>Vrahatis</surname>
          </string-name>
          ,
          <article-title>A new ensemble method for outlier identification</article-title>
          ,
          <source>In 2020 10th International Conference on Cloud Computing, Data Science &amp; Engineering</source>
          , IEEE,
          <year>2020</year>
          , pp.
          <fpage>769</fpage>
          -
          <lpage>774</lpage>
          . doi:
          <volume>10</volume>
          .1109/Confluence47617.
          <year>2020</year>
          .9058219
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Iu.G.</given-names>
            <surname>Kryvonos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Iu.V.</given-names>
            <surname>Krak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.V.</given-names>
            <surname>Barmak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.V.</given-names>
            <surname>Shkilniuk</surname>
          </string-name>
          ,
          <article-title>Construction and identification of elements of sign communication</article-title>
          ,
          <source>Cybernetics and Systems Analysis</source>
          ,
          <volume>49</volume>
          (
          <issue>2</issue>
          ),
          <year>2013</year>
          . pp.
          <fpage>163</fpage>
          -
          <lpage>172</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Yu.V.</given-names>
            <surname>Krak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.A.</given-names>
            <surname>Golik</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.S.</given-names>
            <surname>Kasianiuk</surname>
          </string-name>
          ,
          <article-title>Recognition of dactylemes of Ukrainian sign language based on the geometric characteristics of hand contours defects</article-title>
          .
          <source>Journal of Automation and Information Sciences</source>
          ,
          <volume>48</volume>
          (
          <issue>4</issue>
          ),
          <year>2016</year>
          , pp.
          <fpage>90</fpage>
          -
          <lpage>98</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>J.T.O'Brien</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Nelson</surname>
          </string-name>
          ,
          <source>Assessing the Risks Posed by the Convergence of Artificial Intelligence and Biotechnology</source>
          , Health security,
          <volume>18</volume>
          (
          <issue>3</issue>
          ),
          <year>2020</year>
          , pp.
          <fpage>219</fpage>
          -
          <lpage>227</lpage>
          . https://doi.org/10.1089/hs.
          <year>2019</year>
          .0122
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>I.V.</given-names>
            <surname>Krak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.V.</given-names>
            <surname>Barmak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.O.</given-names>
            <surname>Romanyshyn</surname>
          </string-name>
          ,
          <article-title>The method of generalized grammar structures for text to gestures computer-aided translation</article-title>
          ,
          <source>Cybernetics and Systems Analysis</source>
          ,
          <volume>50</volume>
          (
          <issue>1</issue>
          ),
          <year>2014</year>
          , pp.
          <fpage>116</fpage>
          -
          <lpage>123</lpage>
          . doi:
          <volume>10</volume>
          .1007/s10559-014-9598-4
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>R.</given-names>
            <surname>Penrose</surname>
          </string-name>
          ,
          <article-title>A generalized inverse for matrices</article-title>
          .
          <source>Proceeding of the Cambridge Philosophical Society</source>
          ,
          <volume>51</volume>
          ,
          <year>1955</year>
          , pp.
          <fpage>406</fpage>
          -
          <lpage>413</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ben-Israel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.N.E.</given-names>
            <surname>Greville</surname>
          </string-name>
          , Generalized inverse:
          <source>Theory and Applications</source>
          , (2-nd Ed.), Springer-Verlag, New York,
          <year>2003</year>
          , 420 p.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>G.</given-names>
            <surname>Markowsky</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Savenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Lysenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Nicheporuk</surname>
          </string-name>
          ,
          <article-title>The Technique for Metamorphic Viruses' Detection Based on Its Obfuscation Features Analysis</article-title>
          ,
          <source>In ICTERI Workshops, CEUR Workshop Proceedings</source>
          , Vol.
          <volume>2104</volume>
          ,
          <year>2018</year>
          ,pp.
          <fpage>680</fpage>
          -
          <lpage>687</lpage>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>