<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>MULTILAYERED NEURAL-LIKE NETWORK OF DIRECT PROPAGATION WITH THE ADJUSTMENT ACCORDING TO SIMILARITY MEASURES OF VECTORS OF THE LEARNING SAMPLE*</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Krasnov A.E.</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Nadezhdin E.N.</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Nikol'skii D.N.</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Shmakova E.G.</string-name>
        </contrib>
      </contrib-group>
      <fpage>209</fpage>
      <lpage>218</lpage>
      <abstract>
        <p>The architecture of a multilayer network consisting of several levels of active elements is considered. The input level forms the signals propagated to the connectors (synapses) of the first level. All odd layers of the network consist of connectors (synapses), and even ones consist of switches (neurons). The number of connectors and switches in each layer corresponds to the number of reference signals, the training sample vectors. The process of recurrent adjustment of synaptic connections and neuronal responses of the network is explained both by measures of the similarity of the training sample vectors and similarity measures of these similarity measures. An experimental study using the example of a six-layer network showed that a multi-layer neural-like network of direct propagation is much easier to learn than a recursive network trained by the method of the error back propagation. At the same time, the proposed network is resistant to significant interference when distinguishing signals, which is due to the consideration of additional connections between the components of the reference signals. When analyzing signals against a noise background, under the condition of "interference amplitude / signal amplitude" is less than the average spread of the reference signals, this advantage can become decisive, since it makes it possible to realize an almost error-free signal difference. Multilayered network; direct distribution; vectors of the training sample; measures of similarity of vectors; measures of similarity over similarity measures; discrimination, signal; noise.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Ключевые слова
Многослойная сеть; прямое распространение; векторы обучающей выборки; меры сходства
векторов; меры сходства над мерами сходства; различение; сигнал; помеха.</p>
    </sec>
    <sec id="sec-2">
      <title>Introduction</title>
      <p>The structure of heterogeneous multiply connected network intended for modeling of neurodynamic problems,
recognition of signals and images, data processing of phased antenna arrays was proposed in [1]. The
implementation of the architecture of such network on clusters of universal and / or graphic processors is briefly
described, the approbation of the developed network model on examples of solving a number of known
highdimensional neurodynamic problems is performed in [2]. A wide class of neural-like networks was formally
described as a structural model of the cybernetic network, where the functional-structural topology of the
cybernetic network was also presented for the first time, taking into account the exchange of information and
control data streams [3, 4].</p>
      <p>In [5] the structure of the multilayered network is proposed, which is a subset of cybernetic networks
functioning on the principle of "winner-takes-all" [6]. In this network, both during its training and application, the
direct propagation of signals was used. Experimentally, the high efficiency of the network was shown when
distinguishing noisy signals. In particular, it was revealed that the level of error of discrimination (the number of
incorrect decisions per 100 implementations of interference) of reference signals (with an average variation Var
= 13%) on which the network was trained, nonlinearly depends on values of the ratio N0 = "amplitude of noise /
amplitude of signal". At the same time, the error is practically zero at N0 &lt; Var = 14% and reaches 10% at N0 =
18%.</p>
      <p>The parallel software architecture was described, based on the object-oriented approach and on the known
GoF design patterns [7, 8]. The template "factory method" is used to expand the class of network simulators, and
the "Bridge" template allows building various implementations for CPU and GPU platforms. The architecture is
intended for simulation of multidimensional problems (network neurodynamics, compression of
multidimensional data, pattern recognition) and, in particular, multi-layer networks from [5].</p>
      <p>In this paper, architectural solutions for learning a multi-layer network are discussed in more detail.</p>
    </sec>
    <sec id="sec-3">
      <title>Formulation of the problem</title>
      <p>The aim of the work is to consider the principle of learning a multilayered network – adjusting its synaptic
connections and neural responses according to similarity measures of vectors of the training sample and similarity
measures over these similarity measures. This consideration also explains the high efficiency of the discrimination
of noisy signals.</p>
    </sec>
    <sec id="sec-4">
      <title>Structure of the multilayered neural-like network of direct propagation</title>
      <p>An example of the structure of the multilayered network of the direct propagation is shown in Fig. 1 [5, 8].</p>
      <p>As seen in Figure 1, the network consists of several layers of active elements. The input layer forms the input
signals propagated to connectors (synapses) of the first layer. All odd layers consist of connectors (synapses), and
even ones consist of switches (neurons). The number of synapses and neurons in each layer corresponds to the
number of reference signals – vectors of the training sample. Figure 1 shows an example of an already trained
network, with the memory register of each connector of the first layer containing the corresponding reference
vector of the training sample. The first reference signal without interference is fed to the input layer. Therefore,
every even layer of the network is unmistakably, with a measure of similarity 1, identifying the input signal, as
Signal 1.</p>
    </sec>
    <sec id="sec-5">
      <title>Principle of the learning of neurosimilar network</title>
      <p>Principle of the learning of neurosimilar network is that each synapse’s memory register of an
starting with the third one, records the responses of all neuron-switches of previous layers to the reference signals
sequentially fed to the network input in the learning process.
odd la</p>
      <p>The response of each neuron of any layer to the input signal – the vector Z at the input of the previous synaptic
connector – is formed as an odd power of cosine of the angle between the input vector Z and the vector X stored
in the connector register: ( ,  )= cos2 +1( ,  ). In the above example, n = 20, which provides a strong
nonlinearity – the resonance response of the neuron-commutator to the input signal.</p>
      <p>Let us consider in more detail the iterative process of learning the network. To simplify the drawings, the
controllers controlling the configuration and operation of connectors are not shown on them.</p>
    </sec>
    <sec id="sec-6">
      <title>Recursive training of neurosimilar network</title>
      <p>At the first training step of the network, each reference training sample signal is written to the memory register
of the corresponding synapse-connector of the first network layer, as shown in Figure 2.</p>
      <p>In the second step of network training (Figure 3), all reference signals of the training sample are sequentially
fed to the input layer of the network, and for each reference signal (in Figure 3 – signal 1), the responses of all
switch-neurons of the second layer are written to connector’s registers of the third layer. As a
registers of connectors of the third layer, similarity measures (  ,   )= cos2 +1(  ,   )of all vectors of the
reference training sample are written (Figure 4).</p>
      <p>In the third step of network training (Figure 5), all reference signals of the training sample are sequentially fed
to the input layer of the network, and for each reference signal (in Figure 5 – signal 3), the responses of all
switchneurons of the fourth layer are written to connector’s registers of the fifth layer. As a result,
connectors of the fifth layer, similarity measures under similarity measures ((  ,   ), (  ,   ))=
cos2 +1((  ,   ), (  ,   ))of all vectors of the reference training sample are recorded (Figure 5).
result, in</p>
    </sec>
    <sec id="sec-7">
      <title>Experiments</title>
      <p>As a result of the first experiment, we present in Figure 6 an illustrative example of the network distinguishing
the third reference signal against a noise background when N0 = "amplitude of noise / amplitude of signal" = 0.25,
that corresponds to the ratio Noise/Signal = 6.25%.
gives an incorrect recognition of the input noisy signal.</p>
      <p>However, in the subsequent odd layers, where similarity measures are compared with the reference similarity
measures, responses are formed that correctly identify the input noisy signal.</p>
      <p>For an experiment as a test example were chosen 6 components of 6 reference signals Sk = (sk1, .., sk6)T (k = 1,
2,.., 6), provided in the table 1.</p>
      <p>Apparently from the table 1, the average variation (Var) of signals makes 13%. At the same time:

= 16 ∑6=1 
 = 16 ∑6=1
[∑6 =1(</p>
      <p>−  2]1/2
[∑6 =1   2]1/2
, 〈  〉 =
16 ∑6=1  .</p>
      <p />
      <p>The task consists in a research of dependence of an error of the distinction of reference signals from table 1
from the amplitude of an additive hindrance N.
(1)
(2)
(3)
where random( )– is the size which is evenly distributed in the range of (-1, 1), and N0 – the maximum relation
of amplitude of a hindrance to amplitude of a useful signal.</p>
      <p>For descriptive reasons reference signals from table 1 are given in the figure 7, and one of signals (S3) with the
maximum variation of amplitudes in 18% in the presence of a hindrance with N0 = 13% is given in the figure 8.
In this case the observed signal X is expressed as:</p>
      <p>In an imitating model experiment the hindrance was generated as a vector N = (n1, n2, nm)T of the random evenly
distributed sizes:</p>
      <p>=   +  .
  =  0 ∗   ∗ (1 − 2 ∗ random( )),</p>
      <p>Projections of all reference signals to the plane (X, Y) carried out under technology [9] are given in the figure 9
at one of realization of a hindrance with N0 = 13%.
From the figure 9 it is especially visually visible that in the presence of a hindrance it is probably wrong to
identify an observed signal Z1 with the reference signal S3, and Z2 with S4.</p>
      <p>From statistics of errors of charts (figure 10) it is visible that in comparison with the 2nd layer the 6th layer of
network gives a prize in reliability of recognition from 1% to 4%.</p>
    </sec>
    <sec id="sec-8">
      <title>Conclusion</title>
      <p>The carried out research showed that the multilayered neurosimilar network of direct distribution gives some
advantage at distinction of noisy reference signals. This advantage is caused by accounting of additional
communications between components of reference signals. In the analysis of signals against the background of
hindrances with N0 = "amplitude of noise/amplitude of signal"  of average variation of reference signals this
advantage can become decisive as allows us to make almost faultless distinction of signals.</p>
      <p>The interesting result turns out when giving on an entrance of the trained network of any signal, considerably
different from all reference signals.</p>
      <p>So, in the figure 11 the example of responses of network to an entrance signal XT = (11,07; 17,92; 3,14; 23,51;
12,78; 14,90) is shown.
At the same time, the network shows on the similarity of an entrance signal with the 3rd reference signal  3=
(16,00; 14,00; 3,00; 21,00; 6,00; 16,00), though degree of this similarity is extremely small (0,25). At the same time,
the 4th and 6th layers carry an entrance signal to the 3rd reference signal.</p>
      <p>In the figure 12 the example of responses of network to an entrance signal XT = (13,47; 14,91; 0,02; 29,01; 12,05;
5,51) is shown.</p>
      <p>In this example the 2nd layer of network shows on the similarity of an entrance signal with 4rd reference signal
 4 = (20,00; 9,00; 7,00; 22,00; 8,00; 11,00), and the 4th and 6th layers carry an entrance signal to the 3rd reference
signal.</p>
      <p>Thus, the considered multilayered network of direct distribution of signals with the adjustment on measures
of similarity of vectors of the training selection is very simple for the process of training and has the increased
reliability of recognition of reference signals in the presence of hindrances in comparison with one layer.</p>
    </sec>
    <sec id="sec-9">
      <title>Acknowledgement</title>
      <p>The work was supported by the state in the person of the Ministry of Education and Science of Russia by lot
code 2017-14-579-0002 on the topic: "Development of effective algorithms for detecting network attacks based
on identifying of deviations in the traffic of extremely large volumes arriving at the border routers of the data
network and creating a sample of software complex for detection and prevention of information security threats
aimed at denial of service". Agreement No. 14.578.21.0261 on granting a subsidy of September 26.2017, a unique
identifier of the work (draft) RFMEFI57817X0219.</p>
      <p>References
dlya
modeli
«Informika
Об авторах:
Краснов Андрей Евгеньевич, доктор физик-оматематических наук, профессогрл,авный научный
сотрудник, оГсударственный научн-оисследовательский институт информационных
технологий и телекоммуникациa.йk,rasnov@informika.ru
Надеждин Евгений Николаевич, доктор технических наук, професгслоарв,ный научный сотруд н,ик
Государственный научн-оисследовательский институт информационных технологий и
телекоммуникаций, e.nadezhdin@informika.ru
Никольский Дмитрий Николаевич, кандидат физик-оматематических наук, доценветд,ущий научный
сотрудник, Государственный научн-оисследовательский институт информационных
технологий и телекоммуникациdй.n,ikolsky@informika.ru
Шмакова Елена Германовна, кандидат технических наук, доцент, заведующая кафедрой
«Информационных систем, сетей и безопасн,оРсотсиси»йский государственный социальный
университет, rusja_lena@mail.ru</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <given-names>Kalachev A.A.</given-names>
            ,
            <surname>Krasnov</surname>
          </string-name>
          <string-name>
            <given-names>A.E.</given-names>
            ,
            <surname>Nadezhdin</surname>
          </string-name>
          <string-name>
            <given-names>E.N.</given-names>
            ,
            <surname>Nikolskiy</surname>
          </string-name>
          <string-name>
            <given-names>D.N.</given-names>
            ,
            <surname>Repin</surname>
          </string-name>
          <string-name>
            <surname>D.S.</surname>
          </string-name>
          <article-title>Geterogennaya mnogosvyaznaya set aktivnyih elementov // Innovatsionnyie, informatsionnyie i kommunikatsionnyie tehnologii: sbornik trudov XIII Mezhdunarodnoy nauchno-prakticheskoy konferentsii / Pod red</article-title>
          . S.U.
          <string-name>
            <surname>Uvaysova</surname>
          </string-name>
          .
          <article-title>- Moskva: Assotsiatsiya vyipusknikov i sotrudnikov voenno-vozdushnoy inzhenernoy akademii im</article-title>
          .
          <source>prof. Zhukovskogo</source>
          .
          <year>2016</year>
          , #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          277-
          <fpage>280</fpage>
          . - URL: https://elibrary.ru/item.asp?id=27332412 Kalachev
          <string-name>
            <given-names>A.A.</given-names>
            ,
            <surname>Krasnov</surname>
          </string-name>
          <string-name>
            <given-names>A.E.</given-names>
            ,
            <surname>Nadezhdin</surname>
          </string-name>
          <string-name>
            <given-names>E.N.</given-names>
            ,
            <surname>Nikolskiy</surname>
          </string-name>
          <string-name>
            <given-names>D.N.</given-names>
            ,
            <surname>Repin</surname>
          </string-name>
          <string-name>
            <surname>D.S.</surname>
          </string-name>
          <article-title>Model geterogennoy seti dlya simulyatsii neyrodinamicheskih zadach // Sovremennyie informatsionnyie tehnologii i IT-obrazovanie</article-title>
          .
          <article-title>- Moskva: Fond sodeystviya razvitiyu internet-media, ITobrazovaniya, chelovecheskogo potentsiala "Liga internet-media"</article-title>
          <source>(Moskva)</source>
          .
          <year>2016</year>
          , Tom
          <volume>12</volume>
          , #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          80-
          <fpage>90</fpage>
          . - URL:https://elibrary.ru/item.asp?id=
          <fpage>27539221</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>Krasnov A.E.</given-names>
            ,
            <surname>Nadezhdin</surname>
          </string-name>
          <string-name>
            <given-names>E.N.</given-names>
            ,
            <surname>Nikolskiy</surname>
          </string-name>
          <string-name>
            <given-names>D.N.</given-names>
            ,
            <surname>Repin</surname>
          </string-name>
          <string-name>
            <given-names>D.S.</given-names>
            ,
            <surname>Kalachev</surname>
          </string-name>
          <string-name>
            <surname>A.A.</surname>
          </string-name>
          <article-title>Kiberneticheskaya set kak strukturnaya model neyropodobnyih sistem // Informatizatsiya obrazovaniya i nauki</article-title>
          .
          <year>2017</year>
          , #
          <volume>3</volume>
          (
          <issue>35</issue>
          ), S.
          <fpage>109</fpage>
          -
          <lpage>122</lpage>
          . - URL: https://elibrary.ru/item.asp?id=
          <fpage>29426094</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>Krasnov A. E.</given-names>
            ,
            <surname>Nadezhdin</surname>
          </string-name>
          <string-name>
            <given-names>E. N.</given-names>
            ,
            <surname>Nikolskii</surname>
          </string-name>
          <string-name>
            <given-names>D. N.</given-names>
            ,
            <surname>Repin</surname>
          </string-name>
          <string-name>
            <given-names>D.S.</given-names>
            ,
            <surname>Kalachev</surname>
          </string-name>
          <string-name>
            <surname>A</surname>
          </string-name>
          . A. Nejropodobnaya kiberne ticheskaya set' // Informacionnye innovacionnye tekhnologii. - Izdatelstvo:
          <article-title>Assotsiatsiya vyipusknikov i sotrudnikov VVIA imeni professora N.E. Zhukovskogo sodeystviya sohraneniyu istoricheskogo i nauchnogo naslediya VVIA imeni professora</article-title>
          N.E.
          <string-name>
            <surname>Zhukovskogo</surname>
          </string-name>
          (Moskva).
          <year>2017</year>
          , #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          278-
          <fpage>281</fpage>
          . - URL: https://elibrary.ru/item.asp?id=
          <fpage>29386197</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Kazakov</surname>
            <given-names>K.V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kalachev</surname>
            <given-names>A.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Krasnov</surname>
            <given-names>A.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nikolskiy</surname>
            <given-names>D.N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shevelev</surname>
            <given-names>S.A.</given-names>
          </string-name>
          <article-title>Sravnenie effektivnostey razlicheniya signalov na fo ne silnyih pomeh na osnove mnogokriterialnoy i neyrosetevoy tehnologiy. Innovatsionnyie, informatsionnyie i kommunikatsionnyie tehnologii: sbornik trudov XIII Mezhdunarodnoy nauchno-prakticheskoy konferentsii / pod red</article-title>
          . S.U.
          <string-name>
            <surname>Uvaysova</surname>
          </string-name>
          .
          <article-title>- Moskva: Assotsiatsiya vyipusknikov i sotrudnikov voenno-vozdushnoy inzhenernoy akademii im</article-title>
          .
          <source>prof. Zhukovskogo</source>
          .
          <year>2016</year>
          ,. #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          257-
          <fpage>259</fpage>
          . - URL: https://elibrary.ru/item.asp?id=
          <fpage>27332404</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <given-names>Petrunin</given-names>
            <surname>Yu</surname>
          </string-name>
          .Yu.,
          <string-name>
            <surname>Ryazanov</surname>
            <given-names>M.A.</given-names>
          </string-name>
          <string-name>
            <surname>Savelev</surname>
            <given-names>A.V.</given-names>
          </string-name>
          <article-title>Ot iskusstvennogo intellekta k modelirovaniyu mozga</article-title>
          . MGU im. M.V. Lomonosova, MAKS Press,
          <year>Moskva</year>
          .
          <year>2014</year>
          . -
          <fpage>84</fpage>
          s. - URL:
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>https://istina.msu.ru/publications/book/7869957/</mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <given-names>Krasnov A. E.</given-names>
            ,
            <surname>Nikol'</surname>
          </string-name>
          skii
          <string-name>
            <given-names>D. N.</given-names>
            ,
            <surname>Kalachev</surname>
          </string-name>
          <string-name>
            <surname>A. A.</surname>
          </string-name>
          <article-title>Arhitektura parallel'nogo programmnogo obespecheniya nejropodobnoj kiberneticheskoj seti // Informacionnye innovacionnye tekhnologii</article-title>
          . - Izdatelstvo:
          <article-title>Assotsiatsiya vyipusknikov i sotrudnikov VVIA imeni professora N.E. Zhukovskogo sodeystviya sohraneniyu istoricheskogo i nauchnogo naslediya VVIA imeni professora</article-title>
          N.E.
          <string-name>
            <surname>Zhukovskogo</surname>
          </string-name>
          (Moskva).
          <year>2017</year>
          , #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          275-
          <fpage>287</fpage>
          . - URL: https://elibrary.ru/item.asp?id=29386196 Krasnov
          <string-name>
            <given-names>A.E.</given-names>
            ,
            <surname>Nadezhdin</surname>
          </string-name>
          <string-name>
            <given-names>E.N.</given-names>
            ,
            <surname>Nikol'</surname>
          </string-name>
          skii
          <string-name>
            <surname>D.N.</surname>
          </string-name>
          <article-title>Arhitektura parallel'nogo programmnogo obespechyeanciyiiamndolgyoamesrimnyuhl zadach. Sovremennyie informatsionnyie tehnologii i IT-obrazovanie</article-title>
          .
          <year>2017</year>
          , Tom
          <volume>13</volume>
          , #1,
          <string-name>
            <surname>S.</surname>
          </string-name>
          49 -
          <fpage>57</fpage>
          . - URL: http://sitito.cs.msu.ru/index.php/SITITO/article/view/202/172
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          10.
          <string-name>
            <surname>Krasnov</surname>
            <given-names>A.E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Nikol'</surname>
            skiy
            <given-names>D.N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kalachev</surname>
            <given-names>A.A.</given-names>
          </string-name>
          <article-title>Snizhenie razmernosti spektral'nykh dannykh neyropodobnym algoritmom // Svidetel'stvo o gosudarstvennoy registratsii programmy dlya EVM, Rossiyskaya federatsiya</article-title>
          , №
          <fpage>201</fpage>
          -
          <lpage>761U2R1L9</lpage>
          :
          <fpage>5</fpage>
          ,
          <year>2017</year>
          . http://www1.fips.ru/wps/portal/ofic_pub_ru/#page=document&amp;type=doc&amp;tab=PrEVM&amp;id=
          <fpage>1852C2F0</fpage>
          -10AD
          <string-name>
            <surname>-</surname>
          </string-name>
          461C-
          <fpage>9711</fpage>
          - 44C9C9CBDC33
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>