<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Designing of Neural Networks for Financial Market Forecasting</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Antoshchuk Svitlana</string-name>
          <email>asgonpu@gmail.com</email>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Teslenko Pavlo</string-name>
          <email>p_a_t@ukr.net</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sytnyk Volodymyr</string-name>
          <email>vladas@ua.fm</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sherstiuk Olha</string-name>
          <email>olusha972@gmail.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Ph.</institution>
          <addr-line>D., Associated, professor, Odesa National, Polytechnic</addr-line>
          ,
          <institution>University</institution>
          ,
          <addr-line>Ukraine, Odesa</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Ph.</institution>
          <addr-line>D., Associated, professor, Ukraine, Odesa</addr-line>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Ph.D., Senior Lecturer, Odesa National, Maritime University</institution>
          ,
          <addr-line>Ukraine, Odesa</addr-line>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Sc.D., Professor, Odesa National, Polytechnic University</institution>
          ,
          <addr-line>Ukraine, Odesa</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>The paper discusses methods for forecasting financial markets. Their advantages and disadvantages are presented. A genetic approach for the formation of the structure and training of the neural network is proposed. The method of forming a neural network based on a genetic algorithm is given.The effectiveness of the proposed methodology is presented as a result of the task of forecasting the stock market.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>In this situation, neural networks (NN) can serve as an adequate apparatus for solving problems of diagnostics
and forecasting, and, as very promising, radial base structures, characterized by high learning speed and universal
approximating capabilities, should be noted.</p>
      <p>The aim of the work is to develop a methodology for the formation of neural networks for the financial market
analysis.</p>
      <p>
        For forecasting systems based on NN, a heterogeneous network consisting of hidden layers with a non-linear
activation function of neural elements and an output linear neuron shows the best quality. The disadvantage of most
nonlinear activation functions is that their output values are limited by the [
        <xref ref-type="bibr" rid="ref1">0,1</xref>
        ] or [
        <xref ref-type="bibr" rid="ref1">–1,1</xref>
        ] segment. This leads to the
need to scale the data, if they do not belong to the above ranges of values.
      </p>
      <p>
        Various network learning algorithms and their modifications are used for the network training [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. The
backpropagation error algorithm is of the greatest interest, since it is an effective tool for training in multi-layer
forwardpropagation neural networks.
      </p>
      <p>Training in the method of back propagation of an error is reduced to the selection of the values of the
forwarddirected neural network weights, based on the principles of the fastest descent. One of the main drawbacks of this
classic algorithm is the possible hit in the local minima of the cost function.</p>
      <p>
        The analysis of multilayer neural networks and their learning algorithms revealed a number of shortcomings
and emerging problems [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]:
1. Uncertainty in the choice of the number of layers and the number of neural elements in a layer;
2. Slow convergence of the gradient method with a constant learning step;
3. The difficulty of choosing the appropriate learning rate. Since a small learning rate leads to the NN rolling
up to a local minimum, and a high learning rate can lead to the omission of the global minimum and make the learning
process divergent;
      </p>
      <p>4. The impossibility of determining the local and global minimum points, since the gradient method does not
distinguish them;</p>
      <p>5. The effect of random initialization of NN weighting factors on the search for the minimum of the
root-meansquare error function.</p>
      <p>
        An alternative to the method of back propagation of errors can serve as genetic algorithms. Genetic algorithms
are used to solve optimization problems using the evolution method, i.e. by selecting the most appropriate one from a
variety of solutions. They differ from traditional optimization methods in the following properties [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ]:
1. Process not the parameter values of the problem, but their coded form.
2. Search for solutions based on a certain population.
3. Use only the objective function, and not its derivative.
4. Algorithms are stochastic.
      </p>
      <p>The purpose of the training is to minimize the cost function E(n)=(ek2(n))/2, where ek(n) = dk(n) – yk(n) is the
error, dk(n) is the desired output of the neural network, yk(n) is the real output of the neural network, n is the iteration
number.</p>
      <p>The parameters of the problem are the weights that determine the point of the search space and, therefore,
represent a possible solution.</p>
      <p>
        If the weights take real values from the interval [
        <xref ref-type="bibr" rid="ref1">–1,1</xref>
        ], then each chromosome will be a combination of 9 binary
sequences (genotypes) encoding specific weights. The corresponding phenotypes are represented by the sets of the
corresponding real numbers from the interval [
        <xref ref-type="bibr" rid="ref1">–1,1</xref>
        ]. The length of chromosomes depends on the problem situation.
      </p>
      <p>If it is required to find a solution with accuracy up to q=2 significant decimal figures for each weight, then the
interval [a, b] should be divided into (b – a)·10q identical subintervals. This means applying discretization in
increments of r=10-q The smallest positive integer mi satisfying the inequality (b – a)·10q ≤ 2m–1 determines the
necessary and sufficient length of the binary sequence required to encode a number from an interval [a, b] in
increments of r. As a result, the length of the binary coding sequence is 8 bits.</p>
      <p>When coding real numbers, an integer is taken as the value of the gene that determines the number of the
interval (The Gray code is used). As the phenotype value, the number that is the middle of this interval is taken.</p>
      <p>The initial chromosome population is randomly assigned. When calculating the neural network output values,
you can use the logistic activation function with the learning speed parameter η=1.</p>
      <p>The stage of selection of parental chromosomes for creating a new population performs the greatest role in the
successful functioning of the algorithm. The most effective is the tournament method. The essence of the method is
as follows. All individuals of the population are divided into subgroups of 2-3 individuals each. The choice of parents
is made randomly with a probability less than 1.</p>
      <p>In the classical genetic algorithm, two main genetic operators are used: the crossover operator and the mutation
operator. The probability of crossing is large enough (usually 0,5≤ pc≤1), the probability of mutation is established
very small (most often 0≤ pm≤0,1). This means that interbreeding in the classical algorithm is almost always
performed, while mutation is quite rare.</p>
      <p>The crossover operator acts as follows:</p>
      <p> from the population with probability pc two individuals are selected, which are included in
the composition of the temporary parent population;
 the point of crossing lk is determined (also randomly);
 the concatenation of the part of the first and second parents.</p>
      <p>
        The mutation operator with probability pm changes the value of a gene in the chromosome to the opposite. The
probability of mutation can be calculated by randomly choosing a number from the interval [
        <xref ref-type="bibr" rid="ref1">0, 1</xref>
        ] for each gene and
selecting for this operation those genes for which the played number is less than or equal to the value pm.
      </p>
      <p>The chromosomes obtained by applying genetic operators to the chromosomes of the temporary parent
population are included in the new population. It becomes the current population for this iteration of the genetic
algorithm. At each iteration the value of the fitness function for all chromosomes of this population is calculated, after
which the condition of the algorithm stop is checked. As such a condition, either a restriction on the maximum number
of epochs of the algorithm functioning is applied, or the algorithm convergence is determined by comparing the
population's fitness function values at several epochs. When this parameter is stabilized, the algorithm stops.</p>
      <p>The genetic algorithm was used in the study of neural network structures to forecast the stock prices of public
JSC (joint-stock company) “Lukoil”.</p>
      <p>The studies were conducted for the 2017-year time series. Fig. 1, 2 shows the results of the search for optimal
neural networks. Studies were conducted for MLP type networks (three and four-layer network). It is known that the
search for the type of neural network and structure is a rather time-consuming procedure, so the intermediate task of
determining the initial “prototype” was solved, after which it was determined that the further structure of the NN was
further refined.</p>
      <p>The results of calculations showed that for the stock price of JSC “Lukoil” the optimal structures are neural
networks with the following indicators: type –MLP; three-layer structure: 3 input neurons – 8 neurons of the hidden
layer – 1 output neuron (Fig. 2), or MLP; three-layer structure: 1 input neuron – 11 neurons of the hidden layer – 1
output neuron (Fig. 1).</p>
      <p>Conclusions. The neural network structures obtained were used to forecast the value of LUKOIL stocks
(Fig. 3, 4).</p>
      <p>The forecasting accuracy was 73% for LUKOIL stocks, which is a high result and exceeds the forecasting
results by other methods.</p>
      <p>3650
3600
3550
3500
3450
3400
3350
3300
3250
3200
3150
3600
3580
3560
3540
3520
3500
3480
3460
3440
0
2
4
6
8
10</p>
      <p>12</p>
      <p>
        In the future, the forecast results can be used in the management of project-oriented organizations [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ], both in
the management of individual portfolio projects and in the strategic management of the organization.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Anatoliev</surname>
            <given-names>A.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Teslenko</surname>
            <given-names>P.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chimshir</surname>
            <given-names>V.I.</given-names>
          </string-name>
          (
          <year>2015</year>
          ).
          <article-title>Project-oriented orientation of the management processes of investment companies in the foreign exchange market</article-title>
          .
          <source>Bulletin of the National Technical University "KhPI": №</source>
          <volume>1</volume>
          (
          <issue>1110</issue>
          ). рр.
          <volume>80</volume>
          -
          <fpage>84</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <given-names>Kunwar</given-names>
            <surname>Singh</surname>
          </string-name>
          <string-name>
            <surname>Vaisla</surname>
          </string-name>
          , Ashutosh Kumar Bhatt. (
          <year>2010</year>
          ).
          <article-title>An Analysis of the Performance of Artificial Neural Network Technique for Stock Market Forecasting</article-title>
          .
          <source>International Journal on Computer Science and Engineering</source>
          Vol.
          <volume>02</volume>
          , No. 06, PP.
          <fpage>2104</fpage>
          -
          <lpage>2109</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Niaki</surname>
            ,
            <given-names>S.T.A.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Hoseinzade</surname>
            ,
            <given-names>S. J Ind</given-names>
          </string-name>
          <string-name>
            <surname>Eng Int</surname>
          </string-name>
          (
          <year>2013</year>
          )
          <article-title>9: 1</article-title>
          .
          <string-name>
            <surname>Forecasting</surname>
            <given-names>S</given-names>
          </string-name>
          &amp;
          <article-title>P 500 index using artificial neural networks and design of experiments</article-title>
          . https://doi.org/10.1186/
          <fpage>2251</fpage>
          -712X-9-1
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Tikhonov</surname>
            ,
            <given-names>E.E.</given-names>
          </string-name>
          (
          <year>2006</year>
          ).
          <article-title>Forecasting in market conditions</article-title>
          .
          <source>Nevinnomyssk.</source>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Rutkovskaya</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pilinsky</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Rutkovsky</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          (
          <year>2004</year>
          ).
          <article-title>Neural networks, genetic algorithms and fuzzy systems</article-title>
          . Moscow.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Sytnyk</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>Georgalina</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          (
          <year>2018</year>
          ).
          <article-title>Theoretical and practical aspects of the development of modern science: the experience of countries of Europe and prospects for Ukraine:</article-title>
          “Baltija Publishing”,
          <volume>524</volume>
          p.
          <article-title>DOI: dx</article-title>
          .doi.org/10.30525/
          <fpage>978</fpage>
          -9934-571-30-5.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Tymchenko</surname>
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Antoshchuk</surname>
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2019</year>
          )
          <article-title>Race from Pixels: Evolving Neural Network Controller for VisionBased Car Driving</article-title>
          .
          <source>ICDSIAI 2018. Advances in Intelligent Systems and Computing</source>
          , vol
          <volume>836</volume>
          . Springer, Cham
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Fernandez-Rodriguez</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Sosvilla-Rivero</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          &amp;
          <string-name>
            <surname>Andrada-Felix</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          (
          <year>2002</year>
          ).
          <article-title>Nearest-Neighbour Predictions in Foreign Exchange Markets</article-title>
          . Fundacion de Estudios de Economia Aplicada,
          <volume>5</volume>
          , 36 p.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Singh</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          (
          <year>2000</year>
          ).
          <source>Pattern Modelling in Time-Series Forecasting. Cybernetics and Systems-Anlnternational Journal</source>
          ,
          <volume>31</volume>
          .(1), PP.
          <fpage>49</fpage>
          -
          <lpage>65</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Teslenko</surname>
            <given-names>P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Polshakov</surname>
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bedrii</surname>
            <given-names>D.</given-names>
          </string-name>
          (
          <year>2016</year>
          )
          <article-title>Strategic management of evolving project-oriented organization</article-title>
          .
          <source>Science and Education a New Dimension</source>
          , Economics,
          <source>IV (2)</source>
          , Issue:
          <fpage>94</fpage>
          , Budapest. РP.
          <volume>33</volume>
          -
          <fpage>35</fpage>
          . available at: http://www.seanewdim.com/uploads/3/4/5/1/ 34511564/ econ_iv_2__94.pdf
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>