=Paper= {{Paper |id=Vol-2267/513-517-paper-98 |storemode=property |title=Using TensorFlow to solve the problems of financial forecasting for high-frequency trading |pdfUrl=https://ceur-ws.org/Vol-2267/513-517-paper-98.pdf |volume=Vol-2267 |authors=Alexander V. Bogdanov,Alexey S. Stankus }} ==Using TensorFlow to solve the problems of financial forecasting for high-frequency trading== https://ceur-ws.org/Vol-2267/513-517-paper-98.pdf
Proceedings of the VIII International Conference "Distributed Computing and Grid-technologies in Science and
             Education" (GRID 2018), Dubna, Moscow region, Russia, September 10 - 14, 2018




    USING TENSORFLOW TO SOLVE THE PROBLEMS OF
    FINANCIAL FORECASTING FOR HIGH-FREQUENCY
                    TRADING
                                A.V. Bogdanov, A.S. Stankus a
     St. Petersburg State University, University Embankment 7–9, St. Petersburg, 199034, Russia

                                       E-mail: a alexey@stankus.ru


The use of neural networks significantly expands the possibilities of analyzing financial data and
improves the quality indicators of the financial market. In article we examine various aspects of
working with neural networks and Frame work TensorFlow, such as choosing the type of neural
networks, preparing data and analyzing the results. The work was carried out on the real data of the
financial instrument Si-6.16 (futures contract on the US dollar rate).

Keywords: Artificial Intelligence, recurrent neural network (RNN), financial market forecasting,
TensorFlow.


                                                          © 2018 Alexander V. Bogdanov, Alexey S. Stankus




                                                                                                        513
Proceedings of the VIII International Conference "Distributed Computing and Grid-technologies in Science and
             Education" (GRID 2018), Dubna, Moscow region, Russia, September 10 - 14, 2018




1. Neural network selection
         With an increase in the power of computing resources, it became possible to predict the price
movement of stock markets using artificial neural networks (ANN). The most common form of ANN
used to predict the stock market is a direct transfer network, using the back-propagation error
algorithm to update network weights. These networks are commonly referred to as reverse error
propagation networks. Another form of ANN, which is more suitable for price prediction, is a
recurrent neural network (RNN) [1] or a time delay neural network (TDNN) [2]. Examples of RNN
and TDNN are the networks of Elman, Jordan, and Elman-Jordan.
         RNN was created with the ability to process long serial data and solve problems with the
distribution of context in time. The model processes one element in a sequence in one-time step. After
the calculation, the updated state is transmitted to the next step in time to facilitate the calculation of
the next element.




Figure 1. A recurrent neural network with one hidden element (left) and its unfolding version in time (right). The
 expanded version illustrates what happens in time: st - 1, st, and st + 1 are the same unit with different states at
                                      different time steps t - 1, t, and t + 1
        However, simple networks that linearly combine the current input element and the last output
element can easily lose long-term dependencies. To solve this problem, researchers created a special
neuron with a much more complex internal structure to remember the long-term context, called the
Long-Short Term Memory (LSTM) cell. He is smart enough to find out how long he has to memorize
old information, when to use new data and how to combine old memory with new input [3].




                                       Figure 2. Structure of LSTM neuron
        Stock prices are time series of length N, defined as p0, p1, ..., pN-1, in which pi is the closing
price in the period i, 0≤i