<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>I. Kalinina);</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Petro Mohyla Black Sea National University</institution>
          ,
          <addr-line>68 Desantnykiv 10, 54000, Mykolaiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2045</year>
      </pub-date>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0002</lpage>
      <abstract>
        <p>The article studies the solution of the problem of forecasting electricity demand in Ukraine. The sequence of data processing stages in solving the forecasting problem using machine learning methods is presented. It consists of the following stages: data collection, data research and preparation, construction and training of forecasting models, selection of the best model and calculation of forecasts, evaluation and verification of quality indicators of forecasts. A general methodology for solving forecasting problems is proposed. The methodology for solving the forecasting problem on time series is considered. The forecasting process consists of five stages. The first stage includes the collection, analysis and interpretation of data. The next stage includes the procedures of data research and preparation. The third stage - the modeling stage consists of three parts: preparation of a data set for modeling, selection and training of models and evaluation of their quality. The fourth stage is the forecasting stage and calculation of quality indicators of forecasts. At the fifth stage, procedures for improving the efficiency of the selected forecasting model are performed. The following models were used at the modeling stage: ARIMA, GAM, ANN and BSTS. The analysis of the models was carried out and forecasts were built based on each model. For the constructed models with the best quality indicators, the predictive values were calculated. The forecasts were compared with the data of the validation sample. The following indicators were used to select the optimal model: MAPE, MAE, MSE, RMSE. The BSTS model showed the best results.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Today, energy consumption around the world is growing rapidly due to the increasing demand for
electricity due to the ever-growing global population, large-scale development of industry and
technology, rising living standards, large-scale industrialization in developing countries, and the
need to maintain high rates of sustainable development. The electric power sector is the basis of the
economic potential of any country. This industry belongs to the critical infrastructure industries,
which must respond very quickly to external changes, such as natural disasters, military actions, as
well as to changing conditions in the electricity market.</p>
      <p>Forecasting the demand for electricity in Ukraine today is a strategically important issue, since in
the conditions of war and constant attacks on energy infrastructure, it is necessary to promptly
distribute energy resources to meet the needs of various types of consumers.</p>
      <p>Considering these factors, the construction of adequate and accurate electricity demand
forecasting models is necessary and important for accurate planning of investments in electricity
generation and distribution. A common difficulty in developing quality forecasts is determining a
sufficient amount of information to build forecast models. If there is not enough data, then the
forecast will be inaccurate. Similarly, if information is imprecise or redundant, pre-processing the
data and building models for forecasting will be difficult. Therefore, there is a need to increase the
accuracy of predictive models due to the use of modern effective methods and approaches.</p>
      <p>Machine learning methods significantly increase the efficiency of solving machine learning
problems such as classification, regression and prediction. But the application of each method
requires taking into account the features of the data set, the method of data presentation, and the
features of the problem being solved. Therefore, one approach to solving forecasting problems is to
systematically use several forecasting models and then select the model that gives the best results.</p>
      <p>
        Let's consider some of them. The paper [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] presents an algorithm for forecasting electricity
demand, which is associated with the maximum load in the power grid. The authors substantiated
and applied the SVR method. The data set and parameters of the method were configured and
optimized by a hybrid method. This approach allows us to reduce the overall forecast error. The
hybrid method is based on a combination of a neural network model, the ARIMA method, and a
modified SVR method.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], an artificial bee colony algorithm was used to process the initial data. To forecast the
demand for electricity, the algorithm was used in combination with ensemble models. A number of
independent input variables were used to create homogeneous ensemble models. The ensemble
model proposed by the authors provides more accurate predictions.
      </p>
      <p>
        The authors of [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] propose to combine the feedforward neural network model with the
convolutional neural network architecture to forecast the demand for electricity. This approach
turned out to be the most effective. Deep learning methods were also applied in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. The paper
forecasted the energy demand based on statistical data for previous years. For a more in-depth study
of the data, the cluster analysis method was used. The load was classified by certain periods and
presented as clusters. The forecast of the demand for electricity was assessed using neural network
models and SVM.
      </p>
      <p>
        The article [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] presents an approach to forecasting electricity demand based on a hybrid model.
The model is built using a combination of ARIMA and LSSVM. The forecasting results show that this
approach to building a model allows for abnormal values in the data to be taken into account. In the
work [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], the authors used a regression analysis model to forecast electricity demand in various
industries.
      </p>
      <p>
        To estimate the peak monthly demand for electricity, the following methods were used in [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]:
ANFIS method, a special data processing method and various neural network models. In combination
with the proposed models, these methods were found to be better suited for determining the peak
demand for electricity. In [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], he published long-term forecasts of electricity demand in Greece, and
also used the relationship between time series and effective several criteria. The cost estimation
model is investigated using data collected between 1999 and 2013. The impact of electricity
production in European countries during the quarantine is studied in the works [
        <xref ref-type="bibr" rid="ref10 ref9">9,10</xref>
        ].
      </p>
      <p>
        In works [
        <xref ref-type="bibr" rid="ref11 ref12">11,12</xref>
        ] the effectiveness of various approaches and strategies for predicting daily energy
consumption was investigated. The authors of [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] investigated the approach of forecasting the load
in the electrical network using artificial intelligence methods. The authors of [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] developed a model
for forecasting electricity demand for residential and commercial buildings based on ensemble
methods. Short-term forecasting was considered.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], the authors use SVR models in combination with WOA, which includes learning based
on elite and chaotic opposition (ECWOA) to improve forecasting results. The results of experiments
show that taking into account information about electricity prices leads to higher forecasting
accuracy. In works [
        <xref ref-type="bibr" rid="ref16 ref17">16,17</xref>
        ] approaches based on machine learning algorithms are considered to
increase the accuracy of short-term forecasts. The following methods were used: SVM, LSTM, SVR
and ensemble structures.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ], seasonally adjusted regression was used to obtain forecast values for electricity. In [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ],
the authors demonstrated the advantage of LSTM over SVM in the task of forecasting electricity
demand using specific examples. In [
        <xref ref-type="bibr" rid="ref20 ref21 ref22">20–22</xref>
        ], various neural network architectures were investigated
in combination with heuristic algorithms for forecasting electricity demand in different countries.
      </p>
      <p>
        The authors of [
        <xref ref-type="bibr" rid="ref23">23</xref>
        ] used regularized Lasso Lars and RF models to forecast electricity consumption
in Brazil. An energy forecasting model based on a deep learning approach [
        <xref ref-type="bibr" rid="ref24">24</xref>
        ] combines CNN, LSTM,
and an autoencoder (AE) for time series with different lengths. In the work [
        <xref ref-type="bibr" rid="ref25">25</xref>
        ], modeling based on
neural network models was used to estimate and forecast the demand and consumption of electricity
in transport. Thus, it was proven that a combination of different methods and algorithms of machine
learning allows to effectively solve the problem of forecasting the demand for electricity.
      </p>
      <p>Problem statement: To study different approaches to forecasting the demand for electricity in
Ukraine. To develop a methodology for solving forecasting problems based on machine learning
methods. To compare the effectiveness of different machine learning methods in solving the problem
of forecasting the demand for electricity.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Methodology for solving forecasting problems</title>
      <p>
        The sequence of data processing stages when solving a forecasting problem is shown in Fig. 1. It
consists of the following stages: data collection, research and preparation of data, construction and
training of forecast models, selection of the best models and calculation of forecasts, evaluation and
verification of the quality of forecasts [
        <xref ref-type="bibr" rid="ref26 ref27 ref28">26-28</xref>
        ].
      </p>
      <p>An information model for solving the problem of forecasting using machine learning methods is
described using the following set of elements:
, 
, 
, 
, 
, 
, 
}
where  is a set of data sets that are processed when solving a machine learning problem;
 – a set of data nonlinearities that are taken into account when solving a machine learning
problem;  – a set of non-stationary processes that are taken into account when solving a
machine learning problem;  – set of methods of analysis and preliminary processing of data
sets;  – set of model building methods and modeling methods for solving the problem of
machine learning;  – a set of forecasting methods based on probabilistic statistical analysis,
taking into account nonlinearities and non-stationarity of data;  is a set of uncertainties when
solving a machine learning problem.</p>
      <p>The dataset set  =  ∪  combines two subsets. The first subset 
combines training data, the second subset  is a set of test data. The set of uncertainties 
includes uncertainties of statistical type, uncertainty due to lack of observations, uncertainty of
model parameters, uncertainty of model structure, uncertainty of amplitude and probability type.</p>
      <p>
        Based on the developed information model, a description of the stages of solving forecasting
problems was created, and a methodology for solving the problem of predictive modeling on time
series was developed [
        <xref ref-type="bibr" rid="ref29 ref30">29,30</xref>
        ], which is presented in Figure 2.
      </p>
      <p>The methodology is presented as a sequence of the following stages. The first stage is necessary
for collecting, analyzing and interpreting the initial data. When loading, the data set is analyzed, its
structure and features of individual attributes are determined. As a result of preliminary data
processing, the set is prepared for subsequent analysis - intelligence analysis of data.</p>
      <p>The second stage involves research procedures and data preparation. Descriptive statistics for each
variable are analyzed, missing and abnormal values are identified, the level of autocorrelation is
determined, nonlinearity and nonstationarity of data and their types are identified,
heteroscedasticity is analyzed, and the process is analyzed for integrability.</p>
      <p>The third stage - the modeling stage consists of three parts: preparation of the data set for
modeling, selection and training of models and assessment of their quality. Before starting the
modeling, the prepared dataset is split into training and test samples, and cross-validation sets are
created. When choosing a model, simulation algorithms are tested on a training sample and the best
one is selected according to certain quality criteria.</p>
      <p>The fourth stage is the stage of constructing forecasts and assessing their quality. At the fifth
stage, procedures are performed to improve the efficiency of the selected forecasting model. For
different modeling methods, the following methods can be used to improve quality: complicating the
model structure, changing its specifications, changing the model topology (and/or activation
functions), using additional algorithms, combining forecast values.</p>
      <p>An important feature of the presented method is visualization. With the help of visualization, at
each stage, it is possible to adjust the sequence of actions and return to previous stages. The stages
of the method have features that reflect the subject area of the problem solution.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Forecasting Electricity Demand</title>
      <sec id="sec-3-1">
        <title>3.1. Dataset description and pre-processing</title>
        <p>A data set for forecasting electricity demand in Ukraine is presented on the web resource of the state
operator of the electricity market [31]. The observation time interval covers the period from
01.07.2019 to 04.10.2024. The set contains hourly data on the volumes of electricity, sales of
electricity, demand for it in MWh in the power grids of Ukraine and prices (Fig. 3).</p>
        <p>The variable with hour marks in a day is included in the datetime indexing variable, and the
demand variable is selected from the set as the resulting variable. It is noted that the energy_system
variable takes only three values: Burshtyn peninsula, IPS of Ukraine, IPS of Ukraine (synchronized with
ENTSO-E systems). The values of this variable up to 24 February 2022 were divided into two separate
subsystems: the "United Energy System of Ukraine" and the "Burshtyn Energy Island". Since 24
February 2022, the Ukrainian energy system has been synchronized with the European ENTSO-E
energy system. Thus, up to 24 February 2024, the dataset contains demand data separately for the
two subsystems, and to calculate the all-Ukrainian indicators, the values for these subsystems are
summed up (Fig. 4).</p>
        <p>After checking, 5 missing values were found in the data set. The missing values have a yearly
interval and are recorded in late March or early April. The presence of gaps is due to the transition
of clocks to daylight saving time. Given that the total volume of observations is 69,385, the LOCF
method [32] was used to fill in the missing values.</p>
        <p>After the time series analysis, the sampling frequency was reduced. Hourly observations of
electricity demand in the set were aggregated into daily averages (Figure 5).</p>
        <p>To build forecast models, we used a portion of the data from 01.06.2022 to 04.10.2024, excluding
the peak sections of the series around 24.02.2024 (Fig. 6). The figure shows that demand can take
negative values - this corresponds to a situation where electricity sales volumes exceed purchase
volumes.</p>
        <p>An important condition for constructing reliable forecast models is understanding the structure
of the time series. Decomposition of the series using the STL [33,34] method allowed us to determine
the basic principles of modeling (Fig. 7). The figure demonstrates the dominant influence of seasonal
components (annual and weekly seasonality), as well as the presence of a nonlinear trend
component. The hypothetical non-stationarity of the time series is confirmed by the Ljung-Box tests
for independence, the extended Dickey-Fuller test, the KPSS test for the level of stationarity, and the
Phillips-Perron test for a unit root.</p>
        <p>The time series was tested for nonlinearity using a set of statistical tests. The test results
confirmed the visual nonlinearity of the series, since the p-value is less than 0.05. The
heteroscedasticity of the series was confirmed by the McLeod-Lee test (p-value is less than 0.05). The
number of necessary differentiations and seasonal differentiations to obtain a stationary time series
is determined as a result of the tests. The first differences are recommended, and the seasonal
differentiation is optional.</p>
        <p>To test the time series for autocorrelation, the Durbin-Watson and Breusch-Gottfrey tests were
used. For both tests, the obtained p-values are much less than 0.05, thus confirming the
autocorrelation of the time series. The graphs of the sample autocorrelation function (ACF) and
partial autocorrelation function (PACF) in Figure 8 correspond to expectations: the autocorrelation
function decreases monotonically with increasing time shift. The figure confirms the presence of
significant correlation and weekly seasonality.</p>
        <p>The results of the preliminary analysis confirm that the process under study belongs to the class
of nonlinear and nonstationary.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Construction and evaluation of forecast models</title>
        <p>The modeling stage begins with dividing the data set into two parts: training and test samples. The
last 14 observations (two-week range) are kept as test observations, which corresponds to a forecast
horizon of 14 days for short-term forecasting (Figure 9).</p>
        <p>At the modeling stage, the following statistical models were used as basic forecast models:
ARIMA, the method of fitting additive regression models (GAM), artificial neural networks of direct
propagation (NNAR), and Bayesian structural models of time series (BSTS). The choice of these
models is due to their ability to recognize complex patterns in real time series, taking into account
the nonlinearity and non-stationarity of the process under study [35].</p>
        <p>Modeling based on the ARIMA method. ARIMA models are the result of combining three
components: autoregressive (AR), integration (I) and moving average (MA) [33,35,36]. The
BoxJenkins algorithm [37] helps in choosing the best model based on the graphs of the autocorrelation
function and partial autocorrelation. However, identifying the best model is a problem with an
ambiguous approach to the solution, since one data series can be represented by different parameters
of the ARIMA model. However, compared to others, this methodology is easy to use and has good
forecast accuracy. Alternative ARIMA models were selected automatically and using manual
selection. Automatic selection is based on the methods of complete enumeration, quick enumeration,
enumeration with smoothing of the input data set [35]. Table 1 provides a comparison of ARIMA
models by quality metrics for the studied time series</p>
        <p>DW</p>
        <p>Based on the results obtained after automatic selection of the ARIMA model parameters, it can be
concluded that additional smoothing (the third model) did not give the expected results, and the full
and quick enumeration of the model parameters showed different results, but they have a small
difference in the information criteria, so both models are suitable for forecasting. The manual method
of selecting parameters did not allow obtaining the best model according to the quality criteria.</p>
        <p>To select the qualitatively best ARIMA model from alternative options, the assessment was
carried out not only according to the Akaike and Bayes information criteria, but also according to
the values of the determination coefficient and the Durbin-Watson criterion. For the first two models,
the coefficient of determination is 0.8, and the multiple correlation coefficient of these models
exceeds 90% in absolute value. The DW = 2 criterion indicates the absence of autocorrelation in the
residuals.</p>
        <p>Because transformations were used in constructing the models, the residuals are visualized on
the transformed scale. These so-called "innovation residuals" are useful for checking whether the
model has adequately captured the information in the data (Figure 10). The figure shows the residuals
of the best ARIMA model from the alternatives presented in the table. The result of the portmanteau
test for this model is 0.066, which means that the residuals for the model are independent.</p>
        <p>GAM modeling. GAM modeling. GAM models are created based on the procedure of fitting
additive regression models [35,38,39]. To estimate the parameters of GAM models, the Bayesian
approach is used when finding the posteriori maximum, or Bayesian inference is used. The Stan
library was used for calculations. Based on the preliminary data analysis, the seasonal component of
the time series is formed from two parts: weekly and annual. To solve the forecasting problem, the
monthly components will also be taken into account in GAM models. Alternative models are
presented as follows: an additive model with a monthly seasonal component (GAM 1); a
multiplicative model with an annual seasonal component (GAM 2); a multiplicative model with
annual and weekly seasonal components (GAM 3). Table 2 shows the values of the quality metrics
of the GAM models.</p>
        <p>DW
Model with monthly seasonality component 0.992 2.011
Model with annual seasonality component 0.991 1.841
Model with annual and weekly seasonal
0.994 2.059
components</p>
        <p>Feedforward neural network models. Neural networks can be used to model various time series
with complex structures without additional knowledge of the process features reflected in the data.
When using neural network models to forecast time series, the following features must be taken into
account: 1) the first differences are fed to the model input, not the original series; 2) the number of
lags that are significant for describing a specific process is determined; 3) long-term trends are not
modeled. The best architecture of the neural network model (3, 15, 1) was selected experimentally. It
is shown in Figure 11.</p>
        <p>The neural network model used was a multilayer direct propagation neural network with one
hidden layer, a variable number of neurons and a sigmoid activation function. To prepare time series
for modeling and forecasting, the method of constructing neural network models was used. The
method consists of the following steps.</p>
        <p>Step 1. Feeding the data set to the input of the neural network model based on 1-, 2- and (if
necessary) 3-differences and the resulting vector.</p>
        <p>Step 2. Splitting the data set into two parts (for training and for testing).</p>
        <p>Step 3. Training the neural network.</p>
        <p>Step 4. Visualizing the structure of the neural network.</p>
        <p>Step 5. Calculating forecasts and evaluating forecast decisions.</p>
        <p>Step 6. Determining the type of distribution of the model residuals.</p>
        <p>Step 7. Determining the optimal parameters of the forecast model.</p>
        <p>Step 8. Returning to the original data (inverting scaling + inverting differentiation).
Step 9. Analysis of the absence of autocorrelation in the residuals (ACF, PACF, Portmanteau test).
Step 10. Determining the final predictive solution.</p>
        <p>Figure 12 shows a fragment of the input layer data of the neural network after preprocessing.</p>
        <p>Bayesian Structural Time Series Model. Structural time series models have three main
advantages for modeling and forecasting complex time series [40,41]:


</p>
        <p>Ability to account for uncertainty in forecasts, which can then help quantify future risks.
Open structure of the model.</p>
        <p>Ability to include external information for factors where there is no obvious relationship in
the data.</p>
        <sec id="sec-3-2-1">
          <title>The BSTS model training algorithm consists of the following stages:</title>
          <p>1.
2.
3.
4.</p>
          <p>Defining the model structure and prior probabilities.</p>
          <p>Using the Kalman filter to calculate state parameters based on current data.</p>
          <p>Selecting variables in the structural model based on the splash-and-slab method.</p>
          <p>Combining the results based on averaging over the Bayesian model to calculate the forecast.</p>
          <p>The flexibility of the algorithm is based on the selection of components for each alternative BSTS
model. This is evident in the first two stages of the algorithm. In the following stages, the model was
trained on the available data using a Bayesian method that updates the parameter estimates over
time. When solving the forecasting problem for the data set, several alternative BSTS models were
compiled based on the results of the preliminary analysis and data processing. Table 3 presents the
list of models.</p>
          <p>The values of the BSTS model quality characteristics are presented in Table 4. The residual.sd
characteristic is the mean of the posterior distribution of the standard deviation of the model
residuals, and the predict.sd characteristic is the standard deviation of the next step errors determined
based on the training data. The R2 characteristic is the determination coefficient. The next
characteristic, relation.gof, is the Harvey statistic.</p>
        </sec>
        <sec id="sec-3-2-2">
          <title>Model components</title>
          <p>Local linear trend + weekly seasonality component
Local linear trend + trigonometric seasonality with two Fourier
components (2p and sin 2 cos)
Local linear trend + autoregressive component
Local linear trend + monthly seasonality component
Robust local linear trend + autoregressive component
Robust local linear trend + annual seasonality component
Robust local linear trend + autoregressive component + weekly
seasonality component</p>
          <p>Local level component + autoregressive component</p>
          <p>Figure 13 shows a graph that displays the quality of BSTS models. The training data set is shown
below the graph of the accumulated errors curves. This allows us to better understand where exactly
the model fails to describe the data. In the figure, the curve (M7), which is located below the other
models, confirms the higher quality of this model.</p>
          <p>The adequacy of BSTS models was assessed by how well they described the training data. This
approach carried the risk of selecting an overfitted model as the optimal one. Evaluating models
using the next-step errors partially helps to avoid the error of overfitting models.</p>
        </sec>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Forecasting and evaluating results</title>
        <p>The models with the best quality indicators were the basis for calculating the predicted values (Table
5). The quality indicators of the models were determined on the test sample. The following metrics
were used to select the optimal model: MAPE, MAE, MSE and RMSE.</p>
        <p>Figure 14 shows a visualization of the forecast values for the time series on electricity demand in
Ukraine constructed using the BSTS model (M7). The black line represents 50 training data, the blue
line represents the predicted values of the time series. The yellow dots on the graph are the data
from the test sample. The green dotted lines limit the 95% confidence interval of the predicted values.</p>
        <sec id="sec-3-3-1">
          <title>Types of models</title>
          <p>ARIMA (1, 1, 2)( 2, 0, 0)7
GAM (annual and weekly</p>
          <p>seasonal components)
BSTS (robust local linear
trend + autoregressive
component + weekly
seasonality component)
NNAR (n=15, Sigmoid,
maxit=5000)</p>
          <p>The presented results demonstrate the effectiveness of using BSTS models to solve forecasting
problems. Further improvement of forecasting results is possible through the use of combined
forecasts [35].</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>The paper presents a study of forecasting the demand for electricity in Ukraine using machine
learning methods. Research was conducted on the basis of data from 2019-2024. The sequence of
stages of data processing when solving the problem of forecasting using machine learning methods
is developed and presented. The sequence includes the following steps: data collection, data research
and preparation, building and training forecast models, selecting the best model and calculating
forecasts, evaluating and checking the quality of forecasts. A general methodology for solving
forecasting problems is proposed. Methodology for solving the problem of forecasting time series
based on machine learning methods. The solution to the forecasting problem consists of five stages.
At the first stage, data collection, analysis and interpretation is carried out. At the second stage,
research and data preparation procedures are carried out. The third stage - the modeling stage
consists of three parts: preparation of the data set for modeling, selection and training of models and
assessment of their quality. The fourth stage is the stage of forecasting and determining the quality
of forecasts. At the fifth stage, procedures for increasing the effectiveness of the selected forecasting
model are performed. At the modeling stage, the following models were used: ARIMA, GAM, ANN
and BSTS. A detailed analysis of the models was carried out and predictions were made based on
each model. Predictive values were calculated for the built models with the best quality indicators.
The forecast was developed for 2 weeks. The forecasts were compared with the data of the validation
sample. The following indicators were used to select the optimal model and evaluate it: MAPE, MAE,
MSE, RMSE. The BSTS model showed the best results. This confirms the effectiveness of the BSTS
model when forecasting on real data.
[31] Energy Map. URL:
https://map.ua-energy.org/uk/resources/5a616fba-fbc9-4073-95329161592faca8/
[32] P. Bidyuk, I. Kalinina, A. Gozhyj. An Approach to Identifying and Filling Data Gaps in Machine
Learning Procedures. Lecture Notes on Data Engineering and Communications Technologies
(Switzeland). 2022. Vol. 77, pp. 164-176.
[33] R. J. Hyndman, G. Athanasopoulos. Forecasting: Principles and Practice 3rd ed. Edition.</p>
      <p>Publisher: OTexts. 2021, 442 p.
[34] T. Aggarwal. Master the Power of Seasonal Decomposition of Time Series (STL): Unveiling the
Essence of Time. 2023. URL:
https://medium.com/@tushar_aggarwal/master-the-power-ofseasonal-decomposition-of-time-series-stl-unveiling-the-essence-of-time-26c19a910314
[35] I. Kalinina., P. Bidyuk., A. Gozhyj, P. Malchenko. Combining Forecasts Based on Time Series
Models in Machine Learning Tasks. CEUR-WS. 2023. Vol. 3426. Pp. 25-35.
CEUR-WS.org/Vol3426/paper2.pdf.
[36] A. Nielsen. Practical Time Series Analysis. Prediction with Statistics and Machine Learning.</p>
      <p>O’Reilly Media, Inc., 2019. 504 p.
[37] J. Brownlee. A Gentle Introduction to the Box-Jenkins Method for Time Series Forecasting.
2020. URL:
https://machinelearningmastery.com/gentle-introduction-box-jenkins-methodtime-series-forecasting/
[38] N. Clark. How to interpret and report nonlinear effects from Generalized Additive Models. 2024.</p>
      <p>URL: https://ecogambler.netlify.app/blog/interpreting-gams/
[39] J. Lai, J. Tang, T. Li, A. Zhang, L. Mao. Evaluating the relative importance of predictors in
Generalized Additive Models using the gam.hp R package. Plant Diversity, vol. 46, issue 4, 2024,
pp. 542-546. DOI: 10.1016/j.pld.2024.06.002.
[40] I. Kalinina, P. Bidyuk, A. Gozhyj. Construction of Forecast Models based on Bayesian Structural
Time Series. International Scientific and Technical Conference on Computer Sciences and
Information Technologies. CSIT_2022. 2022. Pp. 180-184. doi: 10.1109/CSIT56902.2022.10000484.
[41] A.M. Almarashi, K. Khan. Bayesian Structural Time Series. 2020. Nanoscience and
Nanotechnology Letters, 12(1), pp. 54-61. doi:10.1166/nnl.2020.3083.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M. R.</given-names>
            <surname>Kazemzadeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Amjadian</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Amraee</surname>
          </string-name>
          .
          <article-title>A hybrid data mining driven algorithm for long term electric peak load and energy demand forecasting</article-title>
          .
          <source>Energy</source>
          <year>2020</year>
          ,
          <volume>204</volume>
          ,
          <fpage>117948</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>J.</given-names>
            <surname>Hao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Feng</surname>
          </string-name>
          .
          <article-title>A Novel Ensemble Approach for the Forecasting of Energy Demand Based on the Artificial Bee Colony Algorithm</article-title>
          .
          <source>Energies</source>
          <year>2020</year>
          ,
          <volume>13</volume>
          ,
          <fpage>550</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>A. J.</given-names>
            <surname>del Real</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Dorado</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Durán</surname>
          </string-name>
          .
          <article-title>Energy Demand Forecasting Using Deep Learning: Applications for the French Grid</article-title>
          .
          <source>Energies</source>
          <year>2020</year>
          ,
          <volume>13</volume>
          ,
          <fpage>2242</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>J.</given-names>
            <surname>Bedi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Toshniwal</surname>
          </string-name>
          .
          <article-title>Deep learning framework to forecast electricity demand</article-title>
          .
          <source>Appl. Energy</source>
          <year>2019</year>
          ,
          <volume>238</volume>
          , pp.
          <fpage>1312</fpage>
          -
          <lpage>1326</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>F.</given-names>
            <surname>Kaytez</surname>
          </string-name>
          .
          <article-title>A hybrid approach based on autoregressive integrated moving average and leastsquare support vector machine for long-term forecasting of net electricity consumption</article-title>
          .
          <source>Energy</source>
          <year>2020</year>
          ,
          <volume>197</volume>
          ,
          <fpage>117200</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>S.</given-names>
            <surname>Di Leo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Caramuta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Curci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Cosmi</surname>
          </string-name>
          .
          <article-title>Regression analysis for energy demand projection: An application to TIMES-Basilicata and TIMES-Italy energy models</article-title>
          .
          <source>Energy</source>
          <year>2020</year>
          ,
          <volume>196</volume>
          ,
          <fpage>117058</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>P.</given-names>
            <surname>Ramsami</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. T. A.</given-names>
            <surname>King</surname>
          </string-name>
          .
          <article-title>Neural Network Frameworks for Electricity Forecasting in Mauritius and Rodrigues Islands</article-title>
          .
          <source>In Proceedings of the 2021 IEEE PES/IAS PowerAfrica</source>
          , Nairobi, Kenya,
          <year>2021</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>D.</given-names>
            <surname>Angelopoulos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Siskos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Psarras</surname>
          </string-name>
          .
          <article-title>Disaggregating time series on multiple criteria for robust forecasting: The case of long-term electricity demand in Greece</article-title>
          .
          <source>Eur. J. Oper. Res</source>
          .
          <year>2019</year>
          ,
          <volume>275</volume>
          , pp.
          <fpage>252</fpage>
          -
          <lpage>265</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>U.</given-names>
            <surname>Şahin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ballı</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Chen</surname>
          </string-name>
          .
          <article-title>Forecasting seasonal electricity generation in European countries under COVID-19-induced lockdown using fractional grey prediction models and machine learning methods</article-title>
          .
          <source>Appl. Energy</source>
          <year>2021</year>
          ,
          <volume>302</volume>
          ,
          <fpage>117540</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>R.</given-names>
            <surname>Hou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wu</surname>
          </string-name>
          , G. Ren,
          <string-name>
            <given-names>W.</given-names>
            <surname>Gao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Khayatnezhad</surname>
          </string-name>
          .
          <article-title>Assessing of impact climate parameters on the gap between hydropower supply and electricity demand by RCPs scenarios and optimized ANN by the improved Pathfinder (IPF) algorithm</article-title>
          .
          <source>Energy</source>
          <year>2021</year>
          ,
          <volume>237</volume>
          ,
          <fpage>121621</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Baba</surname>
          </string-name>
          .
          <article-title>Advanced AI-based techniques to predict daily energy consumption: A case study</article-title>
          .
          <source>Expert Syst. Appl</source>
          .
          <year>2021</year>
          ,
          <volume>184</volume>
          ,
          <fpage>115508</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>M.</given-names>
            <surname>Pegalajar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.G. B.</given-names>
            <surname>Ruíz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. P.</given-names>
            <surname>Cuéllar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Rueda</surname>
          </string-name>
          .
          <article-title>Analysis and enhanced prediction of the Spanish Electricity Network through Big Data and Machine Learning techniques</article-title>
          .
          <source>Int. J. Approx. Reason</source>
          .
          <year>2021</year>
          ,
          <volume>133</volume>
          , pp.
          <fpage>48</fpage>
          -
          <lpage>59</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>N. M. M. Bendaoud</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Farah</surname>
            ,
            <given-names>S. Ben</given-names>
          </string-name>
          <string-name>
            <surname>Ahmed</surname>
          </string-name>
          .
          <article-title>Applying load profiles propagation to machine learning based electrical energy forecasting</article-title>
          .
          <source>Electr. Power Syst. Res</source>
          .
          <year>2022</year>
          ,
          <volume>203</volume>
          ,
          <fpage>107635</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>R.</given-names>
            <surname>Porteiro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Hernández-Callejo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Nesmachnow</surname>
          </string-name>
          .
          <article-title>Electricity demand forecasting in industrial and residential facilities using ensemble machine learning</article-title>
          .
          <source>Rev. Fac</source>
          . De Ing.
          <year>2022</year>
          ,
          <volume>102</volume>
          , pp.
          <fpage>9</fpage>
          -
          <lpage>25</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Lu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Wang</surname>
          </string-name>
          .
          <article-title>A load forecasting model based on support vector regression with whale optimization algorithm</article-title>
          .
          <source>Multimed. Tools Appl</source>
          .
          <year>2023</year>
          ,
          <volume>82</volume>
          , pp.
          <fpage>9939</fpage>
          -
          <lpage>9959</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Kong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Yue</surname>
          </string-name>
          , C. Liu,
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Khan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <surname>H. Zhang.</surname>
          </string-name>
          <article-title>Short-Term Electrical Load Forecasting Using Hybrid Model of Manta Ray Foraging Optimization and Support Vector Regression</article-title>
          .
          <source>J. Clean. Prod</source>
          .
          <year>2023</year>
          ,
          <volume>388</volume>
          ,
          <fpage>135856</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>J.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Algahtani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kaewunruen</surname>
          </string-name>
          .
          <article-title>Energy Forecasting in a Public Building: A Benchmarking Analysis on Long Short-Term Memory (LSTM), Support Vector Regression (SVR), and Extreme Gradient Boosting (XGBoost) Networks</article-title>
          . Appl. Sci.
          <year>2022</year>
          ,
          <volume>12</volume>
          ,
          <fpage>9788</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>C. E.</given-names>
            <surname>Velasquez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zocatelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. B.</given-names>
            <surname>Estanislau</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. F.</given-names>
            <surname>Castro</surname>
          </string-name>
          .
          <article-title>Analysis of time series models for Brazilian electricity demand forecasting</article-title>
          .
          <source>Energy</source>
          <year>2022</year>
          ,
          <volume>247</volume>
          ,
          <fpage>123483</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>F.</given-names>
            <surname>Pallonetto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <string-name>
            <surname>E. Mangina.</surname>
          </string-name>
          <article-title>Forecast electricity demand in commercial building with machine learning models to enable demand response programs</article-title>
          .
          <source>Energy AI</source>
          <year>2022</year>
          ,
          <volume>7</volume>
          ,
          <fpage>100121</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>E. C.</given-names>
            <surname>May</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bassam</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. J.</given-names>
            <surname>Ricalde</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Soberanis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Oubram</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O. M.</given-names>
            <surname>Tzuc</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. Y.</given-names>
            <surname>Alanis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Livas-García</surname>
          </string-name>
          .
          <article-title>Global sensitivity analysis for a real-time electricity market forecast by a machine learning approach: A case study of Mexico</article-title>
          .
          <source>Int. J. Electr. Power Energy Syst</source>
          .
          <year>2022</year>
          ,
          <volume>135</volume>
          ,
          <fpage>107505</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>W. J.</given-names>
            <surname>Niu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z. K.</given-names>
            <surname>Feng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. S.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. J.</given-names>
            <surname>Wu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. Y.</given-names>
            <surname>Wang</surname>
          </string-name>
          .
          <article-title>Short-term electricity load time series prediction by machine learning model via feature selection and parameter optimization using hybrid cooperation search algorithm</article-title>
          .
          <source>Environ. Res. Lett</source>
          .
          <year>2021</year>
          ,
          <volume>16</volume>
          ,
          <fpage>055032</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>R.</given-names>
            <surname>Luzia</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Rubio</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. E.</given-names>
            <surname>Velasquez</surname>
          </string-name>
          .
          <article-title>Sensitivity analysis for forecasting Brazilian electricity demand using artificial neural networks and hybrid models based on Autoregressive Integrated Moving Average</article-title>
          .
          <source>Energy</source>
          <year>2023</year>
          ,
          <volume>274</volume>
          ,
          <fpage>127365</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          [23]
          <string-name>
            <given-names>P. C.</given-names>
            <surname>Albuquerque</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. O.</given-names>
            <surname>Cajueiro</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. D. C.</given-names>
            <surname>Rossi</surname>
          </string-name>
          .
          <article-title>Machine learning models for forecasting power electricity consumption using a high dimensional dataset</article-title>
          .
          <source>Expert. Syst. Appl</source>
          .
          <year>2022</year>
          ,
          <volume>187</volume>
          ,
          <fpage>115917</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>R.</given-names>
            <surname>Rick</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Berton</surname>
          </string-name>
          .
          <article-title>Energy forecasting model based on CNN-LSTM-AE for many time series with unequal lengths</article-title>
          .
          <source>Eng. Appl. Artif. Intell</source>
          .
          <year>2022</year>
          ,
          <volume>113</volume>
          ,
          <fpage>104998</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>M.</given-names>
            <surname>Maaouane</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Chennaif</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Zouggar</surname>
          </string-name>
          , G. Krajaci'c, N. Duic,
          <string-name>
            <given-names>H.</given-names>
            <surname>Zahboune</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. K.</given-names>
            <surname>ElMiad</surname>
          </string-name>
          .
          <article-title>Using neural network modelling for estimation and forecasting of transport sector energy demand in developing countries</article-title>
          .
          <source>Energy Convers Manag</source>
          .
          <year>2022</year>
          ,
          <volume>258</volume>
          ,
          <fpage>115556</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>B.</given-names>
            <surname>Lantz</surname>
          </string-name>
          .
          <article-title>Machine Learning with R. Expert techniques for predictive modeling</article-title>
          ,
          <source>3rd Edition</source>
          , Packt Publishing,
          <year>2019</year>
          . 458 p.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>J. D.</given-names>
            <surname>Kelleher</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. Mac</given-names>
            <surname>Namee</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. D'Arcy</surname>
          </string-name>
          .
          <article-title>Fundamentals of Machine Learning for Predictive Data Analytics</article-title>
          . Algorithms, Worked Examples, and
          <source>Case Studies. Second Edition</source>
          . The MIT Press, Cambridge, Massachusetts, London, England,
          <year>2020</year>
          . 798 p.
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>G.</given-names>
            <surname>James</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Witten</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Hastie</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Tibshirani</surname>
          </string-name>
          .
          <article-title>An Introduction to Statistical Learning with Applications in</article-title>
          R. Springer New York Heidelberg Dordrecht London,
          <year>2013</year>
          . 441 p. doi:
          <volume>10</volume>
          .1007/978-1-
          <fpage>4614</fpage>
          -7138-7.
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>I.</given-names>
            <surname>Kalinina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gozhyj</surname>
          </string-name>
          .
          <source>Methodology for Solving Forecasting Problems Based on Machine Learning Methods. Lecture Notes on Data Engineering and Communications Technologies (Switzeland)</source>
          .
          <year>2023</year>
          . Vol.
          <volume>149</volume>
          , pp.
          <fpage>105</fpage>
          -
          <lpage>125</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>P.</given-names>
            <surname>Bidyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gozhyj</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Kalinina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Vysotska</surname>
          </string-name>
          ,
          <article-title>Methods for forecasting nonlinear non-stationary processes in machine learning</article-title>
          .
          <source>In: Data Stream Mining and Processing. DSMP 2020. Communications in Computer and Information Science</source>
          .
          <year>2020</year>
          , Vol.
          <volume>1158</volume>
          , pp.
          <fpage>470</fpage>
          -
          <lpage>485</lpage>
          . Springer, Cham, (
          <year>2020</year>
          ). doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -61656-4
          <fpage>32</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>