<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>An approach to combining forecasts when solving machine learning problems ⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Peter Bidyuk</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Irina Kalinina</string-name>
          <email>irina.kalinina1612@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Aleksandr Gozhyj</string-name>
          <email>alex.gozhyj@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Victor Gozhyi</string-name>
          <email>gozhyi.v@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergii Shiyan</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”</institution>
          ,
          <addr-line>37, Prospect Beresteiskyi, Solomyanskyi district, Kyiv, 03056</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Petro Mohyla Black Sea National University</institution>
          ,
          <addr-line>St. 68 Desantnykiv 10, Mykolaiv, 54000</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The article investigates an approach to solving forecasting problems based on a combination of forecast solutions. A structural diagram of a forecasting approach using a combination of forecasts is proposed. An information system architecture is developed to improve the efficiency of forecasting based on combined forecasts. The task of forecasting electricity demand in Ukraine is considered as an example. A time series reflecting electricity demand in the period from 2019 to 2024 was studied. A structural diagram of a forecasting approach based on combining forecasts is developed. The scheme considers as basic methods of forecasting time series based on machine learning methods, namely: generalized additive model, exponential smoothing model, ARIMA model and neural network autoregression model. For each method, several models were built, the accuracy of which was evaluated on the training and test samples, then the optimal model was selected, thus 4 independent models were obtained. Several methods of combining forecasts were considered. To solve the forecasting problem, seven forecast combination methods were applied to obtain combined forecasts from the forecasts of individual models. The combination methods demonstrated an improvement in forecast accuracy compared to the best models. Among them, the simple averaging method has the highest accuracy. The proposed approach is effective in solving machine learning problems.</p>
      </abstract>
      <kwd-group>
        <kwd>combining forecast</kwd>
        <kwd>forecasting</kwd>
        <kwd>electricity demand in Ukraine</kwd>
        <kwd>machine learning</kwd>
        <kwd>ARIMA</kwd>
        <kwd>GAM</kwd>
        <kwd>ETS</kwd>
        <kwd>NNAR 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Today, forecasting is a powerful tool for predicting resources and resource demand
management. A number of companies such as Amazon, Uber, Airbnb and many others use
forecasting to predict future economic indicators and identify hidden trends in data, to develop
strategies for future activities.</p>
      <p>Recently, machine learning methods have been used to solve forecasting problems. Forecasting
based on machine learning methods allows you to comprehensively take into account the features
of dynamic processes that are reflected in the time series, on the basis of which the forecast will be
built. Forecasting based on time series is important for various fields of activity, such as medicine,
economics, industry, energy, etc. The complexity of this problem is increased by the presence of
nonlinearity and non-stationarity in real data, as well as various types of uncertainty, such as
statistical, structural and parametric uncertainties. The problem is solved based on a systematic
approach to modeling and forecasting processes. Important features of the system approach are</p>
      <p>
        0000−0002−7421−3565 (P. Bidyuk); 0000-0001-8359-2045 (I. Kalinina); 0000-0002-3517-580X (A. Gozhyj);
0000−0002−5341−0973 (V. Gozhyi); 0000-0001-9255-9511 (S. Shiyan)
comprehensive consideration of features and uncertainties at each stage of solving the problem [
        <xref ref-type="bibr" rid="ref1 ref2">1,
2</xref>
        ]. But such an approach, although it takes place from the point of view of methodology, does not
always provide the necessary forecast accuracy. To increase the accuracy of forecast values,
various methods and approaches are used [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. One of the effective approaches to improving the
quality of forecasts on time series is the use of methods for combining forecast values.
      </p>
      <p>
        Machine learning methods common in forecasting problems based on time series, such as the
generalized additive model, the exponential smoothing model, the autoregressive neural network
model and the classical ARIMA model, allow obtaining fairly accurate forecasts taking into account
different types of trends, seasonal patterns, external disturbances, etc. The classical approach is to
fit several forecast models on the training sample and check their accuracy on the test sample with
subsequent selection of a high-quality model [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. However, different models can give quite
different predictions because they reflect only some of the features of the real process. Combining
predictions obtained from different models allows you to take into account more features of the
process and thus improve the accuracy of the resulting forecast.
      </p>
      <p>
        The authors of the article [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] noted the effectiveness of combined models compared to the
approach of selecting the best individual model. The article describes experiments with 3003 data
sets, including data taken annually, quarterly, monthly, daily, and others, to test the hypothesis of
greater efficiency of combining forecasts compared to choosing the best individual model. The
following forecasting methods were considered: various variants of exponential smoothing (simple
exponential smoothing, Holt method, damped trend method), ARIMA models, neural network
autoregression models. Only simple averaging was used to combine forecasts, but all possible
combinations of forecasts were considered. The accuracy of forecast solutions was assessed using
sMAPE.
      </p>
      <p>
        Various forecast combination methods involve obtaining one combined forecast from a set of
several individual forecasts, which, according to many studies, turns out to be more accurate than
the best individual forecasts [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. The paper presents a general overview of scientific works on the
use of forecast combination methods. Despite the various proposed approaches to forecast
combination, which theoretically should significantly increase the accuracy of the combined
forecast, empirical results are ambiguous and often show that the simple averaging method is the
most effective, and there is no clear answer to the question of when it is more appropriate to use
more complex models and when simple approaches. As a rule, when building several forecast
models based on different methods or one method with different parameters for the same time
series, one of the most optimal methods is chosen among them. This is a traditional approach,
which is based on the assumption that the best method exists and can be found [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
      </p>
      <p>
        There are various approaches to forecast combination that demonstrate good results in
improving forecast accuracy. In the article [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], forecasts of 10 separate models were studied,
including a naive model, a moving average model, several exponential smoothing models and a
linear regression model, combined using the minimum variance method. Forecasts were performed
for many different time series and with different forecast horizons. As a result, it was shown that
the combined forecasts were more accurate than the forecasts of individual models in most cases,
except for large forecast horizons.
      </p>
      <p>
        In the works [
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ] it is proposed to evaluate the models to compare their effectiveness on test
data that were not used to estimate the model parameters, and therefore they can reflect the
effectiveness of the forecast model when applied to new data. For evaluation, the model errors are
summed up, that is, the deviation of the model predicted value from the real one, in a certain way,
obtaining the forecast errors of the model: mean absolute error (MAE), root mean square error
(RMSE), mean absolute percentage error (MAPE), etc. The model that demonstrates the smallest
error values on the test data is considered to be more optimal. This approach allows you to choose
the optimal method, but the selected model will not be the best and will demonstrate worse results
on other data [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. Choosing only one model from a set of successful models can lead to the loss of
valuable information present in alternative models.
      </p>
      <p>
        The effectiveness of combining forecasts compared to choosing the best individual forecast is
presented in [
        <xref ref-type="bibr" rid="ref11 ref12">11, 12</xref>
        ]. The effectiveness of combining forecasts from different models compared to
choosing the best individual model is shown in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. The paper compared three separate forecast
models (an artificial neural network model, an ARIMA model, and an exponential smoothing
model) and three approaches to combining forecasts (simple averaging, minimum variance, and
linear regression) on 500 simulated time series of 200 observations each. The MAE, MAPE, RMSE,
and Theil's U coefficient were used to evaluate the forecasts. As a result, among the individual
models, the artificial neural network model turned out to be the most effective, but it may be
inferior to the combined simple averaging and minimum variance models and significantly loses to
the regression combined model.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], the effectiveness of combined forecasts in comparison with individual forecasting
models was also shown - they were taken as the support vector regression model, the ARIMA
model, the exponential smoothing model with multiplicative seasonality (Winters method), as well
as naive models. All methods were applied to develop a forecast of the number of tourist trips with
different purposes to the United Kingdom. The combination was carried out using the approaches
of simple averaging, minimum variance and discounted mean square error.
      </p>
      <p>
        In [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ], forecasts of various economic indicators of several countries were performed, the results
of comparing simple models of averaging forecasts or the median with various more complex
models (discounted mean square error, calculation of weights based on the AIC and BIC criteria,
regression model with determination of coefficients based on the least squares method, etc.) were
not in favor of the latter.
      </p>
      <p>Problem statement. To investigate the features of the forecast combination process. To
develop an approach for solving time series forecasting problems based on combined forecasts. To
experimentally confirm the effectiveness of forecast combination methods for solving machine
learning problems.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Modeling and forecasting</title>
      <sec id="sec-2-1">
        <title>2.1. Stages of solving forecasting problems</title>
        <p>
          To solve the problems of time series forecasting based on machine learning methods, a
structural generalized sequence diagram of the stage [
          <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
          ] was developed, which is
presented in Figure 1.
        </p>
        <p>The scheme is presented as a sequence of the following stages: data collection, analysis and
preliminary data preparation, modeling (or training models on data), forecasting and
determining the quality of forecasts, improving the efficiency of models. At the first stage, the
input data set is collected and preliminary analyzed. At the same time, procedures for analyzing
the data structure, analyzing individual features, visual analysis of data, and possible recoding
of individual features are carried out. The result of the first stage is a data set ready for further
processing.</p>
        <p>The second stage is designed to eliminate statistical uncertainty in the data. At this stage,
missing values for individual features and observations are identified and processed; anomalous
values and noise are identified; data correlation analysis (autocorrelation level) is performed; types
of nonlinearity and non-stationarity of the data are identified and determined (if possible, actions
are taken to eliminate them); the data set is analyzed for heteroscedasticity and integration.</p>
        <p>The modeling stage (third stage) is designed to consistently eliminate structural and parametric
uncertainties. It is implemented in three steps: the step of dividing the prepared data set into
several samples for training and testing, selecting the structure of the appropriate predictive model,
finding the model parameters and training it, checking the adequacy of the model.</p>
        <p>Forecasts are built based on the selected models and their quality is assessed at the fourth stage.
The fourth stage is the stage of building forecasts and assessing their quality. In this case, a system
of quality indicators (metrics) is used.</p>
        <p>The fifth stage is designed to improve the efficiency of basic forecasting models. The following
approaches are used for this: changing the model structure, selecting model specifications, refining
the model topology, using additional algorithms (ensemble approaches), and using various methods
of combining forecast values. Thus, the use of combination methods helps to improve the quality of
forecasts.</p>
        <p>A necessary part of the generalized scheme for solving the forecasting problem based on
machine learning methods is visualization. Visualization helps to adjust the sequence of actions at
each stage and quickly identify possible shortcomings. Depending on the specifics of the subject
area and the data set, it is possible to re-examine any of the previous stages.</p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2. Stages of solving forecasting problems</title>
        <p>
          The approach to modelling and forecasting based on combination methods, which is developed on
the basis of a systematic approach to the modelling process, includes the following basic
computational procedures: a procedure for analysing and pre-processing the data set; a procedure
for dividing the data set into separate samples (for training, validation, and testing); a modelling
and forecasting procedure; a procedure for assessing the quality of modelling and forecasting
results; a procedure for combining forecasts using different methods [
          <xref ref-type="bibr" rid="ref16 ref17">16, 17</xref>
          ]. The result of the
presented procedures is a forecast value for the desired horizon value, which has better accuracy
and is determined by the analyst or decision maker. The structural diagram of the approach to
forecasting based on combining forecasts is presented in Figure 2.
        </p>
        <p>It is important to emphasize that the quality of the results that we have after each
computational procedure is reflected in the final result. Therefore, statistical tests are added to each
procedure to check the presence of the corresponding properties. The result of the modelling and
forecasting procedure is several forecast models (each of a different type), which have the best
quality metrics for their type of models. Quality assessments are performed both for individual
forecasts and for each type of combined forecasts.</p>
        <p>If no increase in forecast accuracy is detected when combining forecasts, it is necessary to
return to the stage of forming the basic models or change their number and type of combination.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Experimental part</title>
      <sec id="sec-3-1">
        <title>3.1. Data pre-processing</title>
        <p>
          The dataset for combined forecasting studies reflects information on electricity demand in Ukraine
from 2019 to 2024. [
          <xref ref-type="bibr" rid="ref18 ref19">18, 19</xref>
          ]. It contains hourly data on electricity purchase and sale volumes, as well
as demand for it in MWh in the Ukrainian electricity grid and the price of electricity starting from
July 1, 2019. The dataset has 69,385 observations, each of which is characterized by 10 variables.
Among them are the date as a string and the hour represented by an integer, which were converted
to the time data format.
        </p>
        <p>As a result of checking for missing values in the dataset, five gaps were found. This number of
gaps does not significantly affect the quality of the data, therefore, the LOCF strategy was used to
fill in the missing values in the time series, which consists in replacing each missing value with the
last previous non-missing value.</p>
        <p>The analysis of the time series shows that the load on the Ukrainian power system has changed.
Since June 1, 2022, the system has been loaded evenly, therefore, to build forecast models, we
separate a part of the time series that reflects electricity demand from June 2022 to October 2024
(Fig. 3).</p>
        <p>
          After visualization of the time series, its statistical characteristics were analyzed. The
decomposition of the time series was performed and a noticeable seasonality with different periods
was revealed: annual, weekly, daily. The aggregation of the time series by dates was performed in
order to reduce the volume of analyzed data, as well as from the point of view of the feasibility of
performing the forecast for a certain number of days ahead, rather than hourly [
          <xref ref-type="bibr" rid="ref20 ref21">20, 21</xref>
          ]. A
noticeable autocorrelation was detected and the number of necessary differentiations was
determined. The results of the preliminary analysis confirmed the presence of nonlinearity and
non-stationarity in the process under study.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Selection of basic forecasting models</title>
        <p>Generalized additive model. The first among the basic predictive models to describe the process
under study is the generalized additive model (GAM). Five alternative models were considered to
select the best model parameters. Analysis of the quality metrics of the predictive values on the test
sample showed that the best quality is model No. 2 with a larger number of trend nodes (Table 1).</p>
        <p>Figure 4 shows the forecasting results based on the best GAM model against the test data.</p>
        <p>
          Exponential smoothing model. The exponential smoothing model was chosen as the next
basic forecast model. Taking into account additional components of the time series (trend,
seasonality) in the model structure made it possible to consider four alternative types of
exponential smoothing models. Table 2 summarizes the results of forecast quality assessments for
different ETS models. The best forecast quality assessments were received by model No. 4, the
Holt-Winters model with additive errors.
seasonal components of the model, so we will obtain four alternative neural network
autoregression models. The resulting forecasts for these models are the average of the forecast
values of several models. Increasing the parameter values leads to a significant increase in the
model training time. Table 4 presents the results of the quality assessments of the forecasts
obtained on the test data. The best values for all quality metrics were demonstrated by model No. 4
(NNAR(100,100,k)[
          <xref ref-type="bibr" rid="ref7">7</xref>
          ], Max_it=20000).
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Combining forecasts</title>
        <p>
          Based on the analysis of the forecast values obtained using the best baseline models (Fig. 8), it is
obvious that no model takes into account all the features of the dynamic process under study.
Therefore, to increase the accuracy of the forecasts, the approach of combining forecast values was
used [
          <xref ref-type="bibr" rid="ref22">22-24</xref>
          ].
        </p>
        <p>Seven different forecast combination methods were selected and implemented in the work: the
simple averaging method; the median method; the minimum variance method; the method based
on the regression model with coefficients fitted by the least squares method; the method based on
the regression model with coefficients fitted by the least absolute deviation method; the inverse
rank method and the combination of multiple regression models [25-27].</p>
        <p>Table 5 presents a comparison of the indicators for three forecast quality metrics for all forecast
combination methods. The best quality values were shown by models based on methods No. 1 and
No. 6, which are the simple averaging method and the inverse rank method. The best quality model
based on the simple averaging method is presented in Figure 9 against the background of forecasts
obtained using the best basic forecast models.</p>
        <p>The presented approach to improving forecast accuracy based on combining forecast values
from the best basic forecast models demonstrates an increase in forecasting efficiency in machine
learning tasks.</p>
        <p>4. Conclusions
The article considered an approach to improving the accuracy of time series forecasting by using
forecast combination methods to solve machine learning problems. The problem of forecasting
electricity demand in the Ukrainian power grid was considered as a machine learning problem. A
structural diagram of the forecasting approach using a combination of forecasts was proposed. In
the developed approach, the basic methods of time series forecasting based on machine learning
were used, namely: a generalized additive model, an exponential smoothing model, an ARIMA
model, and a neural network autoregression model. For each method, several alternative models
with different parameters were constructed, the accuracy of which was evaluated on training and
test samples, as a result, the optimal model was selected for each type of model.</p>
        <p>Seven different methods of combining forecast values were considered. To solve the forecasting
problem, forecast combination methods were used to obtain combined forecasts based on the
forecasts of the best models. Two of the considered combination methods (simple averaging and
inverse rank method) demonstrated improved forecast accuracy compared to the best baseline
models across all quality metrics. Among the combination methods, the simple averaging method
has the highest accuracy. The proposed approach is effective for obtaining point forecasts on time
series.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Declaration on Generative AI</title>
      <p>The authors did not use any generative AI tools.
[23] P. Bidyuk, A. Gozhyj, I. Kalinina, V. Vysotska, Methods for forecasting nonlinear
nonstationary processes in machine learning, in: S. Babichev, D. Peleshko, O. Vynokurova (Eds.),
Data Stream Mining &amp; Processing, vol. 1158 of Communications in Computer and Information
Science, Springer, Cham, 2020, pp. 470–485. doi:10.1007/978-3-030-61656-4_32.
[24] I. Pikh, V. Senkivskyy, A. Kudriashova, N. Senkivska, Prognostic assessment of COVID-19
vaccination levels, in: Intell. Syst. Decis. Mak. Comput. Intell., Springer Int. Publ., Cham, 2022,
vol. 149, pp. 246–265. doi:10.1007/978-3-031-16203-9_15.
[25] A. Chiche, Structure of a hybrid decision support system for crop growing income forecasting
and recommendations. International Journal of Computing, 2019, 18(2), pp. 181-190.
https://doi.org/10.47839/ijc.18.2.1416.
[26] G. Lipyanina, V. Maksimovich, A. Sachenko, T. Lendyuk, A. Fomenko, I. Kit, Investment Risk
Assessment of a Virtual IT Company Based on Machine Learning. In: Babichev, S., Peleshko,
D., Vinokurova, O. (eds.) Data Stream Analysis and Processing. DSMP 2020. Communications
in Computer and Information Science, 2020, vol. 1158. Springer, Cham.
https://doi.org/10.1007/978-3-030-61656-4_11.
[27] S. Bhatia, M. Sharma, K.K. Bhatia, P. Das, Target for opinion extraction with sentiment
analysis. International Journal of Computing, 17(3), 2018, pp. 136-142.
https://doi.org/10.47839/ijc.17.3.1033.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>A.</given-names>
            <surname>Nielsen</surname>
          </string-name>
          ,
          <source>Practical Time Series Analysis. Prediction with Statistics and Machine Learning. O'Reilly Media</source>
          , Inc. (
          <year>2019</year>
          ), 504 p.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>V.</given-names>
            <surname>Lakshmanan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Robinson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Munn</surname>
          </string-name>
          , Machine Learning Design Patterns:
          <article-title>Solutions to Common Challenges in Data Preparation, Model Building, and</article-title>
          <string-name>
            <surname>MLOps 1st Edition. O'Reilly Media</surname>
          </string-name>
          , Inc. (
          <year>2020</year>
          ), 448 p.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>B.</given-names>
            <surname>Lantz</surname>
          </string-name>
          ,
          <article-title>Machine Learning with R. Expert techniques for predictive modeling</article-title>
          ,
          <source>3rd Edition</source>
          , Packt Publishin, (
          <year>2019</year>
          ), 458 p.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>R.</given-names>
            <surname>Hyndman</surname>
          </string-name>
          , G. Athanasopoulos,
          <source>Forecasting: Principles and Practice</source>
          , 3rd. ed.,
          <source>OTexts</source>
          , Melbourne, Australia, (
          <year>2021</year>
          ), 442 p.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>M.</given-names>
            <surname>Hibon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Evgeniou</surname>
          </string-name>
          .
          <article-title>To combine or not to combine: Selecting among forecasts and their combinations</article-title>
          ,
          <source>Int. J. Forecast</source>
          .
          <volume>21</volume>
          (
          <year>2005</year>
          )
          <fpage>15</fpage>
          -
          <lpage>24</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.ijforecast.
          <year>2004</year>
          .
          <volume>05</volume>
          .002.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>X.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Hyndman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Kang</surname>
          </string-name>
          ,
          <article-title>Forecast combinations: An over 50-year review</article-title>
          ,
          <source>Int. J. Forecast</source>
          .
          <volume>39</volume>
          (
          <year>2022</year>
          ). doi:
          <volume>10</volume>
          .1016/j.ijforecast.
          <year>2022</year>
          .
          <volume>11</volume>
          .005.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>R.</given-names>
            <surname>Winkler</surname>
          </string-name>
          ,
          <string-name>
            <surname>S. Makridakis,</surname>
          </string-name>
          <article-title>The combination of forecasts</article-title>
          ,
          <source>J. R. Stat. Soc. Ser. A</source>
          .
          <volume>146</volume>
          (
          <year>1983</year>
          )
          <fpage>150</fpage>
          -
          <lpage>157</lpage>
          . doi:
          <volume>10</volume>
          .2307/2982011.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>R. J.</given-names>
            <surname>Hyndman</surname>
          </string-name>
          , G. Athanasopoulos,
          <source>Forecasting: Principles and Practice</source>
          , 2nd. ed.,
          <string-name>
            <surname>Melbourne</surname>
          </string-name>
          , Australia: OTexts. (
          <year>2018</year>
          ). URL: https://otexts.com/fpp2/arima.html.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>C.</given-names>
            <surname>Bergmeir</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. J.</given-names>
            <surname>Hyndman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Koo</surname>
          </string-name>
          ,
          <article-title>A note on the validity of cross-validation for evaluating autoregressive time series prediction</article-title>
          .
          <source>Computational Statistics &amp; Data Analysis</source>
          , Vol.
          <volume>120</volume>
          (
          <year>2018</year>
          )
          <fpage>70</fpage>
          -
          <lpage>83</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.csda.
          <year>2017</year>
          .
          <volume>11</volume>
          .003.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>A.</given-names>
            <surname>Mancuso</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Werner</surname>
          </string-name>
          ,
          <article-title>A comparative study on combinations of forecasts and their individual forecasts by means of simulated series</article-title>
          ,
          <source>Acta Sci. Technol</source>
          .
          <volume>41</volume>
          (
          <year>2019</year>
          ). doi:
          <volume>10</volume>
          .4025/actascitechnol.v41i1.
          <fpage>41452</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>S.</given-names>
            <surname>Cang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <article-title>A combination selection algorithm on forecasting</article-title>
          ,
          <source>Eur. J. Oper. Res</source>
          .
          <volume>234</volume>
          (
          <year>2014</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>D.</given-names>
            <surname>Soule</surname>
          </string-name>
          ,
          <article-title>Forecast combination with multiple models and expert correlations</article-title>
          ,
          <source>VCU Scholars Compass</source>
          ,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>J.</given-names>
            <surname>Stock</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Watson</surname>
          </string-name>
          ,
          <article-title>Combination forecasts of output growth in a seven-country data set</article-title>
          ,
          <source>J. Forecast</source>
          .
          <volume>23</volume>
          (
          <year>2004</year>
          )
          <fpage>405</fpage>
          -
          <lpage>430</lpage>
          . doi:
          <volume>10</volume>
          .1002/for.928.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>R. de O. V. dos Santos</surname>
            ,
            <given-names>C. F.</given-names>
          </string-name>
          <string-name>
            <surname>Araujo</surname>
            ,
            <given-names>R. M. S.</given-names>
          </string-name>
          <string-name>
            <surname>Accioly</surname>
            ,
            <given-names>F. L. C.</given-names>
          </string-name>
          <string-name>
            <surname>Oliveira</surname>
          </string-name>
          ,
          <article-title>Horizon-optimized weights for forecast combination with cross-learning, Pesqui</article-title>
          . Oper.
          <volume>41</volume>
          (
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          .1590/
          <fpage>0101</fpage>
          -
          <lpage>7438</lpage>
          .
          <year>2021</year>
          .
          <volume>041</volume>
          .00245564.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>T.</given-names>
            <surname>Pike</surname>
          </string-name>
          ,
          <article-title>Combining forecasts: Can machines beat the average?</article-title>
          <source>Federal Reserve Board</source>
          ,
          <year>2020</year>
          . URL: https://ssrn.com/abstract=3691117.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>X.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Kang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <article-title>Another look at forecast trimming for combinations: robustness, accuracy and diversity, 30 Jul 2022</article-title>
          . doi:
          <volume>10</volume>
          .48550/arXiv.2208.00139.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>M.</given-names>
            <surname>Scholz</surname>
          </string-name>
          ,
          <article-title>Forecast combinations for benchmarks of long-term stock returns using machine learning methods</article-title>
          , Ann. Oper. Res.,
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .1007/s10479-022-04880-4.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>Energy</given-names>
            <surname>Map</surname>
          </string-name>
          . URL: https://map.ua-energy.org/uk/resources/5a616fba-fbc9-
          <fpage>4073</fpage>
          -
          <fpage>9532</fpage>
          - 9161592faca8/.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>Market</given-names>
            <surname>Operator</surname>
          </string-name>
          . URL: https://www.oree.com.ua/.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>A.</given-names>
            <surname>Gozhyj</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Nechakhin</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Kalinina</surname>
          </string-name>
          ,
          <article-title>Solar power control system based on machine learning methods</article-title>
          ,
          <source>in: Proc. 2020 IEEE 15th Int. Conf. Comput. Sci. Inf</source>
          . Technol.
          <source>(CSIT)</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>23</fpage>
          -
          <lpage>26</lpage>
          . doi:
          <volume>10</volume>
          .1109/CSIT49958.
          <year>2020</year>
          .
          <volume>9321953</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>L.</given-names>
            <surname>Chyrun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Kravets</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Garasym</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gozhyj</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Kalinina</surname>
          </string-name>
          ,
          <article-title>Cryptographic information protection algorithm selection optimization for electronic governance IT project management by the analytic hierarchy process based on nonlinear conclusion criteria</article-title>
          ,
          <source>CEUR Workshop Proc</source>
          .
          <volume>2565</volume>
          (
          <year>2020</year>
          ). URL: http://ceur-ws.
          <source>org/</source>
          Vol-
          <volume>2565</volume>
          /paper18.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>P.</given-names>
            <surname>Bidyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Gozhyj</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Szymanski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Kalinina</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Beglytsia</surname>
          </string-name>
          ,
          <article-title>The methods Bayesian analysis of the threshold stochastic volatility model</article-title>
          ,
          <source>in: Proc. 2018 IEEE 2nd Int. Conf. Data Stream Mining Process. (DSMP)</source>
          ,
          <year>2018</year>
          . doi:
          <volume>10</volume>
          .1109/DSMP.
          <year>2018</year>
          .
          <volume>8478474</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>