<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>International Journal of Intelligent
Systems and Applications 9 (2017) 46-58. doi:10.5815/ijisa.2017.12.05.
[5] Z. Poberezhna</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1109/SAIC.2018.8516750</article-id>
      <title-group>
        <article-title>Robust segmented regression with heteroscedasticity based on moving triangle</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Valeriyi Kuzmin</string-name>
          <email>valeriyikuzmin@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Maksym Zaliskyi</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Onyedikachi Chioma Okoro</string-name>
          <email>okorokachi7@gmail.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>State University “Kyiv Aviation Institute”</institution>
          ,
          <addr-line>Liubomyra Huzara Ave., 1, Kyiv, 03058</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Vamooose Technologies</institution>
          ,
          <addr-line>Calgary</addr-line>
          ,
          <country country="CA">Canada</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2018</year>
      </pub-date>
      <volume>992</volume>
      <fpage>1</fpage>
      <lpage>4</lpage>
      <abstract>
        <p>Robust regressions allow reducing the influence of one or more values of the sample population during the mathematical model building. This paper considers the problem of choosing the best segmented regression model for the specific example of econometric data and analyzing its advantages as a result of comparison with other options of approximations using higher-order polynomials. The model building consists of several steps. At the first step, the value of the abscissas of the switching points was optimized using the sliding setsquare method. Since the data are heteroscedastic, a new procedure of sliding triangles was proposed for its qualitative identification. This method can be considered as a generalization of the moving average, which is widely used during the analysis of time series. In general, the sliding triangle procedure makes it possible to quite reasonably construct the heteroscedasticity equation. The heteroscedasticity equation was used to calculate the confidence interval of the data variation relative to the final best regression. Thus, the proposed methodology for constructing mathematical models taking into account heteroscedasticity has the property of robustness in terms of reducing the influence of samples with large values.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;mathematical model building</kwd>
        <kwd>ordinary least squares</kwd>
        <kwd>minimization of absolute deviations</kwd>
        <kwd>minimization of the range of the cumulative residual curve</kwd>
        <kwd>robust regression</kwd>
        <kwd>moving average</kwd>
        <kwd>outlier correction</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The problem of mathematical models building for detection of regularities between natural phenomena,
various processes and parameters is widely used today in diferent industries [
        <xref ref-type="bibr" rid="ref1">1, 2, 3</xref>
        ]. An accurate
mathematical model is the basis for development of new technologies [4, 5], optimization of technological
processes [6, 7], forecasting of possible events and phenomena [8, 9], support of decision-making
regarding control and corrective actions [10, 11], and others.
      </p>
      <p>While mathematical models building, many factors are taken into account. On the one hand, the
model should not be too simple, since in such cases it may have unsatisfactory accuracy [12, 13].
Significant complication of the model may lead to overfitting and unsatisfactory forecasting results
[14, 15]. Therefore, it is necessary to find a trade of between the accuracy of the model in the range of
observed values and the forecasting properties.</p>
      <p>One of the main tools for mathematical models building from a statistical point of view is regression
analysis [16]. Well-known and widely used algorithms are least squares regression, lasso, ridge, least
absolute deviation regression, and others [17, 18]. In these cases, a single function for approximation is
usually used.</p>
      <p>When approximating using a single general function over the entire range of data variation, the
standard deviation is determined as a result of its averaging. In this case, the assumption is made that the
standard deviation is constant for all dataset. Very often in econometric problems, data are described by
non-stationary random processes, so it is incorrect to assume the constant standard deviation. Usually,
such processes in econometrics have the property of heteroscedasticity [19, 20, 21].</p>
    </sec>
    <sec id="sec-2">
      <title>2. Literature review and problem statement</title>
      <p>
        Regression models give possibility to establish a correlation between an outcome variable and one or
more explanatory variables [22, 23]. Often, these models are used to solve forecasting problems, detect
trends (including changes in the geometric structure of data), assess the information content of the
influence of explanatory variables on the outcome variable, and in some cases, to solve a classification
problem [
        <xref ref-type="bibr" rid="ref2 ref3 ref4">24, 25, 26</xref>
        ]. Regression model is an integral part of machine learning systems and artificial
intelligence [
        <xref ref-type="bibr" rid="ref5 ref6">27, 28</xref>
        ].
      </p>
      <p>
        The widespread use of regression models is explained by its main advantages, among which are
simplicity, interpretability, the possibility of using various approximating functions, and ease of
implementation using software [
        <xref ref-type="bibr" rid="ref7 ref8">29, 30</xref>
        ].
      </p>
      <p>
        A new direction in the field of regression analysis today is robust regression [
        <xref ref-type="bibr" rid="ref10 ref9">31, 32</xref>
        ]. This approach
to regression models building is due to the possibility of the presence of highly noisy values, outliers,
and non-stationarities in the analyzed datasets [
        <xref ref-type="bibr" rid="ref11 ref12">33, 34</xref>
        ]. In addition, robust regression makes it possible
to build models in conditions of non-Gaussian errors and the presence of heteroscedasticity.
      </p>
      <p>
        Heteroscedasticity occurs when diferent explanatory variables have diferent standard deviations
[
        <xref ref-type="bibr" rid="ref13 ref14">35, 36</xref>
        ]. Failure to take heteroscedasticity into account may result in unsatisfactory forecasting results
due to an erroneous decision about the significance of a particular explanatory variable. The main
methods for mathematical models building in conditions of heteroscedasticity are the weighted least
squares method, variable transformation (including taking the logarithm of the outcome variable), and
other alternative methods of regression analysis.
      </p>
      <p>
        In case of heteroscedasticity, the first thing to do is to decide on its presence. Today, the literature
provides a large number of tests, including the Goldfeld-Quandt test [
        <xref ref-type="bibr" rid="ref15">37</xref>
        ], Breusch-Pagan test [
        <xref ref-type="bibr" rid="ref16">38</xref>
        ], White
test [
        <xref ref-type="bibr" rid="ref17">39</xref>
        ], and others. In the paper [
        <xref ref-type="bibr" rid="ref18">40</xref>
        ], a numerical measure of heteroscedasticity was also proposed.
The next step after detection is the calculation of weighting coeficients, which make adjustments to
the regression model.
      </p>
      <p>The aim of this paper is to justify the use of robust segmented model in conditions of heteroscedasticity,
as well as compare it with classical regression models based on the ordinary least squares method. To
achieve this aim, the paper will solve specific objectives: a) justification of the method for optimizing
the switching points of the regression model segments, b) development of the moving triangle method
as a generalization of the moving average method, c) presentation of the new methodology using a
specific numerical example.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Materials and methods</title>
      <sec id="sec-3-1">
        <title>3.1. Sliding setsquare method</title>
        <p>Consider methodology on a specific numerical example. The initial data for the analysis are given in
Table 1.</p>
        <p>
          In the book [
          <xref ref-type="bibr" rid="ref19">41</xref>
          ] two absolutely diferent processes (one decreasing and one increasing) were
approximated by a single straight line. This approach is too simplified and gives an overestimated approximation
error.
        </p>
        <p>As a result of visual data analysis, an assumption was made about rationally dividing the data into
four segments. Therefore, approximate values were adopted for the abscissas of three switching points.</p>
        <p>To accurately determine the abscissas of the switching points, a heuristic approach was used. In this
case, the dataset was divided into sections, on each of which only one switching point was optimized.
This method can be considered as a sliding setsquare method, which sequentially moves along the
entire range of data.</p>
        <p>Let’s consider the step-by-step procedure for applying the sliding setsquare method.</p>
        <p>Step 1. We will approximate 9 points using two-segment linear regression. To do this, for five
variants of the abscissa values of the switching point (located from the third to seventh points), data
approximations were performed using the ordinary least squares (OLS) method. The equation of the
where ( −  1)+ is modular function, 1 is abscissa of switching point for the sliding setsquare. In
this case
( −  1)+ = ( −  1)+ |  −  1 | . (2)
2
The results of the calculations of the setsquare parameters are given in Table 2.</p>
        <p>Next, we approximate the data from Table 2 (the dependence of the standard deviation on the abscissa
of the switching point) with a second-order parabola using the OLS method. As a result, we obtain an
equation of the following type:</p>
        <p>( 1) = 1.337 − 0.25 1 + 0.01212.</p>
        <p>The optimum of this parabola is at the point with the abscissa 1 = 10.294.</p>
        <p>Step 2. We move the setsquare so that its origin is in the next point after the abscissa of the first
switching point. After that, we approximate the obtained nine points using two-segmented linear
regression. The calculation results are given in Table 3.</p>
        <p>Next, we approximate the data from Table 3 (the dependence of the standard deviation on the abscissa
of the switching point) with a second-order parabola using the OLS method. As a result, we obtain an</p>
        <p>The optimum of this parabola is at the point with the abscissa 1 = 14.754.</p>
        <p>Step 3. We move the setsquare so that its origin is in the next point after the abscissa of the second
switching point. After that, we approximate the obtained nine points using two-segmented linear
regression. The calculation results are given in Table 4.</p>
        <p>Next, we approximate the data from Table 3 (the dependence of the standard deviation on the abscissa
of the switching point) with a second-order parabola using the OLS method. As a result, we obtain an
equation of the following type:</p>
        <p>( 1) = −0.196 + 0.0373 1 − 0.000915 12.</p>
        <p>The optimum of this parabola is at the point with the abscissa 1 = 20.355.</p>
        <p>Step 4. For the obtained values of the three switching points, we perform data approximation using
four-segmented regression and OLS method. As a result, we obtain the equation
() = 1.936 − 0.181 + 0.287( − 10.296)</p>
        <p>+ + 0.0237( − 14.754) + + 0.0401( − 20.355) +. (3)</p>
        <sec id="sec-3-1-1">
          <title>A visual representation of the obtained regression is shown in Figure 1.</title>
          <p>Visual analysis (Figure 1) and comparison of the coeficients of the regression model allow to make an
assumption about the possibility of combining the second and third segments. Merging the two segments
allows to simplify the mathematical model. As a result, we obtain a three-segmented regression model
based on the OLS of the form:
() = 2.138 − 0.204 + 0.323( − 10.296)
+ + 0.0551( − 20.355) +.</p>
          <p>(4)</p>
        </sec>
        <sec id="sec-3-1-2">
          <title>A visual representation of the obtained regression is shown in Figure 2.</title>
        </sec>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Calculation of the heteroscedasticity equation</title>
        <p>Visual analysis of the data does not provide a clear assumption on the heteroscedasticity presence.
Therefore, it is necessary to conduct a more correct statistical analysis, which can be done using a new
approach.</p>
        <p>To assess heteroscedasticity, this paper uses a new approach based on a moving triangle. This method
can be considered as a generalization of the moving average method, which is often used in the analysis
of time series.</p>
        <p>The moving triangle method expands the capabilities of statistical analysis, since it makes it possible
to establish the dependence of the change in the standard deviation on the value of the approximated
variable in a local observation area.</p>
        <p>The heteroscedasticity equation is the dependence of current standard deviations on the corresponding
values of the approximating function.</p>
        <p>The indicator that characterizes the current standard deviation is a segment drawn from the vertex
of the triangle to its opposite side, parallel to the ordinate axis. In this paper, the designated segment
will be called the measure of the triangle.</p>
        <p>Visual analysis of the data and constructed triangles shows that in the case of large values of the
approximating variable, the area and dimensions of the triangle will also be large. Accordingly, the
heteroscedasticity indicator will also be overestimated.</p>
        <p>The principle of constructing triangles is shown in Figure 3.
the current standard deviation  ). In this case, to estimate the current value of the ordinate , the
ordinates were averaged over the three vertices of the triangle (which is the implementation of the
moving average method). The calculation results are given in Table 5.</p>
        <p>We approximate the data presented in Table 5 using a linear function and OLS. As a result, we get
an equation of the type:</p>
        <p>() = 0.0684 + 0.0658.</p>
        <p>The equation of heteroscedasticity is shown in Figure 4.</p>
        <p>To calculate the coeficients of heteroscedasticity, we use the formula
 = (
 )2,
()
(5)
where  is expected value of outcome variable, () is the current value calculated by the best
regression model (4) obtained using OLS. Each point is determined by the value of the abscissa of the
current empirical value, after which it is substituted into the equation (4). Since weights are calculated
for the OLS method, the obtained fraction is squared.</p>
        <p>The values of the weight coeficients of heteroscedasticity are given in Table 6.
After that, we can get the final regression model with taking into account heteroscedasticity:
() = 2.0798 − 0.197 + 0.3134( − 10.296)
+ + 0.0663( − 20.355) +.</p>
        <p>A visual representation of the obtained regression model (4) and (6) is shown in Figure 5.</p>
        <p>Although a visual analysis of the results of approximation, taking into account and without taking
into account heteroscedasticity, does not give a big diference, the accounting of heteroscedasticity still
improves the quality of the model.</p>
        <p>For the resulting version of approximation, a confidence interval was obtained. The calculation of
the boundaries of the variation was based on the equation of heteroscedasticity. For each signature
value, the double value of the current standard deviation was used. The result of the construction of
confidence interval is shown in Figure 6. As can be seen, all points are inside the confidence range.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Calculation of the polynomial regressions</title>
        <p>Let us consider the approximation variant using a fourth- and sixth-order polynomial. Using the OLS
method, we obtain the following equations:</p>
        <p>() = 3.667 − 0.808 + 0.0605 2 − 1.598 · 10 −3 3 + 1.412 · 10 −5 4.
A visual representation of the obtained regression model (7) and (8) is shown in Figure 7.</p>
        <p>Although the sixth-order polynomial provides better adequacy in the data variation range, it is
absolutely unsuitable for forecasting purposes. Its unsuitability for forecasting is especially evident in
the area on the left side.
(6)
(7)</p>
        <p>The paper conducted a comparative analysis of all approximation options according to the standard
deviation (SD) criterion, as well as according to the criterion of the maximum range (MR) of the
cumulative residual curve. The calculation results are given in Table 7.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>The main goal of this paper is to construct the best segmented regression model of the studied
econometric data and to substantiate its advantages as a result of comparative analysis in comparison with
alternative approximations by higher-order polynomials.</p>
      <p>At the first stage, the value of the abscissas of the switching points was optimized using the sliding
setsquare method. In this case, the use of three segments was substantiated. Since the data are
heteroscedastic, a new procedure of sliding triangles was proposed for its qualitative identification.
This method can be considered as a generalization of the moving average, which is widely used in the
analysis of time series. In general, the sliding triangle procedure makes it possible to quite reasonably
construct the heteroscedasticity equation. The heteroscedasticity equation was used to calculate the
confidence interval (band) of the data variation relative to the final best regression. Thus, the proposed
methodology for constructing mathematical models taking into account heteroscedasticity has the
property of robustness in terms of reducing the influence of samples with large values.</p>
    </sec>
    <sec id="sec-5">
      <title>Declaration on Generative AI</title>
      <sec id="sec-5-1">
        <title>The author(s) have not employed any Generative AI tools.</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>D.</given-names>
            <surname>Montgomery</surname>
          </string-name>
          , G. Runger, Applied Statistics and Probability for Engineers, 4th ed., Wiley, New York, USA,
          <year>2007</year>
          .
          <source>WIREs Computational Statistics</source>
          <volume>13</volume>
          (
          <year>2021</year>
          )
          <article-title>e1524</article-title>
          . doi:
          <volume>10</volume>
          .1002/wics.1524.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [24]
          <string-name>
            <given-names>G.</given-names>
            <surname>Snedecor</surname>
          </string-name>
          , W. Cochran, Statistical Methods, Iowa State University Press, Iowa, USA,
          <year>1989</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [25]
          <string-name>
            <given-names>I.</given-names>
            <surname>Ostroumov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Ivannikova</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Kuzmenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zaliskyi</surname>
          </string-name>
          ,
          <article-title>Impact analysis of russian-ukrainian war on airspace</article-title>
          ,
          <source>Journal of Air Transport Management</source>
          <volume>124</volume>
          (
          <year>2025</year>
          ). doi:
          <volume>10</volume>
          .1016/j.jairtraman.
          <year>2025</year>
          .
          <volume>102742</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>A.</given-names>
            <surname>Popov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Tserne</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Volosyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Zhyla</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Pavlikov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Ruzhentsev</surname>
          </string-name>
          , et al.,
          <article-title>Invariant polarization signatures for recognition of hydrometeors by airborne weather radars</article-title>
          , in: O.
          <string-name>
            <surname>Gervasi</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Murgante</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Taniar</surname>
            ,
            <given-names>B. O.</given-names>
          </string-name>
          <string-name>
            <surname>Apduhan</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          <string-name>
            <surname>Braga</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Garau</surname>
            ,
            <given-names>A</given-names>
          </string-name>
          . Stratigea (Eds.),
          <source>Computational Science and Its Applications - ICCSA 2023. Lecture Notes in Computer Science</source>
          , vol.
          <volume>13956</volume>
          , Springer Nature Switzerland, Cham,
          <year>2023</year>
          , pp.
          <fpage>201</fpage>
          -
          <lpage>217</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -36805-9_
          <fpage>14</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>O.</given-names>
            <surname>Holubnychyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zaliskyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Ostroumov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Sushchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Solomentsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Averyanova</surname>
          </string-name>
          , et al.,
          <article-title>Self-organization technique with a norm transformation based filtering for sustainable infocommunications within cns/atm systems</article-title>
          , in: I. Ostroumov, M. Zaliskyi (Eds.),
          <source>Proceedings of the 2nd International Workshop on Advances in Civil Aviation Systems Development. ACASD 2024. Lecture Notes in Networks and Systems</source>
          , vol.
          <volume>992</volume>
          , Springer Nature Switzerland, Cham,
          <year>2024</year>
          , pp.
          <fpage>262</fpage>
          -
          <lpage>278</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -60196-5_
          <fpage>20</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>M.</given-names>
            <surname>Gopal</surname>
          </string-name>
          , Applied Machine Learning,
          <source>McGraw Hill Education, India</source>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Hu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Gnatyuk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Okhrimenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Tynymbayev</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <article-title>Iavich, High-speed and secure prng for cryptographic applications</article-title>
          ,
          <source>International Journal of Computer Network and Information Security</source>
          <volume>12</volume>
          (
          <year>2020</year>
          )
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          . doi:
          <volume>10</volume>
          .5815/ijcnis.
          <year>2020</year>
          .
          <volume>03</volume>
          .01.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [30]
          <string-name>
            <given-names>O.</given-names>
            <surname>Solomentsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zaliskyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Shcherbyna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Kozhokhina</surname>
          </string-name>
          ,
          <article-title>Sequential procedure of changepoint analysis during operational data processing</article-title>
          ,
          <source>in: 2020 IEEE Microwave Theory and Techniques in Wireless Communications (MTTW)</source>
          ,
          <year>2020</year>
          , pp.
          <fpage>168</fpage>
          -
          <lpage>171</lpage>
          . doi:
          <volume>10</volume>
          .1109/MTTW51045.
          <year>2020</year>
          .
          <volume>9245068</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [31]
          <string-name>
            <given-names>T.</given-names>
            <surname>Ryan</surname>
          </string-name>
          , Modern Regression Methods, 2nd ed., Wiley, New York, USA,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [32]
          <string-name>
            <given-names>O.</given-names>
            <surname>Sushchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bezkorovainyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Solomentsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Zaliskyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Holubnychyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Ostroumov</surname>
          </string-name>
          , et al.,
          <article-title>Algorithm of determining errors of gimballed inertial navigation system</article-title>
          , in: O.
          <string-name>
            <surname>Gervasi</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Murgante</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Garau</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Taniar</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. M. A. C. Rocha</surname>
            ,
            <given-names>M. N. Faginas</given-names>
          </string-name>
          <string-name>
            <surname>Lago</surname>
          </string-name>
          (Eds.),
          <source>Computational Science and Its Applications - ICCSA 2024 Workshops. ICCSA 2024. Lecture Notes in Computer Science</source>
          , vol.
          <volume>14816</volume>
          , Springer Nature Switzerland, Cham,
          <year>2024</year>
          , pp.
          <fpage>206</fpage>
          -
          <lpage>218</lpage>
          . doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -65223-3_
          <fpage>14</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [33]
          <string-name>
            <given-names>D.</given-names>
            <surname>Birkes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Dodge</surname>
          </string-name>
          , Alternative Methods of Regression, Wiley, New York, USA,
          <year>1993</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [34]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Poberezhna</surname>
          </string-name>
          ,
          <article-title>Comprehensive assessment of the airlines' competitiveness</article-title>
          ,
          <source>Economic Annals-XXI</source>
          <volume>167</volume>
          (
          <year>2017</year>
          )
          <fpage>32</fpage>
          -
          <lpage>36</lpage>
          . doi:
          <volume>10</volume>
          .21003/ea.V167-
          <volume>07</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [35]
          <string-name>
            <surname>R. L</surname>
          </string-name>
          . Kaufman, Heteroskedasticity in Regression,
          <source>SAGE Publications</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [36]
          <string-name>
            <given-names>H.</given-names>
            <surname>Barreto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Howland</surname>
          </string-name>
          , Introductory Econometrics, Cambridge University Press,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [37]
          <string-name>
            <given-names>S.</given-names>
            <surname>Goldfeld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Quandt</surname>
          </string-name>
          ,
          <article-title>Some tests for heteroscedasticity</article-title>
          ,
          <source>Journal of the American Statistical Association</source>
          <volume>60</volume>
          (
          <year>1965</year>
          )
          <fpage>539</fpage>
          -
          <lpage>547</lpage>
          . doi:
          <volume>10</volume>
          .1080/01621459.
          <year>1965</year>
          .
          <volume>10480811</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [38]
          <string-name>
            <given-names>T. S.</given-names>
            <surname>Breusch</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. R.</given-names>
            <surname>Pagan</surname>
          </string-name>
          ,
          <article-title>A simple test for heteroscedasticity and random coeficient variation</article-title>
          ,
          <source>Econometrica</source>
          <volume>47</volume>
          (
          <year>1979</year>
          )
          <fpage>1287</fpage>
          -
          <lpage>1294</lpage>
          . doi:
          <volume>10</volume>
          .2307/1911963.
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [39]
          <string-name>
            <given-names>H.</given-names>
            <surname>White</surname>
          </string-name>
          ,
          <article-title>A heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity</article-title>
          ,
          <source>Econometrica</source>
          <volume>48</volume>
          (
          <year>1980</year>
          )
          <fpage>817</fpage>
          -
          <lpage>838</lpage>
          . doi:
          <volume>10</volume>
          .2307/1912934.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [40]
          <string-name>
            <given-names>M.</given-names>
            <surname>Zaliskyi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Solomentsev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Shcherbyna</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Ostroumov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Sushchenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Averyanova</surname>
          </string-name>
          , et al.,
          <source>Heteroskedasticity analysis during operational data processing of radio electronic systems. lecture notes in networks and systems</source>
          , vol.
          <volume>290</volume>
          , in: S. Shukla,
          <string-name>
            <given-names>A.</given-names>
            <surname>Unal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. V.</given-names>
            <surname>Kureethara</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. K.</given-names>
            <surname>Mishra</surname>
          </string-name>
          , D. S. Han (Eds.),
          <source>Data Science and Security</source>
          , Springer Singapore, Singapore,
          <year>2021</year>
          , pp.
          <fpage>168</fpage>
          -
          <lpage>175</lpage>
          . doi:
          <volume>10</volume>
          .1007/
          <fpage>978</fpage>
          -981-16-4486-3_
          <fpage>18</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [41]
          <string-name>
            <given-names>C.</given-names>
            <surname>Leser</surname>
          </string-name>
          , Econometric Techniques and Problems, Lubrecht and Cramer, NY, USA,
          <year>1974</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>