<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>remote sensing data⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Volodymyr Hnatushenko</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Vita Kashtan</string-name>
          <email>vitalionkaa@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Dnipro University of Technology</institution>
          ,
          <addr-line>Dmytra Yavornytskoho Ave 19, Dnipro, 49005</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <abstract>
        <p>Multispectral image fusion is a fundamental task in remote sensing, requiring a balance between spatial detail and spectral fidelity. This study proposes and evaluates quantum image fusion schemes based on three- and four-qubit circuits. The three-qubit scheme provides a balanced reconstruction, preserving key structural features with MSE values of 0.22 (infrared) and 0.35 (visible), PSNR values of 16.56 and 4.53, SSIM values of 0.60 and 0.27, and significant improvements in sharpness (AG = 0.91, SF = 0.22; relative increases &gt;100%). The four-qubit scheme enhances spectral fidelity (PSNR: 160.48 for IR, 164.54 for Visible) and visible-band structural similarity (SSIM = 0.55), while maintaining high sharpness (AG = 1.31, SF = 0.29). However, the IR-SSIM decreases to 0.18, and the MSE rises to 0.89. A comparative analysis against classical methods (IHS, PCA, Brovey, Wavelet, Laplacian Pyramid, Curvelet) reveals that quantum schemes more effectively recover fine structural and textural details, albeit with slightly lower spectral consistency. These results highlight the potential of quantum-enhanced fusion for multispectral data and motivate further research on circuit optimization, noise mitigation, and scalable quantum image processing.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;quantum computing</kwd>
        <kwd>data fusion</kwd>
        <kwd>multispectral imaging</kwd>
        <kwd>qubit</kwd>
        <kwd>quantum image processing</kwd>
        <kwd>structural and spectral metrics1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        In contemporary remote sensing applications, integrating heterogeneous information sources into
a unified and more informative representation is essential. Data fusion, particularly pan-sharpening
of multispectral [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] and hyperspectral images, combines the high spatial resolution of
panchromatic channels
      </p>
      <p>with the spectral richness of multispectral data. This improves
classification accuracy, object detection, and quantitative mapping. The problem remains relevant
due to the increasing number of satellite platforms, the need for automated processing of
largescale datasets, and application requirements in land monitoring, ecology, and urban planning,
which demand both spectral and spatial fidelity.</p>
      <p>Classical pan-sharpening approaches remain effective, but new requirements, such as
hyperspectral data processing, sensor adaptation, and noise robustness, drive research toward
hybrid and deep learning–based solutions. Recent reviews show that advanced methods integrate
physical sensor models, refined quality metrics, and neural architectures to achieve an improved
balance of spatial and spectral accuracy. In parallel, experimental research investigates the
application of quantum computing to image fusion tasks. Quantum approaches promise compact
data encoding through superposition, acceleration of linear operations such as the quantum Fourier
transform, and novel learning paradigms including quantum neural networks. However, current
studies are limited to theoretical models, classical simulations, or small-scale quantum backends.
Hardware limitations, qubit errors, and scalability challenges constrain practical deployment.</p>
      <p>Infrared channels provide thermal information critical for detecting temperature variations,
surface structures, and material properties beyond the visible spectrum. Accurate representation
and fusion of infrared data enhances the reliability of scene interpretation and are particularly
valuable in environmental monitoring, agriculture, defense, fire safety, and medical diagnostics.
High-precision fusion of infrared channels enables the detection of subtle structural and material
changes, supporting timely decision-making and improving analytical outcomes.</p>
      <p>The integration of multispectral and hyperspectral data processing with deep learning and
emerging quantum methods defines a promising but technically demanding research direction.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related works</title>
      <p>
        Early approaches to fusing multispectral (MS) and panchromatic (PAN) images were primarily
based on component substitution (CS) methods, such as the Intensity–Hue–Saturation (IHS)
transform [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], Principal Component Analysis (PCA), the Brovey transform, and wavelet-based
techniques [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. These algorithms are computationally efficient and easy to implement; however,
they are prone to spectral distortions, which limit their applicability in cases where high spectral
fidelity is required [7]. Results from the IEEE Data Fusion Contest demonstrated that none of the
classical methods can simultaneously achieve both maximum spatial and spectral accuracy [7].
      </p>
      <p>
        Subsequent advances led to the adoption of Multiresolution Analysis (MRA), including wavelet
transforms, contourlets, and Laplacian pyramids. These methods preserve spectral information
more effectively, although they typically provide less sharp spatial details compared to CS-based
techniques. The study in [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] confirmed the superior spectral fidelity of MRA algorithms, with
similar conclusions reported in the systematic review [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>More recent approaches formulate pan-sharpening as an optimization and regularization
problem. These include Bayesian methods, matrix factorization, and algorithms that explicitly
incorporate the physical characteristics of sensors. In [8], it was demonstrated that the choice of
loss function is crucial for maintaining a balance between spatial and spectral quality, particularly
under noisy conditions or when fusing data from heterogeneous sensors.</p>
      <p>
        A major research direction today involves the use of deep neural networks. Early examples
include two-stream convolutional architectures (Two-Stream Fusion Networks) [9], which
introduced CNNs into the pan-sharpening process. Further developments, such as the Multi-Scale
Dilated Residual Network (MSDRN) [10], demonstrated that multi-scale residual structures can
more accurately reconstruct spatial details. As highlighted in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], deep learning methods currently
hold the greatest potential for achieving an optimal balance of spatial and spectral fidelity.
However, they require large training datasets, substantial computational resources, and are
sensitive to sensor misalignment. The problem of objectively assessing the quality of fused results
remains unsolved. Traditional metrics, such as ERGAS, SAM, and Q4, provide formal evaluations
but do not always align with visual perception [11]. Recent approaches leverage perceptual models;
for instance, [17] proposed a feature-based evaluation method that relies on deep feature similarity
and correlates more closely with human perception.
      </p>
      <p>An additional challenge lies in the fusion of hyperspectral images (HSI). This allows the
generation of data with both high spectral and spatial resolution, which is critical for applications
such as materials science and agriculture. However, the high dimensionality of hyperspectral data
significantly increases computational complexity and sensitivity to noise. In [12], the authors
emphasized the need for computationally efficient approaches to HSI pan-sharpening.</p>
      <p>Recently, several studies have explored the integration of quantum computing into the data
fusion process. For example, [13] demonstrated the possibility of representing images in a quantum
register. In [14], the authors proposed a concept of quantum fusion for remote sensing, while [15]
presented quantum-inspired algorithms for computational imaging. Although current quantum
devices remain limited by qubit counts and stability issues [16], [17], such approaches open
promising prospects for accelerating pan-sharpening and multi-channel data fusion tasks.</p>
      <p>In summary, modern methods of data fusion have significantly enhanced the informativeness of
fused products, while future research is likely to focus on integrating deep learning with quantum
computing approaches. The aim of this study is to develop and systematically investigate a
quantum approach to heterogeneous data fusion, with a focus on multispectral remote sensing
imagery. The proposed framework employs quantum computations based on three- and four-qubit
systems. It is designed to enhance the informativeness of fused images, improve interpretation
accuracy, and evaluate the effectiveness of quantum algorithms by comparing them with classical
fusion techniques using established quality metrics.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Materials and methods</title>
      <p>The approach proposed in this study applies quantum computing for the fusion of spectral bands of
satellite images and encompasses the stages of data preprocessing, quantum encoding, quantum
computation, quantum state measurement, fused band generation, and evaluation of result quality
(Fig. 1). In the initial stage, preprocessing of the input bands is performed. Cloud masking,
coregistration, and normalization are performed on images acquired in the spectral ranges (Bands 5
and 7 of WorldView-3). Cloud masking is implemented by multiplying the original band I by the
complementary binary mask .
|ψ p ⟩ =√1− p2|0 ⟩ + p|1 ⟩ .</p>
      <p>In the case of fusing two bands (Band 5 and Band 7), their values are encoded in the form of a
superposition:
|ψ ⟩=α|0 ⟩ + β|1 ⟩ , α2+ β2=1 ,
(4)
where the coefficients α and  correspond to the intensities of the respective bands.</p>
      <p>At the third stage, quantum computations are performed. Two configurations were investigated:
a 3-qubit and a 4-qubit scheme. In the 3-qubit model (Fig. 2), the computational process is
implemented as a sequence of rotation operators around the  - (), which encode the pixel
amplitudes, and Hadamard gates , which create a uniform superposition of states. This
combination allows modeling the basic interaction of the bands while preserving their symmetric
contribution to the fused result:</p>
      <p>In the 4-qubit model, a more complex quantum circuit is implemented, which, in addition to the
rotations (), also includes CNOT gates to model nonlinear interactions between the bands. It
allows for accounting of contextual dependencies and spatial correlations, potentially leading to
improved fusion quality [20, 21]:</p>
      <p>m=⟨ ψ| Z|ψ ⟩ , (7)
where Z is the Pauli-Z operator, the measurement probabilities P(0) and P(1) reflect the relative
contribution of each band to the fused result.</p>
      <p>The fifth stage is responsible for generating the fused band. Each pixel is formed as a linear
combination of the two input bands, with weighting coefficients determined from the measurement
outcomes:</p>
      <p>F ( x , y )=w1 ∙ I 5 ( x , y )+ w2 ∙ I 7 ( x , y ) ,
(8)
where w1=P(0), w2=P(1).</p>
      <p>In the final stage, the quality of the fused band is evaluated. Standard metrics [18] such as PSNR,
SSIM, and SAM are employed for this purpose.</p>
      <p>The proposed sequence of stages provides a comprehensive quantum modeling of the image
fusion process while preserving information about the contribution of each band, and also allows
for quantitative assessment of the resulting fused band using classical metrics.</p>
      <p>The proposed approach enables the formation of an integrated band based on quantum
computations, considering both 3-qubit and 4-qubit configurations. Comparison of the results from
these models allows for assessing the impact of increasing the number of qubits on fusion quality
and the potential advantages of quantum approaches in Earth remote sensing.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Experimental results</title>
      <sec id="sec-4-1">
        <title>1. Input satellite data</title>
        <p>Figure 4 presents satellite images from the WorldView-3 spacecraft showing two spectral bands:
Band 5 (Fig. 4a) and Band 7 (Fig. 4b), as well as the resulting images obtained through quantum
fusion using the 3-qubit scheme (Fig. 4c) and the 4-qubit scheme (Fig. 4d). Visual analysis indicates
that the obtained fusion effectively combines the informative characteristics of both bands. The
fused result preserves important textural details and contrast features inherent to each input band
while simultaneously integrating the unique spectral properties of Bands 5 and 7. It demonstrates
the correct amplitude encoding of pixels using the quantum rotation operators Ry(θ) and Hadamard
gates, providing a balanced and informative representation of the fused image. The absence of
artifacts and distortions in the fused band confirms the high quality of the quantum modeling
performed using the quantum circuit. The results demonstrate the potential of applying quantum
computing for multiband image fusion while preserving structural and spectral information.
a)
с)
b)
d)</p>
      </sec>
      <sec id="sec-4-2">
        <title>2. Metrics</title>
        <p>For a quantitative assessment of the quality of the obtained fused images, a set of quantitative
metrics was applied, including classical indicators such as Mean Squared Error (MSE), Peak
Signalto-Noise Ratio (PSNR), and Structural Similarity Index (SSIM). These metrics enable the evaluation
of the degree to which both spectral and spatial information are preserved by comparing the fused
images with the original infrared (Band 5) and visible (Band 7) bands. For a more detailed analysis
of textural and structural quality, the Average Gradient (AG) and Spatial Frequency (SF) metrics
were also considered. Classical metrics enable the assessment of the correspondence between the
spectral and structural properties of the fused images and those of the original infrared (Band 5)
and visible (Band 7) bands. The AG and SF metrics provide additional insight into the level of
detail, contrast, and textural characteristics, which are crucial for image fusion tasks.</p>
        <p>Table 1 presents the metric values for various fusion methods, including classical approaches
(IHS, PCA, Brovey Transform, Wavelet Transform, Laplacian Pyramid) as well as the proposed
quantum approach using 3- and 4-qubit configurations. Notably, among all methods, the 4-qubit
quantum approach exhibits high values of Average Gradient (1.31) and Spatial Frequency (0.29),
indicating superior image detail and textural quality.
5. Discussion
1. Visual analysis
136.60</p>
        <p>105.86
The proposed results of quantum image fusion obtained using the 4-qubit scheme (Fig. 4d)
demonstrate a significant improvement in the quality of the fused image compared to the 3-qubit
scheme (Fig. 4c). The use of the 4-qubit quantum circuit allows more precise encoding of the input
band amplitudes, enabling a more complete representation of spatial and textural information. It is
achieved by increasing the number of qubits, which expands the dimensionality of the state space
and provides better discretization and interference properties of the quantum representation. As a
result, the fused images exhibit higher sharpness, contrast, and naturalness, with important
structural elements clearly visible and less distorted.</p>
        <p>A comparative analysis of the visual results from the 3- and 4-qubit schemes reveals that,
although the 3-qubit scheme demonstrates basic fusion, it has limitations in conveying textural
detail and spectral balance. The 3-qubit scheme may exhibit certain artifacts and reduced contrast,
which can negatively affect image perception. In contrast, the 4-qubit scheme minimizes these
drawbacks, as evidenced by smoother transitions between regions of different brightness and
improved reproduction of fine details.</p>
        <p>Thus, implementing the 4-qubit quantum scheme for image fusion represents a practical
approach that not only preserves the spectral characteristics of individual bands but also ensures a
high-quality final result. This method opens prospects for further advancement of quantum
algorithms for multiband image processing and their application in computer vision and Earth
remote sensing tasks.</p>
      </sec>
      <sec id="sec-4-3">
        <title>2. Quantitative analysis</title>
        <p>Analysis of the obtained metrics demonstrates that the quantum approach to image fusion has
advantages and limitations. Using 3- and 4-qubit quantum circuits revealed differences in structural
(SSIM, AG, SF) and spectral (MSE, PSNR) metrics. In particular, the 3-qubit scheme balanced
reconstruction quality and structural similarity. The MSE value for fusion with the infrared band
was 0.22, comparable to classical methods, while for the visible band it was 0.35. At the same time,
PSNR values (16.56 for IR and 4.53 for Visible) indicate limited preservation of spectral intensity,
whereas SSIM values (0.60 and 0.27, respectively) show that key structural details are maintained. It
is important to note that sharpness metrics (AG = 0.91; SF = 0.22) exhibit significant relative
increases (136.6% and 105.9%), indicating enhanced informativeness of the final image.</p>
        <p>The 4-qubit scheme, in turn, demonstrated substantially higher PSNR values (160.48 for IR and
164.54 for Visible), indicating better correspondence to the spectral characteristics of the original
bands. However, this result is accompanied by a higher MSE for IR (0.89) and a lower SSIM for IR
(0.18), which may suggest some loss of local structural details. At the same time, for the visible
band, SSIM (0.55) exceeds the corresponding results of the 3-qubit scheme. Sharpness metrics (AG
= 1.31; SF = 0.29) also remain high, ensuring relative increases in informativeness of more than
100%.</p>
        <p>Comparison with classical fusion methods shows that the quantum-based approach has several
advantages. For example, the IHS method provided a high SSIM value (1.20) for the visible band but
low results for the infrared band (0.05). Spatial informativeness metrics (AG = 0.39; SF = 0.11) were
almost unchanged from the original, indicating no significant improvement in image sharpness.
While PCA and Brovey Transform methods achieved an acceptable level of structural similarity
(SSIM above 0.7 for Visible), they tended to degrade spatial informativeness, as confirmed by
decreased AG and SF values. Multi-level transform-based methods (Wavelet, Laplacian Pyramid,
Curvelet) in some cases ensured higher local similarity (e.g., SSIM = 0.64 for Curvelet with IR) but
lagged behind quantum schemes in maintaining a balance between spectral and structural
characteristics.</p>
        <p>Thus, the results indicate that quantum methods, particularly 3- and 4-qubit schemes, offer
significant potential for band fusion tasks. The 3-qubit approach is more robust against structural
losses and provides substantial sharpness enhancement, while the 4-qubit scheme enables the
achievement of exceptionally high PSNR values, indicating precise spectral reconstruction. Further
research should focus on optimizing quantum circuits to balance the fused image's spectral fidelity
and structural informativeness.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>6. Conclusions</title>
      <p>The proposed quantum approach to multispectral image fusion achieves the research objective of
developing and systematically exploring the capabilities of quantum algorithms for processing
heterogeneous data. The use of quantum computations based on three- and four-qubit systems
enabled the investigation of the effectiveness of different circuit configurations in enhancing the
informativeness of fused images and evaluating their results compared to classical methods.</p>
      <p>Analysis of the obtained metrics showed that the 3-qubit scheme provides a balanced
combination of spectral information reconstruction and preservation of structural similarity.
However, transitioning to the 4-qubit scheme improved textural details and contrast reproduction,
as evidenced by the increase in the Average Gradient metric (1.31 versus 0.91 for three qubits) and
the Spatial Frequency metric (0.29 versus 0.22). At the same time, an increase in MSE and a
decrease in PSNR were observed, indicating the occurrence of noise effects and highlighting the
need for further optimization to maintain a balance between sharpness and spectral fidelity.</p>
      <p>Comparison with classical methods (IHS, PCA, Brovey, Wavelet, Laplacian Pyramid, Curvelet)
showed that traditional algorithms remain more stable in terms of spectral quality metrics. In
contrast, quantum schemes have an advantage in reproducing spatial and textural characteristics. It
opens up prospects for applying quantum algorithms to enhance the informativeness of remote
sensing data in mapping and thematic processing tasks.</p>
      <p>The results confirm the feasibility of further developing quantum image fusion methods,
particularly in optimizing circuits, reducing noise artifacts, and integrating them with modern
approaches to satellite data processing. It creates a foundation for expanding the applications of
quantum technologies in geoinformation systems, remote sensing, and multiband image analysis.</p>
    </sec>
    <sec id="sec-6">
      <title>7. Acknowledgements</title>
      <p>The authors would like to acknowledge that this paper has been written based on the results
achieved within the OptiQ project. This Project has received funding from the European Union’s
Horizon Europe programme under the grant agreement No 101080374-OptiQ. Supplementarily, the
Project is co-financed from the resources of the Polish Ministry of Science and Higher Education in
a frame of programme International Cofinanced Projects. Disclaimer Funded by the European
Union. Views and opinions expressed are, however, those of the authors only and do not
necessarily reflect those of the European Union or the European Research Executive Agency (REA–
granting authority). Neither the European Union nor the granting authority can be held
responsible for them.</p>
    </sec>
    <sec id="sec-7">
      <title>8. Declaration on Generative AI</title>
      <p>The authors used Grammarly to check the grammar and spelling. After using this tool, the authors
reviewed and edited the content as needed and take full responsibility for the publication’s content.</p>
    </sec>
    <sec id="sec-8">
      <title>9. References</title>
      <p>Management in Global Information Networks (CMiGIN 2019), Lviv, Ukraine, 2019, pp. 370–
380. URL: https://ceur-ws.org/Vol-2588/paper31.pdf.
[7] L. Alparone, L. Wald, J. Chanussot, C. Thomas, P. Gamba and L.M. Bruce, “Comparison of
pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest,” IEEE Trans.
Geosci. Remote Sens., vol. 45, no. 10, pp. 3012–3021, Oct. 2007. URL:
https://doi.org/10.1109/TGRS.2007.904923. doi: 10.1109/TGRS.2007.901007.
[8] R. Restaino, Pansharpening techniques: Optimizing the loss function for convolutional neural
networks. Remote Sensing 17 (2025) 16. doi:10.3390/rs17010016.
[9] G. Vivone, Remote sensing image fusion: A review and future directions. Information Fusion
64 (2020) 99–122. doi:10.1016/j.inffus.2019.08.002.
[10] X. Liu, Q. Liu, Y. Wang, Remote sensing image fusion based on two-stream fusion network.</p>
      <p>Information Fusion 57 (2020) 1–11. doi:10.1016/j.inffus.2019.07.010.
[11] W. Wang, Y. Zhu, H. Li, X. Zhang, Pansharpening of multispectral images via multi-scale
dilated residual network. Remote Sensing 13 (2021) 1072. doi:10.3390/rs13061200.
[12] M. Ciotola et al., "Hyperspectral Pansharpening: Critical review, tools, and future
perspectives", in IEEE Geoscience and Remote Sensing Magazine, vol. 13, no. 1, pp. 311-338,
March 2024. URL: https://doi.org/10.1109/MGRS.2024.3509139.
doi: 10.1109/MGRS.2024.3509139.
[13] A. Dendukuri, A. M. Iliyasu, S. Venegas-Andraca and F. Yan, “Image processing in quantum
computers,” URL: https://doi.org/10.48550/arXiv.1812.11042. doi:10.48550/arXiv.1812.11042.
[14] L. Miller, G. Uehara and A. Spanias, "Quantum Image Fusion Methods for Remote Sensing,"
2024 IEEE Aerospace Conference, Big Sky, MT, USA, 2024, pp. 1-9. doi:
10.1109/AERO58975.2024.10521113.
[15] S. Majji, A. Chalumuri, R. Kune and B. S. Manoj, “Quantum Processing in Fusion of SAR and
Optical Images for Deep Learning: A Data-Centric Approach”, IEEE Access, 2025.</p>
      <p>URL: https://doi.org/10.1109/ACCESS.2022.3189474. doi:10.1109/access.2022.3189474.
[16] Y. Altmann et al., Quantum-inspired computational imaging. Science 361 (2018) eaаt2298.</p>
      <p>doi:10.1126/science.aat2298.
[17] V. Kashtan, V. Hnatushenko, D. Babets, K. Cyran, K. Wereszczyński, “Hybrid quantum
CNNbased information technology for building semantic segmentation in aerial imagery”. PhD
Workshop on Artificial Intelligence in Computer Science at 9th International Conference on
Computational Linguistics and Intelligent Systems (CoLInS-2025), May 15–16, 2025, Kharkiv,
Ukraine, pp.150-162. URL: https://ceur-ws.org/Vol-4015/paper11.pdf.
[18] Z. Zhang, S. Zhang, X. Meng, L. Chen, F. Shao, Perceptual quality assessment for
pansharpened images based on deep feature similarity measure. Remote Sensing 16 (2024)
4621. doi:10.3390/rs16244621.
[19] S. Deb, W. Pan, Quantum image compression: Fundamentals, algorithms, and advances.</p>
      <p>Computers 13 (2024) 185. doi:10.3390/computers13080185.
[20] K. Liu, Y. Zhang, K. Lu, X. Wang, X. Wang, An optimized quantum representation for color
digital images. Int. J. Theor. Phys. 57 (2018) 2938–2948. doi:10.1007/s10773-018-3813-4.
[21] E. Haque, M. Paul, F. Tohidi, A. Ulhaq, An overview of quantum circuit design focusing on
compression and representation. Electronics 14 (2024) 72. doi:10.3390/electronics14010072.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>G.</given-names>
            <surname>Vivone</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Alparone</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Chanussot</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. Dalla</given-names>
            <surname>Mura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Garzelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. A.</given-names>
            <surname>Licciardi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Restaino</surname>
          </string-name>
          and
          <string-name>
            <given-names>L.</given-names>
            <surname>Wald</surname>
          </string-name>
          , “
          <article-title>A critical comparison among pansharpening algorithms</article-title>
          ,
          <source>” IEEE Trans. Geosci</source>
          . Remote Sens., vol.
          <volume>53</volume>
          , no.
          <issue>5</issue>
          , pp.
          <fpage>2565</fpage>
          -
          <lpage>2586</lpage>
          , May
          <year>2015</year>
          . URL: https://doi.org/10.1109/TGRS.
          <year>2014</year>
          .
          <volume>2361734</volume>
          . doi:
          <volume>10</volume>
          .1109/TGRS.
          <year>2014</year>
          .
          <volume>2361734</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>V.</given-names>
            <surname>Hnatushenko</surname>
          </string-name>
          , Vik. Hnatushenko,
          <string-name>
            <given-names>A.</given-names>
            <surname>Kavats</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Shevchenko</surname>
          </string-name>
          ,
          <article-title>Pansharpening technology of high resolution multispectral and panchromatic satellite images</article-title>
          ,
          <source>Scientific bulletin of National Mining University</source>
          <volume>4</volume>
          (
          <issue>148</issue>
          ) (
          <year>2015</year>
          )
          <fpage>91</fpage>
          -
          <lpage>98</lpage>
          . URL: https://nvngu.in.ua/index.php/en/component/jdownloads/viewdownload/55/8345.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>F.</given-names>
            <surname>Javan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Samadzadegan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mehravar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Toosi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Khatami</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Stein</surname>
          </string-name>
          ,
          <article-title>A review of image fusion techniques for pan-sharpening of high-resolution satellite imagery</article-title>
          .
          <source>ISPRS J. Photogramm. Remote Sensing</source>
          <volume>171</volume>
          (
          <year>2021</year>
          )
          <fpage>101</fpage>
          -
          <lpage>117</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.isprsjprs.
          <year>2020</year>
          .
          <volume>11</volume>
          .001.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>K.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , W. Wan,
          <string-name>
            <given-names>H.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. Del</given-names>
            <surname>Ser</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Elyan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hussain</surname>
          </string-name>
          ,
          <article-title>Panchromatic and multispectral image fusion for remote sensing and earth observation: concepts, taxonomy, literature review, evaluation methodologies and challenges ahead</article-title>
          .
          <source>Information Fusion</source>
          <volume>93</volume>
          (
          <year>2023</year>
          )
          <fpage>227</fpage>
          -
          <lpage>242</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.inffus.
          <year>2022</year>
          .
          <volume>12</volume>
          .026.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>V.</given-names>
            <surname>Kashtan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Hnatushenko</surname>
          </string-name>
          , “
          <article-title>A Wavelet and HSV Pansharpening Technology of High Resolution Satellite Images”</article-title>
          .
          <source>Intelligent Information Technologies &amp; Systems of Information Security IntelITSIS</source>
          <year>2020</year>
          . Khmelnytskyi, Ukraine, June 10-12,
          <year>2020</year>
          :
          <fpage>67</fpage>
          -
          <lpage>76</lpage>
          . URL: https://ceurws.org/Vol-
          <volume>2623</volume>
          /paper7.pdf.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>V.</given-names>
            <surname>Kashtan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Hnatushenko</surname>
          </string-name>
          ,
          <article-title>Computer technology of high-resolution satellite image processing based on packet wavelet transform</article-title>
          . In: International Workshop on Conflict
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>