<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>S.: Pan-sharpening using a guided filter. International Journal of Remote
Sensing</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1109/access.2020.2971502</article-id>
      <title-group>
        <article-title>A Wavelet and HSV Pansharpening Technology of High Resolution Satellite Images</article-title>
      </title-group>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Dnipro University of Technology</institution>
          ,
          <addr-line>Dnipro, 49005</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Oles Honchar Dnipro National University</institution>
          ,
          <addr-line>Dnipro, 49010</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2017</year>
      </pub-date>
      <volume>14</volume>
      <issue>10</issue>
      <fpage>0000</fpage>
      <lpage>0002</lpage>
      <abstract>
        <p>High resolution satellite images are used to monitor environmental changes, map-making and military intelligence and forecast natural disasters. Nowadays, these contain spatial dissimilarities due to differences in their radiometry resolution, spectral characteristics and time delay from high resolution satellite sensors (multispectral and panchromatic). The use of pansharpened high spatial resolution images significantly increases the possibility of thematic recognition. Pansharpening is a technique that is used to combine the spatial details of a panchromatic image with the the several spectral bands of a lower resolution multispectral image. To date there have proposed a large number of fusion methods. However, most available methods are not effective for the latest very high resolution, such as WorldView-3 satellite imagery. Most methods for increasing spatial resolution lead to artifacts - objects that are not present in the original scene, but that appear in the resulting image. In this paper, we present the pansharpening technology of high resolution satellite images, using integration of bicubic interpolation, color system HSV and wavelet-transform. The aim of the proposed technology is to obtain the high resolution multispectral satellite image after the previous geometric correction of primary multispectral images and optimal wavelet decomposition into approximation and detail coefficients according to the chosen information value function linear forms. The proposed technology was verified by a number of different satellite data. The experimental evaluations are carried out on WorldView-3 images. Visual and quantitative analyses show that our presented technology can achieve high spectral and spatial quality and outperforms some existing pansharpening methods.</p>
      </abstract>
      <kwd-group>
        <kwd>Pansharpening</kwd>
        <kwd>Satellite Image</kwd>
        <kwd>High Resolution</kwd>
        <kwd>Panchromatic</kwd>
        <kwd>Multispectral</kwd>
        <kwd>Wavelet Transform</kwd>
        <kwd>Resampling</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        High resolution satellites, such as WorldView-2,3, provide very valuable data about
the Earth, e.g., for urban damage detection, environmental monitoring, weather
forecasting, map-making and military intelligence [
        <xref ref-type="bibr" rid="ref1 ref2 ref3">1-3</xref>
        ]. The satellite images are
characterized by their spatial, spectral, radiometric and temporal resolutions. But for more
practical applications, used only spatial and spectral resolutions are considered.
Generally, satellites take various images from different frequencies in the visual and
nonvisual ranges called as monochrome images. Most modern satellite systems that
monitor the Earth have the ability to obtain multispectral (MUL) and panchromatic
(PAN) images of different spatial resolutions. All things being equal, panchromatic
images have a higher spatial resolution. Based on the frequency range each
monochrome image contains various information about the object. Each monochrome
image is represented as a band [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. Multispectral image of high spatial resolution sensor
contains four (Red, Green, Blue and near-infrared) or eight bands (Coastal, Blue,
Green, Yellow, Red, Red Edge, Near-Infrared 1, NearInfrared 2). The combination of
these bands produces a new color image [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
2
      </p>
    </sec>
    <sec id="sec-2">
      <title>State of Art</title>
      <p>
        At present, there are different pansharpening methods which can generally be divided
into: Brovey-transform, principal component analysis, independent component
analysis, Gram-Schmidt, intensity-hue-saturation transform (IHS) [
        <xref ref-type="bibr" rid="ref10 ref11 ref12 ref13 ref6 ref7 ref8 ref9">6-13</xref>
        ]. The typical
algorithm for component substitution fusion technique is IHS transform fusion algorithm,
which provides good quality visual high resolution multispectral image, but spectral
distortion occurs [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. Rahmani et al. proposed a modified IHS pansharpening method
[
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. An image adaptive coefficient for IHS was found to obtain a more accurate
spectral resolution. Work [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ] proposed a pansharpening with multiscale normalized
nonlocal means filter. This filter computes each pixel value as the weighted average
of all pixels over a sliding window, and the pixel weight depends upon the distance
from the center pixel. The decomposed high frequency details are added into each
band of multispectral image and the final smooth fused image is obtained. Almost all
these approach are fast and easy to implement but suffer from spectral distortion
because the PAN image does not cover exactly the same spectrum band as that of the
MUL image; and not efficient for the latest image generation.
      </p>
      <p>
        In the last decade, image fusion techniques based on Multi Resolution Analysis
methods, like wavelet transform [
        <xref ref-type="bibr" rid="ref12">12, 16</xref>
        ] and “à-trous” wavelet transform have
become significant due to their ability to capture the information present at different
scales. It is accomplished by using wavelet basis functions due to its desirable
properties such as multiscale decomposition and space frequency localization [17]. In [18]
component substitution fusion method is proposed to reduce color distortion. Work
[19] proposed method divided multispectral and panchromatic images into several
pixel groups by k-means algorithm. Then, the panchromatic image was estimated by a
weighted summation of MUL bands and the fused image was generated by ratio
enhancement [19]. In [20] authors suggested pansharpening algorithm using a guided
filter that has good properties such as edge-preserving and structure transferring. The
underlying idea of the approach in [21] is to consider the spectral difference of each
pixel between multispectral image and panchromatic image, and to adaptively inject
the Pan details into the MUL image. An improved image fusion method was proposed
through the improvement of fused spectra of of mixed pixels [22]. In paper [23]
authors consider the application of nonlinear image decomposition schemes based on
morphological operators to data fusion. Work [24] proposes a new regularized
modelbased pan-sharpening method for the images with local dissimilarities. An adjustment
matrix is introduced into the global spatial similarity regulariser to reduce the effect of
the contrast inversion [24]. Recent research has shown that the deep neural networks
have obtained superior performance in pansharpening image [25-29]. However, the
neural network methods take more time than traditional fusion methods.
      </p>
      <p>The analysis of the existing pansharpening approaches showed that most methods
for increasing spatial resolution lead to artifacts. Objects that are not present in the
original scene, but that appear in the resulting image. Most of existing pansharpening
methods generally introduce spectral distortions.
3</p>
    </sec>
    <sec id="sec-3">
      <title>Pansharpening Technology</title>
      <p>We propose an efficient pansharpening technology of high resolution satellite images
using integration of bicubic interpolation, HSV and wavelet-transforms. The
technology scheme is shown in the Fig. 1.</p>
      <p>The main processing steps are:
1. Uploading the high resolution images received from satellite WorldView-3:
panchromatic – PAN, multispectral: MUL in true-color composition (R, G, B) and
NIR in color composition (NIR, B, R).
2. Resampling MUL and NIR based on bicubic interpolation [30]:
(1)
(2)
where aij – coefficient; Pij – intensity of the image being scaled.
3. A characteristic feature of many images obtained in real scanner satellite system is
a significant proportion of dark areas and the relatively small number of areas with
high brightness. That is why one of the first steps of the algorithm is histogram
equalization images. Discrete transformation brightness scale is as follows:
where zi′ – the value of the conversion scale brightness corresponding to the
brightness output scale;</p>
      <p>p(zk ) – normalized histogram brightness of the original image ( k = 0...255 ).
4. Transform the multispectral image (MUL) from RGB components into hue–
saturation-intensity (HSV) components:</p>
      <p>3 3
v (x,y) = ∑ ∑ aij ⋅ Pij ,
i=0 j=0</p>
      <p>i
zi′ = zm ∑ p(zk ) ,</p>
      <p>k =0
VV1  =  1136
V2   12
(3)
(4)
(5)
5. The next stage is applying wavelet transform (Fig.2) which is divided into the next
stages:
5.1. Decompose the PAN into approximation coefficients (LL) and detail
coefficients (LH, HL, and HH, the image information including vertical, horizontal and
diagonal features) to the fourth decomposition level of the discrete wavelet transform
(DWT) of the bior 2.2 class:</p>
      <p>N
PAN → PLNL + ∑ (PLiH + PHi L + PHi H ) ,</p>
      <p>i
where, PLNL – approximation coefficient at level N;
PLiH , PHi L , PHi H – horizontal, vertical, diagonal coefficients at level i respectively.
5.2. Apply DWT to luminance Vm-component of the multispectral image. The
images are subjected to the fourth level of decomposition, resulting in each of the
images are approximate and detailing matrix coefficients:</p>
      <p>N
Vm → VmLNL + ∑ (VmLiH + VmHiL + VmHiH ) ,</p>
      <p>i
where, VmLNL – approximation coefficient at level N;
VmLiH , VmHiL , VmHiH – horizontal, vertical and diagonal coefficients at level i.
5.3. Apply the concept of DWT to the NIR image. Matrix NIRLL holds the
approximation coefficient. Matrices NIRLH, NIRHL, and NIRHH keep the detail coefficients:</p>
      <p>N
NIR → NIRLNL + ∑ (NIRLiH + NIRHiL + NIRHiH ) ,
i
(6)
where, NIRLNL – approximation coefficient at level N; NIRLiH , NIRHiL , NIRHiH –
horizontal, vertical and diagonal coefficients at level i.</p>
      <p>5.4. After decomposition, substitution has been done by placing Vm bands
approximation in PAN approximation at each level and NIR bands approximation in PAN
approximation at each level. For each Vm, NIR band and PAN, single fused image
coefficients are obtained. Similarly for each level, two fused image coefficients have
been obtained. The coefficients merging rule is:</p>
      <p>N N
VmLNL + ∑(PLiH + PHiL + PHiH ) → Vmw; NIRLNL + ∑(PLiH + PHiL + PHiH ) → NIRw.</p>
      <p>i i
5.5. Applying the Biorthogonal wavelet transform (IDWT) to matrix obtained in
the last step for obtaining the new intensity components Vmv and NIRw. IDWT
performs the inverse DWT, which allows reconstructing new intensity components
where the features from the PAN and those from the initial intensity component have
been integrated. After inverse wavelet transform, two new fused images have been
obtained in each level.
6. Reverse transform from the HSV in the RGB color space, choose Hm and Sm
components of multichannel images and the resulting Vmw after
wavelettransform.
7. Taking the new RGB channels from the true-color composite and the new NIRw
channel from the false-color composite. The result is four-channel image high
spatial resolution.
4
4.1</p>
    </sec>
    <sec id="sec-4">
      <title>Results</title>
      <sec id="sec-4-1">
        <title>Visual Analysis</title>
        <p>The testing WorldView-3 data set consists of 8-band multispectral data with 1.24 m
spatial resolution and panchromatic image with 0.31 m spatial resolution. Fig. 3
shows 400*400 detail of the whole scene, which constitutes buildings, grass, trees and
roads. The MUL subscene image in RGB composition (Band 5-3-2) is shown in Fig
3a, in NIR color composition (Band 7-2-5) is shown in Fig 3b and full-resolution
panchromatic image is shown in Fig.3c. The results obtained from Brovey-transform,
Gram-Schmidt methods and the proposed technology are reported in Figs. 3(c)-3(f),
respectively. A visual comparison of the results makes it possible to assert that the
spatial resolution of the original multispectral data is improved. Buildings and roads
in the resulting image are much sharper than in the original image. The result obtained
from proposed technology is shown the spatial details appear as sharp as those in
panchromatic image and spectral information is faithfully preserved without any
obvious color distortion.</p>
        <p>The results obtained from Brovey-transform, Gram-Schmidt methods and the
proposed technology are reported in Figs. 3(c)-3(f), respectively.
d) e) f)
Fig. 3. Satellite images of WorldView-3: a) multichannel; b) NIR color composition; c)
panchromatic; d) Brovey-transform; e) Gram-Schmidt methods; f) the proposed technology result.
4.2</p>
      </sec>
      <sec id="sec-4-2">
        <title>Quantitative Analysis</title>
        <p>
          As the visual analysis is very subjective and depends on the interpreter, a number of
statistical analyses were performed [31-33]. To evaluate the spectral and spatial
quality of pansharpened images we used relative dimensionless global error in synthesis
(ERGAS) [
          <xref ref-type="bibr" rid="ref11">11, 33</xref>
          ]. Table 1 shows ERGASspectral obtained by known pansharpening
methods (HSV, PCA, Gram-Shmidt, Wavelet) and synthesized images developed
technology. It is clear, from its definition, that low ERGAS index values represent
high image quality. One of the main difficulties is the quantitative assessment of
visual image quality. To assess visual quality, an approach based on the calculation of
information entropy is often used. Image entropy is a statistical feature that reflects
the average information content in an image [16]. Fig. 4 shows a graphical
representation of the value of information and entropy for original images and the synthesized
multichannel image from our technology. The value of the synthesized image entropy
value far exceeds the initial entropy multichannel image, this indicates that the new
technology allows to improve information and detail objects multichannel images.
        </p>
        <p>The correlation coefficient (CORR) is an important indicator reflecting the
difference between the fused image and the original image [33]. This value ranges from -1
to 1. The best correspondences between fused and original image data show the
highest correlation values. Table 2 shows the values CORR for the Gram-Schmidt,
wavelet, packet wavelet, HSV, PCA, and new technology image fusion methods. So, Table
2 shows the best results for the proposed technology and the PCA method presents
acceptable results. All other methods have a very low correlation values.</p>
        <p>The SSIM shows the similarity with the original image. The structural similarity
image quality paradigm is based on the assumption that the human visual system is
highly adapted for extracting structural information from the scene [32]. Table 3
shows value of SSIM for the fused images in comparison with the multispectral
image. All methods except of Wavelet and proposed technology are near zero, which
confirms the fact that there is only slight similarity with the original image.</p>
        <p>Quantitative analysis shows that existing methods for increasing spatial resolution
lead to artifacts. The proposed technology increases the spatial resolution of
multispectral aerospace images without color distortion.
In this paper, we present the new pansharpening technology of high resolution
satellite images. It is obtained, particularly, by utilizes the merit of the HSV fusion in
smoothly integrating spatial resolution information and the merit of the wavelet fusion
in preserving color information. The visual evaluation shows that the color of the
fusion results of the proposed wavelet-HSV pansharpening technology is very close to
the color of original MUL images for every data set; whereas the color of the
standalone HSV fusion results and the wavelet fusion results are substantially distorted.
Visual and quantitative analyses show that our presented technology preserves the
original spectral features, can achieve high spectral and spatial quality and
outperforms existing pansharpening methods.</p>
        <p>Our further research will focus on improving the fusion accuracy of multichannel
image fusion.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Hnatushenko</surname>
            ,
            <given-names>V.V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mozgovyi</surname>
            ,
            <given-names>D.K.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vasyliev</surname>
            ,
            <given-names>V.V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kavats</surname>
            ,
            <given-names>O.O.</given-names>
          </string-name>
          :
          <article-title>Satellite Monitoring of Consequences of Illegal Extraction of Amber in Ukraine</article-title>
          . Scientific bulletin of National Mining University. - State Higher Educational Institution “National Mining University”, Dnipropetrovsk, №
          <volume>2</volume>
          (
          <issue>158</issue>
          ).
          <source>С. 99-105</source>
          , (
          <year>2017</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2. Zhang, Y.:
          <article-title>Problems in the fusion of commercial high-resolution satellite as well as Landsat 7 images and initial solutions</article-title>
          .
          <source>International Archives of Photogrammetry Remote Sensing and Spatial Information Sciences</source>
          ,
          <volume>34</volume>
          (
          <issue>4</issue>
          ), p.
          <fpage>587</fpage>
          -
          <lpage>592</lpage>
          , (
          <year>2012</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3.
          <string-name>
            <surname>Hordiiuk</surname>
            ,
            <given-names>D.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hnatushenko</surname>
            ,
            <given-names>V.V.</given-names>
          </string-name>
          :
          <article-title>Neural network and local laplace filter methods applied to very high resolution remote sensing imagery in urban damage detection</article-title>
          .
          <source>2017 IEEE International Young Scientists Forum on Applied Physics and Engineering (YSF)</source>
          , (
          <year>2017</year>
          ). doi:
          <volume>10</volume>
          .1109/ysf.
          <year>2017</year>
          .
          <volume>8126648</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Gnatushenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>The use of geometrical methods in multispectral image processing</article-title>
          .
          <source>Journal of Automation and Information Sciences</source>
          , Volume
          <volume>35</volume>
          (
          <issue>12</issue>
          ),
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          , (
          <year>2003</year>
          ). doi:
          <volume>10</volume>
          .1615/JAutomatInfScien.v35.
          <year>i12</year>
          .
          <fpage>10</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Kashtan</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hnatushenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shedlovska</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          :
          <article-title>Processing technology of multispectral remote sensing images</article-title>
          .
          <source>International Young Scientists Forum on Applied Physics</source>
          <year>2017</year>
          , p.
          <fpage>355</fpage>
          -
          <lpage>358</lpage>
          .
          <string-name>
            <surname>Lviv</surname>
          </string-name>
          (
          <year>2017</year>
          ). doi:
          <volume>10</volume>
          .1109/YSF.
          <year>2017</year>
          .
          <volume>8126647</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          6.
          <string-name>
            <surname>Meng</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , Zhang,
          <string-name>
            <given-names>L.</given-names>
            , &amp;
            <surname>Fu</surname>
          </string-name>
          ,
          <string-name>
            <surname>R.</surname>
          </string-name>
          :
          <article-title>Review of the pansharpening methods for remote sensing images based on the idea of meta-analysis: Practical discussion and challenges</article-title>
          .
          <source>Information Fusion</source>
          ,
          <volume>46</volume>
          ,
          <fpage>102</fpage>
          -
          <lpage>113</lpage>
          , (
          <year>2019</year>
          ). doi:
          <volume>10</volume>
          .1016/j.inffus.
          <year>2018</year>
          .
          <volume>05</volume>
          .006 .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Vivone</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Alparone</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Chanussot</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Dalla</surname>
            <given-names>Mura</given-names>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Garzelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Licciardi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. A.</given-names>
            ,
            <surname>Restaino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            , and
            <surname>Wald</surname>
          </string-name>
          , L.:
          <article-title>A critical comparison among pansharpening algorithms</article-title>
          .
          <source>IEEE Trans. Geosci</source>
          . Remote Sens., vol.
          <volume>53</volume>
          , no.
          <issue>5</issue>
          , pp.
          <fpage>2565</fpage>
          -
          <lpage>2586</lpage>
          , (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Ghassemian</surname>
          </string-name>
          , H.:
          <article-title>A review of remote sensing image fusion methods</article-title>
          .
          <source>Information Fusion</source>
          ,
          <volume>32</volume>
          ,
          <fpage>75</fpage>
          -
          <lpage>89</lpage>
          , (
          <year>2016</year>
          ). doi:
          <volume>10</volume>
          .1016/j.inffus.
          <year>2016</year>
          .
          <volume>03</volume>
          .003.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Xu</surname>
            ,
            <given-names>Q.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ding</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>High-Fidelity Component Substitution Pansharpening by the Fitting of Substitution Data</article-title>
          .
          <source>EEE Transactions on Geoscience and Remote Sensing</source>
          , vol.
          <volume>52</volume>
          , no.
          <issue>11</issue>
          , pp.
          <fpage>7380</fpage>
          -
          <lpage>7392</lpage>
          , (
          <year>2014</year>
          ). doi:
          <volume>10</volume>
          .1109/TGRS.
          <year>2014</year>
          .
          <volume>2311815</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10.
          <string-name>
            <surname>Sulaiman</surname>
            ,
            <given-names>A.G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Elashmawi</surname>
            ,
            <given-names>W.H.</given-names>
          </string-name>
          , &amp;
          <string-name>
            <surname>El-Tawel</surname>
            ,
            <given-names>G.S.:</given-names>
          </string-name>
          <article-title>A Robust Pan-Sharpening Scheme for Improving Resolution of Satellite Images in the Domain of the Nonsubsampled Shearlet Transform</article-title>
          .
          <source>Sensing and Imaging</source>
          , (
          <year>2019</year>
          ).
          <volume>21</volume>
          (
          <issue>1</issue>
          ).
          <source>doi:10.1007/s11220-019-0268-5.</source>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          11.
          <string-name>
            <surname>Hnatushenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hnatushenko</surname>
          </string-name>
          , Vik.,
          <string-name>
            <surname>Kavats</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shevchenko</surname>
            ,
            <given-names>V.</given-names>
          </string-name>
          :
          <article-title>Pansharpening technology of high resolution multispectral and panchromatic satellite images</article-title>
          .
          <source>Scientific Bulletin of National Mining University, Issue</source>
          <volume>4</volume>
          ,
          <fpage>91</fpage>
          -
          <lpage>98</lpage>
          (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          12.
          <string-name>
            <surname>Vivone</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Alparone</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Chanussot</surname>
            ,
            <given-names>J.; Dalla</given-names>
          </string-name>
          <string-name>
            <surname>Mura</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Garzelli</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Licciardi</surname>
            ,
            <given-names>G.A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Restaino</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wald</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          :
          <article-title>Acritical comparison among pansharpening algorithms</article-title>
          .
          <source>IEEE Trans. Geosci. Remote Sens</source>
          .
          <volume>53</volume>
          , p.
          <fpage>2565</fpage>
          -
          <lpage>2586</lpage>
          (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          13.
          <string-name>
            <surname>Aishwarya</surname>
            ,
            <given-names>N</given-names>
          </string-name>
          , Abirami,
          <string-name>
            <given-names>S.</given-names>
            and
            <surname>Amutha</surname>
          </string-name>
          , R.:
          <article-title>Multifocus image fusion using Discrete Wavelet Transform</article-title>
          and
          <string-name>
            <given-names>Sparse</given-names>
            <surname>Representation</surname>
          </string-name>
          .
          <source>2016 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET)</source>
          , Chennai,
          <year>2016</year>
          , pp.
          <fpage>2377</fpage>
          -
          <lpage>2382</lpage>
          , (
          <year>2016</year>
          ). doi:
          <volume>10</volume>
          .1109/WiSPNET.
          <year>2016</year>
          .
          <volume>7566567</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          14.
          <string-name>
            <surname>Rahmani</surname>
            ,
            <given-names>S</given-names>
          </string-name>
          , Strait,
          <string-name>
            <given-names>M</given-names>
            ,
            <surname>Merkurjev</surname>
          </string-name>
          ,
          <string-name>
            <surname>D</surname>
          </string-name>
          , Moeller,
          <string-name>
            <surname>M</surname>
          </string-name>
          , Wittman,
          <string-name>
            <surname>T.</surname>
          </string-name>
          :
          <article-title>An adaptive IHS pansharpening method</article-title>
          .
          <source>IEEE Geosci. Remote Sens. Lett.</source>
          , vol.
          <volume>7</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>746</fpage>
          -
          <lpage>750</lpage>
          (
          <year>2010</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          15.
          <string-name>
            <surname>Haitao</surname>
          </string-name>
          , Yin, Shutao, Li.:
          <article-title>Pansharpening with multiscale normalized nonlocal means filter: a two-step approach</article-title>
          .
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          , vol.
          <volume>53</volume>
          , no.
          <issue>10</issue>
          , pp.
          <fpage>5734</fpage>
          -
          <lpage>5745</lpage>
          (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>