<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Real-time multispectral video panorama construction</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>I A Kudinov</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>O V Pavlov</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>I S Kholopov</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>M Yu Khramov</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Ryazan State Radio Engineering University</institution>
          ,
          <addr-line>Gagarina Street 59/1, Ryazan, Russia, 390005</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Scientific and Design Center of Video and Computer technologies, Ryazan State Instrumentmaking Enterprise</institution>
          ,
          <addr-line>Seminarskaya Street 32, Ryazan, Russia, 390005</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2018</year>
      </pub-date>
      <abstract>
        <p>An algorithm for the video panorama construction from distributed multispectral cameras data is described. It is shown that operations of vision enhancement (modified Multiscale Retinex algorithm) and multispectral image fusion are implemented for two independently chosen regions of interest with a frame size of 1024x768 pixels and 30 fps using CUDA technology.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>The real-time automatic generation of high-resolution video panoramas from information of several
cameras with partially overlapping fields of view (FoV) is one of the modern trends in the vision
systems development. Generally, panorama navigation implies the presence of a user-controlled
region of interest (RoI). This approach is an alternative for mechanical drive-based vision systems, as
it ensures simultaneous operation of several users with an independent choice of a personal RoI
without mechanically moving the camera system. Another advantage of distributed panoramic systems
(DPS) is the integration of video cameras into the body / fuselage of the carrier object, what positively
affects its aerodynamic properties.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Methods of panorama construction</title>
      <p>
        There are two basic approaches for panorama stitching:
 an approach based on finding the correspondence of homogeneous pixel coordinates on the
frames of cameras with i and j numbers by detecting and matching keypoints using their
descriptors and estimating the homography matrix Hij [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ];
 an approach based on finding the correspondence of homogeneous pixel coordinates by
preliminarily calibrating the cameras of the DPS by the test object with an auxiliary wide-angle
camera [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] if the fields of view of cameras have small angular sizes of the intersection zone or
do not intersect at all.
      </p>
      <p>The advantage of the first approach is operability even in the absence of a priori information about
the mutual placement of DPS cameras; the advantage of the second approach is operability in difficult
observation conditions and in low-contrast observable scenes.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Multispectral images superimposition</title>
      <p>
        It is known [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] that one of the main approaches for increase of situational awareness in poor visibility
conditions is the simultaneous use of different spectral ranges cameras: visible TV and infrared (IR) –
near IR (NIR), short wave IR (SWIR), medium wave IR (MWIR) and long wave IR (LWIR).
Panorama stitching from different spectral frames for each of the methods considered above is
hindered by the different physical nature of the images formed by TV and IR cameras: TV, NIR and
SWIR cameras see the light reflected by the object in the wavelength ranges 0.38...0.76, 0.7...1.1 and
0.9...1.7 μm respectively whereas MWIR and LWIR cameras see only the thermal radiation of the
object at wavelengths 3...5 and 8...12 μm respectively. Therefore, in order to construct a video
panorama in the DPS with different spectral ranges cameras (depending on the selected method of the
panorama stitching), either a solution of the problem of keypoint matching on TV and IR frames or the
production of a universal (having high contrast in the all operating spectral ranges) pattern for camera
calibration is necessary.
      </p>
      <sec id="sec-3-1">
        <title>3.1. Multispectral images keypoints matching</title>
        <p>
          The analysis of publications [
          <xref ref-type="bibr" rid="ref5 ref6 ref7 ref8 ref9">5-9</xref>
          ] allows distinguishing of four basic approaches for automatic
superimposition of multispectral images:
 based on the transition to the decorrelated colour space and the use the SIFT method: this
approach is applicable only for combining RGB frames of the visible range and NIR [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ];
 based on the mutual morphological correlation of pre-segmented images [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ];
 based on the results of the contour analysis [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ],
 based on estimating the homography matrix by manually matches search [
          <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
          ].
        </p>
        <p>Restrictions on the use of these methods are situations where the image in one of the video
channels (usually in TV) does not allow selecting of the important details of the scene for matches
searching (for example, in near to zero illumination, as well as in dense smoke or fog).</p>
        <p>The solution of the superimposition problem can also be achieved by mechanical alignment, which
is to ensure the parallelism of the sighting lines and the same angular dimensions of the field of view
of the each spectral range cameras, as well as their mutual placement minimizing parallax, but this
approach is not applicable for the DPS.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Multispectral camera calibration with universal pattern</title>
        <p>
          Photogrammetric calibration of DPS cameras is the most universal approach for combining images in
the far zone, but requires the use of a test pattern that have high contrast simultaneously in several
spectral ranges. Examples of such patterns with typical "chessboard" image are considered in [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ].
The results of the calibration allow us to evaluate the matrix of internal and external parameters of the
multispectral DPS cameras.
4. Algorithm of the video panorama construction according to information from multispectral
cameras
As the images of DPS cameras are characterized by geometric distortions caused by different shooting
angles, in order to minimize them it is expedient to construct a panoramic frame on a virtual surface of
uniform curvature: a sphere or a cylinder of unit radius [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ]. Implemented by the authors the algorithm
of the moving along the spherical video panorama RoI filling [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] for the DPS with pre-calibrated
multispectral cameras contains the following steps.
        </p>
        <p>
          1. Initialization: calculating quaternions quv0, that specify the initial angular direction to the pixels
of RoI [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ]. If it is necessary to dynamically change the field of view the RoI, the quaternion quv0 is
recalculated in the body of the main operation cycle.
        </p>
        <p>Main operation cycle
2. Estimation of the current angular position of the DPS reference camera by pitch  and roll 
(for example, according to the data from the inertial measurement unit) and corresponding rotation
matrix:</p>
        <p>cos
Rθ  sin 
 0
 sin 
cos
0
01
00
10</p>
        <p>0
cos
sin </p>
        <p>0 
 sin 
cos  .</p>
        <p>(1)
3. Calculation of the rotation quaternion for a given angular position of the line of sight (center of
RoI) qvis and quaternions quv, determining the current angular position of the RoI pixels (u, v):</p>
        <p>
          quv = qvisquv0. (2)
4. Filling RoI for each spectral range with pixels from cameras by re-projecting points from the
surface of a unit virtual sphere to their matrices (with lens distortion compensation [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ]).
5. The implementation of the blending procedure [
          <xref ref-type="bibr" rid="ref1 ref14">1, 14</xref>
          ] for the RoI of each spectral range.
6. Pixel-level image fusion [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ] according to selected fusion algorithm.
        </p>
        <p>Since for each pixel of the RoI the processing according to the above algorithm is homogeneous,
this allows us to apply the procedure of parallelizing the computations, for example, using the
resources of the GPU.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>5. Description of the DPS layout</title>
      <p>
        The layout of the DPS is a development of previous authors’ work [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ] and in addition to grayscale
TV cameras contains a LWIR thermal camera with a field of view of 50°40° (figure 1). The mutual
angular position of the fields of view of the TV cameras and the thermal camera in the sector
200°120° is shown in figure 2. To synchronize frames in time, external sync pulses are applied.
      </p>
      <p>For filling of the RoI the computations are divided into parallel blocks (64 horizontally and 48
vertically) with 256 threads in each (16 threads horizontally and 16 vertically) using CUDA and
CUDA C. As copying of the data of the CPU memory into the GPU memory and back is relatively
slow, the number of such operations is minimized in our implementation of the video panorama.</p>
      <p>
        In the DPS layout are implemented:
 angular position control of the line of sight of the operator: according to the data from the head
tracking system or (if not) from the joystick;
 independent display of two RoIs with a dynamic change of its field of view from 80°60° (wide
angle) to 10°7.5° (telephoto angle);
 blending according to the algorithm [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] (figure 3);
 increasing the contrast of the TV image (figure 4) using the modified Multiscale Retinex
algorithm [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ]: in order to accelerate calculations to estimate the brightness of the background,
instead of smoothing by a Gaussian filter a box filter is used;
 RoI display mode selection: TV, grayscale thermal image, false-color thermal image
(Jet [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] and Cold-to-Warm [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] color maps are realized), contrasted TV (Multiscale Retinex),
image fusion mode – grayscale [
        <xref ref-type="bibr" rid="ref19">19</xref>
        ] or false-color [
        <xref ref-type="bibr" rid="ref19 ref20 ref21">19-21</xref>
        ]); results are shown in figures 5 and 6.
 mapping of the mutual angular position of the RoIs of the first and second operators.
      </p>
      <p>As the DPS layout currently contains a single thermal camera, the fusion mode is realized only at
angular positions of the RoI containing a part of area 6 (see figure 2). Otherwise, the user in this mode
receives information only from the TV cameras. This is illustrated in figure 5, where in the thermal
camera and fusion modes the lower rows of the RoI are filled with information from the TV cameras,
because at the current position of the line of sight their angular coordinates do not fall in the field of
view of the LWIR camera and therefore do not contain data in infrared spectral band.</p>
      <p>On the NVIDIA GeForce GTX 560 Ti GPU (384 cores) with the maximum amount of calculations
(blending, false-color fusion of contrasted TV and IR channels) and the RoI size 1024768 pixels for
each of the two operators, the rate of updating the information is 32 Hz. The calculation speed for
other modes is given in the table, where for comparison, information on the processing speed is also
provided when implementing a video panorama on an Intel Core i5 CPU. All values in the table are
rounded to the nearest whole number.</p>
      <p>As can be seen from the table, the use of parallel computations increases the processing speed by
an average of 16 times.</p>
    </sec>
    <sec id="sec-5">
      <title>6. Conclusion</title>
      <p>With the use of CUDA technology the algorithm for creating a video panorama based on information
from multispectral cameras for two windows of interest with a resolution of 0.7 Mp implements an
independent display of video information and vision enhancement functions (Multiscale Retinex and
pixel-level fusion) with a frequency of at least 30 fps.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Brown</surname>
            <given-names>M</given-names>
          </string-name>
          and
          <string-name>
            <surname>Lowe</surname>
            <given-names>D 2007</given-names>
          </string-name>
          <string-name>
            <surname>Int. J. Comput</surname>
          </string-name>
          .
          <source>Vision</source>
          <volume>74</volume>
          (
          <issue>1</issue>
          )
          <fpage>59</fpage>
          -
          <lpage>73</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Fischler</surname>
            <given-names>M</given-names>
          </string-name>
          and
          <string-name>
            <surname>Bolles R 1981</surname>
          </string-name>
          <article-title>Commun</article-title>
          . ACM
          <volume>24</volume>
          (
          <issue>6</issue>
          )
          <fpage>381</fpage>
          -
          <lpage>395</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Shirokov</surname>
            <given-names>R I</given-names>
          </string-name>
          and
          <string-name>
            <surname>Alekhnovich</surname>
            <given-names>V I</given-names>
          </string-name>
          <year>2014</year>
          Contenant 4
          <fpage>10</fpage>
          -
          <lpage>23</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <surname>Knyaz</surname>
            <given-names>V A</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vygolov</surname>
            <given-names>O V</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vizilter</surname>
            <given-names>Y V</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zheltov S Y and Vishnyakov B V 2016</surname>
          </string-name>
          <article-title>Proc</article-title>
          .
          <source>SPIE 22 984022</source>
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Brown</surname>
            <given-names>M</given-names>
          </string-name>
          and
          <string-name>
            <surname>Susstrunk</surname>
            <given-names>S 2011</given-names>
          </string-name>
          <string-name>
            <surname>Proc. IEEE CVPR (Washington</surname>
            <given-names>DC</given-names>
          </string-name>
          : IEEE Comput. Soc.)
          <volume>177</volume>
          -
          <fpage>184</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Vizilter</surname>
            <given-names>Y V</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Zheltov</surname>
            <given-names>S Y</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rubis</surname>
            <given-names>A Y</given-names>
          </string-name>
          and
          <string-name>
            <surname>Vygolov O V 2016 J. Comput</surname>
          </string-name>
          .
          <source>Syst. Sci. Int</source>
          .
          <volume>55</volume>
          <fpage>598</fpage>
          -
          <lpage>608</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Efimov</surname>
            <given-names>A I</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Novikov</surname>
            <given-names>A I</given-names>
          </string-name>
          and
          <string-name>
            <surname>Sablina</surname>
            <given-names>V A</given-names>
          </string-name>
          <year>2016</year>
          Proc. 5th
          <source>Mediterr. Conf. Embedded Comput. (Bar</source>
          )
          <fpage>132</fpage>
          -
          <lpage>137</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Bhosle</surname>
            <given-names>U</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Roy S D and Chaudhuri S 2005 Pattern</surname>
          </string-name>
          <article-title>Recognit</article-title>
          .
          <source>Lett</source>
          .
          <volume>26</volume>
          (
          <issue>4</issue>
          )
          <fpage>471</fpage>
          -
          <lpage>482</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Efimov</surname>
            <given-names>A I</given-names>
          </string-name>
          and
          <string-name>
            <surname>Novikov</surname>
            <given-names>A I</given-names>
          </string-name>
          2016
          <source>Computer Optics</source>
          <volume>40</volume>
          (
          <issue>2</issue>
          )
          <fpage>258</fpage>
          -
          <lpage>265</lpage>
          DOI: 10.18287/
          <fpage>2412</fpage>
          -6179- 2016-40-2-
          <fpage>258</fpage>
          -265
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>St-Laurent</surname>
            <given-names>L</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mikhnevich</surname>
            <given-names>M</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Bubel</surname>
            <given-names>A</given-names>
          </string-name>
          and
          <string-name>
            <surname>Prévost</surname>
            <given-names>D 2017</given-names>
          </string-name>
          <string-name>
            <surname>Quant. Infrared Thermography</surname>
          </string-name>
          J.
          <volume>14</volume>
          (
          <issue>2</issue>
          )
          <fpage>193</fpage>
          -
          <lpage>205</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Szeliski</surname>
            <given-names>R 2006</given-names>
          </string-name>
          <string-name>
            <surname>Found. Trends Comput</surname>
          </string-name>
          .
          <source>Graphics Vision 2</source>
          (
          <issue>1</issue>
          )
          <fpage>1</fpage>
          -
          <lpage>104</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Kudinov</surname>
            <given-names>I A</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pavlov</surname>
            <given-names>O V</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kholopov</surname>
            <given-names>I S</given-names>
          </string-name>
          and
          <string-name>
            <surname>Khramov M Yu CEUR Workshop Proc</surname>
          </string-name>
          .
          <year>1902</year>
          37-
          <fpage>42</fpage>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Brown D C 1971</surname>
          </string-name>
          <article-title>Photogramm</article-title>
          . Eng.
          <volume>37</volume>
          (
          <issue>8</issue>
          )
          <fpage>855</fpage>
          -
          <lpage>866</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Burt</surname>
            <given-names>P</given-names>
          </string-name>
          and
          <string-name>
            <surname>Adelson E 1983 ACM</surname>
          </string-name>
          <article-title>Trans</article-title>
          .
          <source>Graphics</source>
          <volume>2</volume>
          (
          <issue>4</issue>
          )
          <fpage>217</fpage>
          -
          <lpage>236</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Li</surname>
            <given-names>S</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kang</surname>
            <given-names>X</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fang</surname>
            <given-names>L</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Hu</surname>
            <given-names>J</given-names>
          </string-name>
          and
          <string-name>
            <surname>Yin</surname>
            <given-names>H 2017</given-names>
          </string-name>
          <string-name>
            <surname>Inf</surname>
          </string-name>
          . Fusion 33
          <fpage>100</fpage>
          -
          <lpage>112</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Jobson</surname>
            <given-names>D J</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rahman</surname>
            <given-names>Z</given-names>
          </string-name>
          and
          <string-name>
            <surname>Woodell G A 1997 IEEE</surname>
          </string-name>
          <article-title>Trans</article-title>
          .
          <source>Image Proc. 6</source>
          (
          <issue>7</issue>
          )
          <fpage>965</fpage>
          -
          <lpage>976</lpage>
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>MATLAB</given-names>
            <surname>Jet</surname>
          </string-name>
          <article-title>Array (Access mode: https://www</article-title>
          .mathworks.com/help/matlab/ref/jet.html)
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <surname>Moreland</surname>
            <given-names>K</given-names>
          </string-name>
          <source>2009 Proc. 5th Int. Symp. Adv. Visual Comput. (Las Vegas) II 92-103</source>
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <surname>Zheng</surname>
            <given-names>Y 2011</given-names>
          </string-name>
          <string-name>
            <surname>Image</surname>
          </string-name>
          <article-title>Fusion and its Applications (Rijeka: inTech)</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <surname>Li</surname>
            <given-names>G</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Xu</surname>
            <given-names>S</given-names>
          </string-name>
          and
          <string-name>
            <surname>Zhao</surname>
            <given-names>X</given-names>
          </string-name>
          <source>2010 Proc. SPIE 7710 77100S</source>
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <surname>Kholopov</surname>
            <given-names>I S</given-names>
          </string-name>
          <year>2016</year>
          Computer Optics
          <volume>40</volume>
          (
          <issue>2</issue>
          )
          <fpage>266</fpage>
          -
          <lpage>274</lpage>
          DOI: 10.18287/
          <fpage>2412</fpage>
          -6179-2016-40-2-
          <fpage>266</fpage>
          - 274
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>