<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <contrib-group>
        <aff id="aff0">
          <label>0</label>
          <institution>Zohaib Khan, Faisal Shafait and Ajmal Mian School of Computer Science and Software Engineering The University of Western Australia</institution>
          ,
          <addr-line>35 Stirling Highway, CRAWLEY, 6009</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>-Hyperspectral imaging and analysis refers to the capture and understanding of image content in multiple spectral channels. Satellite and airborne hyperspectral imaging has been the focus of research in remote sensing applications since nearly the past three decades. Recent use of ground-based hyperspectral imaging has found immense interest in areas such as medical imaging, art and archaeology, and computer vision. In this paper, we make an attempt to draw closer the forensic community and image analysis community towards automated forensic document examination. We believe that it has a huge potential to solve various challenging document image analysis problems, especially in the forensic document examination domain. We present the use of hyperspectral imaging for ink mismatch detection in handwritten notes as a sample application. Overall, this paper provides an overview of the applications of hyperspectral imaging with focus on solving pattern recognition problems. We hope that this work will pave the way for exploring its true potential in the document analysis research field.</p>
      </abstract>
      <kwd-group>
        <kwd>Multispectral imaging</kwd>
        <kwd>Hyperspectral imaging</kwd>
        <kwd>Hyperspectral document analysis</kwd>
        <kwd>forensic document examination</kwd>
        <kwd>ink mismatch detection</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION</title>
      <p>
        Human eye exhibits a trichromatic vision. This is due to the
presence of three types of photo-receptors called Cones that
are sensitive to different wavelength ranges in the visible range
of the electromagnetic spectrum [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Conventional imaging
sensors and displays (like cameras, scanners and monitors) are
developed to match the response of the trichromatic human
vision so that they deliver the same perception of the image
as in a real scene. This is why an RGB image constitutes
three spectral measurements per pixel. Most of the computer
vision applications do not make use of the spectral information
and directly employ grayscale images for image understanding.
There is evidence that machine vision tasks can take the
advantage of image acquisition in a wider range of electromagnetic
spectrum capturing more information in a scene compared
to only RGB data. Hyperspectral imaging captures spectral
reflectance information for each pixel in a wide spectral range.
It also provides selectivity in the choice of frequency bands.
Satellite based hyperspectral imaging sensors have long been
used for astronomical and remote sensing applications. Due to
the high cost and complexity of these hyperspectral imaging
sensors, various techniques have been proposed in the literature
to utilize conventional imaging systems combined with a few
off-the-shelf optical devices for hyperspectral imaging.
      </p>
      <p>
        Strictly speaking, an RGB image is a three channel
multispectral image. An image acquired at more than three specific
wavelengths in a band is referred to as a Multispectral Image.
Generally, multispectral imaging sensors acquire more than
three spectral bands. An image with a higher spectral
resolution or more number of bands is regarded as a Hyperspectral
Image. There is no clear demarcation with regards to the
number of spectral bands/resolution between multispectral and
hyperspectral images. However, hyperspectral sensors may
acquire a few dozen to several hundred spectral measurements
per scene point. For example, the AVIRIS (Airborne
Visible/Infrared Imaging Spectrometer) of NASA has 224 bands
in 400-2500nm range [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        During the past several years hyperspectral imaging has
found its utility in various ground-based applications. The use
of hyperspectral imaging in archeological artifacts restoration
has shown promising results. It is now possible to read
the old illegible historical manuscripts by restoration using
hyperspectral imaging [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. This was a fairly difficult task for a
naked eye due to its limited capability, restricted to the visible
spectral range. Similarly, hyperspectral imaging has also been
applied to the task of material discrimination. This is because
of the physical property of a material to reflect a specific range
of wavelengths giving it a spectral signature which can be
used for material identification [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. The greatest advantage of
hyperspectral imaging in such applications is that it is
noninvasive and thus does not affect the material under analysis
compared to other invasive techniques which inherently affect
the material under observation.
      </p>
      <p>Despite the success of hyperspectral imaging in solving
various challenging computer vision problems in recent years,
its use in the document image analysis research has remained
largely unexplored. In this paper, we intend to draw the
attention of the document analysis and forensics community
towards this promising technology. We believe that there is
a huge potential in hyperspectral imaging to solve various
challenging document image analysis problems, especially in
the forensic document examination domain. First, we present
in Section II a brief survey on the applications of hyperspectral
imaging in the field of pattern recognition. Then, some of
our recent work on forensic document examination using
hyperspectral imaging is discussed in Section III. The paper is
concluded with some hints about directions for future research
in Section IV.</p>
      <p>HYPERSPECTRAL IMAGING AND APPLICATIONS
A hyperspectral image has three dimensions: two spatial
(Sx and Sy ) and one spectral (S ) (see Figure 1). The
hyperspectral data can be represented in the form of a Spectral
Cube. Similarly, a hyperspectral video has four dimensions –
two spatial dimensions (Sx and Sy ), a spectral dimension (S )
and a temporal dimension (t). The hyperspectral video can
be thought of as a series of Spectral Cubes along temporal
dimension. Hyperspectral imaging has been applied in various
areas, some of which are listed in Table I. In the following,
we provide a brief survey of the applications of hyperspectral
imaging in pattern recognition. The scope of our survey is
limited to the multispectral and hyperspectral imaging systems
used in ground-based computer vision applications. Therefore,
high cost and complex sensors for remote sensing, astronomy,
and other geo-spatial applications are excluded from the
discussion.</p>
      <sec id="sec-1-1">
        <title>A. Biometrics Applications</title>
        <p>The bulk of biometric recognition research revolves around
monochromatic imaging. Recently, different biometric
modalities have taken advantage of hyperspectral imaging for reliable
and improved recognition. The images can cover visible,
infrared, or a combination of both ranges of the
electromagnetic spectrum (see Figure 2). We briefly discuss the recent
work in palmprint, face, fingerprint, and iris recognition using
hyperspectral imaging.</p>
        <p>
          Palmprints have emerged as a popular choice for human
access control and identification. Interestingly, palmprints have
even more to offer when imaged under different spectral
ranges. The line pattern is captured in the visible range
while the vein pattern becomes apparent in the near infrared
range. Both line and vein information can be captured using a
multispectral imaging system such as those developed by Han
et al. [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ] or Hao et al. [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ]. The underlying principle of a
multispectral palmprint imaging device is to use a monochromatic
camera with illumination sources of different colors. Images
of a palm are sequentially captured under each illumination
within a fraction of a second.
        </p>
        <p>
          Multispectral palmprint recognition system of Han et al. [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ]
captured images under four different illuminations (red, green,
blue and infrared). The first two bands (blue and green)
generally showed only the line structure, the red band showed
both line and vein structures, whereas the infrared band showed
only the vein structure. These images can be fused and
features extracted for subsequent matching and recognition. The
contact-free imaging system of Hao et al. [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ] acquires
multispectral images of a palm under six different illuminations. The
contact-free nature of the system offers more user acceptability
while maintaining a reasonable accuracy. Experiments show
that pixel level fusion of multispectral palmprints has better
recognition performance compared to monochromatic images.
The accuracy achievable by multispectral palmprints is much
higher compared to traditional monochromatic systems.
        </p>
        <p>
          Fingerprints are established as one of the most reliable
biometrics and are in common use around the world. Fingerprints
can yield even more robust features when captured under a
multispectral sensor. Rowe et al. [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ] developed a multispectral
imaging sensor for fingerprint imaging. The system comprised
of illumination source of multiple wavelengths (400, 445, 500,
574, 610 and 660nm) and a monochrome CCD of 640x480
resolution. They showed that MSI sensors are less affected
by moisture content of skin which is of critical significance
compared to the traditional sensors. Recognition based on
multispectral fingerprints outperformed standard fingerprint
imaging.
        </p>
        <p>
          Face recognition has an immense value in human
identification and surveillance. The spectral response of human
skin is a distinct feature which is largely invariant to the pose
and expression [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] variation. Moreover, multispectral images
of faces are less susceptible to variations in illumination
sources and their directions [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]. Multispectral face recognition
systems generally use a monochromatic camera coupled with
a Liquid Crystal Tunable Filter (LCTF) in the visible and/or
near-infrared range. A multispectral image is captured by
electronically tuning the filter to the desired wavelengths and
acquiring images in a sequence.
10 nm
raF liedd reaN
        </p>
        <p>M
nm nm nm
200 300 400
e
l
b
ii
s
V
m
n
0
0
7</p>
        <p>
          Iris is another unique biometric used for person
authentication. Boyce et al. [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ] explored multispectral iris imaging
in the visible electromagnetic spectrum and compared it to
the near-infrared in a conventional iris imaging systems. The
use of multispectral information for iris enhancement and
segmentation resulted in improved recognition performance.
        </p>
      </sec>
      <sec id="sec-1-2">
        <title>B. Material Identification</title>
        <p>
          Naturally existing materials show a characteristic spectral
response to incident light. This property of a material can
distinguish it from other materials. The use of multispectral
techniques for imaging the works of arts like paintings allows
segmentation and classification of painted parts. This is based
on the pigment physical properties and their chemical
composition [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ]. Pelagotti et al. [
          <xref ref-type="bibr" rid="ref11">11</xref>
          ] used multispectral imaging
for analysis of paintings. They collected multispectral images
of a painting in UV, Visible and Near IR band. It was
possible to differentiate among different color pigments which
appear similar to the naked eye based on spectral reflectance
information.
        </p>
        <p>
          Gregoris et al. [
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] exploited the characteristic reflectance
of ice in the infrared band to detect ice on various surfaces
which is difficult to inspect manually. The developed prototype
called MD Robotics’ Spectral Camera system could determine
the type, level and location of the ice contamination on a
surface. The prototype system was able to estimate thickness
of ice (&lt;0.5mm) in relation to the measured spectral contrast.
Such system may be of good utility for aircraft/space shuttle
ice contamination inspection and road condition monitoring in
snow conditions.
        </p>
        <p>
          Multispectral imaging has critical importance in magnetic
resonance imaging. Multispectral magnetic resonance imagery
of brain is in wide use in medical science. Various tissue
types of the brain are distinguishable by virtue of multispectral
imaging which aids in medical diagnosis [
          <xref ref-type="bibr" rid="ref13">13</xref>
          ].
        </p>
        <p>
          Clemmensen et al. [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] used multispectral imaging to
estimate the moisture content of sand used in concrete. It is a
very useful technique for non-destructive in-vivo examination
of freshly laid concrete. A total of nine spectral bands was
acquired in both visual and near infrared range. Zawada et
al. [
          <xref ref-type="bibr" rid="ref15">15</xref>
          ] proposed a novel underwater multispectral imaging
system named LUMIS (Low light level Underwater
Multispectral Imaging System) and demonstrated its use in study of
phytoplankton and bleaching experiments.
        </p>
        <p>
          Spectrometry techniques are also widely used to identify
the fat content in pork meat, because it has proved significantly
cheaper and more efficient than traditional analytical chemistry
methods [
          <xref ref-type="bibr" rid="ref16">16</xref>
          ]. For that purpose, near-infrared spectrometers are
used that measure the spectrum of light transmitted through a
sample of minced pork meat.
        </p>
        <p>
          Last but not least, multispectral imaging has also important
applications in defense and security. For instance, Alouini [
          <xref ref-type="bibr" rid="ref17">17</xref>
          ]
showed that multispectral polarimetric imaging significantly
enhances the performance of target detection and
discrimination.
        </p>
        <p>III. FORENSIC DOCUMENT EXAMINATION USING</p>
        <p>HYPERSPECTRAL IMAGING</p>
        <p>
          Hyperspectral imaging (HSI) has recently emerged as an
efficient non-destructive tool for detection, enhancement [
          <xref ref-type="bibr" rid="ref18">18</xref>
          ],
comparison and identification of forensic traces [
          <xref ref-type="bibr" rid="ref19">19</xref>
          ]. Such
systems have a huge potential for aiding forensic document
examiners in various tasks. Brauns et al. [
          <xref ref-type="bibr" rid="ref20">20</xref>
          ] developed a
hyperspectral imaging system to detect forgery in potentially
fraudulent documents in a non-destructive manner. A more
sophisticated hyperspectral imaging system was developed at the
National Archives of Netherlands for the analysis of historical
documents in archives and libraries [
          <xref ref-type="bibr" rid="ref21">21</xref>
          ]. The system provided
high spatial and spectral resolution from near-UV through
visible to near IR range. The only limitation of the system
was its extremely slow acquisition time (about 15 minutes)
[
          <xref ref-type="bibr" rid="ref22">22</xref>
          ]. Other commercial hyperspectral imaging systems from
Foster &amp; Freeman [
          <xref ref-type="bibr" rid="ref23">23</xref>
          ] and ChemImage [
          <xref ref-type="bibr" rid="ref24">24</xref>
          ] also allow manual
comparison of writing ink samples. Hammond [25] used visual
comparison in Lab color mode for differentiating different
black inks. Such manual analysis of inks cannot establish the
presence of different inks with certainty, because of inherent
human error. Here we will demonstrate a promising application
of hyper-spectral imaging for automated writing inks mismatch
detection that we have recently proposed [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ]. The work is
based on the assumption that same inks exhibit similar spectral
responses whereas different inks show dissimilarity in their
spectra. The phenomenon is illustrated in Figure 3. We assume
that the spectral responses of the inks are independent of the
writing styles of different subjects.
        </p>
        <p>
          Using our hyperspectral imaging setup (see [
          <xref ref-type="bibr" rid="ref26">26</xref>
          ] for
details), a database comprising of 70 hyperspectral images of
a hand-written note in 10 different inks by 7 subjects was
collected1. All subjects were instructed to write the same
sentence, once in each ink on a white paper. The pens included
        </p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>1UWA Writing Ink Hyperspectral Image Database</title>
      <p>http://www.csse.uwa.edu.au/%7Eajmal/databases.html
5 varieties of blue ink and 5 varieties of blank ink pens. It was
ensured that the pens came from different manufacturers while
the inks still appeared visually similar. Then, we produced
mixed writing ink images from single ink notes by joining
equally sized image portions from two inks written by the
same subject. This made roughly the same proportion of the
two inks under question.</p>
      <p>
        The mixed-ink images were pre-processed
(binarization [
        <xref ref-type="bibr" rid="ref27">27</xref>
        ] followed by spectral response normalization) and then
fed to the k-means clustering algorithm with a fixed value of
k = 2. Finally, based on the output of clustering, segmentation
accuracy was computed as
      </p>
      <p>Accuracy =</p>
      <p>True Positives</p>
      <p>True Positives + False Positives + False Negatives
The segmentation accuracy is averaged over seven samples for
each ink combination Cij . It is important to note that according
to this evaluation metric, the accuracy of a random guess (in a
two class problem) will be 1/3. This is different from common
classification accuracy metrics where the accuracy of a random
guess is 1/2. This is because our chosen metric additionally
penalizes false negatives which are useful to quantify in a
segmentation problem.</p>
      <p>ra0.22
t
c
peS0.2
d
ize0.18
l
a
rom0.16
N
an0.14
e
M
400</p>
      <p>Blue Ink</p>
      <p>Ink 1
Ink 2
Ink 3
Ink 4
Ink 5
ra0.22
t
c
peS0.2
d
lize0.18
a
rom0.16
N
n
ea0.14
M</p>
      <p>Black Ink</p>
      <p>Ink 1
Ink 2
Ink 3
Ink 4
Ink 5
500 600
Wavelength (nm)
700
400
500 600
Wavelength (nm)
700</p>
      <p>Figure 4 shows the average normalized spectra of all blue
and black inks, respectively. It was achieved by computing the
average of the spectral responses of each ink over all samples
in the database. It can be observed that the spectra of the inks
are distinguished at different ranges in the visible spectrum. A
close analysis of variability of the ink spectra in these ranges
reveals that most of the differences are present in the
highvisible range, followed by mid-visible and low-visible ranges.</p>
      <p>We now inspect how hyperspectral information can be
beneficial in discrimination of inks. We compare the
segmentation accuracy of HSI with RGB in Figure 5. As expected,</p>
      <p>Blue Ink</p>
      <p>
        Black Ink
0.8
cy0.6
a
r
u
c
c0.4
A
HSI significantly improves over RGB in most of the ink
combinations. This results in most accurate clustering of ink
combinations C12, C14, C12, C25, C35 and C45. In case of
black inks, ink 1 is highly distinguished from all other inks
resulting in the most accurate clustering for all combinations
C1j . However, it can be seen that for a few combinations,
HSI does not show a remarkable improvement. Instead, in
some cases, it is less accurate compared to RGB. These results
encouraged us to further look at HSI in detail in order to
take advantage of the most informative bands. The results of
different feature (band) selection methods for this problem are
detailed in [
        <xref ref-type="bibr" rid="ref26">26</xref>
        ]. Overall, the results showed that use of a few
selected bands further improved discrimination between most
of the ink combinations.
      </p>
      <p>We now present some qualitative results on segmentation
of blue and black ink combinations. The original images of
a combination of two blue inks (C34) and black inks (C45)
are shown are in Figure 6. RGB images are shown here for
better visual appearance. The ground truth images are labeled
in pseudo-colors, where green pixels represent the first ink and
red pixels represent the second ink.</p>
      <p>The clustering based on RGB images fails to group similar
ink pixels into the same clusters. A closer look reveals that all
of the ink pixels are falsely grouped into one cluster, whereas
most of the boundary pixels are grouped into the other cluster.
This implies that typical RGB imaging is not sufficient to
discriminate inks that appear visually similar to each other.
On the other hand, segmentation based on HSI is much more
effective compared to RGB. It can be seen that the majority
of the ink pixels are correctly grouped in HSI in accordance
with the ground truth segmentation. Note that the k-means
clustering algorithm used in this work is rather basic. The
use of more advance clustering algorithms has the potential
of further improving the accuracy of ink segmentation.</p>
      <p>IV.</p>
    </sec>
    <sec id="sec-3">
      <title>CONCLUSION AND OUTLOOK</title>
      <p>This paper presented an overview about different
applications of hyperspectral imaging in pattern recognition. We
also demonstrated a sample application of HSI in document
image analysis, where it was possible to discriminate between
two visually similar inks using hyperspectral images of the
documents. This is the first reported work on using
automatic document image analysis methods in combination with
hyperspectral imaging to address forensically relevant issues
in questioned document examination. In future, it will be
interesting to see whether spectral imaging can aid in writer
identification. Since it is possible to identify hand writings</p>
      <sec id="sec-3-1">
        <title>Ground Truth</title>
      </sec>
      <sec id="sec-3-2">
        <title>Result (RGB)</title>
      </sec>
      <sec id="sec-3-3">
        <title>Result (HSI)</title>
        <p>
          by the texture [
          <xref ref-type="bibr" rid="ref28">28</xref>
          ] or ink-deposition traces [
          <xref ref-type="bibr" rid="ref29">29</xref>
          ], a promising
research direction would be to investigate whether these feeble
variations in ink strokes are reflected in the spectral response of
the inks. In addition, ink or document aging is a phenomenon
that can be observed in a more effective manner using spectral
imaging. During the aging process, the chemical properties of
ink and paper change due to various environmental factors.
Spectral imaging can potentially capture subtle differences in
inks or paper due to aging. These are just a few application
examples where HSI can potentially provide solutions to some
major practical problems in document analysis. We hope
that this work will open up many exciting possibilities for
tackling forensic document examination problems with a new
perspective.
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>ACKNOWLEDGMENT</title>
      <p>This research was supported by ARC Grant DP110102399.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>P. R.</given-names>
            <surname>Martin</surname>
          </string-name>
          , “
          <article-title>Retinal color vision in primates</article-title>
          ,” in Encyclopedia of Neuroscience. Springer,
          <year>2009</year>
          , pp.
          <fpage>3497</fpage>
          -
          <lpage>3501</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>P.</given-names>
            <surname>Shippert</surname>
          </string-name>
          , “
          <article-title>Introduction to hyperspectral image analysis</article-title>
          ,
          <source>” Online Journal of Space Communication</source>
          , vol.
          <volume>3</volume>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.</given-names>
            <surname>Baronti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Casini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Lotti</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Porcinai</surname>
          </string-name>
          , “
          <article-title>Principal component analysis of visible and near-infrared multispectral images of works of art,” Chemometrics and Intelligent Laboratory Systems</article-title>
          , vol.
          <volume>39</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>103</fpage>
          -
          <lpage>114</lpage>
          ,
          <year>1997</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>B.</given-names>
            <surname>Thai</surname>
          </string-name>
          and G. Healey, “
          <article-title>Invariant subpixel material detection in hyperspectral imagery</article-title>
          ,
          <source>” IEEE Transactions on Geoscience and Remote Sensing</source>
          , vol.
          <volume>40</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>599</fpage>
          -
          <lpage>608</lpage>
          ,
          <year>2002</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>D.</given-names>
            <surname>Han</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Guo</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , “
          <article-title>Multispectral palmprint recognition using wavelet-based image fusion,”</article-title>
          <source>in Proc. International Conference on Signal Processing. IEEE</source>
          ,
          <year>2008</year>
          , pp.
          <fpage>2074</fpage>
          -
          <lpage>2077</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Hao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Tan</surname>
          </string-name>
          , and
          <string-name>
            <given-names>C.</given-names>
            <surname>Ren</surname>
          </string-name>
          , “
          <article-title>Multispectral palm image fusion for accurate contact-free palmprint recognition,”</article-title>
          <source>in Proc. International Conference on Image Processing. IEEE</source>
          ,
          <year>2008</year>
          , pp.
          <fpage>281</fpage>
          -
          <lpage>284</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>R. K.</given-names>
            <surname>Rowe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Nixon</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>Corcoran</surname>
          </string-name>
          , “Multispectral fingerprint biometrics,”
          <source>in Proc. IEEE SMC Information Assurance Workshop</source>
          . IEEE,
          <year>2005</year>
          , pp.
          <fpage>14</fpage>
          -
          <lpage>20</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Pan</surname>
          </string-name>
          , G. Healey,
          <string-name>
            <given-names>M.</given-names>
            <surname>Prasad</surname>
          </string-name>
          , and
          <string-name>
            <given-names>B.</given-names>
            <surname>Tromberg</surname>
          </string-name>
          , “
          <article-title>Face recognition in hyperspectral images</article-title>
          ,
          <source>” IEEE Trans. on Pattern Analysis and Machine Intelligence</source>
          , vol.
          <volume>25</volume>
          , no.
          <issue>12</issue>
          , pp.
          <fpage>1552</fpage>
          -
          <lpage>1560</lpage>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>H.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Koschan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Abidi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. G.</given-names>
            <surname>Kong</surname>
          </string-name>
          , and C.
          <string-name>
            <surname>-H. Won</surname>
          </string-name>
          , “
          <article-title>Multispectral visible and infrared imaging for face recognition,” in Proc. Computer Vision and Pattern Recognition Workshops</article-title>
          . IEEE,
          <year>2008</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>C.</given-names>
            <surname>Boyce</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ross</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Monaco</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Hornak</surname>
          </string-name>
          , and
          <string-name>
            <given-names>X.</given-names>
            <surname>Li</surname>
          </string-name>
          , “
          <article-title>Multispectral iris analysis: A preliminary study,” in Proc. Computer Vision and Pattern Recognition Workshops (CVPRW)</article-title>
          . IEEE,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Pelagotti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Del Mastio</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. De Rosa</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Piva</surname>
          </string-name>
          , “Multispectral imaging of paintings,
          <source>” IEEE Signal Processing Magazine</source>
          , vol.
          <volume>25</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>27</fpage>
          -
          <lpage>36</lpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>D.</given-names>
            <surname>Gregoris</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Yu</surname>
          </string-name>
          , and
          <string-name>
            <given-names>F.</given-names>
            <surname>Teti</surname>
          </string-name>
          , “
          <article-title>Multispectral imaging of ice,”</article-title>
          <source>in Proc. Canadian Conference on Electrical and Computer Engineering</source>
          , vol.
          <volume>4</volume>
          . IEEE,
          <year>2004</year>
          , pp.
          <fpage>2051</fpage>
          -
          <lpage>2056</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>T.</given-names>
            <surname>Taxt</surname>
          </string-name>
          and
          <string-name>
            <given-names>A.</given-names>
            <surname>Lundervold</surname>
          </string-name>
          , “
          <article-title>Multispectral analysis of the brain using magnetic resonance imaging</article-title>
          ,
          <source>” IEEE Transactions on Medical Imaging</source>
          , vol.
          <volume>13</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>470</fpage>
          -
          <lpage>481</lpage>
          ,
          <year>1994</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>L. H.</given-names>
            <surname>Clemmensen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Hansen</surname>
          </string-name>
          , and
          <string-name>
            <given-names>B. K.</given-names>
            <surname>Ersbøll</surname>
          </string-name>
          , “
          <article-title>A comparison of dimension reduction methods with application to multi-spectral images of sand used in concrete</article-title>
          ,
          <source>” Machine Vision and Applications</source>
          , vol.
          <volume>21</volume>
          , no.
          <issue>6</issue>
          , pp.
          <fpage>959</fpage>
          -
          <lpage>968</lpage>
          ,
          <year>2010</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>D. G.</given-names>
            <surname>Zawada</surname>
          </string-name>
          , “
          <article-title>Image processing of underwater multispectral imagery,”</article-title>
          <source>IEEE Journal of Oceanic Engineering</source>
          , vol.
          <volume>28</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>583</fpage>
          -
          <lpage>594</lpage>
          ,
          <year>2003</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>H. H.</given-names>
            <surname>Thodberg</surname>
          </string-name>
          , “
          <article-title>A review of bayesian neural networks with an application to near infrared spectroscopy</article-title>
          ,
          <source>” IEEE Transactions on Neural Networks</source>
          , vol.
          <volume>7</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>56</fpage>
          -
          <lpage>72</lpage>
          ,
          <year>1996</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>M.</given-names>
            <surname>Alouini</surname>
          </string-name>
          , “
          <article-title>Target detection and discrimination through active multispectral polarimetric imaging,” in Computational Optical Sensing and Imaging</article-title>
          .
          <source>Optical Society of America</source>
          ,
          <year>2005</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>8</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>S.</given-names>
            <surname>Joo Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Deng</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M. S.</given-names>
            <surname>Brown</surname>
          </string-name>
          , “
          <article-title>Visual enhancement of old documents with hyperspectral imaging,” Pattern Recognition</article-title>
          , vol.
          <volume>44</volume>
          , no.
          <issue>7</issue>
          , pp.
          <fpage>1461</fpage>
          -
          <lpage>1469</lpage>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>G.</given-names>
            <surname>Edelman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Gaston</surname>
          </string-name>
          , T. van Leeuwen,
          <string-name>
            <given-names>P.</given-names>
            <surname>Cullen</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Aalders</surname>
          </string-name>
          , “
          <article-title>Hyperspectral imaging for non-contact analysis of forensic traces,” Forensic Science International</article-title>
          , vol.
          <volume>223</volume>
          , pp.
          <fpage>28</fpage>
          -
          <lpage>39</lpage>
          ,
          <year>2012</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          [20]
          <string-name>
            <given-names>E. B.</given-names>
            <surname>Brauns</surname>
          </string-name>
          and
          <string-name>
            <given-names>R. B.</given-names>
            <surname>Dyer</surname>
          </string-name>
          , “
          <article-title>Fourier transform hyperspectral visible imaging and the nondestructive analysis of potentially fraudulent documents,” Applied spectroscopy</article-title>
          , vol.
          <volume>60</volume>
          , no.
          <issue>8</issue>
          , pp.
          <fpage>833</fpage>
          -
          <lpage>840</lpage>
          ,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          [21]
          <string-name>
            <given-names>R.</given-names>
            <surname>Padoan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Steemers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Klein</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Aalderink</surname>
          </string-name>
          , and G. de Bruin, “
          <article-title>Quantitative hyperspectral imaging of historical documents: technique and applications</article-title>
          ,”
          <source>ART Proceedings</source>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          [22]
          <string-name>
            <given-names>M. E.</given-names>
            <surname>Klein</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. J.</given-names>
            <surname>Aalderink</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Padoan</surname>
          </string-name>
          , G. De Bruin, and
          <string-name>
            <given-names>T. A.</given-names>
            <surname>Steemers</surname>
          </string-name>
          , “
          <article-title>Quantitative hyperspectral reflectance imaging</article-title>
          ,
          <source>” Sensors</source>
          , vol.
          <volume>8</volume>
          , no.
          <issue>9</issue>
          , pp.
          <fpage>5576</fpage>
          -
          <lpage>5618</lpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>[23] foster + freeman, http://www.fosterfreeman.com/index.php.</mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>[24] [25] ChemImage, http://www.chemimage.com/.</mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          <string-name>
            <surname>D. L. Hammond</surname>
          </string-name>
          , “
          <article-title>Validation of lab color mode as a nondestructive method to differentiate black ballpoint pen inks*</article-title>
          ,
          <source>” Journal of forensic sciences</source>
          , vol.
          <volume>52</volume>
          , no.
          <issue>4</issue>
          , pp.
          <fpage>967</fpage>
          -
          <lpage>973</lpage>
          ,
          <year>2007</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          [26]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Khan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Shafait</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>A.</given-names>
            <surname>Mian</surname>
          </string-name>
          , “
          <article-title>Hyperspectral imaging for ink mismatch detection,”</article-title>
          <source>in Proc. International Conference on Document Analysis and Recognition (ICDAR)</source>
          ,
          <year>2013</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          [27]
          <string-name>
            <given-names>F.</given-names>
            <surname>Shafait</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Keysers</surname>
          </string-name>
          , and
          <string-name>
            <given-names>T. M.</given-names>
            <surname>Breuel</surname>
          </string-name>
          , “
          <article-title>Efficient implementation of local adaptive thresholding techniques using integral images,” Document Recognition</article-title>
          and
          <string-name>
            <surname>Retrieval</surname>
            <given-names>XV</given-names>
          </string-name>
          , pp.
          <volume>681</volume>
          <fpage>510</fpage>
          -
          <lpage>681</lpage>
          510-
          <fpage>6</fpage>
          ,
          <year>2008</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          [28]
          <string-name>
            <given-names>K.</given-names>
            <surname>Franke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Bunnemeyer</surname>
          </string-name>
          , and T. Sy, “
          <article-title>Ink texture analysis for writer identification</article-title>
          ,”
          <source>in Proc. IEEE Workshop on Frontiers in Handwriting Recognition</source>
          ,
          <year>2002</year>
          , pp.
          <fpage>268</fpage>
          -
          <lpage>273</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          [29]
          <string-name>
            <given-names>K.</given-names>
            <surname>Franke</surname>
          </string-name>
          and
          <string-name>
            <given-names>S.</given-names>
            <surname>Rose</surname>
          </string-name>
          , “
          <article-title>Ink-deposition model: The relation of writing and ink deposition processes,”</article-title>
          <source>in Proc. IEEE Workshop on Frontiers in Handwriting Recognition</source>
          ,
          <year>2004</year>
          , pp.
          <fpage>173</fpage>
          -
          <lpage>178</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>