<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Identification and Classification of Color Textures1</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Mattias Mende</string-name>
          <email>mattias.mende@iwu.fraunhofer.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          ,
          <addr-line>Thomas Wiener</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Fraunhofer Institute for Machine Tools and Forming Technology IWU</institution>
          ,
          <addr-line>Chemnitz</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>This article describes, how color textures can be reliably detected and classified in the production process independent of external parameters such as brightness, object positions (translation), angulars (rotation), object distances (scaling) or curved surfaces (rotation + scaling). The methods described here are also suitable for reliably classifying at least 18 color textures even if they differ only slightly from each other optically. The online classification of color textures is a classic task in the wood, furniture and textile industry. For example, unwanted defects or partial soiling on moving webs can be reliably detected regardless of fluctuations in brightness and/or shadows during process operation. Algorithms has been developed for teach-in with RGB-HSI-transform, set fewer segments on the color textures of each class with e.g. 24x24 Pixel, use suitable transformations {HSI}, e.g. 2D-FFT for formation characteristic 2D spectral mountains in these segments, extraction of statistical features and setting up the individual classifiers. Algorithms has been developed for identification &amp; classification in process operation with extraction of statistical characteristics and methods of robust classification. The implementation of the methods, the triggering of the color cameras, the processing of the color information including the output of the results to the process control is done with the data analysis program Xeidana®.</p>
      </abstract>
      <kwd-group>
        <kwd>Optical Image Processing</kwd>
        <kwd>Identification</kwd>
        <kwd>Classification</kwd>
        <kwd>Color Textures</kwd>
        <kwd>Process Control</kwd>
        <kwd>Color Sphere Model</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>The applications of optical image processing cover almost all areas of daily life as well
as in production, manufacturing and research. They can be assigned to five essential
fields, see Fig.1.1.</p>
      <p>
        Copyright © 2020 for this paper by its authors. Use permitted under Creative Commons
License Attribution 4.0 International (CC BY 4.0).
Important fields of application are object recognition and classification, monitoring and
control of visible areas, inspection of object surfaces, recording the (3D) positions
of objects or measuring their 3D geometries. In this paper the automatic recognition and
classification of objects using existing textures will be presented. Textures are
characteristic regular or disordered patterns, which can be found on the surfaces of
objects, see Fig.1.2.
During the actual classification process, all objects of one type are grouped together in
a uniform class. New classes are created for objects of a new type. An important task
in classification is to separate textures of different classes from each other (selectivity),
but at the same time to allow textures with small deviations within the class (immunity
to interference) [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
      </p>
      <p>Identification and Classification of Color Textures 3
In particular, the correct result of the classification should not be affected even if typical
deviations occur in the production process, e.g.</p>
      <p>The objects change their positions in the camera image, (T-) translation
invariance
The color textures of individual objects are rotated, (R-) rotation invariance
The sizes of the textures change, (S-) scaling invariance
The ambient brightness changes and thereby the integral brightness of the
image changes
Partial shadows appear on the textures
There is partial soiling on the colour textures
The textures lie on spherically curved surfaces and are therefore distorted in
the camera image (RST invariance)
3</p>
    </sec>
    <sec id="sec-2">
      <title>Methods and procedures</title>
      <p>Mostly color cameras are used for optical image processing. The method of
classification described here is therefore based on color textures, but is also well
suited for b/w textures. The actual process of classification is divided into 2 steps; the
prior learning of suitable features of the textures (1.) as well as their recognition and
online classification (2.) in the process, see Fig.3.1.</p>
      <p>Both for teaching the textures and for their classification in process operation, the
textures are divided into individual segments (segmentation). This corresponds to the
model of nature with a successive rejuvenation of the number of image channels from
the retina to the following bipolar cells according to the visual process.
3.1</p>
      <p>
        Segmentation (tiling)
During the teach-in process one or more representative segments (tiles) with e.g.
16x16 or 32x32 pixels edge length are taken from the textures of each relevant object
class, see Fig.3.2.
A previous HSI transformation of each RGB color pixel proves to be advantageous. It
forms the basis for obtaining invariant features and ensures by simple means an
additional invariance against changes in brightness, partial shading etc., Fig.3.3.-6.
The HSI model used for this purpose is based on the HSI color sphere model according
to [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Compared to other colour models, this model has a spatially vivid representation
of hue (H), colour saturation (S) and brightness (I) and at the same time a largely
consistent and unambiguous mapping of the entire set of all points of the RGB colour
space to the points of the HSI colour space.
Characteristic features are extracted from the tiles of each class for the classification.
The HSI values of the individual segments are first processed, i.e. subjected to a suitable
transformation {HSI}. From this, m-characteristics are then generated within each
class. Depending on the number ‘n’ of segments used, these characteristics are
summarized in m-dimensional clouds with n-points each. From the point clouds those
characteristics can be extracted for each class, which have the best invariance
properties against external parameters such as changes in brightness, scaling, rotation
or distortion [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], see Table 3.1.
      </p>
      <sec id="sec-2-1">
        <title>O invariant, - non-invariant,</title>
        <p>* after transformation from cartesian coordinates to polar
coordinates</p>
        <p>The transformations {HSI} listed below offer, among other things, the possibility
of bringing about an extraction of suitable and partially invariant characteristics after
ap- propriate preparation of the segments:</p>
      </sec>
      <sec id="sec-2-2">
        <title>Histogram analysis</title>
        <p>Fast Fourier Transformation (FFT)
Hough Transformation
Gabor transformation</p>
        <p>Local Binary Pattern (LBP) approach</p>
        <p>
          With methods such as the Local Binary Pattern (LBP) approach [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ], characteristics
with invariant properties can be extracted without prior transformation {HSI}.
        </p>
        <p>In order to achieve additional better properties regarding immunity to interference
(shadows, contamination) and at the same time high separation efficiency for similar
textures belonging to other classes, methods with robust classification are used. Modern
methods of classification use numerical compensation methods, e.g. the method of least
squares, as well as methods of fuzzy classification based on probabilistic models, see
Fig.3.7. For their description, for example, the minimum sympathy difference (shown
in green), which is a measure of the minimum distance between two texture classes in</p>
        <p>Identification and Classification of Color Textures 7
the n-dimensional feature space, is suitable. The sympathy difference then describes
the affiliation to a texture class.</p>
        <p>
          In [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ] the number of texture classes corresponds to the number of m-dimensional
point clouds. Their positions, shapes and characteristics are important parameters for
each class. As a result of the classification, the class with the highest sympathy is
used, i.e. where the sum of the distance values of the feature vectors is minimal.
However, if this sympathy is smaller than the minimum sympathy SMS or the
difference to the next smaller sympathy is smaller than the minimum sympathy
difference SMD, the point cloud is rejected as unclassifiable. In the following Table
3.2., different methods are listed, which are in principle suitable for the robust
classification of color textures.
        </p>
        <p>
          The methods listed in Table 3.2. were implemented in the existing universal data
analysis program Xeidana® [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ].
        </p>
        <p>
          Procedures
Support Vector
Machine [
          <xref ref-type="bibr" rid="ref7 ref8">7, 8</xref>
          ]
Multilayer
Perceptron [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]
Multiple linear
discriminant
analysis
[
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]
Radial basic
function
network [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]
The program Xeidana® (eXtensible Environment for Industrial Data ANAlysis) is used
for the feature extraction and the creation of the classifier. The program Xeidana® is
written in C#.
        </p>
        <p>It comprises of an extensible development environment for solving data analysis
tasks in the industrial sector, Fig. 3.8.</p>
        <p>Xeidana® is also used for the classification and visualization of large amounts of
data. With the help of the extensive repertoire of algorithms and procedures, colour
textures, for example, can be classified. The software has a modular structure and can
be extended with new functionalities using a variety of different libraries.</p>
        <p>Fig.3.9. shows the user interface with which colour textures are both taught and
classified. The left window shows 18 different colour textures which are to be taught.
The right part of the user interface contains the functionality for selecting the RGB and
HSI color channels, setting segments of variable size, e.g. with 16x16 or 32x32 pixels,
and classifying the textures.</p>
        <p>The segments are set with the mouse on the texture, see red tile on the upper
texture. The extraction of the characteristics can be done with Histogram Analysis, Fast
Fourier Transformation (FFT) or Gabor Transformation</p>
        <p>In Fig.3.10. n-dimensional feature vectors of the individual classes are shown. By
pair- wise combination of two characteristics in each case, in the plan view (left
picture) suitable characteristics are compared and selected by assessing their separation
perfor- mance (Features of the LBP procedure).</p>
        <p>Fig.3.11. shows the module for Robust Classification used for setting the
parameters of the classifier and for automatic parameter optimization using the
Support Vector Machine as an example.</p>
        <p>Identification and Classification of Color Textures 11
4</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Results &amp; industrial application examples</title>
      <p>In order to create environmental conditions for the classification of color textures
similar to those in process operation, a special test stand with a conveyor belt (width
1m, length 4m) was set up at Fraunhofer IWU. The asynchronous triggering of the
individual cameras, the processing of the image information and the transfer of the
classification results are based on the data analysis program Xeidana®, see Fig.4.1.</p>
      <p>To demonstrate the performance of the Xeidana® program, a demonstrator was
cre- ated, see Fig. 4.2. which consists of a camera and a turntable with 10 different
types of wood. The grains and colors of the 10 types of wood are different, but in
some cases they differ only slightly from each other. The aim was to achieve a correct
clas- sification over their entire surface despite the spherically curved surfaces and the
re- sulting variations in colour textures in distance and angular position. The result of
the classification is shown in Fig.4.3.
With the methods for classification described here, color textures can be reliably
detected and classified in the production process independent of external parameters
such as brightness, object positions (translation and rotation), object distances
(scaling) or curved surfaces (rotation + scaling).</p>
      <p>The methods described here are also suitable for reliably classifying at least 18
colour textures even if they optically differ only slightly from each other.</p>
      <p>The online classification of color textures is a classic task in the wood, furniture
and textile industry. For example, unwanted defects or partial soiling on moving webs
can be reliably detected regardless of fluctuations in brightness and/or shadows during
process operation.</p>
      <p>The implementation of the methods, the triggering of the color cameras, the
processing of the color information including the output of the results to the process
control is done with the data analysis program Xeidana®. The following algorithm has
been developed for teaching and classifying HSI color textures is inserted:
a)</p>
      <sec id="sec-3-1">
        <title>Teach-in phase of the objects</title>
        <p>1. RGB-HSI transformation of the color textures
2. set fewer segments on the color textures of each class with e.g.</p>
        <p>24x24Pixel
3. suitable transformations {HSI}, e.g. 2D-FFT for formation characteristic
2D spectral mountains in these segments</p>
        <p>extraction of statistical features from the 2D spectral mountains
setting up the individual classifiers</p>
      </sec>
      <sec id="sec-3-2">
        <title>b) Identification &amp; classification in process operation</title>
        <p>1. RGB-HSI transformation of all pixel values of the image
2. segmentation of previously defined areas of the image (ROI)
3. suitable transformations {HSI} of all segments
4. extraction of statistical characteristics
5. robust classification</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          1.
          <string-name>
            <surname>Beyerer</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Heintz</surname>
          </string-name>
          , R.:
          <article-title>Licht und Farbe</article-title>
          . In:
          <article-title>Fraunhofer Allianz Vision Seminar Inspektion und Charakterisierung von Oberflächen mit Bildverarbeitung</article-title>
          ,
          <source>Erlangen (Dec</source>
          .
          <year>2007</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          2.
          <string-name>
            <surname>Matile</surname>
          </string-name>
          , H.:
          <article-title>Die Farbenlehre Phillip Otto Runges, 2nd edn</article-title>
          .
          <source>München</source>
          (
          <year>1979</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          3. 3 https://www.iwu.fraunhofer.de, last accessed
          <year>2020</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          4.
          <string-name>
            <surname>Mende</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wiener</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          :
          <article-title>Online-Klassifikation von Farbtexturen</article-title>
          .
          <source>In: Fraunhofer Vision Leitfaden</source>
          <volume>16</volume>
          "
          <article-title>Inspektion und Charakterisierung von Oberflächen mit Bildverarbeitung"</article-title>
          , pp.
          <fpage>66</fpage>
          . (
          <year>2016</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          5.
          <string-name>
            <surname>Mäenpää</surname>
          </string-name>
          , Topi.:
          <article-title>The local binary pattern approach to texture analysis - extensions</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <article-title>6. and applications</article-title>
          . In: Infotech Oulu and Department of Electrical and Information Engineering, University of Oulu, P.O.Box 4500, FIN-90014 University of Oulu, Finland Oulu,
          <string-name>
            <surname>Finland</surname>
          </string-name>
          (
          <year>2003</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          7.
          <string-name>
            <surname>Priber</surname>
            ,
            <given-names>U.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Kretzschmar</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          :
          <article-title>Inspection and Supervision by Means of Hierarchical Fuzzy Classifiers</article-title>
          .
          <source>Fuzzy Sets and Systems</source>
          , Vol.
          <volume>85</volume>
          /1, North-Holland (
          <year>1997</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          8.
          <string-name>
            <surname>Vapnik</surname>
          </string-name>
          ,
          <article-title>Chervonenkis: Theory of Pattern Recognition (</article-title>
          <year>1974</year>
          )
          <article-title>(germ</article-title>
          .: Wapnik und Tschervonenkis,
          <source>Theorie der Mustererkennung</source>
          (
          <year>1979</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          9.
          <string-name>
            <surname>Schölkopf</surname>
          </string-name>
          , Smola: „
          <article-title>Learning with Kernels”</article-title>
          , MIT Press (
          <year>2001</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          10. Duda, Richard O.,
          <string-name>
            <surname>Hart</surname>
            ,
            <given-names>Peter E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Stork</surname>
            ,
            <given-names>David G.</given-names>
          </string-name>
          ,
          <source>Pattern Classification. 2nd edn. Wiley Interscience ISBN 0471056693</source>
          (
          <year>2000</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>