<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>T. Ehret, A. Davy, J.-M. Morel, and M. Delbracio, “Image Anomalies: a Review and Synthesis
of Detection Methods,” J Math Imaging Vis</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1007/s10851-019-00885-0</article-id>
      <title-group>
        <article-title>Comparative Analysis of 2D and 3D Visual Inspection Techniques for Precision Quality Assessment in Coffee Capsules</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Pau Garrigues Carbó</string-name>
          <email>pgarrigues@iti.es</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Mauro Fabrizioli</string-name>
          <email>m.fabrizioli@videosystems.it</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Javier Pérez Soler</string-name>
          <email>javierperez@iti.es</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Miguel Sanchis Hernández</string-name>
          <email>msanchis@iti.es</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Alessandro Liani</string-name>
          <email>a.liani@videsystems.it</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Juan-Carlos Perez-Cortes</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Instituto Tecnológico de Informática (ITI)</institution>
          ,
          <addr-line>46022 Valencia</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Universitat Politècnica de València</institution>
          ,
          <addr-line>46022 Valencia</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Video Systems Srl</institution>
          ,
          <addr-line>33033 Codroipo</addr-line>
          ,
          <country country="IT">Italy</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2019</year>
      </pub-date>
      <volume>61</volume>
      <issue>5</issue>
      <fpage>14298</fpage>
      <lpage>14308</lpage>
      <abstract>
        <p>This paper explores the integration of Artificial Intelligence (AI) in Non-Destructive Inspection (NDI) systems, focusing on the context of the Horizon Europe project Zero Defects Zero Waste (ZDZW). The study conducts a thorough comparison between 2D and 3D computer vision techniques to evaluate their effectiveness in enhancing the quality analysis of coffee capsules. Through practical insights gained from the ZDZW project, this work aims to illuminate the strengths and trade-offs associated with each technique, providing valuable guidance for the design and implementation of AI-based solutions in NDI systems.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Quality assessment</kwd>
        <kwd>Non-Destructive Inspection Technology</kwd>
        <kwd>AI quality inspection</kwd>
        <kwd>Computer Vision</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>In the context of Non-Destructive Inspection (NDI) systems, the integration of Artificial
Intelligence (AI) has emerged as a transformative force, offering unprecedented capabilities for
quality analysis [1], [2], [3]. This paper delves into the intricacies of AI application within visual NDI,
with a specific focus on the context of the Horizon Europe project Zero Defects Zero Waste (ZDZW).
In the pursuit of perfection, ZDZW will develop digital non-destructive inspection services (NDIS) as
a set of strategic technologies to improve production efficiency, zero-defect, and sustainable
manufacturing of European industries.</p>
      <p>Within the framework of the ZDZW project, this study undertakes a meticulous examination of
the integration of 2D and 3D computer vision techniques in the inspection of coffee capsules during
manufacturing process. The central objective is to conduct a comprehensive comparative analysis of
these techniques in enhancing precision quality assessment. Drawing on practical insights garnered
from the ZDZW initiative, this work aims to illuminate the strengths and trade-offs inherent in each
technique.</p>
      <p>The significance of this research lies not only in its contribution to the specific domain of coffee
capsule quality assessment but also in its broader implications for the design and implementation of
AI-based solutions in NDI systems. By providing a nuanced understanding of the comparative merits
of 2D and 3D visual inspection techniques, this work seeks to offer valuable guidance for researchers,
engineers, and practitioners involved in the development of AI-driven solutions for Non-Destructive</p>
      <p>Inspection. As industries strive towards the ambitious goal of zero defects and zero waste [4], the
insights derived from this comparative analysis are poised to play a pivotal role in shaping the future
landscape of quality assurance through AI integration in NDI systems.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Non-Destructive Visual Inspection</title>
      <p>The Non-Destructive Visual Inspection landscape encompasses a diverse array of methodologies
and techniques, ranging from conventional human-based visual inspections to cutting-edge
autonomous visual inspection processes featuring assisted AI algorithms designed to augment
detection capabilities. It is important to highlight that these methodologies may extend beyond the
visible spectrum wavelength, yet they are categorized under the umbrella of visual testing techniques.
The discernment of defects by the human eye, with an average size detection threshold of 100μm,
introduces a critical dimension to this inspection domain. However, this ability is contingent upon
factors such as measurement subjectivity, interpretation, ambient conditions, and visual fatigue,
among other external influences [5].</p>
      <p>The addition of automated visual inspection equipment, featuring optimal lighting conditions,
magnification optics, higher resolution cameras, and specialized software tools for dimensional
measuring and defect detection, presents a notable enhancement in visual accuracy, detectability,
reliability, and inspection process speed when compared to traditional human visual inspection
methods.</p>
      <p>Over the past decade, advancements in computer vision solutions coupled with Convolutional
Neural Networks (CNN) and other artificial intelligence (AI) algorithms have significantly elevated
the interpretation of images [6]. This progress enables the extraction of regions of interest and the
detection of previously unnoticed features across a spectrum of processes, ranging from dimensional
control to defect detection. The early identification of defects during the initial stages of the
production process not only improves overall quality but also results in substantial savings in both
raw materials and energy consumption.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Multi-camera 3D inspection</title>
      <p>A 3D multi-camera solution tailored for visual quality inspection [7] operates by concurrently
capturing images from diverse perspectives, enabling the creation of a detailed 3D reconstruction of
the target object. The main advantage of this approach is it can inspect the object from multiple views
analysing its 3D structure as well as texture at the same time being able to 360º inspection if the object
is captured in free fall. Once reconstructed, the object undergoes classification against a selection of
possible known references [8], facilitating alignment with the reference model [9]. This alignment
serves as the foundation for various analyses.</p>
      <p>The volumetric analysis compares the reconstructed object and the reference model, obtaining
the volume difference between them. With the application of a global threshold as a criterion for its
overall conformity to the reference model in terms of volume. Following this, a fine-grained analysis
of volumetric characteristics within previously defined regions of interest takes place. Subsequently,
localized quality thresholds are applied to these pre-defined regions, facilitating an in-depth
evaluation of the object's volumetric fidelity in diverse areas. The integration of both global and local
thresholds enhances the effectiveness of visual quality inspection.</p>
      <p>A surface analysis [10] is facilitated through the training of a specialized model to learn image
descriptors for each point within the 3D model though an automated training process using a reduced
set of pre-validated samples [11]. For each 3D point, the cameras with the best viewing angle are
evaluated and small image cropping of a predefined size are assigned, establishing a texture model
with spatial reference that comprehensively captures the details of the surface at that point. The
subsequent comparison between the textures acquired during the capture process and those learned
by the model yields a metric quantifying the distance between the two sets of surface representations.
This metric serves as a discerning tool for evaluating surface variations, providing accurate
information about the fidelity of the captured surface texture with respect to the learned model.</p>
      <p>In addition to this a Geometric Dimensioning and Tolerancing (GD&amp;T) measurements [12],
derived from the reference model, are systematically applied to the points obtained through the 3D
reconstruction process [13]. This method, widely adopted in contemporary manufacturing, offers a
robust and easy-to-interpret measure of quality and ensures compliance for later phases of the
manufacturing process.</p>
      <p>After all, the multi-camera solution presented here provides an engineering-perspective approach
to quality inspection by integrating volumetric scrutiny, surface texture analysis, and adherence to
GD&amp;T specifications for geometrical measures. This system proves to be a valuable asset in advancing
quality control processes within manufacturing industries. Through a systematic comparison of the
reconstructed objects with reference models, manufacturers obtain real-time insights into potential
irregularities or variations in the manufacturing process.</p>
      <p>The integration of these analyses not only enhances the overall quality control framework but
also contributes significantly to the optimization of manufacturing efficiency and resource utilization
within the plastic coffee capsule production pipeline, avoiding defective units to be further processed
and thus reducing the scrap and waste material.</p>
    </sec>
    <sec id="sec-4">
      <title>4. 2D real-time inspection based on machine vision edge device</title>
      <p>In many manufacturing industrial applications there is often demand of vision controls to inspect
100% of production with rates higher than 10 parts/s or more. Such controls can also be requested to
be installable and operate within compact and multi-steps production lines, in order to perform
intermediate checks between different stages of an existing process/assembly.</p>
      <p>High-speed acquisition and processing vision edge unit developed by Video Systems, integrated
with space-saving 2D cameras, can cope with both requirements. In particular regarding the
inspection rates, the low-power edge architecture is designed to allow the execution [14] of complex
machine vision algorithms with a processing capacity of up to 20-25 parts/s. The process is currently
based on a high-performance DSP system with SIMD parallel computation capability, suitable for
computation of matrix data such as images.</p>
      <p>On the software side, algorithms are designed to extract features of interest from the images
(contours, blobs, …) thanks to advanced machine vision techniques: the features are finally used to
assess quality conformity, typically based on free-of-defect conditions or within-tolerance
dimensional controls. AI models can also be trained using the web-based suite of AI tools available in
the application framework, including SOTA techniques for supervised object detection and
classification: trained models are finally deployed in the edge unit, while guaranteeing the high
processing capacity.</p>
      <p>Moving back to the inspection of coffee capsule during manufacturing process, it is reported the
case of a specific patented design of a capsule body, where a real-time visual control is installed within
the assembly line used for the final coffee capsule production, to monitor the punching of the silicone
disc at the center of the capsule bottom by means of a steel needle. The position of the punched hole
is required to be within a certain defined distance from the center of the silicone septum, for correct
coffee flow extraction from the hole during the dispensing phase.</p>
      <p>The acquisition and processing speed is capable to cope with the maximum production rate of 800
parts/minute for the line, and the final result of captured image analysis is the correct identification
of both the septum circumference and the punched hole, so to estimate its distance from the septum
center based on previous dimensional calibration on reference samples.</p>
      <p>Absence of the hole (typically due to broken needle) or out-of-tolerance measurement for its
position can be used to alert the operator of defective conditions in production. Also, a continuous
drift of the hole-center distance in time can be used for preventive actions like the anticipated
maintenance of needle substitution, so improving the operator work and the system productivity.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Conclusions</title>
      <p>In this study, we conducted a comprehensive analysis of 2D and 3D visual inspection techniques
within the Horizon Europe project Zero Defects Zero Waste (ZDZW), focusing on coffee capsule
quality assessment. Our findings highlight distinct advantages in both approaches.</p>
      <p>The multi-camera 3D inspection system offered a holistic quality assessment by integrating
volumetric scrutiny, surface texture analysis, and adherence to Geometric Dimensioning and
Tolerancing (GD&amp;T) specifications. This system enabled real-time defect detection, empowering
manufacturers to adjust processes promptly and optimize efficiency while minimizing waste.</p>
      <p>Conversely, the 2D real-time inspection based on machine vision edge devices demonstrated rapid
acquisition and processing capabilities, particularly suitable for high-speed inspections in
manufacturing lines.</p>
      <p>As we explore the complexities of 2D and 3D visual inspection techniques for coffee capsule
quality assessment, it becomes clear that the integration of these methods is essential. Combining 2D
and 3D approaches will not only improve the efficiency of quality assessment but also lead to a more
complete understanding of product quality. Looking ahead, prioritizing the development of
interoperable frameworks and standards will drive the advancement of non-destructive inspection
systems, aligning with the overall goal of achieving zero defects and zero waste in manufacturing.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgements</title>
      <p>The ZDZW project has received funding from the European Union’s Horizon Europe program
under grant agreement No 101057404. Views and opinions expressed are, however, those of the
author(s) only and do not necessarily reflect those of the European Union. Neither the European
Union nor the granting authority can be held responsible for them.</p>
      <p>We gratefully acknowledge GLN Plast S.A. and Illycaffè S.p.A. for providing the experimental
objects.</p>
    </sec>
    <sec id="sec-7">
      <title>Declaration on Generative AI</title>
      <p>The author(s) have not employed any Generative AI tools.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>M. Colledani</surname>
          </string-name>
          et al.,
          <article-title>“Design and management of manufacturing systems for production quality</article-title>
          ,
          <source>” CIRP Ann Manuf Technol</source>
          , vol.
          <volume>63</volume>
          , no.
          <issue>2</issue>
          ,
          <year>2014</year>
          , doi: 10.1016/j.cirp.
          <year>2014</year>
          .
          <volume>05</volume>
          .002.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>H.</given-names>
            <surname>Ding</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. X.</given-names>
            <surname>Gao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Isaksson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. G.</given-names>
            <surname>Landers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Parisini</surname>
          </string-name>
          , and
          <string-name>
            <given-names>Y.</given-names>
            <surname>Yuan</surname>
          </string-name>
          , “
          <article-title>State of AI-Based Monitoring in Smart Manufacturing</article-title>
          and Introduction to Focused Section,” IEEE/ASME Transactions on Mechatronics, vol.
          <volume>25</volume>
          , no.
          <issue>5</issue>
          ,
          <year>2020</year>
          , doi: 10.1109/TMECH.
          <year>2020</year>
          .
          <volume>3022983</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>N.</given-names>
            <surname>Ida</surname>
          </string-name>
          and
          <string-name>
            <given-names>N.</given-names>
            <surname>Meyendorf</surname>
          </string-name>
          ,
          <article-title>Handbook of advanced nondestructive evaluation</article-title>
          , vol.
          <volume>10</volume>
          . Springer International Publishing Cham, Switzerland,
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <given-names>D.</given-names>
            <surname>Powell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. C.</given-names>
            <surname>Magnanini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Colledani</surname>
          </string-name>
          , and
          <string-name>
            <given-names>O.</given-names>
            <surname>Myklebust</surname>
          </string-name>
          , “
          <article-title>Advancing zero defect manufacturing: A state-of-the-art perspective and future research directions,” Computers in Industry</article-title>
          , vol.
          <volume>136</volume>
          .
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .1016/j.compind.
          <year>2021</year>
          .
          <volume>103596</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>B. M. Sharratt</surname>
          </string-name>
          , “
          <article-title>Non-Destructive Techniques and Technologies for Qualification of Additive Manufactured Parts and Processes: A Literature Review</article-title>
          ,” Department of National Defence of Canada, vol.
          <volume>55</volume>
          , no.
          <source>March</source>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <given-names>S.</given-names>
            <surname>Mao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Tang</surname>
          </string-name>
          , and
          <string-name>
            <given-names>F.</given-names>
            <surname>Qian</surname>
          </string-name>
          , “
          <article-title>Opportunities and Challenges of Artificial Intelligence for Green Manufacturing in the Process Industry</article-title>
          ,” Engineering, vol.
          <volume>5</volume>
          , no.
          <issue>6</issue>
          .
          <year>2019</year>
          . doi:
          <volume>10</volume>
          .1016/j.eng.
          <year>2019</year>
          .
          <volume>08</volume>
          .013.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Perez-Cortes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J.</given-names>
            <surname>Perez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Saez-Barona</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. L.</given-names>
            <surname>Guardiola</surname>
          </string-name>
          ,
          <string-name>
            <surname>and I. Salvador</surname>
          </string-name>
          , “
          <article-title>A system for in-line 3D inspection without hidden surfaces,” Sensors (Switzerland)</article-title>
          , vol.
          <volume>18</volume>
          , no.
          <issue>9</issue>
          ,
          <string-name>
            <surname>Sep</surname>
          </string-name>
          .
          <year>2018</year>
          , doi: 10.3390/S18092993.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>J.</given-names>
            <surname>Arlandis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Perez-Cortes</surname>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <given-names>J.</given-names>
            <surname>Cano</surname>
          </string-name>
          , “
          <article-title>Rejection strategies and confidence measures for a k - NN classifier in an OCR task</article-title>
          ,
          <source>” Proceedings - International Conference on Pattern Recognition</source>
          , vol.
          <volume>16</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>576</fpage>
          -
          <lpage>579</lpage>
          ,
          <year>2002</year>
          , doi: 10.1109/ICPR.
          <year>2002</year>
          .
          <volume>1044806</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <given-names>Z.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , “
          <article-title>Iterative point matching for registration of free-form curves and surfaces</article-title>
          ,”
          <source>Int J Comput Vis</source>
          , vol.
          <volume>13</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>119</fpage>
          -
          <lpage>152</lpage>
          , Oct.
          <year>1994</year>
          , doi: 10.1007/BF01427149/METRICS.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>