<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Automated approaches to determining chicken eggs' parameters based on their digital images⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Denys Baran</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bohdan Rusyn</string-name>
          <email>b.rusyn.prof@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Karpenko Physico-Mechanical Institute of NAS of Ukraine</institution>
          ,
          <addr-line>79601, Lviv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Kazimierz Pulaski University of Radom</institution>
          ,
          <addr-line>26-600 Radom</addr-line>
          ,
          <country country="PL">Poland</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Ternopil Ivan Puluj National Technical University</institution>
          ,
          <addr-line>46001, Ternopil</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>The accurate and efficient calculation of chicken egg parameters is crucial for quality control, grading, and overall productivity in the poultry industry. Traditional manual methods are time-consuming, labourintensive, and prone to subjective errors. This paper presents an approach to automate the process of determining egg parameters using digital images and advanced image recognition algorithms. The proposed methodology addresses the critical need for rapid and objective assessment of egg characteristics, such as size, shape, and surface defects. By leveraging state-of-the-art computer vision techniques, the system achieves high accuracy and robustness in extracting relevant features from egg images. Experimental results demonstrate the effectiveness of the proposed methods in accurately determining egg parameters, offering a significant improvement over manual inspection and contributing to enhanced efficiency and quality assurance in egg production. Experiments conducted on a group of chicken eggs showed that the error in determining geometric dimensions does not exceed 3%, and the error in determining volume does not exceed 6%. The research results demonstrate the potential of the proposed approach for developing effective automated egg quality control systems, which will contribute to the optimization of production processes and the reduction of losses.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;image recognition</kwd>
        <kwd>geometric parameters</kwd>
        <kwd>lighting conditions</kwd>
        <kwd>recognized object</kwd>
        <kwd>automated egg quality control systems 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Optical-digital control methods and ovoscopy are used in the quality control of chicken eggs. There
are also methods that find defects on the egg surface based on the texture analysis and neural
networks. The informative value of photo images obtained using the above methods depends to a
large extent on the lighting conditions and other conditions under which they were obtained [1].
Chicken eggs are investigated in the egg tester (ovoscope). Egg testers can be of different designs.
The design that allows recording chicken eggs in diffused light and shining them through in
directed light appears most convenient [2,3]. Such combination of lighting types proves to be the
best for detecting surface defects. At the same time, a comprehensive vision device needs to be
developed, in which software and hardware would be complementary [4]. The software will
recognise and classify defects in chicken eggs to a higher accuracy and optical control methods will
be improved. When evaluating the quality of chicken eggs by their appearance, image analysis is
used. There are also methods that collect eggs robotically [5]. All these methods trigger the
development of optical-digital systems that assess the size and surface damage of chicken eggs [6].
However, most systems existing to-date are highly specialised. When processing images obtained
by such systems, complex metrological parameters and conditions for obtaining images need to be
taken into account. They pose certain limitations to the operation of such systems and require
information technology specialists to participate in the image processing.</p>
      <p>An easy-to-use installation for photographing chicken eggs under controlled conditions needs
to be developed. It should include an algorithm that calculates their parameters based on the
resulting images and ensures the accuracy and reproducibility of results. This research aims at
developing an automated method for determining the geometric dimensions and volume of chicken
eggs by processing their images.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Research technique</title>
      <p>To conduct this research, an installation was developed, which features a box (Fig. 1, a) consisting
of the upper 1 and lower 2 parts (Fig. 1, b). During the experiment, chicken eggs were placed on
stand 6 in the middle of the chamber.</p>
      <p>Side walls of the upper and lower parts of the chamber (Fig. 1, b) are painted black from inside.
Opening 3 with a diameter of 90 mm, through which the camera takes photos of the specimens, is
located on the cover of the upper part. The opening is also painted black from inside. On the inside
of the top cover, LED modules are located around the opening, which are covered by a
lightdiffusing screen 4 made from off-white acrylic plastic. At the bottom of the lower part of chamber
2, there is the same number of LED modules as on the upper one.</p>
      <p>These modules are also covered with a light-scattering screen 5, above which there is a stand 6
with holes for placing test eggs. The stand is also painted black. The LEDs on the upper part of the
chamber assess the egg’s surface quality (Fig. 2, a).</p>
      <p>LED modules at the bottom of the chamber shine the eggs through to study the quality of their
internal contents (Fig. 2, b). Figure 3 shows the structure of the chamber. The luminous flux
emitted by the LEDs located both on the upper and lower parts of the chamber can be adjusted by
changing the voltage from two separate power sources: Constant voltage source 1 and Constant
voltage source 2 (Fig. 4), the output voltage of which can vary in the range of 5-12 V. Owing to this,
the surface illumination of the specimens studied in the chamber can also vary in the range of
100600 Lx. Luminance is measured using the BL1 photosensor of the LUXMETER device. In all cases,
chicken eggs were photographed with camera 8 which had the following settings: ISO 400,
aperture 7.1, 1/8 shutter speed.</p>
      <p>In the course of research, the following parameters were measured:
- constant voltage of the LEDs located on the upper and lower parts of the chamber,
- illumination of the specimen surface.</p>
      <p>Chicken eggs were photographed at illumination E of 760 Lx. The geometric dimensions of the
studied objects were calculated automatically using the developed algorithm and a
mechanicaldigital calliper with an accuracy of 0.01 mm.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Research results and their discussion</title>
      <p>A robust image processing algorithm can identify eggs in varying lighting conditions, orientations,
and background complexities. The algorithm used in this work is based on classical image
processing techniques and consists of several major stages: image filtering, image segmentation
using thresholding, morphological operations (erosion and dilation), objects detection and
calculation of detected objects dimensions (height, width, volume). Laboratory-obtained RGB
images of eggs are shown on Fig. 5.</p>
      <p>The initial step in the egg detection process involves image filtering, which aims to reduce noise
and smooth the image while preserving the essential features like edges and contours of the eggs.
As the input image iRGB is in colour, it is first converted to a grayscale image igr. This reduces the
computational complexity by simplifying the image to a single intensity channel. Formula used to
convert an RGB image to grayscale is based on luminance perception – how the human eye
perceives brightness from red, green, and blue light. The standard weighted average formula is [7]:</p>
      <p>Gray =0.2989⋅ R +0.5870⋅ G +0.1140⋅ B
where R, G, B are the red, green, and blue channel intensities (in the range from 0 to 255).</p>
      <p>
        A Gaussian filter is applied to the grayscale image to remove high-frequency noise. This filter
blurs the image, which helps in reducing small artifacts that might be misclassified as part of an
egg:
(
        <xref ref-type="bibr" rid="ref1">1</xref>
        )
(
        <xref ref-type="bibr" rid="ref2">2</xref>
        )
G ( x , y )=
where x, y are the distances from the central pixel; σ is the standard deviation of the Gaussian
distribution, it controls the blur intensity.
      </p>
      <p>The Gaussian filter uses a kernel to average pixel intensities with their neighbors, giving higher
weight to central pixels. Practically, the Gaussian filter was implemented by first computing its
5 × 5 pixel kernel and then convolving it with the initial image.</p>
      <p>Next step of image processing is a contrast enhancement that aims to improve the visibility of
the eggs’ contours and internal structures. It especially is useful when images suffer from low
contrast due to poor lighting, overexposure, or underexposure. Enhancing contrast makes the
subsequent segmentation more accurate and robust. To adjust the image contrast, Adaptive
Contrast Stretching (CLAHE) is used. It improves the image global histogram by applying the
technique locally, in small parts of the image. First, we divided the image into a grid of small tiles
with size 12 × 12 pixels, then performed histogram equalization on each tile. CLAHE allows to
enhance local contrast and preserve object details and textural features such as eggs contours.</p>
      <p>
        After preparation stage image is segmented. Segmentation is an essential step where the goal is
to distinguish the eggs (foreground) from the background. It plays a crucial role in extracting
structures of interest from the background. Laboratory-obtained images of eggs are characterized
by the following feature: the brightness of eggs is significantly higher than that of the dark
background on which they are located (Fig. 5). Therefore, simple thresholding by brightness level
was used for the initial segmentation of eggs:
ibw ( x , y )={ 0 , if igr ( x , y )&lt;T
255 , if igr ( x , y ) ≥ T
(
        <xref ref-type="bibr" rid="ref3">3</xref>
        )
where x , y are the pixel coordinates; T  is the experimentally determined threshold limit,
T ∈ [ 0 , 255 ].
      </p>
      <p>In binary image ibw, white regions are candidate egg regions, and black regions represent the
background.</p>
      <p>After segmentation, the binary image may still contain noise, holes, or fragmented egg shapes.
To address this, morphological operations are applied to clean and consolidate the detected regions.
Specifically, a sequence of erosion and dilation transformations is used to clean the image and
consolidate the detected egg regions. Erosion helps to remove small noise and separates closely
positioned objects, while dilation restores the original size and shape of the relevant regions,
effectively reinforcing the continuity of egg contours.</p>
      <p>
        Consider ibw ( x , y ) as the input image, where ( x , y ) are the pixel coordinates and
ibw ( x , y )=1 for foreground pixels and ibw ( x , y )=0 for background pixels. Let s ( u , v ) be the
structuring element, where ( u , v ) are the coordinates within the structuring element, and
s ( u , v )=1 for the elements belonging to the structuring element's shape and s ( u , v )=0
otherwise. Then erosion of the image ibw ( x , y ) by the structuring element s ( u , v ) is defined as
follows [8]:
ie ( x , y )=(ibw⊝ s )( x , y )= min {ibw ( x +u , y + v )}
(u , v)∈ S
(
        <xref ref-type="bibr" rid="ref4">4</xref>
        )
(
        <xref ref-type="bibr" rid="ref6">6</xref>
        )
where S is the set of coordinates ( u , v ) for which s ( u , v )=1.
      </p>
      <p>
        Structural element matrix for erosion is shown on Figure 6a. The output pixel is foreground (
        <xref ref-type="bibr" rid="ref1">1</xref>
        )
only if all the pixels in the input image corresponding to the '1's in the structuring element (when
centered at ( x , y )) are also foreground (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ). Otherwise, the output pixel is background ( 0 ).
      </p>
      <p>
        The dilation of the input image ibw ( x , y ) by the structuring element s ( u , v ), is defined as [8]:
id ( x , y )=(ibw ⨁ s )( x , y )= max {ibw ( x−u , y −v )}
(u , v)∈ S
(
        <xref ref-type="bibr" rid="ref5">5</xref>
        )
      </p>
      <p>
        In this case the output pixel ( x , y ) is foreground (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ) if at least one of the pixels in the input
image corresponding to the '1's in the structuring element (when centered at ( x , y )) is also
foreground (
        <xref ref-type="bibr" rid="ref1">1</xref>
        ). Otherwise, the output pixel is background ( 0 ). Dilation uses the same structural
element as erosion (Fig. 6a).
      </p>
      <p>Figure 5b shows an example of the image obtained in this way. A bounding box and the central
axis are also presented for each recognized egg.</p>
      <p>To process images and calculate the dimensions of the objects found, we used the scikit-image
0.25.2 library for Python 3.12 [9]. For each object recognised, the bounding box, the area and the
angle of the major axis of the equivalent ellipse (that has the same second moments as the region)
were calculated.</p>
      <p>Next, each individually found object (egg) was rotated so that the main axis was vertical
(Fig. 6b), and the height, width, and volume of the egg were calculated. To calculate the egg’s
volume, the second theorem of Pappus was used. It states that the volume V of a solid of
revolution generated by the revolution of a lamina about an external axis is equal to the product of
the area a of the lamina and the distance d traveled by the lamina’s geometric centroid:</p>
      <p>V =ad =2 πra
where r is the distance from the major axis to the geometric centroid of half of the egg (Fig. 6b).
s (u , v )=[1 1 1]
1 1 1
1 1 1
a b
Figure 6: The structural element matrix for morphological erosion and dilation (a). Scheme for
calculating the egg volume using formula (b). The vector’s topr is the geometric centroid of the egg’s
half, y is the egg's central axis.</p>
      <p>In the final binary image, where white pixels represent the object (egg) and black pixels
represent the background, the area of the object can be computed by counting the number of white
pixels:</p>
      <p>
        w h
a=∑ ∑ i ( x , y )
x y
(
        <xref ref-type="bibr" rid="ref7">7</xref>
        )
where i ( x , y ) is rotated (vertically aligned) image of the recognized egg; w – width, h – height
of the image i ( x , y ).
      </p>
      <p>The physical attributes of eggs are essential for various applications, ranging from optimizing
incubation practices to predicting quality and ensuring consumer satisfaction. Most important
geometrical parameters of eggs are: height, width and volume [10]. In the field of egg
morphometrics, length and width are consistently the most frequently measured parameters for
studying egg shape variation [11, 12]. These two dimensions are used to calculate various indices,
such as elongation and asymmetry, which help quantify subtle differences in egg shape. The
universality of length and width measurements allows for standardized comparisons of eggs.</p>
      <p>Within the poultry industry, egg volume is strongly correlated with economically important
traits. It is a valuable predictor of chick weight, with studies showing that larger egg weight
(closely related to volume) often results in heavier hatchlings. Egg volume also influences
hatchability, with eggs within a specific volume range generally exhibiting higher hatching
success. Furthermore, egg volume can be indicative of shell quality characteristics, although the
relationship can be complex, with larger volumes sometimes associated with thinner shells.
Additionally, volume provides insights into the egg's interior parameters, such as the proportions
of albumen and yolk, which are important for nutritional assessment [12].</p>
      <p>Thus, this study concentrated on computing geometric parameters of eggs, specifically their
dimensions and volume. Table 1 summarises egg measurements and allows to compare the results
made by proposed image recognition method with manual measurements. It makes possible to
estimate the effectiveness of the developed algorithm.</p>
      <p>Automated
measurement data</p>
      <sec id="sec-3-1">
        <title>Mechanical</title>
        <p>measurement data*</p>
      </sec>
      <sec id="sec-3-2">
        <title>Relative error, %</title>
        <p>H,
mm
57.29
57.25
58.46
59.99</p>
        <p>W,
mm
43.83
42.50
40.93
44.22</p>
        <p>V,
cm3
56.22
53.56
48.58
60.62</p>
        <p>H1,
mm
56.91
57.93
59.73
59.19</p>
        <p>W1,
mm
43.98
43.54
40.73
44.14</p>
        <p>V1,
cm3
54.60
54.47
49.14
57.20
δH
0.67
1.17
2.13
1.35
δW
0.34
2.39
0.49
0.18
δV
2.97
1.67
1.14
5.98
*H – height; W – width; V – volume.</p>
        <p>In general, the data summarised in Table 1 testify to the effectiveness of the developed
automated algorithm for determining the geometric parameters of eggs. The obtained relative
errors for height and width are quite low, which indicates that the method’s accuracy is acceptable.
A relative error in determining the volume was somewhat larger. This may be due to a more
complex algorithm used in the calculation of the volume based on a two-dimensional image. Given
this, the algorithm needs to be improved, or additional factors need to be taken into account.</p>
        <p>There are certain aspects that may cause measurement errors:
- lighting differences in the process of analysis of chicken eggs’ geometry [13];
- possible chamber vibrations that may occur during the robotic collection of eggs.</p>
        <p>These aspects will be explored in further publications. The results obtained are also related to
the studies cited in [14,15].</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Conclusions</title>
      <p>An automated method for determining the geometric dimensions (height, width) and volume of
chicken eggs based on processing their digital optical images is developed and tested. A specialised
installation for obtaining photo images under controlled lighting conditions and an algorithm for
calculating egg parameters are proposed.</p>
      <p>The research results showed a high consistency between the geometric parameters of eggs
determined using the developed automated method and the results of direct mechanical
measurements. The relative error in determining the height, width and volume of eggs did not
exceed 3.0 % and 6.0 %, respectively. This indicates the effectiveness and acceptable accuracy of the
proposed approach for practical application in quality control systems.</p>
      <p>The results show that the developed installation and image processing algorithm have the
potential to create easy-to-use and reproducible systems for automated determination of egg
parameters. This can help increase the productivity of sorting, calibration and quality control
processes in the poultry industry.</p>
    </sec>
    <sec id="sec-5">
      <title>Declaration on Generative AI</title>
      <p>The authors have not employed any Generative AI tools.
[9] S. van der Walt, J.L. Schönberger, J. Nunez-Iglesias, F. Boulogne, J.D. Warner, N. Yager, E.</p>
      <p>Gouillart, T. Yu Scikit-image: Image processing in Python. PeerJ 2:e453 (2014)
doi:10.7717/peerj.453.
[10] DSTU 5028:2008 Hen’s eggs for human consumption. Specifications, Kyiv, State Consumer</p>
      <p>
        Standard of Ukraine, 2009, 17p. (in Ukrainian).
[11] L. Severa, Š. Nedomová, J. Buchar, J. Cupera Novel approaches in mathematical description of
hen egg geometry. International Journal of Food Properties, 16(
        <xref ref-type="bibr" rid="ref7">7</xref>
        ), (2013) 1472–1482.
doi:10.1080/10942912.2011.595028.
[12] D.R. Jones, G.E. Ward, P. Regmi, D.M. Karcher, Impact of egg handling and conditions during
extended storage on egg quality. Poultry Science, 97(
        <xref ref-type="bibr" rid="ref2">2</xref>
        ), (2018) 716-723, doi:10.3382/ps/pex351.
[13] P. Maruschak, I. Konovalenko, Y. Osadtsa, V. Medvid, O. Shovkun, D. Baran, H. Kozbur, &amp; R.
      </p>
      <p>
        Mykhailyshyn Surface Illumination as a factor influencing the efficacy of defect recognition on
a rolled metal surface using a deep neural network. Applied Sciences, 14(
        <xref ref-type="bibr" rid="ref6">6</xref>
        ), (2024) 2591.
doi:10.3390/app14062591.
[14] Buketov, A.; Maruschak, P.; Sapronov, O.; Zinchenko, D.; Yatsyuk, V.; Panin, S. Enhancing
Performance Characteristics of Equipment of Sea and River Transport by Using Epoxy
Composites. Transport 2016, 31, 333–342.
[15] Konovalenko, I.; Maruschak, P.; Brevus, V.; Prentkovskis, O. Recognition of Scratches and
Abrasions on Metal Surfaces Using a Classifier Based on a Convolutional Neural
Network. Metals 2021, 11, 549.
      </p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>R.</given-names>
            <surname>Karoui</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Kemps</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Bamelis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>De Ketelaere</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Decuypere</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>J. De Baerdemaeker</surname>
          </string-name>
          <article-title>Methods to evaluate egg freshness in research and industry: A review</article-title>
          .
          <source>Eur Food Res Technol</source>
          <volume>222</volume>
          (
          <year>2006</year>
          )
          <fpage>727</fpage>
          -
          <lpage>732</lpage>
          . doi:
          <volume>10</volume>
          .1007/s00217-005-0145-4
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>S.</given-names>
            <surname>Harnsoongnoen</surname>
          </string-name>
          ,
          <string-name>
            <surname>N. Jaroensuk</surname>
          </string-name>
          <article-title>The grades and freshness assessment of eggs based on density detection using machine vision and weighing sensor</article-title>
          .
          <source>Sci Rep</source>
          <volume>11</volume>
          ,
          <issue>16640</issue>
          (
          <year>2021</year>
          ). . doi:
          <volume>10</volume>
          .1038/s41598-021-96140-x
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>C.W.</given-names>
            <surname>Cheng</surname>
          </string-name>
          , S.Y.Jung,
          <string-name>
            <given-names>C.C.</given-names>
            <surname>Lai</surname>
          </string-name>
          , S.-
          <string-name>
            <given-names>Y.</given-names>
            <surname>Tsai</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>C.-C. Jeng</surname>
          </string-name>
          <article-title>Transmission spectral analysis models for the assessment of white-shell eggs and brown-shell eggs freshness</article-title>
          .
          <source>J Supercomput</source>
          <volume>76</volume>
          , (
          <year>2020</year>
          )
          <fpage>1680</fpage>
          -
          <lpage>1694</lpage>
          . doi:
          <volume>10</volume>
          .1007/s11227-019-03008-z
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>L.</given-names>
            <surname>Qi</surname>
          </string-name>
          ,
          <string-name>
            <surname>Mc. Zhao</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Shen</surname>
          </string-name>
          &amp;
          <string-name>
            <surname>J. Lu</surname>
          </string-name>
          <article-title>Non-destructive testing technology for raw eggs freshness: a review</article-title>
          .
          <source>SN Appl. Sci</source>
          . (
          <year>2020</year>
          )
          <volume>2</volume>
          ,
          <fpage>1113</fpage>
          . doi:
          <volume>10</volume>
          .1007/s42452-020-2906-x
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>C.-L. Chang</surname>
            ,
            <given-names>B.-X.</given-names>
          </string-name>
          <string-name>
            <surname>Xie</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.-H. Wang</surname>
          </string-name>
          <article-title>Visual Guidance and Egg Collection Scheme for a Smart Poultry Robot for Free-Range Farms</article-title>
          .
          <source>Sensors</source>
          <year>2020</year>
          ,
          <volume>20</volume>
          , 6624. doi:
          <volume>10</volume>
          .3390/s20226624
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Alimu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Jiang &amp; H. Jin</surname>
          </string-name>
          <article-title>Nondestructive detection of egg freshness based on a decision-level fusion method using hyperspectral imaging technology</article-title>
          .
          <source>Food Measure</source>
          <volume>18</volume>
          , (
          <year>2024</year>
          )
          <fpage>4334</fpage>
          -
          <lpage>4345</lpage>
          . doi:
          <volume>10</volume>
          .1007/s11694-024-02497-8
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>ITU-R</surname>
          </string-name>
          ,
          <article-title>Parameter values for the HDTV standards for production and international programme exchange</article-title>
          ,
          <source>ITU-R BT.709-6</source>
          , International Telecommunication Union, Geneva, Switzerland,
          <year>June 2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>R.C.</given-names>
            <surname>Gonzalez</surname>
          </string-name>
          and
          <string-name>
            <given-names>R.E.</given-names>
            <surname>Woods</surname>
          </string-name>
          . Digital Image Processing, [4th ed.],
          <source>Longman (Pearson education)</source>
          ,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>