<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Contactless Classification of Strawberry Using Hyperspectral Imaging</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Binu Melit Devassy</string-name>
          <email>binu.m.devassy@ntnu.no</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sony George</string-name>
          <email>sony.george@ntnu.no</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Computer Science, Norwegian University of Science and Technology</institution>
          ,
          <addr-line>Gjøvik 2802</addr-line>
          ,
          <country country="NO">Norway</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Rapid non-contact estimation of fruit quality parameters is an essential factor for an efficient food processing pipeline. We propose a novel workflow for the contactless classification of strawberries based on their sugar content, using Hyperspectral Imaging (HSI) and One-Dimensional Convolutional Neural Network (1D - CNN). Sugar content is an important quality aspect of strawberries, hence classification based on sugar content gives more yield to the fruit producers. We used Visible and Near Infrared (VNIR) hyperspectral camera to acquire HSI data of 50 ripe strawberries and applied the proposed method to classify them. To verify the advantage of the proposed method, the results from 1DCNN are compared against other standard classification methods such as Spectral Angle Mapper (SAM), and Spectral Information Divergence (SID). The results show that the 1D-CNN outperformed other methods by achieving 96.6% classification accuracy.</p>
      </abstract>
      <kwd-group>
        <kwd>Hyperspectral Imaging</kwd>
        <kwd>Strawberry classification</kwd>
        <kwd>Fruits classification using CNN</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>Non-invasive measurement of different food attributes is of great interest. The sugar
content is one of the characteristics that enhances the customer experience and
influence the market value of the fruit [1]. Sugar content is one of the key factors that is
useful for grading the fruits [2], measurement of this attribute with the application of
imaging technologies eases the sorting process and offer several advantages like
contactless and non-destructive measurement, the possibility for automation, and high
accuracy within a certain limit. Hyperspectral imaging (HSI) is one of the most suitable
imaging modalities that is proven to be useful for this purpose [3]. There has been many
extensive studies reported on the usefulness of HSI for non-invasive measurement of
several types of fruits such as apples, oranges, and kiwifruits [4][5][6].
Hyperspectral imaging systems capture both spatial and spectral information
simultaneously, which enables HSI technology to make a more reliable classification compared
to the traditional three channel imaging methods. In addition to this, HSI already proven
to be one of the effective non-invasive methods to study material properties compared
to the chemical analysis, which are generally invasive in nature. HSI finds applications
in many different fields like remote sensing [7], cultural heritage [8][9], forensics [10],
etc. Food science is an important field where hyperspectral imaging used extensively
to monitor the quality, diseases, and stages of development in variety of food products.
Strawberry is one of the favorite fruits for many people and the interest is reflected in
the increased production in every year [11]. Motivated by this fact that lack of HSI and
predictive sugar analysis for strawberries, we have studied and evaluated the
strawberry’s classification based on sugar content using HSI. Customer interest in strawberry
is mainly driven by the taste of the fruit and most of the times the sweetness of the fruit.
Convolutional Neural Networks (CNNs) has been used in many computer vision and
image classification applications, CNNs were designed to learn features from the
training data and which can be used to classify the test samples. The most common CNNs
are two-dimensional (2D-CNN), however in this work we proposed a one-dimensional
CNN (1D-CNN) because the spectral signals are One-Dimensional (1D) signals with
varying amplitude. 1D-CNNs can provide reliable solutions in many 1D
signal-processing applications such as Electrocardiographic (ECG) signals [12], audio[13], and
other 1D signals [14][15]. We can also find a few attempts to use 1D-CNN for spectral
classification [16], this experiment used 1D-CNN to classify ink spectra. To verify the
effectiveness of the proposed method, we used two well-known classification methods
in HSI domain; they are Spectral Angle Mapper (SAM) and Spectral Information
Divergence (SID).</p>
      <p>Later part of this paper is organized as follows. Section 2 will present the details of the
fruit samples used, hyperspectral acquisition of the fruits and processing of the HSI
data. Followed by Section 3, which presents and discusses the results from this study
and finally the conclusions in Section 4.
2
2.1</p>
    </sec>
    <sec id="sec-2">
      <title>Materials and Methods</title>
      <sec id="sec-2-1">
        <title>Fruit Samples</title>
        <p>Strawberries were purchased from local markets in Norway and fifty of them were
selected as candidates by avoiding the fruits having any defects, bruises, and infections.
All the strawberries were belong to the same class and produced in the same
environment. The fruits were kept in ideal storage conditions, and taken to the lab environment
an hour before the hyperspectral image acquisition, in a controlled room temperature.
The strawberries were washed to remove any contaminations and the water drops from
the fruits were wiped away using a dry clean cloth, before measurement.</p>
      </sec>
      <sec id="sec-2-2">
        <title>Hyperspectral Acquisition</title>
        <p>
          Hyperspectral acquisition system used for this experiment is shown in Fig.1, which is
the same setup used in this experiment [17] except for the samples used.
          <xref ref-type="bibr" rid="ref7">HySpex
VNIR1800</xref>
          [18] push broom hyperspectral camera was used for hyperspectral image
acquisition of the fruit samples. The camera placed at right angles to a moving translator stage
where the fruit samples were placed and two halogen light sources were used to
illuminate the scene with 45°:0° geometry with respect to the camera to minimize shadowing.
This camera has a spectral sampling of 3.18 nm along with its spectral range, which
divides the supporting spectral range (400 nm to 1000 nm) into 186 bands. The image
acquisition resulted in a hyperspectral data cube with spatial (X and Y) and spectral (Z)
directions. Here, the size of X- axis was 1800 pixels, the size of the Y-axis depends on
the size and number of strawberries used in a single scan, and the size Z-axis was 186.
A reference target with known reflectance (Contrast Multi-Step Target [19]) values was
present in the scene, which will be used to convert the radiance to the reflectance while
processing the data.
The sugar content of each strawberry was measured immediately after spectral
measurement using a refractometer (PAL 1, Atago Co., Ltd., Japan). This method of sugar
content estimation requires the fruits to be squeezed to get the juice, from which the
refractometer analyses the degrees of Brix (o Bx). The degree Brix represents “the
percentage of water-soluble solids in fruit juice and can be affected by many factors
including variety, growth region, growth year, and maturity level of the fruit” [20]. In this
case, the degree of Brix represents the sugar content in the strawberry juice; one degree
of brix can be defined as the one gram of sucrose in 100 grams of fruit juice. Fig.2
represents the sugar level distribution measured from the samples used.
The figure (Fig.3) shows the proposed 1D-CNNs architecture, the input spectra will
pass through a series of hidden layers, each hidden layer consists of 1D-Convolution
layer with ReLU (Rectified Linear Unit) activation, the ‘n’ will be finalized after
parameter tuning. A drop out layer with a rate ‘0.5’ will follow the convolution
layers and then by a max-pooling layer with a pool size of ‘2’. Then a flatten layer will
flatten the max-pooled output, followed by two dense layers, the last dense layer use
‘softmax’ activation to generate the classification result. The network used the
categorical cross-entropy as loss function, which is a proven technique for learning a
multi-class classification problem and used Adam [21] optimization.
Using Equation 2, we can define a normalized vector as in Equation 3
  = ∑ =1
        </p>
        <p>(2)
p = { }j=1
(3)
Finally, SID can be defined as Equation 4, where x and y are the normalized vectors
generated from reference (r) and target (t)
Where

( ,  ) =  ( ∥  ) +  ( ∥  )
 ( ∥  ) = ∑ =1   log</p>
        <p>( ∥  ) = ∑ =1</p>
        <p>(4)
(5)
(6)
2.7</p>
      </sec>
      <sec id="sec-2-3">
        <title>Data Processing</title>
        <p>The major steps in the data processing pipeline are preprocessing, calculating
normalized reflectance, and sample segmentation. The camera software performs the
preprocessing of the data, which includes dark current reduction, sensor corrections, and
radiometric calibration. After preprocessing, the HSI data were converted to normalized
reflectance by utilizing the known reflectance of the reference target present in the
scene. Then manual selection of the region of interest (ROI) will be done for each
strawberry for segmentation to avoid saturation areas of the data. The saturation areas
are part of the fruit possess abnormal spectra due to the glossiness of the strawberry.
For faster processing of data, here we decided the ROI size as 5x5 pixels.
2.8</p>
      </sec>
      <sec id="sec-2-4">
        <title>CNN Implementation and Parameter Tuning</title>
        <p>The proposed CNN architecture was implemented in Python using Keras [24] and a
Python framework known as SHERPA [25] was used for parameter tuning. The number
of filters, kernel size for convolution layer, the batch size for training, learning rate,
number of hidden layers, and number of epochs of the proposed CNN model were tuned
using SHERPA. All those parameters were initialized with random Gaussian
distributions and optimized using Bayesian optimization for hyperparameters tuning [26].
2.9</p>
      </sec>
      <sec id="sec-2-5">
        <title>Training and Evaluation</title>
        <p>The entire strawberries were classified into two groups based on a threshold sugar
value. The berries have sugar values greater than the threshold was considered as high
sugar content and the berries having lower sugar values than the threshold were
considered as low sugar content. The sugar values of the strawberries were varies between
6oBx and 12oBx, hence we used multiple threshold values starting from 7oBx to 10oBx
with 0.5oBx increment. The usage of varying threshold caused imbalanced data sets and
used random oversampling to compensate for the imbalanced data. The oversampled
data were divided randomly into the train and test data, with train data, contains 80%
of the total data, and the remaining were used for evaluation of 1D-CNN. To evaluate
SAM and SID, the reference spectra were generated from the training data set by
calculating the mean spectra[27]. The K-Fold technique with shuffle on and split count
five was used to calculate the cross-validation result.</p>
        <p>Accuracy was used as the parameter for comparing the classification capability of the
proposed method against SAM and SID. Accuracy is defined as the ratio between truly
predicted outcomes (true positives + true negatives) and the sum of all predictions.
HSI of 50 strawberries were acquired and processed using the setup and processing
methods described in Section 2.2 and 2,7. The average spectra for all strawberries
obtained from their ROIs are presented in Fig.4, we can observe that they appear nearly
similar in visible region and differ in near infrared (NIR) region. From the average
spectra, it is difficult to classify them visually; hence, we required some reliable method
for achieving this.</p>
        <p>The proposed 1D-CNN is implemented and tuned for hyperparameters, the final
architecture with fine-tuned values are shown in Fig.5 The number of hidden layers required
was determined as three, the input and output data sizes for each block based on the
final parameters are updated in the diagram. The details of the parameters and final
tuned values are displayed in Table 1.
The fine-tuned 1D-CNN is trained, tested, and compared the result against SID and
SAM. Fig.6 shows the variation in the accuracy and loss against epochs, and it can be
observed that accuracy and loss flattens in a few epochs. Table 2 provides the summary
of accuracy results obtained after cross-validation for each threshold values. From these
results, it is clear that the proposed method outperformed the traditional methods, the
1D-CNN possesses a high average accuracy score of 0.96 compared to 0.58 of SID and
0.6 of SAM. Also, it is not fare to compare this result against the previous studies
related to sugar and spectra of strawberries because they were mainly focused on
predicting sugar values rather than classification [28][29]
To evaluate the effectiveness of the method, we varied the threshold sugar values and
executed cross-validation for each threshold value. The average accuracy obtained from
cross-validation is plotted in Fig.7, which showed that the threshold sugar value has a
negligible effect on the accuracy of the classification in 1D-CNN method compared to
SID and SAM.
The performance of SAM and SID was low because of the nearly identical reference
spectra, they need a reference spectrum to compare against the test spectra and the
reference spectra were generated from the training spectra by calculating the average. The
sample reference spectra were presented in Fig.8 and we can observe that the reference
spectrum for low and high sugar values possess a nearly identical spectrum. Hence,
these methods such as SAM and SID, which relies on geometry of the spectrum, fails
to predict correctly in most cases. However, 1D-CNN that can extract features from
every sample spectrum in the training set can learn effectively and make a precise
prediction.
4</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Conclusion</title>
      <p>Hyperspectral acquisition of the strawberries were performed and created an HSI
database of 50 strawberries. The proposed 1D-CNN method was implemented and tested
on the strawberry’s HSI data set. In addition, validated the classification accuracy of
the proposed method against SID and SAM and the proposed method produced a higher
accuracy in classifying strawberries based on sugar levels. In future research, we aim
to extend these results into wide verities and larger sample count in order to make an
industrial application.
Choi, K., Lee, K., Kim, G.: Nondestructive quality evaluation technology for
fruits and vegetables using near-infrared spectroscopy. Int. Semin. Enhancing
Export Compet. Asian Fruits. (2006).</p>
      <p>Manolakis, D., Marden, D., Shaw, G. a: Hyperspectral Image Processing for
Automatic Target Detection Applications. Lincoln Lab. J. (2003).</p>
      <p>Pu, Y.Y., Feng, Y.Z., Sun, D.W.: Recent progress of hyperspectral imaging on
quality and safety inspection of fruits and vegetables: A review. Compr. Rev.
Food Sci. Food Saf. (2015). https://doi.org/10.1111/1541-4337.12123.
Zhu, H., Chu, B., Fan, Y., Tao, X., Yin, W., He, Y.: Hyperspectral Imaging for
Predicting the Internal Quality of Kiwifruits Based on Variable Selection
Algorithms and Chemometric Models. Sci. Rep. (2017).
https://doi.org/10.1038/s41598-017-08509-6.</p>
      <p>Li, J., Huang, W., Tian, X., Wang, C., Fan, S., Zhao, C.: Fast detection and
visualization of early decay in citrus using Vis-NIR hyperspectral imaging.
Comput. Electron. Agric. (2016).
https://doi.org/10.1016/j.compag.2016.07.016.</p>
      <p>Adão, T., Hruška, J., Pádua, L., Bessa, J., Peres, E., Morais, R., Sousa, J.J.:
Hyperspectral imaging: A review on UAV-based sensors, data processing and
applications for agriculture and forestry. Remote Sens. (2017).
https://doi.org/10.3390/rs9111110.</p>
      <p>Deborah, H., George, S., Hardeberg, J.Y.: Spectral-divergence based pigment
discrimination and mapping: A case study on The Scream (1893) by Edvard
Munch. J. Am. Inst. Conserv. (2019).
https://doi.org/10.1080/01971360.2018.1560756.</p>
      <p>Melit Devassy, B., George, S., Hardeberg, J.Y.: Comparison of Ink
Classification Capabilities of Classic Hyperspectral Similarity Features.
Presented at the (2019). https://doi.org/10.1109/icdarw.2019.70137.</p>
      <p>Melit Devassy, B., George, S.: Dimensionality reduction and visualisation of
hyperspectral ink data using t-SNE. Forensic Sci. Int. (2020).
https://doi.org/10.1016/j.forsciint.2020.110194.</p>
      <p>Food and Agriculture Organization of the United Nations,
http://www.fao.org/home/en/, last accessed 2020/06/10.</p>
      <p>Donida Labati, R., Muñoz, E., Piuri, V., Sassi, R., Scotti, F.: Deep-ECG:
Convolutional Neural Networks for ECG biometric recognition, (2018).
https://doi.org/10.1016/j.patrec.2018.03.028.</p>
      <p>Hershey, S., Chaudhuri, S., Ellis, D.P.W., Gemmeke, J.F., Jansen, A., Moore,
R.C., Plakal, M., Platt, D., Saurous, R.A., Seybold, B., Slaney, M., Weiss, R.J.,
Wilson, K.: CNN architectures for large-scale audio classification. In: ICASSP,
IEEE International Conference on Acoustics, Speech and Signal Processing
Proceedings. pp. 131–135 (2017).
https://doi.org/10.1109/ICASSP.2017.7952132.</p>
      <p>Abdeljaber, O., Avci, O., Kiranyaz, S., Gabbouj, M., Inman, D.J.: Real-time
vibration-based structural damage detection using one-dimensional
convolutional neural networks. J. Sound Vib. 388, 154–170 (2017).
https://doi.org/10.1016/j.jsv.2016.10.043.
18.
19.
21.
22.
23.
24.
25.
26.
27.
28.
29.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Acta - Part A Mol. Biomol</surname>
          </string-name>
          . Spectrosc. (
          <year>2008</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          https://doi.org/10.1016/j.saa.
          <year>2008</year>
          .
          <volume>03</volume>
          .005.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Manganaro</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>De Gyvez</surname>
            ,
            <given-names>J.P.</given-names>
          </string-name>
          :
          <article-title>One-dimensional discrete-time CNN with multiplexed template-hardware</article-title>
          .
          <source>IEEE Trans. Circuits Syst. I Fundam. Theory Appl</source>
          .
          <volume>47</volume>
          ,
          <fpage>764</fpage>
          -
          <lpage>769</lpage>
          (
          <year>2000</year>
          ). https://doi.org/10.1109/81.847884.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Devassy</surname>
            ,
            <given-names>B.M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>George</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          :
          <article-title>Ink Classification Using Convolutional Neural Network</article-title>
          . Nis. J.
          <volume>12</volume>
          , (
          <year>2019</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <given-names>Melit</given-names>
            <surname>Devassy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ,
            <surname>George</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            ,
            <surname>Nussbaum</surname>
          </string-name>
          ,
          <string-name>
            <surname>P.</surname>
          </string-name>
          :
          <article-title>Unsupervised Clustering of Hyperspectral Paper Data Using t-SNE</article-title>
          .
          <source>J. Imaging. 6</source>
          ,
          <issue>29</issue>
          (
          <year>2020</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>https://doi.org/10.3390/jimaging6050029.</mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <source>HySpex VNIR</source>
          <year>1800</year>
          , https://www.hyspex.no,
          <source>last accessed</source>
          <year>2020</year>
          /06/03.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <given-names>Contrast</given-names>
            <surname>Multi-Step</surname>
          </string-name>
          <string-name>
            <surname>Target</surname>
          </string-name>
          , https://www.labspherestore.com/,
          <source>last accessed</source>
          <year>2020</year>
          /06/03.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Türkmen</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ekşi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          :
          <article-title>Brix degree and sorbitol/xylitol level of authentic pomegranate (Punica granatum) juice</article-title>
          .
          <source>Food Chem</source>
          .
          <volume>127</volume>
          ,
          <fpage>1404</fpage>
          -
          <lpage>1407</lpage>
          (
          <year>2011</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          https://doi.org/10.1016/j.foodchem.
          <year>2010</year>
          .
          <volume>12</volume>
          .118.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Kingma</surname>
            ,
            <given-names>D.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ba</surname>
            ,
            <given-names>J.L.</given-names>
          </string-name>
          :
          <article-title>Adam: A method for stochastic gradient descent</article-title>
          .
          <source>ICLR Int. Conf. Learn. Represent</source>
          . (
          <year>2015</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <given-names>Summ. Third</given-names>
            <surname>Annu</surname>
          </string-name>
          .
          <source>JPL Airborne Geosci. Work. JPL Publ. 92-14</source>
          , Vol.
          <volume>1</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <source>In: International Geoscience and Remote Sensing Symposium (IGARSS)</source>
          (
          <year>1999</year>
          ). https://doi.org/10.1109/igarss.
          <year>1999</year>
          .
          <volume>773549</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <string-name>
            <surname>Keras</surname>
          </string-name>
          , https://keras.io/,
          <source>last accessed</source>
          <year>2020</year>
          /06/05.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <string-name>
            <surname>SHERPA</surname>
          </string-name>
          , https://parameter-sherpa.readthedocs.io/en/latest/,
          <source>last accessed</source>
          <year>2020</year>
          /06/05.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <string-name>
            <surname>Snoek</surname>
            ,
            <given-names>B.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Larochelle</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Adams</surname>
            ,
            <given-names>R.P.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Snoek</surname>
          </string-name>
          , J.:
          <source>PRACTICAL BAYESIAN OPTIMIZATION OF MACHINE LEARNING. Arxiv</source>
          .
          <volume>1</volume>
          -
          <fpage>12</fpage>
          (
          <year>2001</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <string-name>
            <surname>Kruse</surname>
            ,
            <given-names>F.A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Lefkoff</surname>
            ,
            <given-names>A.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boardman</surname>
            ,
            <given-names>J.W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Heidebrecht</surname>
            ,
            <given-names>K.B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shapiro</surname>
            ,
            <given-names>A.T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Barloon</surname>
            ,
            <given-names>P.J.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Goetz</surname>
            ,
            <given-names>A.F.H.</given-names>
          </string-name>
          :
          <article-title>The spectral image processing system (SIPS)- interactive visualization and analysis of imaging spectrometer data</article-title>
          .
          <source>Remote Sens. Environ</source>
          . (
          <year>1993</year>
          ). https://doi.org/10.1016/
          <fpage>0034</fpage>
          -
          <lpage>4257</lpage>
          (
          <issue>93</issue>
          )
          <fpage>90013</fpage>
          -
          <lpage>N</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <article-title>Presented at the (</article-title>
          <year>2013</year>
          ). https://doi.org/10.13031/
          <year>2013</year>
          .19105.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <string-name>
            <surname>ElMasry</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>ElSayed</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Ngadi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          :
          <article-title>Hyperspectral imaging for nondestructive determination of some quality attributes for strawberry</article-title>
          .
          <source>J. Food Eng</source>
          . (
          <year>2007</year>
          ). https://doi.org/10.1016/j.jfoodeng.
          <year>2006</year>
          .
          <volume>10</volume>
          .016.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>