<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Lightweight segmentation of UAV images for early detection of maize leaf diseases</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Arnaud</forename><forename type="middle">S R M</forename><surname>Ahouandjinou</surname></persName>
							<email>arnaud.ahouandjinou@imsp-uac.org</email>
							<affiliation key="aff1">
								<orgName type="department">Institute of Training and Research in Computer Science</orgName>
								<orgName type="institution">University of Abomey-Calavi</orgName>
								<address>
									<settlement>Abomey-Calavi</settlement>
									<country key="BJ">Benin</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Ambroise</forename><forename type="middle">D K</forename><surname>Houedjissin</surname></persName>
							<email>abhouedjissin@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="department">Doctoral School of Engineering Science</orgName>
								<orgName type="institution">University of Abomey-Calavi</orgName>
								<address>
									<settlement>Abomey-Calavi</settlement>
									<country key="BJ">Benin</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Manhougbé</forename><surname>Probus</surname></persName>
						</author>
						<author>
							<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Kiki</surname></persName>
							<affiliation key="aff0">
								<orgName type="department">Doctoral School of Engineering Science</orgName>
								<orgName type="institution">University of Abomey-Calavi</orgName>
								<address>
									<settlement>Abomey-Calavi</settlement>
									<country key="BJ">Benin</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Xavier</forename><surname>Amétépé</surname></persName>
							<affiliation key="aff1">
								<orgName type="department">Institute of Training and Research in Computer Science</orgName>
								<orgName type="institution">University of Abomey-Calavi</orgName>
								<address>
									<settlement>Abomey-Calavi</settlement>
									<country key="BJ">Benin</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Kokou</forename><forename type="middle">M</forename><surname>Assogba</surname></persName>
							<email>mkokouassogba@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="department">Doctoral School of Engineering Science</orgName>
								<orgName type="institution">University of Abomey-Calavi</orgName>
								<address>
									<settlement>Abomey-Calavi</settlement>
									<country key="BJ">Benin</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Lightweight segmentation of UAV images for early detection of maize leaf diseases</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">71B6DD14A9B88A3538022252264EB914</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2025-04-23T17:39+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>UAV, RGB images, motion blur, maize diseases detection, segmentation Assogba) 0000-0001-7217-5588 (A. S. R. M. Ahouandjinou)</term>
					<term>0009-0003-1880-7283 (A. D. K. Houedjissin)</term>
					<term>0009-0009-0070-8707 (M. P. A. F. Kiki)</term>
					<term>0000-0002-5275-8150 (F. X. Amétépé)</term>
					<term>0009-0004-6328-0799 (K. M. Assogba)</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Unmanned Aerial Vehicles (UAVs) equiped with RGB cameras have emerged as effective tools for monitoring agricultural crops. However, motion blur in UAV images can affect the accuracy of subsequent image analysis tasks, such as disease detection in plant leaves. This study proposes a real-time image segmentation approach for analyzing UAV-captured maize leaf images. The algorithm evaluates image blur using the Laplacian variance, applies an adaptive Wiener filter for deblurring, segments maize leaves from the background using color transformations, and identifies diseased regions through Canny edge and contour detection. Experimental results demonstrate the lightweight and effectiveness of proposed approach with less than 1s runtime, improving image quality and allowing accurate disease detection of maize crops for real-time purpose.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Plant disease detection is a key application of UAVs and has been extensively researched <ref type="bibr" target="#b0">[1]</ref>. One of the advantages of using UAVs is its ability to detect diseases early and prevent their spread, thereby reducing crop losses <ref type="bibr" target="#b1">[2]</ref>. Decision-support systems that incorporate UAVs can lead to better decisionmaking, increased production, improved product quality, and labor savings <ref type="bibr" target="#b2">[3]</ref>. UAVs are utilized across various crop types and for detecting multiple diseases. Some diseases present visible symptoms, while others require temperature measurements for detection <ref type="bibr" target="#b3">[4]</ref>. Early detection of pests and crop diseases provides farmers and other stakeholders with enough time to prevent potential epidemics and minimize yield losses. However, motion blur in UAV images generally caused by the camera movement during image capture, the combined effects of atmospheric turbulence, the shaking of the UAV platforms, high altitude or operation errors can affect the accuracy of subsequent image analysis tasks, such as disease detection in crop plant leaves <ref type="bibr" target="#b4">[5]</ref>. This represents a common issue in UAV imagery and various methods have been proposed to address motion blur.</p><p>On the other hand, recent advancements in deep learning (DL) have produced various methods for detecting and classifying plant diseases using images of infected plants <ref type="bibr" target="#b5">[6]</ref>. However, they require huge datasets for advanced approaches such as CNN to produce good results and large image datasets result in increased accuracy rates <ref type="bibr" target="#b6">[7]</ref>.</p><p>This study presents a real-time algorithm for the segmentation of UAV images, specifically targeting the detection of maize plant leaf diseases. The proposed method leverages motion blur detection, adaptive Wiener filtering, color conversion combined with morphological operations, Canny edge detection, Otsu color thresholding and contour area detection so that to isolate infected regions. The results demonstrate the algorithm's efficacy in identifying unhealthy plant areas in less than 1 second runtime, thereby providing a robust tool for precision agriculture in real-time. The rest of the paper is organized as follows: in section 2, we present the related works, section 3 outlines the proposed approach, the experimentation is introduced in section 4 and results and discussion are presented in section 5. Finally, section 6 provides a conclusion.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Related works</head><p>Successful disease estimation has been demonstrated in many UAV-based imagery applications such as <ref type="bibr" target="#b7">[8]</ref>, <ref type="bibr" target="#b8">[9]</ref>, <ref type="bibr" target="#b9">[10]</ref>. These studies often used either the mean value of the vegetation index or the count of pixels below a certain threshold within a plot to estimate the disease score.</p><p>Table <ref type="table" target="#tab_0">1</ref> gives a synthetic comparative analysis of classical image classification techniques in plant leaves healthy and unhealthy area detection. Compares the distribution of colors in an image using a threshold.</p><p>-Simple to implement -Fast computation -Limited discriminative power -Sensitive to changes in lighting conditions <ref type="bibr" target="#b0">[1]</ref> Texture Analysis Analyzing textural patterns in an image to characterize healthy and unhealthy areas.</p><p>-Captures subtle differences in texture -Robust against changes in lighting and color -Parameter tuning required -May be computationally intensive <ref type="bibr" target="#b1">[2]</ref> Machine Learning Models Utilizing machine learning algorithms (e.g., SVM, Random Forest, CNN) to learn features and classify healthy and unhealthy areas.</p><p>-High accuracy and robustness -Can automatically learn complex patterns -Requires large amounts of labeled data -Training and inference can be computationally costly <ref type="bibr" target="#b2">[3]</ref> According to Table <ref type="table" target="#tab_0">1</ref> the choice of classification algorithm depends on various factors, including the desired level of accuracy, computational resources, and the availability of labeled data. Color thresholding is simple and efficient but may lack the discriminative power of more complex methods like texture analysis and machine learning models. Texture analysis can capture subtle differences in texture but may require more computational resources. Machine learning models offer high accuracy but come with higher complexity and resource requirements, especially during training.  </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Proposed Approach</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.">UAV Image Acquisition from field</head><p>First, UAVs equipped with high-resolution RGB cameras capture images of maize fields. The UAV images were collected between 4:45 p.m. and 6:00 p.m., on July 14, 2024, whereas it was sunny and windless. The DJI Mini 3 Pro, a quadcopter UAV system, was used to collect aerial images of maize leaves from a field. This system carried an automated RGB sensor (Quad Bayer CMOS camera) which was developed for agricultural applications.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.1.">Image Preprocessing</head><p>The second step in the algorithm involves preprocessing the UAV-captured images to enhance their quality and reduce noise. This is crucial for improving the accuracy of subsequent image analysis steps.</p><p>1. Motion Blur Detection and Assessing: the Laplacian variance <ref type="bibr" target="#b13">[14]</ref> is used to detect motion blur. When detected, an adaptive Wiener filter is applied to deblur the images. To assess the degree of motion blur in the UAV images, we calculate the Laplacian variance of the grayscale image:</p><formula xml:id="formula_0">𝐿𝑎𝑝𝑙𝑎𝑐𝑖𝑎𝑛(𝐼) = ∑ + ,<label>(1)</label></formula><formula xml:id="formula_1">𝑉𝑎𝑟𝑖𝑎𝑛𝑐𝑒 = ∑ (𝐿𝑎𝑝𝑙𝑎𝑐𝑖𝑎𝑛(𝑖, 𝑗) − 𝜇) , (<label>2</label></formula><formula xml:id="formula_2">)</formula><p>where I is the grayscale image, μ is the mean of the Laplacian image, and N is the total number of pixels.</p><p>The variance provides a measure of image sharpness, with lower values indicating higher blur levels.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Image Deblurring by Adaptive Wiener Filter: if the image is found to be blurred, an adaptive</head><p>Wiener filter is applied to reduce noise and enhance details. The Wiener filter operates as follows:</p><p> Normalize grayscale image:</p><formula xml:id="formula_3">𝐼 = .<label>(3)</label></formula><p> Create averaging kernel:</p><formula xml:id="formula_4">𝐾 = 1 ×<label>(4)</label></formula><p> Compute local mean and variance:</p><formula xml:id="formula_5">𝜇 = 𝑐𝑜𝑛𝑣𝑜𝑙𝑣𝑒2𝑑(𝐼 , 𝐾, '𝑠𝑎𝑚𝑒')<label>(5)</label></formula><p> Compute overall variance:</p><formula xml:id="formula_6">𝜎 = 𝑐𝑜𝑛𝑣𝑜𝑙𝑣𝑒2𝑑(𝐼 , 𝐾, '𝑠𝑎𝑚𝑒') − 𝜇<label>(6)</label></formula><p> Apply Wiener filter</p><formula xml:id="formula_7">𝐼 = (𝐼 − 𝜇 ) ⋅ ( ) + 𝜇<label>(7)</label></formula><p> Denormalize:</p><formula xml:id="formula_8">𝐼 = 𝐼 ⋅ 255 (8)</formula></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.2.">Image Segmentation</head><p>The third step in the algorithm relates to segmentation which plays a crucial role in identifying regions of interest within UAV images, such as healthy and infected areas of maize plant leaves, for further analysis.</p><p>1. HSV Color Segmentation: the deblurred image is then converted to the HSV (Hue, Saturation, Value) color space to facilitate segmentation based on color characteristics <ref type="bibr" target="#b14">[15]</ref>. The green color range corresponding to healthy maize leaves is defined in the HSV space, and a binary mask is created to isolate these regions:</p><formula xml:id="formula_9">𝑀𝑎𝑠𝑘 = 1 𝑖𝑓 𝑟𝑎𝑛𝑔𝑒1 ≤ 𝐻𝑆𝑉(𝑥, 𝑦) ≤ 𝑟𝑎𝑛𝑔𝑒2 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒<label>(9)</label></formula><p>where 𝑟𝑎𝑛𝑔𝑒1 and 𝑟𝑎𝑛𝑔𝑒2 define the HSV range for green color. Morphological operations, including opening and closing, are applied to the binary mask to remove noise and smooth the segmented regions.</p><p>𝑀𝑜𝑟𝑝ℎ𝑜𝑙𝑜𝑔𝑦(𝐼) = (𝐼 ⊙ 𝐾) ⊕ 𝐾 (10) where ⊙ denotes morphological opening and ⊕ denotes closing, and 𝐾 is the structuring element.</p><p>2. Division into Patch: the preprocessed image is divided into a specified number of equal-sized patches for localized analysis of disease symptoms. The number of patches here is 10.</p><p>Calculate patch dimensions:</p><formula xml:id="formula_10">ℎ = (11) ; 𝑤 = (<label>12</label></formula><formula xml:id="formula_11">)</formula><p>where 𝐻 is the height and 𝑊 is the width of the patch 3. Cany Edge Detection: each patch undergoes further analysis to detect and classify objects of interest. The edges of potential diseased regions are detected using the Canny edge detection algorithm <ref type="bibr" target="#b15">[16]</ref>, which identifies boundaries based on gradients in the image:</p><formula xml:id="formula_12">𝐸𝑑𝑔𝑒𝑠(𝑥, 𝑦) = 1 𝑖𝑓 𝐺(𝑥, 𝑦) &gt; 𝑡ℎ𝑟𝑒𝑠ℎ𝑜𝑙𝑑 0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒<label>(13)</label></formula><p>where G(x,y) is the gradient magnitude at pixel (x,y).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.3.">Image Classification</head><p>The detected edges are used to identify contours, which are then classified as healthy or unhealthy based on their area. A threshold is applied to differentiate between large healthy regions and smaller unhealthy spots. Contours are then extracted and classified based on their area, with larger areas typically indicating healthy regions and smaller areas potentially indicating diseased regions. Following equations explain the classification process:</p><formula xml:id="formula_13">𝐴𝑟𝑒𝑎 = ∑ ∑ 𝐼 (<label>14</label></formula><formula xml:id="formula_14">)</formula><p>where 𝐼 is the pixel value within a contour.</p><formula xml:id="formula_15">𝐶𝑙𝑎𝑠𝑠𝑖𝑓𝑖𝑐𝑎𝑡𝑖𝑜𝑛 = 𝐻𝑒𝑎𝑙𝑡ℎ𝑦 𝑖𝑓 𝐴𝑟𝑒𝑎 &gt; 𝐴 𝑈𝑛ℎ𝑒𝑎𝑙𝑡ℎ𝑦 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 (<label>15</label></formula><formula xml:id="formula_16">)</formula><p>where 𝐴 is fixed here to 500.</p><p>Below is the detailed algorithm: </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Experimentation</head><p>The characteristics of the UAV employed for the flight mission and the computer used for testing resulting aerial images are shown in Table <ref type="table" target="#tab_1">2</ref> below.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1.">Characteristics of UAV used for data acquisition</head><p>The mini-sized, mega-capable DJI Mini 3 Pro is just as powerful as it is portable. Weighing less than 249 g and with upgraded safety features, it is not only regulation-friendly but also the safest in its series <ref type="bibr">[17]</ref>. With a 1/1.3-inch sensor and top-tier features, it redefines what it means to fly Mini.</p><p>Table <ref type="table" target="#tab_1">2</ref> shows the specifications of the small UAV used for acquiring the images of the maize plants leaves used for constructing the dataset <ref type="bibr">[18]</ref>. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2.">Experimentation site localization</head><p>The images were accessed on 14 July 2024 and acquired on board the UAV, in the village of Dodji-Sèhè inside Sekou in the town of Allada, Benin Republic. Following Figure <ref type="figure" target="#fig_3">2</ref> and Figure <ref type="figure">3</ref> give additional details on the site of study. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3.">Dataset</head><p>The dataset consists of 266 images. When the images are captured in the state of stabilization of the device they are usually clear. On the other hand, the images captured during flight time are subject to motion blur.</p><p>The selected images were in the JPG file format and 4032 x 3024 pixels (see Figure <ref type="figure" target="#fig_4">4</ref>). </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Results and discussion</head></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.1.">Preprocessing steps</head><p>Below we present a sample of image from the dataset with preprocessing steps as follows:  <ref type="table" target="#tab_2">3</ref> indicates the performances characteristics of the deblurred image. </p><formula xml:id="formula_17">(a) (b) (c)</formula></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.2.">HSV color segmentation</head><p>We perform here plant leaves segmentation using color transformations based on HSV color space combined with morphological operations. Figure <ref type="figure" target="#fig_7">7</ref> shows the result of extracted maize leaves from background.  We perform here plant leaves segmentation using color transformations based on HSV color space combined with morphological operations.</p><p>Color transformations provides a reliable means to segment maize leaves from the background, a critical step for accurate disease detection.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.3.">Division into patches and Canny Edge Detection</head><p>The preprocessed image is divided into a specified number of equal-sized patches for localized analysis of disease symptoms. The number of patches here is 10. Each patch undergoes Canny edge algorithm to detect objects of interest.</p><p>Figure <ref type="figure" target="#fig_8">8</ref> shows the resulting image after this combined process.  Table <ref type="table">4</ref> below presents the runtime of the program for a previously clear image. Table <ref type="table">4</ref> Performance Analysis of classification results of tested UAV maize leaves images.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.4.">Health classification</head><p>Performance criteria Value Runtime of the program 0.863659143447876 seconds According to Table <ref type="table">4</ref>, the classification results are otained in 0,86 second less than 1 second runtime, which ensures low computation performance of the proposed approach, crucial condition for real-time application and decision-making.</p><p>The proposed approach addresses several key challenges in UAV-based crop monitoring. By evaluating and correcting image blur, we ensure that the subsequent segmentation and analysis steps are based on high-quality data. The use of adaptive Wiener filtering is particularly beneficial for realtime applications due to its efficiency and effectiveness in varying noise conditions. Color transformations provide a reliable means to segment maize leaves from the background, a critical step for accurate disease detection. The division of the image into patches allows for detailed localized analysis, making it possible to detect early signs of disease that might be missed in a full-image analysis. Canny edge detection and color thresholding leverages both structural and color information, enhancing speed and robustness of disease detection in 0,86 second.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">Conclusion</head><p>This study presents a comprehensive real-time image processing algorithm for analyzing UAVcaptured images of maize leaves. First, we leverage adaptive Wiener filtering to address image motion blur, then use color space transformation to segment maize leaves and finally patch-based analysis to detect diseased regions through a combination of edge detection and color analysis. The proposed method offers a promising solution for automated crop health monitoring, enabling timely interventions and improving agricultural productivity, forming a fast and effective tool for precision agriculture with less than 1 second. Future work will focus on optimizing the algorithm for different crop types and integrating it into a real-time UAV-based monitoring system.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1</head><label>1</label><figDesc>Figure1describes the proposed approach.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Scheme of the proposed method. The proposed algorithm involves several key steps and each step is based on specific mathematical operations and image processing techniques to ensure accurate segmentation and detection.</figDesc><graphic coords="3,127.08,62.40,346.44,153.36" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Algorithm 1 :</head><label>1</label><figDesc>Lightweight Segmentation for maize leave disease detection Input: x,y, I(x, y), ctg(), clv(), avf(), rhsv(), sgr(), mo(), dpp(), ced(), fdc() --image of leaves Begin 1: Initialize Image Processing Output: Result --Segmented and classified plant regions 2: I(x, y) ← ctg(I(x, y))) --Convert the UAV image to a grayscale image 3: σ² ← clv(I(x, y)) --Calculate the Laplacian variance to assess image sharpness 4: if σ² &lt; Threshold then 5: I(x, y) ← avf(I(x, y)) --Apply the adaptive Wiener filter if Variance &lt; threshold 6: end if 7: IHSV ← rhsv(I(x, y)) --Resize the image and convert it to the HSV color space 8: ILeaves ← sgr(IHSV) --Segment green plant regions using predefined HSV ranges 9: Imo ←mo(IMorph--Apply morphological operations to enhance segmentation 10: Patch [ ] ← dpp(Imo) --Divide the processed image into smaller patches 11: for each patch in Patch do 12: Edge [ ] ← ced(patch) --Canny edge detection of each patch 13: Cntrs [ ] ← fdc(Edge [ ]) --Classify detected edges based on health status 14: end For</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2 : Acquisition site localization, Figure 3 : Dodji-Sèhè village, Sékou District. Benin Republic Map.</figDesc><graphic coords="6,132.60,371.04,68.16,86.52" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 4 :</head><label>4</label><figDesc>Example of UAV images of maize leaves from the dataset. Image from (a) to (d) appear to be sharp whereas those from (e) to (h) are motion-blurred.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_5"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: (a) Original image, (b) Adaptive Wiener deblurred image of (a), (c) histogram of b Table3indicates the performances characteristics of the deblurred image. Table 3 Performances of the Adaptive Wiener Filter image deblurring step Image Quality Index Value Image Entropy 6.508706806116296 MSE 704.7281638886699 PSNR 19.65058732797596 SSIM 0.853058838351285 This deblurred image is characterized by an Entropy of 6,50; a Minimun Square Error of 704,72; a Peak Signal to Noise Ratio of 19,65 and a Structural Similarity Index Measure of 0,85, which indicates generation of an image of better quality.</figDesc><graphic coords="7,372.84,125.28,106.08,85.20" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_6"><head>Figure 6 :</head><label>6</label><figDesc>Figure 6 : Original image of maize leaves.Figure 7 : Maize Leaves Extracted from background.We perform here plant leaves segmentation using color transformations based on HSV color space combined with morphological operations.Color transformations provides a reliable means to segment maize leaves from the background, a critical step for accurate disease detection.</figDesc><graphic coords="7,363.84,486.48,120.48,108.48" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_7"><head>Figure 7 :</head><label>7</label><figDesc>Figure 6 : Original image of maize leaves.Figure 7 : Maize Leaves Extracted from background.We perform here plant leaves segmentation using color transformations based on HSV color space combined with morphological operations.Color transformations provides a reliable means to segment maize leaves from the background, a critical step for accurate disease detection.</figDesc><graphic coords="7,104.52,486.96,123.84,107.64" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_8"><head>Figure 8 :</head><label>8</label><figDesc>Division into 10 patches and Canny Edge Detection.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_9"><head>Figure 9 Figure 9 :</head><label>99</label><figDesc>Figure 9 below shows the health classification results generated for the 10 patches of the sample image used above.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1</head><label>1</label><figDesc></figDesc><table><row><cell cols="3">Comparison of classical image classification techniques</cell><cell></cell><cell></cell></row><row><cell>Classification</cell><cell></cell><cell></cell><cell></cell><cell></cell></row><row><cell>Techniques</cell><cell>Description</cell><cell>Advantages</cell><cell>Limits</cell><cell>References</cell></row><row><cell>Color</cell><cell></cell><cell></cell><cell></cell><cell></cell></row><row><cell>thresholding</cell><cell></cell><cell></cell><cell></cell><cell></cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_1"><head>Table 2</head><label>2</label><figDesc>Characteristics of the DJI Mini 3 Pro used for experimentation</figDesc><table><row><cell>Characteristics</cell><cell>Specifications</cell></row><row><cell>Model</cell><cell>DJI Mini 3 Pro</cell></row><row><cell>Weight</cell><cell>Under 250g</cell></row><row><cell>Camera</cell><cell>1/1.3" (0.77") 48MP f1.7 Quad Bayer CMOS Sensor</cell></row><row><cell>Frame per second (fps)</cell><cell>4K 60 fps with HDR; 1080p 120fps</cell></row><row><cell>Camera Orientation</cell><cell>Horizontal and Fully Vertical camera orientations</cell></row><row><cell>Sensors</cell><cell>Obstacle Avoidance Sensors</cell></row><row><cell>Flight Mode</cell><cell>Intelligent</cell></row><row><cell>Flight times</cell><cell>Up to 47 minutes (with the optional larger batter) or 25-</cell></row><row><cell></cell><cell>30 minutes with the standard supplied batteries.</cell></row><row><cell>Controller</cell><cell>New Smart with 1080p 30fps</cell></row><row><cell>Maximum flight speed</cell><cell>16 m/s</cell></row><row><cell>Maximum flight height</cell><cell>4000m</cell></row><row><cell cols="2">Maximum horizontal range 8KM</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_2"><head>Table 3</head><label>3</label><figDesc>Performances of the Adaptive Wiener Filter image deblurring step This deblurred image is characterized by an Entropy of 6,50; a Minimun Square Error of 704,72; a Peak Signal to Noise Ratio of 19,65 and a Structural Similarity Index Measure of 0,85, which indicates generation of an image of better quality.</figDesc><table><row><cell>Image Quality Index</cell><cell>Value</cell></row><row><cell>Image Entropy</cell><cell>6.508706806116296</cell></row><row><cell>MSE</cell><cell>704.7281638886699</cell></row><row><cell>PSNR</cell><cell>19.65058732797596</cell></row><row><cell>SSIM</cell><cell>0.853058838351285</cell></row></table></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgments</head><p>This work was not supported by any funding. The first author would like to acknowledge individuals and groups that assisted in the research and the preparation of this work and is thankful to the Benin Ministry of Higher Education and Scientific Research.</p><p>This Word template was created by Tiago Prince Sales (University of Twente, NL) in collaboration with Manfred Jeusfeld (University of Skövde, SE). It is derived from the template designed by Aleksandr Ometov (Tampere University of Applied Sciences, FI). The template is made available under a Creative Commons License Attribution-ShareAlike 4.0 International (CC BY-SA 4.0).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Declaration on Generative AI</head><p>The author(s) have not employed any Generative AI tools.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">The rise of the drones in agriculture</title>
		<author>
			<persName><forename type="first">F</forename><surname>Veroustraete</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">EC agriculture</title>
		<imprint>
			<biblScope unit="volume">2</biblScope>
			<biblScope unit="issue">2</biblScope>
			<biblScope unit="page" from="325" to="327" />
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">The use of UAVs in monitoring yellow sigatoka in banana</title>
		<author>
			<persName><forename type="first">V</forename><forename type="middle">B C</forename><surname>Calou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Santos Teixeira</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">C J</forename><surname>Moreira</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">S</forename><surname>Lima</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">B</forename><surname>De Oliveira</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">R R</forename><surname>Oliveira</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">biosystems engineering</title>
		<imprint>
			<biblScope unit="volume">193</biblScope>
			<biblScope unit="page" from="115" to="125" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Quantifying the effect of Jacobiasca lybica pest on vineyards with UAVs by combining geometric and computer vision techniques</title>
		<author>
			<persName><forename type="first">A</forename><surname>Del-Campo-Sanchez</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">PLoS One</title>
		<imprint>
			<biblScope unit="volume">14</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page">e0215521</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Mapping skips in sugarcane fields using object-based analysis of unmanned aerial vehicle (UAV) images</title>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">H W</forename><surname>De Souza</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">A C</forename><surname>Lamparelli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">V</forename><surname>Rocha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">S G</forename><surname>Magalhães</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Computers and Electronics in Agriculture</title>
		<imprint>
			<biblScope unit="volume">143</biblScope>
			<biblScope unit="page" from="49" to="56" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Video deblurring via motion compensation and adaptive information fusion</title>
		<author>
			<persName><forename type="first">Z</forename><surname>Zhan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Yang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Li</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Pang</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Neurocomputing</title>
		<imprint>
			<biblScope unit="volume">341</biblScope>
			<biblScope unit="page" from="88" to="98" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Plant leaf detection and disease recognition using deep learning</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">V</forename><surname>Militante</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">D</forename><surname>Gerardo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><forename type="middle">V</forename><surname>Dionisio</surname></persName>
		</author>
		<ptr target="https://ieeexplore.ieee.org/abstract/document/8942686/" />
	</analytic>
	<monogr>
		<title level="m">2019 IEEE Eurasia conference on IOT, communication and engineering (ECICE)</title>
				<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2019-08-16">2019. Aug. 16, 2024</date>
			<biblScope unit="page" from="579" to="582" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">A further step to perfect accuracy by training CNN with larger data</title>
		<author>
			<persName><forename type="first">S</forename><surname>Uchida</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Ide</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">K</forename><surname>Iwana</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Zhu</surname></persName>
		</author>
		<ptr target="https://ieeexplore.ieee.org/abstract/document/7814098/" />
	</analytic>
	<monogr>
		<title level="m">2016 15th International Conference on Frontiers in Handwriting Recognition (ICFHR)</title>
				<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2016-08-16">2016. Aug. 16, 2024</date>
			<biblScope unit="page" from="405" to="410" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Peanut leaf wilting estimation from RGB color indices and logistic models</title>
		<author>
			<persName><forename type="first">S</forename><surname>Sarkar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Ramsey</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A.-B</forename><surname>Cazenave</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Balota</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Frontiers in plant science</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="page">658621</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Wheat yellow rust detection using UAV-based hyperspectral technology</title>
		<author>
			<persName><forename type="first">A</forename><surname>Guo</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Remote Sensing</title>
		<imprint>
			<biblScope unit="volume">13</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page">123</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">High throughput phenotyping of tomato spot wilt disease in peanuts using unmanned aerial systems and multispectral imaging</title>
		<author>
			<persName><forename type="first">A</forename><surname>Patrick</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Pelham</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Culbreath</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">C</forename><surname>Holbrook</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><forename type="middle">J</forename><surname>De Godoy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Li</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">IEEE Instrumentation &amp; Measurement Magazine</title>
		<imprint>
			<biblScope unit="volume">20</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="4" to="12" />
			<date type="published" when="2017">2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Color indexing</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">J</forename><surname>Swain</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">H</forename><surname>Ballard</surname></persName>
		</author>
		<idno type="DOI">10.1007/BF00130487</idno>
	</analytic>
	<monogr>
		<title level="j">Int J Comput Vision</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="11" to="32" />
			<date type="published" when="1991-11">Nov. 1991</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Textural features for image classification</title>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">M</forename><surname>Haralick</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Shanmugam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><forename type="middle">H</forename><surname>Dinstein</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">IEEE Transactions on systems, man, and cybernetics</title>
		<imprint>
			<biblScope unit="issue">6</biblScope>
			<biblScope unit="page" from="610" to="621" />
			<date type="published" when="1973">1973</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">ImageNet classification with deep convolutional neural networks</title>
		<author>
			<persName><forename type="first">A</forename><surname>Krizhevsky</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Sutskever</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">E</forename><surname>Hinton</surname></persName>
		</author>
		<idno type="DOI">10.1145/3065386</idno>
	</analytic>
	<monogr>
		<title level="j">Commun. ACM</title>
		<imprint>
			<biblScope unit="volume">60</biblScope>
			<biblScope unit="issue">6</biblScope>
			<biblScope unit="page" from="84" to="90" />
			<date type="published" when="2017-05">May 2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Blur image detection using Laplacian operator and Open-CV</title>
		<author>
			<persName><forename type="first">R</forename><surname>Bansal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Raj</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Choudhury</surname></persName>
		</author>
		<ptr target="https://ieeexplore.ieee.org/abstract/document/7894491/" />
	</analytic>
	<monogr>
		<title level="m">2016 International Conference System Modeling &amp; Advancement in Research Trends (SMART), IEEE</title>
				<imprint>
			<date type="published" when="2016-08-17">2016. Aug. 17, 2024</date>
			<biblScope unit="page" from="63" to="67" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Analysis of various image preprocessing techniques for denoising of flower images</title>
		<author>
			<persName><forename type="first">I</forename><surname>Patel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Patel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Patel</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">International Journal of Computer Sciences and Engineering</title>
		<imprint>
			<biblScope unit="volume">6</biblScope>
			<biblScope unit="issue">5</biblScope>
			<biblScope unit="page" from="1111" to="1117" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">A computational approach to edge detection</title>
		<author>
			<persName><forename type="first">J</forename><surname>Canny</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">IEEE Transactions on pattern analysis and machine intelligence</title>
		<imprint>
			<biblScope unit="volume">6</biblScope>
			<biblScope unit="page" from="679" to="698" />
			<date type="published" when="1986">1986</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
