<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Potato Leaf Disease Detection using CNN -A Lightweight Approach</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Abhisek</forename><surname>Saha</surname></persName>
							<email>saha.abhisek@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="institution">Netaji Subhash Engineering College</orgName>
								<address>
									<addrLine>Techno City, Garia, Ranabhutia</addrLine>
									<postCode>700152</postCode>
									<settlement>Kolkata</settlement>
									<region>West Bengal</region>
									<country key="IN">India</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Syed</forename><forename type="middle">Mohammed</forename><surname>Musharraf</surname></persName>
							<email>syedmdmusharraf@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="institution">Netaji Subhash Engineering College</orgName>
								<address>
									<addrLine>Techno City, Garia, Ranabhutia</addrLine>
									<postCode>700152</postCode>
									<settlement>Kolkata</settlement>
									<region>West Bengal</region>
									<country key="IN">India</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Anubhav</forename><surname>Dey</surname></persName>
							<email>anubhavd56@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="institution">Netaji Subhash Engineering College</orgName>
								<address>
									<addrLine>Techno City, Garia, Ranabhutia</addrLine>
									<postCode>700152</postCode>
									<settlement>Kolkata</settlement>
									<region>West Bengal</region>
									<country key="IN">India</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Hiranmoy</forename><surname>Roy</surname></persName>
							<email>hiranmoy.roy@rcciit.org.in</email>
							<affiliation key="aff1">
								<orgName type="department">Department of Information Technology</orgName>
								<orgName type="institution">RCC Institute of Information Technology</orgName>
								<address>
									<addrLine>Canal South Road</addrLine>
									<postCode>700015</postCode>
									<settlement>Kolkata</settlement>
									<country key="IN">India</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Debotosh</forename><surname>Bhattacharjee</surname></persName>
							<email>debotoshb@hotmail.com</email>
							<affiliation key="aff2">
								<orgName type="department">Department of Computer Science &amp; Engineering</orgName>
								<orgName type="institution">Jadavpur University</orgName>
								<address>
									<postCode>700032</postCode>
									<settlement>Kolkata</settlement>
									<country key="IN">India</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Potato Leaf Disease Detection using CNN -A Lightweight Approach</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">A2FFF922A858C2B5F4E23FFFABC757ED</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2025-04-23T19:10+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Potato Leaf Disease</term>
					<term>CNN</term>
					<term>Image Enhancement</term>
					<term>Image classification</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Detection of potato leaf diseases at an early stage is of great significance to the agricultural industry. The conventional tactics of disease identification that exist are either unreliable or very complex or costly making them not suitable as viable techniques. However, with a boom in the field of Artificial intelligence, many procedures have come up over the recent years to help solve this problem. Data being the fuel for such procedures, it is very important to source reliable and accurate data for the training purpose of the AI based models. The task of disease detection for potato leaves is quite challenging as the symptoms show a lot of variations depending upon the species, climate and environmental factors. The popular pretrained models used for this purpose are VGG16, Inception V3, ResNet50 which help us to classify diseases of plants. In our research we have tried to build a custom Convolutional Neural Network classification model which is more robust and light weight as compared to the existing approaches. The model is built with a very simple approach and is trained using two standard publicly available datasets namely "PlantVillage" and PLD. The correctness of the suggested model has shown promising and consistent output with accuracy of 99.3% and 99.23%, while implemented on the two datasets respectively. To achieve the said accuracy, we have used image Enhancement algorithm: CLAHE at the preprocessing stage after the data acquisition.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Since the dawn of human civilization, agriculture has played a crucial role in transforming people from roving hunter-gatherers to established citizens <ref type="bibr" target="#b0">[1]</ref>. It has facilitated the growth of large human populations by providing a reliable and stable source of nutrition. The history of agriculture is a long continuum of groundbreaking innovations, evolving rapidly through industrial revolutions and advancements in modern science, particularly in the 20th and 21st centuries. Unlike a single, definitive moment, the origin of agriculture in human civilization unfolded over centuries and cannot be precisely dated. Researchers concur that early Homo Sapiens began transitioning from a nomadic lifestyle to settling down, domesticating animals, and cultivating cereal seeds during the early Neolithic period, known as the Neolithic Revolution <ref type="bibr" target="#b1">[2]</ref>. This shift likely took place as glaciers retreated northward and the climate warmed, approximately 10,000 years ago-though some estimates place it at 12,000 or even 15,000 years ago. This further led to the development of complex societies as humans learnt the ways of trade and caused the growth of economy and exchange. From a contemporary standpoint, agriculture is a very dynamic industry that is essential to meet the basic need of food for enitire population. In a developing nation like India, agriculture is an important sector of its economy as it contributes about 15% of total GDP and renders employment to about 60% of its residing population <ref type="bibr" target="#b2">[3]</ref>. However, despite being a formidable industry, due to the problem of crop diseases, the sector suffers havoc. Plant diseases can have a major effect on the leaves, fruits and different parts of crops, which degrades quality of crops and yield <ref type="bibr" target="#b3">[4]</ref>. This, in turn, contributes to food scarcity and insecurity on a global scale. It is estimated that crop diseases cause an annual loss of around 16% in global crop yields, making them a major factor behind famines and rising production costs. According to predictions from the Food and Agriculture Organization (FAO), there will be 9.5 billion people on the earth by next thirty years <ref type="bibr" target="#b4">[5]</ref>, meaning that a 75% increase in food production is necessary to provide a consistent supply of food. A group of factors that impact plants and their products are diseases and illnesses. As opposed to disorders, which are mostly caused by factors such as rainfall, temperature, moisture, and nutritional deficiencies, illnesses are caused by biotic agents such as fungi, bacteria, and algae <ref type="bibr" target="#b5">[6]</ref>. For the well-being of the crops, early and precise identification of diseases is necessary, so that the correct cure can be applied in time. Various methods are available for diagnosing plant diseases, with one of the simplest ones being visual inspection. Conventional diagnostic techniques frequently depend on the farmer's knowledge, which can be erratic and untrustworthy. Due to the extensive, time-consuming process needed and the restricted availability of experts in remote places <ref type="bibr" target="#b0">[1]</ref>, this strategy is frequently impracticable. To improve accuracy, researchers have introduced spectrometers to distinguish between healthy and infected plant leaves <ref type="bibr" target="#b6">[7]</ref>. Another technique involves extracting leaf DNA using the PCR <ref type="bibr" target="#b7">[8]</ref>. These methods are complex, costly, and time-consuming, requiring specialized skills, controlled experimental conditions, and extensive use of crop safety equipments. The role of artificial intelligence is immensely significant in this aspect. If the training of deep learning based models, is done on labeled samples, we can perform automated, efficient and accurate leaf disease detection <ref type="bibr" target="#b8">[9]</ref>. We have chosen potato as the crop of our concern and our model serves to categorically classify blight disease both early and late, as well healthy leaf images. Potato (Solanum tuberosum) <ref type="bibr" target="#b9">[10]</ref> is a temperate crop grown under subtropical conditions in India. Soils that are loose, muddy, and sandy and rich in organic matter are ideal for growing potato crops; alkaline and saline soils are not suited <ref type="bibr" target="#b10">[11]</ref>. In addition to being a great source of fiber and heart disease prevention, potatoes are vital for overall health. Their high antioxidant content aids in the defense against diseases including excessive cholesterol and irregular blood sugar levels <ref type="bibr" target="#b11">[12]</ref>. The nation has been growing potatoes for more than three centuries. It is currently one of the most widely grown crops in this nation for vegetable purposes. Potatoes are a cost-effective food that contributes inexpensive energy to the human diet. For the purpose of training our mode, we have made use of the publicly available dataset "PlantVillage" and PLD. Both these datasets contain potato leaves images for both blight diseases, Early and Late as well as healthy also. Pre-indication of the disease Early Blight can be seen initially towards the base of the plant, with roughly circular brown colored spots on the leaves and stems. This infection caused by Fungus can be deadly for tubers, leaves and stems, causing many problems like reduced tuber size, low produce and crop yield. One more fungal disease that affects potato crops is late blight, which first manifests as patches on the stems. They spread out very quickly, creating big, dark brown, and black areas that frequently look oily <ref type="bibr" target="#b0">[1]</ref>. However, a marked variation in these symptoms can be observed based on the region it is taken from, the climate, the species etc., making it a difficult task to build a generalized classifier. Kamal et. al. <ref type="bibr" target="#b12">[13]</ref> developed two models based on MobileNet architecture and applied the architecture on PlantVillage dataset. They have achieved an accuracy of 97.65% and 98.34% from Reduced MobileNet and Modified MobileNet respectively for classification of 55 classes in PlantVillage dataset. Liang et.al. Two different types of deep architectures where the first one based is on residual learning and the second one is based on attention mechanism, have been developed by Karthik and his co-authors <ref type="bibr" target="#b13">[14]</ref>. The model based on the attention mechanism has attained an accuracy of 98% on the public dataset named plantvillage for the detection of diseased tomato leaf. Khamparia et. al. <ref type="bibr" target="#b14">[15]</ref> designed a hybrid approach of developing an architecture for disease detection of potato leaves using Deep Neural Network and autoencoders and attained an accuracy of 97.50% on PlantVillage dataset. Islam et. al. <ref type="bibr" target="#b15">[16]</ref> presented an approach which integrates the technique of image processing and machine learning to classify potato leaf diseases. They used the renowned machine learning algorithm SVM in their recommended solution and attained an accuracy of 95% over 300 images. For the identification and classification of cassava illness, Sambasivam and his team <ref type="bibr" target="#b16">[17]</ref> developed a deep neural network that was trained on a very small dataset with significant class imbalance. A possible solution to the illness identification accuracy problem with the least amount of time investment is the Kuan filtered Hough transformation based reweighted linear program boost classification (KFHT-RLPBC) technique, which was introduced by Nagarjan and his co-authors <ref type="bibr" target="#b17">[18]</ref>. By using the PlantVillage dataset, they achieved an accuracy of 92%. Geetharamani et.al. <ref type="bibr" target="#b18">[19]</ref> recommended a deep Convolutional Neural Network which can effectively identify and solve the problem of plant leaf diseases. They intended to carry out a more thorough analysis of the training procedure without utilizing the tagged photos after achieving an accuracy rate of 96.46% in the PlantVillage dataset. Table <ref type="table" target="#tab_0">1</ref> discusses the synopsis of related works. Most of the current methods for crop leaf disease detection have used the popular transfer learning models, but these models often have a higher degree of parameters which leads to the problem of computational complexity. Other approaches have implemented custom Convolutional Neural Networks (CNNs) with reduced parameter counts; however, they generally fall short in achieving significant amount of accuracy. Thus, there is a need for a more efficient, lightweight model for crop leaf disease detection. In order to detect diseases in potato leaves, we provide a novel lightweight Convolutional Neural Network (CNN) in this research that can recognize both simple and abstract patterns. The model architecture, illustrated in Figure <ref type="figure" target="#fig_1">1</ref>, comprises convolutional layers, maxpooling layers for edge feature extraction, batch normalization to normalize the input neuron values, dropout layers to reduce overfitting, and fully connected layers for classification purpose. We have also used the Contrast Limited Adaptive Histogram Equalization (CLAHE) technique for image preprocessing.The major contributions are as follows.</p><p>• The model introduces a lightweight CNN architecture which is both less complex with respect to number of parameters and highly accurate for leaf disease detection of potato leaves. • Through the use of convolution and max pooling layers, it is able to capture both intricate patterns and minute details. • The addition of CLAHE raises the picture quality, which strengthens the model's capacity to accurately detect damaged potato leaves. • Data augmentation techniques used to make the training and testing datasets larger and more balanced. The classifier's capacity for generalization increases with the use of various data augmentation strategies.</p><p>The study is organized as follows: the proposed CNN architecture is thoroughly explained in Section 2.</p><p>The work is concluded in Section 4, while the experiments and comparison findings are presented in Section 3. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Proposed Methodology</head><p>Three subsections comprise this section: Data Preprocessing, Acquisition of Data, and Classification.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.1.">Acquisition of Data</head><p>The images of potato leaf diseases were sourced from two publicly available datasets: the PLD dataset <ref type="bibr" target="#b4">[5]</ref> and the PlantVillage dataset <ref type="bibr" target="#b20">[21]</ref> . Both datasets include two types of blight disease early, late as well as healthy images . The PlantVillage dataset provided a sum of 2,152 images-1,000 each for both kind of blight diseases and 152 for healthy leaves (refer table-4). Owing to the limited quantity and imbalance of images, additionally, 3,251 photos from Pakistan's Central Punjab were included from the PLD dataset.This dataset contains 816 healthy images, 1,303 early blight images, and 1,132 late blight images after redundancy has been removed (refer to table-5).All photos are saved in uncompressed JPG style and have RGB color profiles.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.2.">Data Preprocessing</head><p>As preproceesing stage, we employed CLAHE <ref type="bibr" target="#b21">[22]</ref>  </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.3.">Classification</head><p>Image processing, recognition, and classification are the main applications for CNNs, a kind of deep learning method. The architecture of the human brain served as the model for CNN's design. The similarity of both can be understood as human brain has neurons and in the neural networks, it is the neurons which are the backbone of the entire system. A CNN consists of several layers namely convolutional, maxpooling, dropout and fully connected .</p><p>The convolution operation for a 2D convolutional layer can be represented as:</p><formula xml:id="formula_0">𝑦 𝑝,𝑞,𝑟 = 𝑀 ∑︁ 𝑖=1 𝑁 ∑︁ 𝑗=1 𝐶 ∑︁ 𝑘=1 𝑥 𝑝+𝑖,𝑞+𝑗,𝑘 • 𝑤 𝑖,𝑗,𝑘,𝑟 + 𝑏 𝑟</formula><p>• 𝑦 𝑝,𝑞,𝑟 : Output feature map at position (𝑝, 𝑞) in channel 𝑟.</p><p>• 𝑥 𝑝+𝑖,𝑞+𝑗,𝑘 : Input feature map at position (𝑝 + 𝑖, 𝑞 + 𝑗) in channel 𝑘.</p><p>• 𝑤 𝑖,𝑗,𝑘,𝑟 : Convolution filter weight of size (𝑀 × 𝑁 ) for channel 𝑘 and output channel 𝑟.</p><p>• 𝑏 𝑟 : Bias term for channel 𝑟.</p><p>For max-pooling, the operation can be written as:</p><formula xml:id="formula_1">𝑦 𝑝,𝑞,𝑟 = max 𝑖,𝑗</formula><p>(𝑥 𝑝+𝑖,𝑞+𝑗,𝑟 )</p><p>• 𝑦 𝑝,𝑞,𝑟 : Output of max-pooling at position (𝑝, 𝑞) in channel 𝑟.</p><p>• 𝑥 𝑝+𝑖,𝑞+𝑗,𝑟 : Input feature map over a pooling window defined by (𝑖, 𝑗).</p><p>The following is an expression for a Fully connected layer's output.</p><formula xml:id="formula_2">𝑦 𝑗 = 𝑁 ∑︁ 𝑖=1 𝑤 𝑗,𝑖 𝑥 𝑖 + 𝑏 𝑗</formula><p>• 𝑦 𝑗 : Output of the 𝑗 th neuron in the layer.</p><p>• 𝑥 𝑖 : Input from the 𝑖 th neuron in the previous layer.</p><p>• 𝑤 𝑗,𝑖 : Weight connecting the 𝑖 th neuron in the previous layer to the 𝑗 th neuron.</p><p>• 𝑏 𝑗 : Bias term for the 𝑗 th neuron.</p><p>Softmax function, used to convert final layer's logits into probabilities, is given by:</p><formula xml:id="formula_3">𝜎(𝑧) 𝑖 = 𝑒 𝑧 𝑖 ∑︀ 𝑁 𝑗=1 𝑒 𝑧 𝑗</formula><p>• 𝜎(𝑧) 𝑖 : Softmax output for class 𝑖.</p><p>• 𝑧 𝑖 : Logit (raw output) for class 𝑖.</p><p>• 𝑁 : Total number of classes.</p><p>The funct of these layers is to detect the features like edges and complex patterns. To extract features and different kinds of edges, various types of filters will be used as per the requirement. For Potato Disease Detection, we used the three well known transfer learning models: VGG16, InceptionV3 and ResNet-50. VGG16 <ref type="bibr" target="#b22">[23]</ref> is a deep CNN architecture which has been introduced for image classification in the year 2014. Its architecture is based on the input size 224x224 pixels of RGB images. It consists of total 16 layers comprises with 13 convolutional and 3 fully connected layers. Activation function relu is employed in all the layers. VGG16 is trained on ImageNet dataset and is competent enough for the classification of images into 1000 classes. categories and detecting objects from 200 classes. The convolution layer has 3x3 filters with increasing number of filters to detect the complex hierarchal patterns in the images. Max pooling layers of size 2x2 with a stride of 2 have been used to extract features that select the maximum valued pixel within each small region. After the feature extraction layers, there are 2 fully connected layers of 4096 neurons and finally there is a fully connected layer of 1000 neurons for classification purpose. The 48-layer deep pretrained CNN InceptionV3 <ref type="bibr" target="#b23">[24]</ref> was trained on the ImageNet dataset and can categorize images into 1000 different categories. The network is based on the input size of 299x299 pixels of RGB images. The layers architecture of InceptionV3 consists of Inception modules where each module is combined of 1x1, 3x3, and 5x5 convolutions. InceptionV3 has fewer no. of parameters because of factorizing convolutions. A convolution of 5X5 filter can be replaced by two 3x3 filters. In this context, for a 5x5 filter it requires 25 parameters but for two 3x3 filters it requires 18 parameters. It will reduce the no. of parameters by 28% without losing the ability to capture patterns by a 5x5 filter. Because of this light weight architecture, it is computationally efficient to work on. ResNet-50 <ref type="bibr" target="#b24">[25]</ref> is a popular deep cNN architecture, which is a part of the ResNet (Residual Networks) family, which was developed to tackle the problem of vanishing gradient in deep networks by introducing residual blocks. It was first introduced in the year 2016.Convolutional layers, batch normalization, ReLU activations, and skip (residual) connections make up this 50-layer deep model. 48 convolutional layers, one max-pooling layer, and one average-pooling layer make up the layers. There are many residual blocks in each of the four main stages of the model. Each residual block in ResNet-50 has a shortcut connection that skips single or multiple layers, enabling the gradient to pass back through the network without vanishing. They consist of three convolutional layers with 1x1, 3x3, and 1x1 convolutions. The final fully connected layer in ResNet-50 typically has 1,000 output units for 1,000 classes in the ImageNet dataset, which the model was originally trained on. However, ResNet-50 can be modified to handle any number of classes by adjusting the number of output units in the final layer. This modification is common in transfer learning, where the network is adapted to different datasets with fewer or more classes. For our experiment of potato disease detection, we propsed a custom CNN which comprises with 20 layers Table-3. The architecture of the proposed CNN model is depicted in following steps.  • The output of the last dense layer, which consists of three neurons representing the three classes, is then sent through the Softmax activation function to classify the image as either Healthy, Early Blight, or Late Blight. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Hrizontal Filp True</head><p>Softmax is generalization of sigmoid function for multiclass classification which generates probabilistic output between 0 and 1.Relu activation adds non-linearity to the model in each convolution and max pooling layer by setting just the negative variables to zero while leaving the positive ones unaltered. Details about hyperparameters are depicted in Table-9. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Experimental Result</head><p>All the experiments have been done using an IntelR CoreTM i5-1135G7 CPU, an NVIDIA Tesla T4 GPU with 16 GB VRAM, 32 GB of RAM, and a Windows operating system. The deep learning implementation was carried out using the TensorFlow 2.16.1 framework, Python 3.12.2, to accelerate neural network operations. The proposed methodology has been implemented using the two public dataset namely PLD and Plantvillage. Keras, framework of neural networks written in Python, has been used to implement the model. There is total 2152 images from PlantVillage Dataset and 3251 images of PLD dataset which have been used. To artificially increase the dataset and flexibility of the model, data augmentation technique has been employed in the dataset <ref type="bibr" target="#b25">[26]</ref>. For both datasets, we have used the same techniques. We used ImageGenerator class in Keras for augmentation purpose Table <ref type="table" target="#tab_2">2</ref>. We applied the parameters to rotate the image about an angle of 30 degree and transform the images horizontally, vertically and zoom into or out in a range of 15% randomly. With the learning rate set to 0.001 and the number of epochs set to 200, batch size 32 has been utilized Table <ref type="table" target="#tab_7">9</ref>. Categorical cross-entropy is the loss function, and the Adam optimizer is employed.</p><p>For the Plant Disease detection, we have used the publicly available PlantVillage Dataset <ref type="bibr" target="#b26">[27]</ref> and PLD Dataset. The PlantVillage dataset is a widely available and comprehensive benchmark dataset for crop leaf disease classification. It includes 54,306 samples across 14 plant species, covering a total of 32 classes. Among them, 26 classes are from diseased plants, while the remaining 12 classes belongs to healthy plants.We selected three different kinds of potato leaf disease samples-late blight, early blight, and healthy-from the "PlantVillage dataset" because our study focuses on potato leaf diseases prediction. In all, 2152 images of potato leaves were used in our experiment; 1000 of these images showed early blight, 1000 showed late blight, and the remaining 152 showed healthy in Table <ref type="table" target="#tab_4">4</ref>. Potato crops are susceptible to a fungal ailment called early blight. The PlantVillage dataset has not an adequate number of images and exhibits an uneven class distribution, so the PLD dataset has been used which has been created in Pakistan's Central Punjab region. From that dataset, we have rejected some images of potato leaves due to redundancy. There is a total of 3251 images of potato leaves used amongst which 1303 pictures are from Early blight section, 816 pictures are from healthy section and 1132 from Late blight section Table <ref type="table">5</ref>. The model's performance is calculated using standard validation metrics. To evaluate how well the suggested model discriminated, the model's accuracy, recall, precision, and f1-score were calculated. A tabular method of displaying the prediction model's performance is the confusion matrix. The number of predictions the model made when it properly or erroneously identified the classes is indicated by an entry in a confusion matrix. A classifier's True Positive (TP) is the number of predictions in which it correctly identifies the positive class as positive. Conversely, True Negative (TN) refers to the quantity of predictions in which the classifier correctly identifies the negative class as negative. False Positives (FP) are the quantity of hypotheses in which the classifier incorrectly predicts the negative class as positive.False Negative (FN) is the frequency with which the classifier incorrectly predicts the positive class as negative. Accuracy provides the model's overall accuracy, or the percentage of all samples that the classifier successfully classified. Equation ( <ref type="formula" target="#formula_4">1</ref>) can be used to determine accuracy.</p><formula xml:id="formula_4">𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = 𝑇 𝑃 + 𝑇 𝑁 𝑇 𝑃 + 𝑇 𝑁 + 𝐹 𝑃 + 𝐹 𝑁<label>(1)</label></formula><p>Precision indicates the percentage of positive class predictions that came true. Equation ( <ref type="formula" target="#formula_5">2</ref>) can be used to determine precision.</p><formula xml:id="formula_5">𝑃 𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 = 𝑇 𝑃 𝑇 𝑃 + 𝐹 𝑃<label>(2)</label></formula><p>Recall indicates the percentage of positive samples that the classifier accurately predicted to be positive. Other names for it include probability of detection, sensitivity, and true positive rate (TPR). Equation ( <ref type="formula" target="#formula_6">3</ref>) can be used to determine precision.</p><formula xml:id="formula_6">𝑅𝑒𝑐𝑎𝑙𝑙 = 𝑇 𝑃 𝑇 𝑃 + 𝐹 𝑁<label>(3)</label></formula><p>The F1 score, a mixture of the two measures, is frequently used by practitioners in machine learning to balance the precision-recall score. It merges recall and precision into one metric. In terms of mathematics, it is the harmonic mean of recall and precision. It can be computed by using Equation ( <ref type="formula" target="#formula_7">4</ref>).  The FPR shows the percentage of actual negatives that are mistakenly categorized as positives, whereas the TPR, or sensitivity, shows the percentage of actual positives that are accurately detected.</p><formula xml:id="formula_7">𝐹 1 − 𝑆𝑐𝑜𝑟𝑒 = 2 * 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 * 𝑅𝑒𝑐𝑎𝑙𝑙 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 + 𝑅𝑒𝑐𝑎𝑙𝑙<label>(4)</label></formula><p>The trade-off between sensitivity and specificity is illustrated by the ROC curve. Better performance is indicated by a model whose curve is closer to the plot's upper-left corner. This performance is frequently measured using the Area Under the ROC Curve (AUC), where a value of 0.5 indicates no discriminatory capacity and a value of 1 indicates flawless classification.   The confusion matrix for both PlantVillage and PLD dataset is given in Figure <ref type="figure" target="#fig_3">3</ref>. If we have a look at   <ref type="table" target="#tab_6">8</ref>. The implementation process requires way less hardware because of the presence of fewer parameters unlike the deep CNN architectures Table-6. The model also outperformed the other state-of-art models in terms of accuracy Table <ref type="table" target="#tab_6">8</ref>. The performance of the proposed model as compared to the other transfer learning models in terms of Precision, Recall and F1-score is given in Table <ref type="table" target="#tab_5">7</ref>. The two ROC curves for the two datasets PlantVillage and PLD is given in Figure <ref type="figure" target="#fig_7">7</ref>. Thus, the model gives us an efficient way to solve the problem of potato leaf disease detection with a better accuracy than the pretrained models and less computational hazards.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Conclusion and Future Scope</head><p>Agriculture holds the most important economical aspect of our country as majority of the common people have been heavily relied on Agriculture. Identifying illnesses that damage economically valuable crops early on is crucial to preventing farmers from suffering financial losses related to these crops. Potato is one of the most staple crops and our experiment is based on <ref type="bibr" target="#b25">[26]</ref> the detection of the potato leaf diseases. Our unique Convolutional Neural Network categorizes potato leaves into three groups for that purpose, blight disease namely early and late along with healthy. Due to presence of fewer layers and parameters than the other transfer learning and CNN models, it is highly efficient and resourceful in the computationally constrained environments. It achieved more than 99% precision in PlantVillage, as well as in the PLD data set. The vision of our research is to improve the model's adaptability and resilience so that it can identify diseases in a variety of crops besides potatoes. A mobile and web application for the benefit of the farmer community and contribution to this sector overall will be developed, along with an attempt to further minimize the parameters to make the model more efficient.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head></head><label></label><figDesc>Figure 1 and Figure 2 are representing the model architecture and detailed layerwise flow diagram respectively.</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Paradigm of proposed model</figDesc><graphic coords="6,82.99,65.61,429.30,308.70" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: Flow diagram of proposed model</figDesc><graphic coords="7,78.84,342.03,437.60,246.80" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Confusion matrix generated by proposed methodology</figDesc><graphic coords="11,72.00,65.60,454.80,225.36" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Training validation accuracy of proposed methodology</figDesc><graphic coords="11,72.00,330.38,452.40,206.16" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_5"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: Feature extraction output of different convolution layers</figDesc><graphic coords="11,87.55,575.96,420.19,112.39" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_6"><head>Figure 6 :</head><label>6</label><figDesc>Figure 6: Training validation loss of proposed methodology</figDesc><graphic coords="12,85.58,65.61,424.13,197.10" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_7"><head>Figure 7 :</head><label>7</label><figDesc>Figure 7: Receiver Operating Characteristic curve proposed methodology</figDesc><graphic coords="12,84.12,302.12,427.05,195.98" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1</head><label>1</label><figDesc>Summary of disease prediction of potato leaves</figDesc><table><row><cell cols="2">Author Algorithm</cell><cell>Dataset</cell><cell>Plant</cell><cell>Accuracy</cell></row><row><cell>[13]</cell><cell>Modified Mobilenet</cell><cell>Plantvillage</cell><cell>Potato</cell><cell>98.34%</cell></row><row><cell>[20]</cell><cell>Resnet50</cell><cell>Plantvillage</cell><cell>Potato</cell><cell>98%</cell></row><row><cell>[14]</cell><cell>Attention Based Residual Network</cell><cell>Plantvillage</cell><cell>Tomato</cell><cell>98%</cell></row><row><cell>[17]</cell><cell>CNN</cell><cell>Cassava Challenge</cell><cell>Cassava</cell><cell>93%</cell></row><row><cell>[18]</cell><cell>Reweighted Linear Boost Program Classification</cell><cell>Plantvillage</cell><cell>Multiple</cell><cell>92%</cell></row><row><cell>[15]</cell><cell>CNN and Auto Encoders</cell><cell>Plantvillage</cell><cell>Potato,Maize,Tomato</cell><cell>97.5%</cell></row><row><cell>[19]</cell><cell>Deep CNN</cell><cell>Plantvillage</cell><cell>Potato</cell><cell>96.46%</cell></row><row><cell>[16]</cell><cell>Segment and Multi SVM</cell><cell>Plantvillage</cell><cell>Potato</cell><cell>95%</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_1"><head></head><label></label><figDesc>algorithm . CLAHE is an image processing technique designed to enhance image contrast. CLAHE operates on discrete areas of the image, known as tiles, as opposed to the full image at once, in contrast to conventional histogram equalization. Within each tile, CLAHE adjusts the contrast adaptively based on the local histogram, allowing it to enhance detail without overly amplifying noise. After processing each tile, neighboring tiles are merged smoothly to prevent visible boundaries. CLAHE is designed to prevent excessive contrast amplification by limiting it. The contrast amplification around each pixel is determined by a slope function transformation. To control this amplification, CLAHE clips the histogram at a predefined threshold before computing the cumulative distribution function, effectively constraining the enhancement to avoid noise exaggeration.</figDesc><table /><note>By limiting contrast adjustments in uniform areas, CLAHE minimizes noise amplification, producing a balanced, enhanced image.</note></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_2"><head>Table 2</head><label>2</label><figDesc></figDesc><table><row><cell>Augmentation parameters</cell><cell></cell><cell></cell></row><row><cell cols="2">Sl No. Operation used</cell><cell>Range</cell></row><row><cell>1.</cell><cell>Rotation</cell><cell>30 Degree</cell></row><row><cell>2.</cell><cell>Zoom</cell><cell>0.15</cell></row><row><cell>3.</cell><cell>Width shift</cell><cell>0.2</cell></row><row><cell>4.</cell><cell>Height shift</cell><cell>0.2</cell></row><row><cell>5.</cell><cell>Shear</cell><cell>0.15</cell></row><row><cell>6.</cell><cell></cell><cell></cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_3"><head>Table 3</head><label>3</label><figDesc>An overview of the suggested neural network model</figDesc><table><row><cell>Layer</cell><cell>Output Shape</cell><cell>Param #</cell></row><row><cell cols="2">Batch_normalization (None, 224, 224, 3)</cell><cell>12</cell></row><row><cell>Convolution2D_1</cell><cell cols="2">(None, 223, 223, 32) 896</cell></row><row><cell>MaxPooling2D_1</cell><cell cols="2">(None, 111, 111, 32) 0</cell></row><row><cell>Convolution2D_2</cell><cell cols="2">(None, 109, 109, 64) 18,496</cell></row><row><cell>Convolution2D_3</cell><cell cols="2">(None, 107, 107, 64) 36,928</cell></row><row><cell>MaxPooling2D_2</cell><cell>(None, 53, 53, 64)</cell><cell>0</cell></row><row><cell>Convolution2D_4</cell><cell>(None, 51, 51, 128)</cell><cell>73,856</cell></row><row><cell>Convolution2D_5</cell><cell>(None, 49, 49, 128)</cell><cell>147,584</cell></row><row><cell>MaxPooling2D_3</cell><cell>(None, 24, 24, 128)</cell><cell>0</cell></row><row><cell>Convolution2D_6</cell><cell>(None, 22, 22, 256)</cell><cell>295,168</cell></row><row><cell>Convolution2D_7</cell><cell>(None, 20, 20, 256)</cell><cell>590,080</cell></row><row><cell>MaxPooling2D_4</cell><cell>(None, 10, 10, 256)</cell><cell>0</cell></row><row><cell>Convolution2D_8</cell><cell>(None, 8, 8, 512)</cell><cell>1,180,160</cell></row><row><cell>MaxPooling2D_5</cell><cell>(None, 4, 4, 512)</cell><cell>0</cell></row><row><cell>Flatten</cell><cell>(None, 8192)</cell><cell>0</cell></row><row><cell>Dropout</cell><cell>(None, 8192)</cell><cell>0</cell></row><row><cell>Dense_1</cell><cell>(None, 1024)</cell><cell>8,389,632</cell></row><row><cell>Dense_2</cell><cell>(None, 256)</cell><cell>262,400</cell></row><row><cell>Dense_3</cell><cell>(None, 64)</cell><cell>16,448</cell></row><row><cell>Dense_4</cell><cell>(None, 3)</cell><cell>195</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_4"><head>Table 4</head><label>4</label><figDesc></figDesc><table><row><cell>PlantVillage Dataset</cell><cell></cell><cell></cell><cell></cell></row><row><cell>Sl No.</cell><cell>Class</cell><cell>Sample size</cell><cell></cell></row><row><cell>1.</cell><cell>Healthy</cell><cell>152</cell><cell></cell></row><row><cell>2.</cell><cell>Late Blight</cell><cell>1000</cell><cell></cell></row><row><cell>3.</cell><cell>Early Blight</cell><cell>1000</cell><cell></cell></row><row><cell>Table 5</cell><cell></cell><cell></cell><cell></cell></row><row><cell>PLD Dataset</cell><cell></cell><cell></cell><cell></cell></row><row><cell>Sl No.</cell><cell>Class</cell><cell>Sample size</cell><cell></cell></row><row><cell>1.</cell><cell>Healthy</cell><cell>816</cell><cell></cell></row><row><cell>2.</cell><cell>Late Blight</cell><cell>1132</cell><cell></cell></row><row><cell>3.</cell><cell>Early Blight</cell><cell>1303</cell><cell></cell></row><row><cell>Table 6</cell><cell></cell><cell></cell><cell></cell></row><row><cell cols="3">Comparative study of parameters with various Transfer Learning Model</cell><cell></cell></row><row><cell></cell><cell cols="3">Number of Parameters</cell></row><row><cell cols="3">Transfer Learning Model Trainable Non-Trainable</cell><cell>Total</cell></row><row><cell>VGG16</cell><cell>75,267</cell><cell>14,714,688</cell><cell>14,789,955</cell></row><row><cell>INCEPTIONV3</cell><cell>153,603</cell><cell>21,802,784</cell><cell>21,956,387</cell></row><row><cell>RESNET50</cell><cell>301,059</cell><cell>23,587,712</cell><cell>23,888,771</cell></row><row><cell>Proposed Model</cell><cell>11,011,849</cell><cell>6</cell><cell>11,011,855</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_5"><head>Table 7</head><label>7</label><figDesc>Comparative study of accuracy Measures of Suggested model with well-known Pretrained Models</figDesc><table><row><cell>Model</cell><cell>Type</cell><cell cols="3">Precision Recall F1-score</cell></row><row><cell></cell><cell>Micro Average</cell><cell>0.97</cell><cell>0.97</cell><cell>0.97</cell></row><row><cell>VGG16</cell><cell>Macro Average</cell><cell>0.98</cell><cell>0.90</cell><cell>0.93</cell></row><row><cell></cell><cell>Weighted Average</cell><cell>0.97</cell><cell>0.97</cell><cell>0.97</cell></row><row><cell></cell><cell>Micro Average</cell><cell>0.98</cell><cell>0.98</cell><cell>0.98</cell></row><row><cell>ResNet50</cell><cell>Macro Average</cell><cell>0.98</cell><cell>0.97</cell><cell>0.97</cell></row><row><cell></cell><cell>Weighted Average</cell><cell>0.97</cell><cell>0.97</cell><cell>0.98</cell></row><row><cell></cell><cell>Micro Average</cell><cell>0.77</cell><cell>0.77</cell><cell>0.77</cell></row><row><cell>InceptionV3</cell><cell>Macro Average</cell><cell>0.80</cell><cell>0.60</cell><cell>0.61</cell></row><row><cell></cell><cell>Weighted Average</cell><cell>0.79</cell><cell>0.77</cell><cell>0.75</cell></row><row><cell></cell><cell>Micro Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row><row><cell>Proposed Model (PlantVillage)</cell><cell>Macro Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row><row><cell></cell><cell>Weighted Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row><row><cell></cell><cell>Micro Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row><row><cell>Proposed Model(PLD dataset)</cell><cell>Macro Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row><row><cell></cell><cell>Weighted Average</cell><cell>0.99</cell><cell>0.99</cell><cell>0.99</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_6"><head>Table 8</head><label>8</label><figDesc>Comparative study of accuracy measures with State of the Art Models</figDesc><table><row><cell>Reference</cell><cell>Technique</cell><cell>Crop</cell><cell cols="2">No. of Diseases Accuracy</cell></row><row><cell>Divyansh[28]</cell><cell cols="2">SVM, KNN and Neural Net Potato</cell><cell>2</cell><cell>97.8%</cell></row><row><cell>Zhang[29]</cell><cell>Faster RCNN</cell><cell>Tomato</cell><cell>4</cell><cell>97.1%</cell></row><row><cell>Barman[30]</cell><cell>SBCNN</cell><cell>Potato</cell><cell>2</cell><cell>96.75%</cell></row><row><cell>Rozaqi[31]</cell><cell>CNN</cell><cell>Potato</cell><cell>2</cell><cell>92%</cell></row><row><cell>Proposed Model(Plantvillage dataset)</cell><cell>CNN</cell><cell>Potato</cell><cell>3</cell><cell>99.3%</cell></row><row><cell>Proposed Model(PLD dataset)</cell><cell>CNN</cell><cell>Potato</cell><cell>3</cell><cell>99.23%</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_7"><head>Table 9 Hyperparameters Hyper-parameter Description</head><label>9</label><figDesc></figDesc><table><row><cell>Convolution Layers</cell><cell>8</cell></row><row><cell>Max-pooling Layers</cell><cell>5</cell></row><row><cell>Dropout</cell><cell>0.5</cell></row><row><cell>Activation function</cell><cell>Relu</cell></row><row><cell>Number of epochs</cell><cell>200</cell></row><row><cell>Batch size</cell><cell>32</cell></row><row><cell>Learning rate</cell><cell>0.001</cell></row><row><cell cols="2">A binary classifier's diagnostic performance can be assessed visually using a ROC curve. Across a</cell></row><row><cell cols="2">range of threshold values, it compares True Positive Rate (TPR) against the False Positive Rate (FPR).</cell></row></table></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgments</head><p>Thanks to the developers of ACM consolidated LaTeX styles https://github.com/borisveytsman/acmart and to the developers of Elsevier updated L A T E X templates https://www.ctan.org/tex-archive/macros/ latex/contrib/els-cas-templates.</p></div>
			</div>

			<div type="annex">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Declaration on Generative AI</head><p>The author(s) have not employed any Generative AI tools.</p></div>			</div>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Aisslab, Pldpnet: Endto-end hybrid deep learning framework for potato leaf disease prediction</title>
		<author>
			<persName><forename type="first">F</forename><surname>Arshad</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Mateen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Hayat</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Wardah</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Al-Huda</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Gu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Al-Antari</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.aej.2023.07.076</idno>
	</analytic>
	<monogr>
		<title level="j">Alexandria Engineering Journal</title>
		<imprint>
			<biblScope unit="volume">78</biblScope>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<monogr>
		<author>
			<persName><forename type="first">N</forename><surname>Przulj</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Velimirovic</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Petrović</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Ilić</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Mirosavljević</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Trkulja</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Jovovic</surname></persName>
		</author>
		<title level="m">From the Stone Hoe to Circular Agriculture</title>
				<imprint>
			<date type="published" when="2024">2024</date>
			<biblScope unit="page" from="119" to="168" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">An analysis of india&apos;s agricultural sector: Challenges and opportunities</title>
		<author>
			<persName><forename type="first">P</forename><surname>Dalwadi</surname></persName>
		</author>
		<idno type="DOI">10.36713/epra13069</idno>
	</analytic>
	<monogr>
		<title level="j">EPRA International Journal of Multidisciplinary Research</title>
		<imprint>
			<biblScope unit="page" from="293" to="296" />
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">An automated detection and classification of citrus plant diseases using image processing techniques: A review</title>
		<author>
			<persName><forename type="first">Z</forename><surname>Iqbal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Khan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Sharif</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Shah</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.compag.2018.07.032</idno>
	</analytic>
	<monogr>
		<title level="j">Computers and Electronics in Agriculture</title>
		<imprint>
			<biblScope unit="volume">153</biblScope>
			<biblScope unit="page" from="12" to="32" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Multi-level deep learning model for potato leaf disease recognition</title>
		<author>
			<persName><forename type="first">D.-J</forename><surname>Rashid</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Khan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Ghulam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Almotiri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Al Ghamdi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Masood</surname></persName>
		</author>
		<idno type="DOI">10.3390/electronics10172064</idno>
	</analytic>
	<monogr>
		<title level="j">Electronics</title>
		<imprint>
			<biblScope unit="volume">10</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<monogr>
		<title level="m" type="main">AgriScanNet-18: A Robust Multilayer CNN for Identification of Potato Plant Diseases</title>
		<author>
			<persName><forename type="first">S</forename><surname>Manzoor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Manzoor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Islam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Boudjadar</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-031-47724-9_20</idno>
		<imprint>
			<date type="published" when="2024">2024</date>
			<biblScope unit="page" from="291" to="308" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Automatic diagnosis of plant disease</title>
		<author>
			<persName><forename type="first">Y</forename><surname>Sasaki</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Okamoto</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Imou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Torii</surname></persName>
		</author>
		<idno type="DOI">10.11357/jsam1937.61.2_119</idno>
	</analytic>
	<monogr>
		<title level="j">JOURNAL of the JAPANESE SOCIETY of AGRICULTURAL MACHINERY</title>
		<imprint>
			<biblScope unit="volume">61</biblScope>
			<biblScope unit="page" from="119" to="126" />
			<date type="published" when="2010">2010</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">A quantitative real time pcr based method for the detection of phytophthora infestans causing late blight of potato, in infested soil</title>
		<author>
			<persName><forename type="first">T</forename><surname>Hussain</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">P</forename><surname>Singh</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Anwar</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.sjbs.2013.09.012</idno>
		<ptr target="https://doi.org/10.1016/j.sjbs.2013.09.012" />
	</analytic>
	<monogr>
		<title level="j">Saudi Journal of Biological Sciences</title>
		<imprint>
			<biblScope unit="volume">21</biblScope>
			<biblScope unit="page" from="380" to="386" />
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Enhanced fieldbased detection of potato blight in complex backgrounds using deep learning</title>
		<author>
			<persName><forename type="first">J</forename><surname>Johnson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Sharma</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Srinivasan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Masakapalli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Sharma</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Sharma</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Dua</surname></persName>
		</author>
		<idno type="DOI">10.34133/2021/9835724</idno>
	</analytic>
	<monogr>
		<title level="j">Plant Phenomics</title>
		<imprint>
			<biblScope unit="page" from="1" to="13" />
			<date type="published" when="2021">2021. 2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">A multi layer perceptron neural network trained by invasive weed optimization for potato color image segmentation</title>
		<author>
			<persName><forename type="first">P</forename><surname>Moallem</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Razmjooy</surname></persName>
		</author>
		<idno type="DOI">10.3923/tasr.2012.445.455</idno>
	</analytic>
	<monogr>
		<title level="j">Trends in Applied Sciences Research</title>
		<imprint>
			<biblScope unit="volume">6</biblScope>
			<biblScope unit="page" from="445" to="455" />
			<date type="published" when="2012">2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<monogr>
		<title level="m" type="main">Artificial Intelligence in Potato Leaf Disease Classification: A Deep Learning Approach</title>
		<author>
			<persName><forename type="first">N</forename><forename type="middle">E</forename><surname>Khalifa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Taha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">Abou</forename><surname>El-Magd</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">E</forename><surname>Hassanien</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-030-59338-4_4</idno>
		<imprint>
			<date type="published" when="2021">2021</date>
			<biblScope unit="page" from="63" to="79" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Performance of deep learning vs machine learning in plant leaf disease detection</title>
		<author>
			<persName><forename type="first">S</forename><surname>Radha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Chatterjee</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Jhanjhi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Brohi</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.micpro.2020.103615</idno>
	</analytic>
	<monogr>
		<title level="j">Microprocessors and Microsystems</title>
		<imprint>
			<biblScope unit="volume">80</biblScope>
			<biblScope unit="page">103615</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Depthwise separable convolution architectures for plant disease classification</title>
		<author>
			<persName><forename type="first">K</forename><surname>Kc</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Yin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Wu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Wu</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.compag.2019.104948</idno>
	</analytic>
	<monogr>
		<title level="j">Computers and Electronics in Agriculture</title>
		<imprint>
			<biblScope unit="volume">165</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Attention embedded residual cnn for disease detection in tomato leaves</title>
		<author>
			<persName><forename type="first">R</forename><surname>Karthik</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Hariharan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Anand</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Mathikshara</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Johnson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Menaka</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Applied Soft Computing</title>
		<imprint>
			<biblScope unit="volume">86</biblScope>
			<biblScope unit="page">105933</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Seasonal crops disease prediction and classification using deep convolutional encoder network</title>
		<author>
			<persName><forename type="first">A</forename><surname>Khamparia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Saini</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Gupta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Khanna</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Tiwari</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Albuquerque</surname></persName>
		</author>
		<idno type="DOI">10.1007/s00034-019-01041-0</idno>
	</analytic>
	<monogr>
		<title level="j">Circuits, Systems, and Signal Processing</title>
		<imprint>
			<biblScope unit="volume">39</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<monogr>
		<title level="m" type="main">Detection of potato diseases using image segmentation and multiclass support vector machine</title>
		<author>
			<persName><forename type="first">M</forename><surname>Islam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Dinh</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Wahid</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Bhowmik</surname></persName>
		</author>
		<idno type="DOI">10.1109/CCECE.2017.7946594</idno>
		<imprint>
			<date type="published" when="2017">2017</date>
			<biblScope unit="page" from="1" to="4" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">A predictive machine learning application in agriculture: Cassava disease detection and classification with imbalanced dataset using convolutional neural networks</title>
		<idno type="DOI">10.1016/j.eij.2020.02.007</idno>
		<idno>doi:</idno>
		<ptr target="https://doi.org/10.1016/j.eij.2020.02.007" />
	</analytic>
	<monogr>
		<title level="j">Egyptian Informatics Journal</title>
		<imprint>
			<biblScope unit="volume">22</biblScope>
			<biblScope unit="page" from="27" to="34" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<analytic>
		<title level="a" type="main">Kuan noise filter with hough transformation based reweighted linear program boost classification for plant leaf disease detection</title>
		<author>
			<persName><forename type="first">N</forename><surname>Deepa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Nagarajan</surname></persName>
		</author>
		<idno type="DOI">10.1007/s12652-020-02149-x</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Ambient Intelligence and Humanized Computing</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">Identification of plant leaf diseases using a nine-layer deep convolutional neural network</title>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">G</forename></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">P J</forename></persName>
		</author>
		<idno type="DOI">10.1016/j.compeleceng.2019.04.011</idno>
		<ptr target="https://doi.org/10.1016/j.compeleceng.2019.04.011" />
	</analytic>
	<monogr>
		<title level="j">Computers Electrical Engineering</title>
		<imprint>
			<biblScope unit="volume">76</biblScope>
			<biblScope unit="page" from="323" to="338" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<analytic>
		<title level="a" type="main">Pd2se-net: Computer-assisted plant disease diagnosis and severity estimation network</title>
		<author>
			<persName><forename type="first">Q</forename><surname>Liang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Xiang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Hu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Coppola</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Sun</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.compag.2019.01.034</idno>
	</analytic>
	<monogr>
		<title level="j">Computers and Electronics in Agriculture</title>
		<imprint>
			<biblScope unit="volume">157</biblScope>
			<biblScope unit="page" from="518" to="529" />
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<monogr>
		<author>
			<persName><forename type="first">D</forename><surname>Hughes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Salathe</surname></persName>
		</author>
		<title level="m">An open access repository of images on plant health to enable the development of mobile disease diagnostics through machine learning and crowdsourcing</title>
				<imprint>
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<monogr>
		<title level="m" type="main">A review: Contrast-limited adaptive histogram equalization (clahe) methods to help the application of face recognition</title>
		<author>
			<persName><forename type="first">P</forename><surname>Musa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Rafi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Lamsani</surname></persName>
		</author>
		<idno type="DOI">10.1109/IAC.2018.8780492</idno>
		<imprint>
			<date type="published" when="2018">2018</date>
			<biblScope unit="page" from="1" to="6" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<monogr>
		<title level="m" type="main">Very deep convolutional networks for large-scale image recognition</title>
		<author>
			<persName><forename type="first">K</forename><surname>Simonyan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Zisserman</surname></persName>
		</author>
		<idno>arXiv 1409.1556</idno>
		<imprint>
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<analytic>
		<title level="a" type="main">Rethinking the inception architecture for computer vision</title>
		<author>
			<persName><forename type="first">C</forename><surname>Szegedy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Vanhoucke</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Ioffe</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Shlens</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Wojna</surname></persName>
		</author>
		<idno type="DOI">10.1109/CVPR.2016.308</idno>
	</analytic>
	<monogr>
		<title level="m">IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</title>
				<imprint>
			<date type="published" when="2016">2016. 2016</date>
			<biblScope unit="page" from="2818" to="2826" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b24">
	<monogr>
		<title level="m" type="main">Deep residual learning for image recognition</title>
		<author>
			<persName><forename type="first">K</forename><surname>He</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Ren</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Sun</surname></persName>
		</author>
		<idno type="DOI">10.1109/CVPR.2016.90</idno>
		<imprint>
			<date type="published" when="2016">2016</date>
			<biblScope unit="page" from="770" to="778" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b25">
	<monogr>
		<title level="m" type="main">Potato blight: Deep learning model for binary and multi-classification</title>
		<author>
			<persName><forename type="first">V</forename><surname>Kukreja</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Baliyan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Salonki</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename></persName>
		</author>
		<idno type="DOI">10.1109/SPIN52536.2021.9566079</idno>
		<imprint>
			<date type="published" when="2021">2021</date>
			<biblScope unit="page" from="967" to="672" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b26">
	<monogr>
		<author>
			<persName><forename type="first">D</forename><surname>Hughes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Salathe</surname></persName>
		</author>
		<title level="m">An open access repository of images on plant health to enable the development of mobile disease diagnostics through machine learning and crowdsourcing</title>
				<imprint>
			<date type="published" when="2015">2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b27">
	<monogr>
		<title level="m" type="main">Potato leaf diseases detection using deep learning</title>
		<author>
			<persName><forename type="first">D</forename><surname>Tiwari</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Ashish</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Gangwar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Sharma</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Patel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Bhardwaj</surname></persName>
		</author>
		<idno type="DOI">10.1109/ICICCS48265.2020.9121067</idno>
		<imprint>
			<date type="published" when="2020">2020</date>
			<biblScope unit="page" from="461" to="466" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b28">
	<analytic>
		<title level="a" type="main">Deep learning-based object detection improvement for tomato disease</title>
		<author>
			<persName><forename type="first">Y</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Song</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Zhang</surname></persName>
		</author>
		<idno type="DOI">10.1109/ACCESS.2020.2982456</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Access PP</title>
		<imprint>
			<biblScope unit="page" from="1" to="1" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b29">
	<monogr>
		<title level="m" type="main">Comparative assessment of deep learning to detect the leaf diseases of potato based on data augmentation</title>
		<author>
			<persName><forename type="first">U</forename><surname>Barman</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Sahu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">G</forename><surname>Barman</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Das</surname></persName>
		</author>
		<idno type="DOI">10.1109/ComPE49325.2020.9200015</idno>
		<imprint>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b30">
	<analytic>
		<title level="a" type="main">Identification of disease in potato leaves using convolutional neural network (cnn) algorithm</title>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">J</forename><surname>Rozaqi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Sunyoto</surname></persName>
		</author>
		<idno type="DOI">10.1109/ICOIACT50329.2020.9332037</idno>
	</analytic>
	<monogr>
		<title level="m">2020 3rd International Conference on Information and Communications Technology (ICOIACT)</title>
				<imprint>
			<date type="published" when="2020">2020</date>
			<biblScope unit="page" from="72" to="76" />
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
