<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Web-based melanoma detection system using convolutional neural networks and advanced image processing⋆</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Sebastian</forename><surname>Górecki</surname></persName>
							<email>sebastian.gorecki@dokt.p.lodz.pl</email>
							<affiliation key="aff0">
								<orgName type="institution">Maria Sklodowska-Curie Warsaw Higher School</orgName>
								<address>
									<addrLine>Al. Solidarności 12</addrLine>
									<postCode>03-411</postCode>
									<settlement>Warszawa</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Wiktoria</forename><surname>Duszczyk</surname></persName>
							<email>wiktoria.duszczyk@adres.pl</email>
							<affiliation key="aff0">
								<orgName type="institution">Maria Sklodowska-Curie Warsaw Higher School</orgName>
								<address>
									<addrLine>Al. Solidarności 12</addrLine>
									<postCode>03-411</postCode>
									<settlement>Warszawa</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Zuzanna</forename><surname>Huda</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Maria Sklodowska-Curie Warsaw Higher School</orgName>
								<address>
									<addrLine>Al. Solidarności 12</addrLine>
									<postCode>03-411</postCode>
									<settlement>Warszawa</settlement>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Andrzej</forename><surname>Faryna</surname></persName>
							<email>andrzejfaryna@gmail.com</email>
							<affiliation key="aff1">
								<orgName type="institution">Centrum Diagnostyczne</orgName>
								<address>
									<addrLine>Marii Skłodowskiej-Curie sp. z o.o., Jasionka 954</addrLine>
									<postCode>36-002</postCode>
									<settlement>Jasionka</settlement>
									<country key="PL">Polska</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Aleksandra</forename><surname>Tatka</surname></persName>
							<email>olatatka@gmail.com</email>
							<affiliation key="aff1">
								<orgName type="institution">Centrum Diagnostyczne</orgName>
								<address>
									<addrLine>Marii Skłodowskiej-Curie sp. z o.o., Jasionka 954</addrLine>
									<postCode>36-002</postCode>
									<settlement>Jasionka</settlement>
									<country key="PL">Polska</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Web-based melanoma detection system using convolutional neural networks and advanced image processing⋆</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">E6FE4A80CD833C7A4B9F91F0213696FA</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2025-04-23T17:10+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Melanoma, Autonomous Diagnostic Systems, Convolutional Neural Networks, AI in Dermatology A. Tatka) 0000-0001-5700-4000 (S. Górecki)</term>
					<term>0009-0005-3322-1904 (W. Duszczyk)</term>
					<term>0009-0009-2481-8281 (Z. Huda), 0009-0004-7388-2603 (A. Faryna)</term>
					<term>0000-0002-8268-894X (A. Tatka)</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Melanoma, an aggressive malignancy of melanocytes, is one of the deadliest skin cancers, marked by high mortality rates, particularly when diagnosed late. This study presents the development of an autonomous diagnostic system designed to detect melanoma using embedded AI technologies, specifically convolutional neural networks and advanced image processing methods. The system aims to enhance diagnostic accuracy, shorten waiting times for diagnosis, and provide a non-invasive, accessible solution for early melanoma detection. We leverage deep learning models trained on a diverse dataset of dermoscopic images, combined with innovative pre-processing and segmentation techniques, to achieve high-performance melanoma classification. The results demonstrate the potential of this web-based system to serve as an effective decision support tool for clinicians, ultimately improving patient outcomes through early intervention.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Skin melanoma is one of the most common malignancies, with its incidence increasing significantly over the past few decades, making it a serious public health concern. In Western populations, one in 50 individuals is expected to develop melanoma. While most cases occur in the elderly, melanoma is also the third most common cancer in adolescents and young adults aged 15 to 39 years <ref type="bibr" target="#b0">[1]</ref>.</p><p>The development of melanoma is multifactorial, involving an interplay between genetic susceptibility and environmental factors. Primary melanomas can present with a wide range of pigmentation, from heavily pigmented to amelanotic. Early diagnosis of melanoma is a crucial factor in improving patient survival rates.</p><p>The traditional approach to melanoma diagnosis begins with a visual examination, followed by biopsy and histopathological evaluation. The major challenge in melanoma detection lies in accurately identifying early-stage melanomas while minimizing the need for biopsies of benign lesions.</p><p>Recent advancements in noninvasive diagnostic techniques have enhanced the accuracy of melanoma detection, particularly in managing melanocytic lesions with uncertain diagnoses. Additionally, the promising potential of artificial intelligence offers a transformative opportunity in the field of melanoma detection.</p><p>Skin cancer is the most common malignancy affecting humans <ref type="bibr" target="#b1">[2]</ref>. Early recognition is the most effective intervention to improve melanoma prognosis <ref type="bibr" target="#b2">[3]</ref>. Both melanoma and non-melanoma skin cancers are showing a gradually increasing incidence worldwide, especially in the Caucasian population, posing a growing health problem due to the associated morbidity, mortality, and economic burden of monitoring and treatment.</p><p>Melanoma, a malignant skin cancer originating from melanocytes, poses a significant public health threat due to its rapid progression and high metastatic potential. In Poland, melanoma accounts for approximately 2% of all cancer deaths, and its incidence has been increasing by 5-6% annually. The implementation of AI-powered autonomous diagnostic systems can play a crucial role in addressing the challenges of early melanoma detection and improving patient outcomes.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Related works</head><p>This section describes current diagnostic methods for detecting skin cancer. While some of these techniques were developed decades ago, they have undergone continuous transformation and improvement. We will focus on methods such as dermoscopy, digital photography, and multispectral imaging, which provide better visualization of skin lesions. However, these approaches often require expert interpretation by experienced dermatologists and can be timeconsuming. Advances in artificial intelligence, particularly the rapid development and deployment of convolutional neural networks, have enabled the automatic classification of skin images, greatly improving the accuracy and efficiency of cancer detection. The figure <ref type="figure" target="#fig_0">1</ref>. illustrates the evolution of melanoma detection technologies over the years, reflecting key advancements that have shaped the field of skin cancer diagnostics. Early methods, such as dermoscopy, provided the foundational tools for visual inspection, whereas digital and multispectral imaging later emerged, enhancing lesion visualization and pattern recognition. More recent advancements in artificial intelligence (AI), specifically convolutional neural networks (CNNs), have accelerated the capacity for automatic lesion classification, improving diagnostic accuracy and efficiency. These innovations highlight the ongoing transition toward AI-driven methods that offer substantial improvements over traditional diagnostic approaches.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.1.">ABCDE Principle</head><p>Measuring characteristics from an image, such as color, texture, or shape helps distinguish benign from malignant melanoma. In this work, the ABCD rule is used which is divided into four phases: Asymmetry, Border, Color, and Diameter. In 2004, it was expanded to E -Evolving to reflect the possibility of melanoma in rapidly progressing moles. This is commonly applied by physicians, healthcare workers, or even patients to determine skin lesions for melanoma or features concerning. The tool was developed to provide a simple and straightforward template that laypersons and physicians could follow in case of features that may represent melanoma. The ABCD criteria <ref type="bibr" target="#b4">[5]</ref> have great profits for physicians and healthcare workers who don't have enough experience in screening and diagnostic methods to examine skin lesions. Moreover, the introduction of the ABCDE rule improved patient education about melanoma and self-skin examination. However, the ABCDE rule doesn't include all melanoma features, and sometimes melanoma moles don't present characteristics aided by ABCDE criteria. In studies, Abbasi NR et al. and Thomas L et al. showed a sensitivity and a specificity of the differentiating of benign from malignant based on ABCDE principles have the following results: 57 -100 % and 37 -100%. One of the other limitations is insufficient knowledge of laypeople who don't understand how to refer a lesion to the ABCDE rule therefore, there is necessary cognitive training to educate patients and improve their accuracy of detection of skin cancer.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.2.">Dermoscopy</head><p>The advancement in skin cancer diagnosis has been facilitated using a dermoscope -a simple, handheld optical device. The dermoscope provides 10x magnification and uses illumination that minimizes surface light reflection. This technique allows for a more precise visualization of the structures and pigmentation beneath the stratum corneum, which are typically not visible to the naked eye. <ref type="bibr" target="#b5">[6]</ref> Dermoscopy is a noninvasive technique that enables the clinician to perform direct microscopic examination of diagnostic features, not seen by the naked eye, in pigmented skin lesions. The device operates by positioning it directly on or near the skin while the built-in light source is activated, allowing the user to examine the targeted lesion through the magnifying lens <ref type="bibr" target="#b6">[7]</ref>. In the meta-analysis conducted for the National Institute for Health Research (NIHR) Cochrane Systematic Reviews Program the findings indicate that dermoscopy is more effective than visual inspection alone in both accurately diagnosing melanoma and ruling out conditions that are not melanoma <ref type="bibr" target="#b7">[8]</ref>.</p><p>Melanoma can be difficult to distinguish not only clinically but also dermoscopically from melanocytic nevi, especially in early-stage lesions where specific malignant features may be absent <ref type="bibr" target="#b8">[9]</ref>. In addition, dermatoscopic structures and patterns used in the detection of melanoma. There are many helpful algorithms, including the ABCD rule, the Menzies method, the 7-point checklist, the 3-point checklist, chaos and clues, and CASH (color, architecture, symmetry, and homogeneity) <ref type="bibr" target="#b9">[10]</ref>. The results of this systematic review and meta-analysis highlight the diagnostic relevance of dermoscopic features linked to melanoma detection, such as shiny white structures and the bluewhite veil. They further emphasize the importance of recognizing the overall pattern and may indicate a hierarchy regarding the significance of these features and patterns <ref type="bibr" target="#b10">[11]</ref>.</p><p>Based on the principle that benign lesions tend to remain stable, while melanoma typically changes over time, digital follow-up of melanocytic lesions has been proposed as a strategy to identify melanomas that may lack clear dermoscopic features at the initial assessment. Whole-body photography (TBP) has been widely used for patients with extensive or atypical nevi, identifying malignant lesions can be challenging TBP involves capturing high-resolution, full-body clinical images as an adjunct to total body skin examinations (TBSE) during follow-up visits. This approach helps in identifying new or changing lesions and provides reassurance to both the patient and physician by showing the stability of lesions over time. TBP is particularly beneficial for patients with extensive or atypical nevi <ref type="bibr" target="#b11">[12]</ref>. Dermatoscopic examination, although a very useful tool in diagnosing skin lesions, has certain limitations such as the necessity of having a dermatoscope, the subjectivity of assessment depending on knowledge and experience, and limited capability in detecting deeper lesions.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.3.">Ultrasonography (USG)</head><p>Diagnosis of skin melanoma using ultrasound is based on several main aspects. This method is used to assess the advancement and progression of the disease. It is mainly important in cases of suspected metastases to lymph nodes or other organs. Ultrasonography is a diagnostic technique using high-frequency sound waves. The ultrasound machine records reflected sound waves from given structures, depending on the density of the examined area and the structures located there. The reflected waves are converted into a computer image in real time, which allows for the study of the dynamics of processes in the body <ref type="bibr" target="#b12">[13]</ref>. This is a common diagnostic test, but in the case of skin melanoma it will be limiting. It complements the medical diagnosis of skin melanoma and will be important for specific purposes:</p><p> To examin regional lymph nodes, in case of diagnosing possible metastases. It is an important aspect in advanced stages of melanoma. The ultrasound of lymph nodes allows to asses their size, shape and echogenicity (in the case of metastases they will be enlarged and have a pathological shape).  An estimation of Breslow depth, the depth of infiltration of the lesion in the skin. This parameter is difficult to determine clinically, and the ultrasound examination helps in its assessment. The analysis of tumor thickness, is essential in predicting outcomes and guiding treatment for invasive melanomas. This metric has been integrated into the staging frameworks developed by the American Joint Committee on Cancer, helping to define pathological stages, set appropriate excision margins, and indicate when a sentinel lymph node biopsy should be performed <ref type="bibr" target="#b14">[15]</ref>.  An assessment of metastases to internal organs, is important examination in case of later stages of the disease and its diagnosis. Analysis of possible metastases in organs such as liver etc.  Monitoring the body's response to tailored treatment. Re-diagnosis of lymph nodes or tumors in the case of implementation of appropriate treatment.</p><p>The use of ultrasound in the diagnosis of skin melanoma is important in case of its advanced stage. Studies suggest that it is necessary to observe, control and monitoring changes in the lymph nodes and metastases to other organs. The ability to capture real-time images, measure the morphological and physiological characteristics of the skin, along with the safety of using nonionizing media and the absence of contraindications, are additional benefits of skin sonography. However, ultrasound has many disadvantages and significant limitations. It is not a test that allows for quick diagnosis and spontaneous diagnosis of early stages of skin melanoma. It is not able to identify characteristic features of superficial skin lesions. It is limiting in the case of early diagnosis of neoplasms and is a complementary examination, especially in the case of later stages of skin cancer. In addition, the described diagnostics are subject to limitations due to the possible lack of experience of the operator performing the test <ref type="bibr" target="#b13">[14]</ref> [15]</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.4.">AI-Driven Innovations in Early Melanoma Detection</head><p>The application of artificial intelligence (AI), particularly convolutional neural networks (CNNs), in medical image analysis has revolutionized the field of dermatology. CNNs have demonstrated extraordinary performance in the classification of skin lesions, consistently surpassing traditional image analysis techniques in terms of accuracy and reliability. These deep learning models have an unparalleled capacity for extracting critical features from dermoscopic images, enabling the differentiation between benign and malignant lesions with exceptional precision. One study has reported an area under the receiver operating characteristic curve (AUC) of 0.94 for a CNN model in detecting melanoma, highlighting its potential as a decision support tool in clinical settings <ref type="bibr" target="#b15">[16]</ref>.</p><p>AI models, particularly CNNs, excel in feature extraction due to their multi-layered architecture, which enables automatic identification of hierarchical patterns within images. These models are capable of discerning minute variations in skin lesion characteristics, such as texture, shape, and color, which are often difficult for the human eye to detect. Such capabilities significantly enhance diagnostic accuracy, especially in the early stages of melanoma, where early intervention can drastically improve patient outcomes.</p><p>In addition to standard CNN architectures, transfer learning has emerged as a powerful technique to mitigate the challenges associated with limited medical datasets. Transfer learning allows for the utilization of pre-trained models often trained on large general datasets, such as ImageNet-and fine-tuning them on domain-specific data, such as dermoscopic images of skin lesions. Leveraging transfer learning have reported significant improvements in classification accuracy, even when training data is limited. This approach not only reduces the computational burden but also accelerates the development of highly accurate diagnostic models, making them accessible for clinical use in under-resourced settings.</p><p>Despite the considerable advancements in AI-driven skin lesion detection, several limitations persist that impede the widespread clinical adoption of these models. One of the foremost challenges is the reliance on large, annotated datasets for training. Although datasets such as ISIC and HAM10000 have been instrumental in advancing research, the scarcity of diverse and representative datasets remains a significant hurdle. For instance, most publicly available datasets are skewed towards specific skin types, ethnicities, and geographic regions, which can lead to biased model predictions when applied to underrepresented populations <ref type="bibr" target="#b16">[17]</ref>. Addressing this issue requires not only the expansion of existing datasets but also the inclusion of more diverse skin phototypes, lesion types, and patient demographics.</p><p>Moreover, the inherent variability in lesion presentation, including differences in lesion size, shape, and color across different stages of melanoma, poses additional challenges. CNN models, while highly effective at identifying well-represented lesion types, may struggle to generalize across varied presentations of atypical or rare skin lesions. This variability can result in misclassifications, particularly in cases where early-stage melanomas exhibit subtle features that are difficult to distinguish from benign lesions. Furthermore, certain dermoscopic features that are vital for melanoma diagnosis, such as the presence of regression structures or blue-white veils, may not be adequately captured in the training data, further complicating the classification process.</p><p>Another significant concern is the black-box nature of CNNs. While these models provide high accuracy, their decision-making process is often opaque, making it difficult for clinicians to interpret the rationale behind a given prediction. This lack of transparency has led to calls for more interpretable AI models that can offer insights into how specific features influence diagnostic outcomes. Techniques such as Grad-CAM (Gradient-weighted Class Activation Mapping) and SHAP (SHapley Additive exPlanations) have been proposed to enhance model interpretability by visualizing the regions of the image that most contributed to the model's decision. However, further research is needed to make these techniques more clinically useful.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.5.">Histopathological Diagnosis of Melanoma</head><p>Histopathological examination remains the gold standard in melanoma diagnosis, confirming malignancy through microscopic evaluation of biopsied tissue. Pathologists employ specialized staining techniques such as HMB-45, S-100, and MART-1 to distinguish malignant melanocytes from benign lesions. The tissue, once biopsied, is fixed, embedded in paraffin, sectioned, and stained for microscopic evaluation. Key histopathological features like irregular lesion borders, nuclear atypia, and mitotic activity are assessed to determine malignancy and stage. Upon histopathological confirmation, treatment is guided by crucial factors like Breslow thicknessdepth of tumor invasion and Clark level -extent of penetration into skin layers. For instance, a Breslow thickness exceeding 1 mm typically necessitates wide excision of the lesion. When the thickness exceeds 4 mm, more aggressive treatment may be warranted, including adjuvant therapies like PD-1 or BRAF inhibitors. Additional features such as ulceration or lymphatic invasion signal a poorer prognosis and can prompt more intensive treatment plans. Histopathology plays a pivotal role in determining eligibility for immunotherapies. Molecular and histological characteristics identified during evaluation assist clinicians in tailoring personalized treatment plans, optimizing patient outcomes <ref type="bibr" target="#b17">[18]</ref>. Despite its diagnostic accuracy, histopathological diagnosis has certain limitations. The process can be time-consuming, with more complex cases requiring several days to weeks for conclusive results, potentially delaying treatment initiation. Costs can also be a significant barrier, especially when advanced analyses like BRAF mutation testing are involved. These expenses cover not only the biopsy and immunohistochemical staining but also specialist consultations, which may be limited in certain medical centers. The availability of such diagnostics varies globally, with some regions facing delays and higher costs due to limited access to specialized laboratories. Furthermore, histopathological analysis, while reliable, is not infallible, with potential for misinterpretation in ambiguous cases, leading to false-negative or false-positive results <ref type="bibr" target="#b18">[19]</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Project Objective</head><p>The goal of this project is to develop an advanced, autonomous diagnostic system for the early detection of melanoma using convolutional neural networks (CNNs) and sophisticated image processing techniques. This system is designed to reduce diagnostic waiting times by approximately 35 minutes and improve detection accuracy by 20% compared to traditional diagnostic methods. The key objectives are:</p><p> Development of an autonomous diagnostic system: A system capable of analyzing dermoscopic images to diagnose melanoma autonomously.  Efficiency and accuracy: The system will enhance diagnostic precision while reducing the time needed for professional evaluation.  Accessibility: It will be accessible to a wide range of users, enabling early detection through intuitive, easy-to-use mobile platforms.</p><p>The scope of work for this project includes: 1. Development of the Diagnostic Algorithm: o The core of the system will be based on convolutional neural networks specifically trained to detect melanoma in dermoscopic images. o The CNN models will be developed and optimized to achieve high accuracy in recognizing and classifying skin lesions as benign or malignant.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Mobile and Web Platform Integration:</head><p>o The diagnostic system will be integrated into an accessible mobile and web-based platform. o This platform will allow users to easily upload dermoscopic images for analysis. o The system's user interface will be designed for simplicity and ease of use, ensuring that individuals without medical expertise can effectively navigate the application.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">System Validation and Testing:</head><p>o These tests will focus on measuring the accuracy, specificity, and sensitivity of the system in correctly identifying melanoma.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.">Expected Outcomes</head><p>The proposed project aims to develop an autonomous diagnostic system for the early detection of melanoma using convolutional neural networks and advanced image processing techniques. This system is designed to analyse dermoscopic images and automatically classify skin lesions as malignant or benign, with a target accuracy over 60%, significantly improving upon conventional diagnostic methods. Furthermore, the project seeks to raise public awareness about the importance of early detection and diagnosis of skin cancer. The system will incorporate educational features to inform users about the risks associated with neglecting suspicious skin lesions, as early recognition of melanoma is crucial for improving treatment outcomes and patient prognosis.</p><p>The application will be designed to be user-friendly and accessible, allowing individuals with varying levels of technical expertise to utilize the system. By integrating the diagnostic system into a mobile and web-based platform, the project aims to make it widely accessible, as users with a smartphone equipped with a camera and internet access can easily upload and analyse their skin lesion images.</p><p>To address potential variability in image quality, the system will employ advanced image processing algorithms to effectively analyze even suboptimal images. In cases where the image quality does not meet the required standards for accurate diagnosis, the system will provide feedback to the user, prompting them to retake the image to ensure the highest possible diagnostic accuracy.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.2.">Key points of projects</head><p> The accuracy and broad diagnostic base of the proposed method will significantly speed up the process of mole recognition. Accelerating diagnosis is crucial in implementing possible treatment, which, due to the low stage of the cancer, will be less burdensome to the body.  Nowadays, more and more people have access to smartphones, so with the knowledge of the existence of the proposed application, many owners of the necessary equipment will check their signs. It will also be a form of promoting the examination of skin lesions.  A mobile application installed on a smartphone will reduce the burden on potential users.</p><p>To use it, you do not have to leave your home, and you can check moles without time limits. Thanks to this, when the patient speculates about the diagnosis, the proposed method can have a positive impact on the patient-user's psyche.  The project does not exclude any user. Regardless of the quality of the supported camera, the application can analyse the photo.</p><p> Reduction in the number of biopsies. Even low-risk routine surgical procedures are associated with morbidity, mounting health care costs, and patient anxiety.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.3.">Add values</head><p> Reducing the time of doctor's working by mobile application, may decrease the time of doctors working on identifying and diagnosing skin lesions in melanoma by examining skin lesions and providing preliminary diagnosis. Moreover, it may help identify whether the mole is benign, malignant, or requires further investigation. This optimizes the doctor's workflow and supports their decisions.  Remote diagnosis by smartphone application provides health care services for people who live in less urbanized places or underserved areas. It reduces costs related to travelling to clinic appointments, and day off work but also costs with diagnostic tools. The next advantage is decreased exposure to infectious diseases on which patients might be exposed in the clinic.  Introducing a novel non-invasive diagnosis method leads to several benefits in melanoma diagnosis which can improve the patient's life standard and the effectiveness of diagnosis. It opens perspective in skin cancer diagnosis to revolutionize current technologies and optimize the recognition process of skin moles.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.4.">Limitation</head><p> The dataset applied for the recognition of skin lesions is limited in the diversity of lesions and doesn't include all significant data to accurately diagnose each case. An expanding dataset is required to provide errorless patient outcomes.  The application analyses visual features of skin lesions such as color, border, shape, and other characteristics seen by the eye. It doesn't consider molecular aspects such as cell morphology, tumor invasion in a tissue, and structural disorganization in skin tissue. Therefore, it doesn't replace histology techniques, which are more specific in examining melanoma stages.  Misdiagnosis of a skin lesion due to incorrect date recording or technical issues may affect decision-making for laypersons and health care workers and lead to plenty of consequences in the recognition process of a skin mole.  The lack of universal standards for mobile applications and concerns about the security of patient's medical records have been challenging and require licensing laws.  Insufficient education of laypeople and lack of understanding limitations of the tool are potential risks to misinterpretation of results and non-comprehensive investigation due to replacement of dermatologist appointments.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Methodology</head><p>The development of the autonomous melanoma diagnostic system will follow a structured approach, broken down into several key stages. Each stage focuses on different aspects of the system's design, development, and deployment to ensure the final solution is robust, accurate, and efficient.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 3: Project Methodology</head><p>The main stages are as follows:</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.1.">Hardware Preparation</head><p>The first step involves selecting and preparing the hardware infrastructure necessary for training and deploying convolutional neural networks (CNNs). High-performance computing resources, including powerful GPUs or TPUs, will be utilized to handle the computationally intensive processes of deep learning. The hardware will also need to support cloud-based services for scalability and real-time diagnostic capabilities. Additionally, mobile devices such as smartphones will be tested to ensure compatibility and smooth operation of the application.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.2.">Data Preparation</head><p>For this project, we utilized the publicly available HAM10000 dataset <ref type="bibr" target="#b19">[20]</ref> published by Philipp Tschandl in 2018 on the Harvard Dataverse platform. The dataset consists of 10,015 high-quality dermatoscopic images covering a range of diagnostic categories, including actinic keratoses, basal cell carcinoma, melanocytic nevi, melanoma, and vascular lesions. More than 50% of the lesions were confirmed through histopathological examination, which is considered the gold standard for skin cancer diagnosis, while the remaining cases were validated via follow-up examinations, expert consensus, or in-vivo confocal microscopy. The preprocessing of the dataset involved several key steps to ensure the quality and consistency of the images. First, color normalization was applied to account for variations in lighting conditions and imaging devices, ensuring a uniform appearance across the dataset. Next, noise reduction techniques were employed to remove any artifacts or unwanted disturbances, thereby enhancing the overall image quality and enabling more accurate feature extraction. Additionally, advanced fuzzy logic-based methods were used to further refine the segmentation of the skin lesions, improving the accuracy of the subsequent feature extraction process.</p><p>To address the common issue of class imbalance in the dataset, we applied data augmentation techniques. This included transformations such as rotations, translations, and flips, which were used to artificially expand the size of the underrepresented classes. This approach helps to prevent overfitting and enhances the generalizability of the model, enabling it to perform well when encountering new, unseen data during the deployment phase. A comprehensive machine learning solution was developed to classify skin lesions, starting with model creation and culminating in the deployment of a live web application. The primary objective was to enable users to upload an image of a skin lesion and receive an instant diagnosis. The system classifies skin lesions into seven categories, including melanocytic nevi, melanoma, basal cell carcinoma, and more.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.3.">Model Building</head><p>The model employed for this task is a fine-tuned MobileNet Convolutional Neural Network (CNN), chosen for its compact size and efficient performance, making it ideal for web and mobile deployment. MobileNet's lightweight architecture enables real-time inference with lower computational requirements while still maintaining strong performance in image classification tasks.</p><p>Input Layer  The model accepts input images with a shape of (224, 224, 3), corresponding to RGB images that have been resized to 224x224 pixels. This ensures that all input data is uniformly processed by the network.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Convolutional Layers and Depthwise Separable Convolutions</head><p> Depthwise Separable Convolutions: The MobileNet architecture relies heavily on depthwise separable convolutions, which separate spatial filtering from channel-wise operations. This approach dramatically reduces the number of parameters and computation required compared to traditional convolutions, while still capturing relevant image features effectively.</p><p> Pointwise Convolution: Following each depthwise convolution, a pointwise convolution (1x1) is applied, enabling the model to learn interactions between features across different channels.  Batch Normalization and ReLU Activation: Each convolutional layer is followed by batch normalization and ReLU activation to stabilize the training process and introduce non-linearity, allowing the model to learn more complex patterns.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.4.">Model Training &amp; Tuning</head><p>The model was trained using the Keras framework, employing a fine-tuned MobileNet architecture for the classification of skin lesions. The training process was designed to maximize the model's ability to generalize to new, unseen data while avoiding overfitting. Several techniques were utilized to optimize the training process, including early stopping, learning rate reduction, and model checkpointing.</p><p>To address the challenge of class imbalance in the HAM10000 dataset, where certain skin lesion types were overrepresented, a combination of data augmentation techniques such as rotations, flips, and zooms was applied to artificially expand the dataset. Additionally, class weights were introduced to make the model more sensitive to melanoma (class weight = 3.0), helping to mitigate bias toward more common lesions like benign nevi (class weight = 1.0).</p><p>Leveraging the pre-trained MobileNet model, we froze the weights of all layers except the last 23 layers, allowing only these layers to be trainable. This approach enabled the model to retain the general image features learned from the ImageNet dataset while fine-tuning the later layers to specialize in melanoma detection based on the dermoscopic images from the HAM10000 dataset.</p><p>Hyperparameters such as the learning rate (initially set at 0.01), batch size, and the number of epochs were optimized through trial and error. Callbacks like ModelCheckpoint and ReduceLROnPlateau were employed to save the best model based on the top-3 accuracy metric, and to reduce the learning rate if validation performance plateaued.</p><p>The model was compiled using the Adam optimizer with a learning rate of 0.01 and categorical cross-entropy as the loss function. In addition to the standard categorical accuracy, we introduced custom metrics like top-2 accuracy and top-3 accuracy to better capture the model's ability to rank the correct diagnosis within the top few predictions.</p><p>During training, the model demonstrated consistent improvements, achieving strong results after 30 epochs. By the end of training, the final model reached a validation accuracy of 89.5%, with a top-2 accuracy of 91.3% and a top-3 accuracy of 96.3% on the validation set. This indicated that the model was able to rank the correct diagnosis within the top three predictions 96.3% of the time, making it highly effective in identifying the correct lesion type, even in challenging cases.</p><p>After the best model was selected based on the top-3 accuracy metric, we evaluated its performance on the validation data:</p><p> Final Validation Loss: 0.59  Categorical Accuracy: 80.8%  Top-2 Accuracy: 91.2%  Top-3 Accuracy: 96.2% These metrics demonstrate that the model generalizes well and is highly capable of correctly classifying both common and rare types of skin lesions, providing a reliable tool for melanoma detection.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.5.">Model Deployment</head><p>To make the melanoma detection system accessible to a broad audience, a web-based application was developed using a lightweight framework, leveraging modern web technologies to deliver a seamless and user-friendly experience. The primary goal was to allow users, regardless of their technical expertise, to upload an image of a skin lesion and receive an immediate, real-time diagnosis. The system provides the predicted class (such as melanoma or benign lesion) along with the associated probability, offering users an easy-to-understand result.</p><p>One of the unique aspects of this project is the conversion of the trained model, initially developed in Keras, to TensorFlow.js, allowing the model to run directly within a web browser. This approach ensures that the entire process, from image submission to the generation of predictions, takes place locally on the user's device. As a result, no data is uploaded to external servers, which preserves the user's privacy and makes the solution particularly well-suited for medical applications, where data security and privacy are of paramount importance. The conversion of the model from Keras to TensorFlow.js involved the following steps:</p><p>1. Recreating the Model in Native Keras: The trained Keras model was finalized and saved in a format compatible with conversion. 2. Conversion to TensorFlow.js: The model was converted to TensorFlow.js using the command-line conversion tool. This allowed the model to be served directly in the browser without additional server infrastructure. The model was then embedded within a basic HTML interface, which provides an intuitive and simple user experience. The application allows users to:</p><p>1. Upload an Image: Users can select an image of a skin lesion from their device and upload it directly through the browser.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Real-Time Processing:</head><p>The model processes the image in real-time, leveraging the TensorFlow.js framework to run the neural network in the browser.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Display of Results:</head><p>The predicted lesion class, along with the top-3 probabilities, is displayed on the page, giving users insights into the potential diagnosis. The use of TensorFlow.js brings several key benefits:</p><p> Local Processing: Since the model runs entirely in the user's browser, the need for a backend server is eliminated. This not only preserves privacy but also reduces infrastructure costs and improves scalability.  Fast Inference: The conversion to TensorFlow.js ensures that model inference is fast, enabling real-time feedback to users without noticeable delays.  Offline Capability: As the model is stored in the browser, it can potentially function even in low-connectivity or offline environments, making it accessible to a wider audience.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.6.">Model Management</head><p>The development of this AI-powered skin lesion diagnosis system represents a significant advancement in automated medical diagnostics. To maintain its effectiveness, continuous monitoring and regular updates are essential. The model will be periodically monitored to ensure it remains accurate as new data is collected. Retraining the model with fresh, labelled data will prevent model drift and ensure high performance in detecting diverse skin lesions. Feedback from dermatologists will be integrated to improve the system's diagnostic accuracy, particularly in handling ambiguous cases. This collaboration will ensure the model complements medical expertise and maintains clinical relevance. Strict data management protocols will ensure compliance with privacy regulations like GDPR and HIPAA. All patient data used for retraining will be anonymized, ensuring secure, privacy-focused model updates. The web-based deployment using TensorFlow.js allows for easy scaling, making the system accessible in underserved areas. As the model evolves, it can be adapted for mobile use, further expanding its reach. By focusing on these aspects, the system can continue to improve early detection of melanoma and provide lasting benefits to global healthcare.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Results</head><p>This section outlines the performance of the AI-powered skin lesion classification model, along with a real-world test case comparing the model's prediction to the histopathological findings of an actual patient.</p><p>The model's performance was evaluated using standard classification metrics: precision, recall, and F1-score. These metrics provide a comprehensive view of how well the model performs in classifying melanoma and other skin lesions.</p><p>Here is a summary of the model's performance:</p><p> Precision: Measures the proportion of correctly predicted instances among all instances predicted as positive by the model. For example, for the "melanoma" class, the precision indicates how likely the model's melanoma prediction is to be correct.  Recall: Measures the proportion of correctly predicted positive instances out of all actual positive instances. In other words, for the "melanoma" class, recall shows how well the model can detect melanoma cases when they actually exist.  F1-Score: The harmonic mean of precision and recall, providing a single score that balances both concerns, especially in cases where there is class imbalance. This indicates that when the model predicts a lesion to be melanoma, it is correct only 27% of the time. The model detects melanoma in 48% of the cases where it is present. The overall performance for melanoma classification, balancing both precision and recall, is moderate with an F1 score of 0.34.</p><p>These metrics highlight that while the model has room for improvement, it can detect melanoma in some cases. Given the nature of skin lesion classification and the importance of early detection, further tuning and adjustments may be required to improve the performance of the model, particularly in real-world clinical applications. In addition to the evaluation metrics, the model was tested on a real case of a suspicious skin lesion, where a physician recommended removal. The histopathological report confirmed as compound melanocytic nevus, a benign condition (ICD-10: D22).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Model Prediction:</head><p> Melanocytic nevi: (97.8%)  Melanoma: (0.5%) The model strongly predicted the lesion to be a benign melanocytic nevus, with a very low probability of melanoma, which aligned with the final histopathological diagnosis. This real-world validation supports the potential utility of the model in clinical scenarios, although more extensive validation is needed to ensure broader applicability.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">Discussion</head><p>The proposed method of diagnosing skin melanoma may also have negative effects and certain limitations. There are risks where a mole scanning app will have downsides. Firstly, due to the wide availability of diagnosis via mobile applications, many potential patients may misinterpret the results and not further diagnostics of a specific nevus by a specialist.</p><p>Moreover, due to the wide availability of the tool and the limited diagnostic knowledge of users, it is possible to practice self-treatment or create false security, which may ultimately prove disastrous and lead to the progression of the neoplasm. Therefore, it is extremely important to educate potential users before introducing this diagnostic tool, emphasizing the disadvantages of the method and considering that it is not a fully professional diagnosis, which is only possible in the case of contact with a specialist.</p><p>The limitation of the method, which depends on the type of user, is the possible lack of effectiveness of the application due to the quality of the photo provided to the application by the patient. An image of insufficient quality sent for analysis may make the diagnosis unreliable, which may adversely affect further treatment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.1.">Clinical Impact and Applications</head><p>The developed autonomous diagnostic system offers a rapid, non-invasive, and highly accurate solution with broad potential applications across various healthcare environments. Its ability to function autonomously without the need for specialised personnel makes it particularly valuable in settings where access to expert dermatological care is limited, such as rural clinics or underserved regions. This can significantly improve the accessibility of early detection and diagnosis, especially for conditions like melanoma, which are critical for reducing morbidity and mortality. Moreover, the system's compatibility with teledermatology platforms further enhances its utility by enabling clinicians to provide timely assessments and treatment recommendations remotely, without the need for in-person visits. This can streamline the diagnostic process, improve patient convenience, and make quality dermatological care more accessible to a wider population, regardless of their geographic location. Previous studies have shown that similar AI-based systems can aid in the early detection of skin cancer, reducing the burden on healthcare systems and improving patient outcomes. The integration of this technology into clinical practice has the potential to revolutionise the field of dermatology, making high-quality care more readily available to underserved communities.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.2.">Limitations and Future Directions</head><p>Despite its promising performance, the system faces several limitations that must be addressed in future iterations. One significant challenge is the need to further refine the algorithm to account for the heterogeneity in skin phototypes, ethnic backgrounds, and the wide range of morphological presentations of skin lesions encountered in real-world clinical practice. Additionally, while the model has demonstrated high diagnostic accuracy in controlled clinical settings, further large-scale trials across diverse patient populations are necessary to assess its generalizability and robustness Future research will focus on incorporating advanced imaging modalities, such as multispectral imaging, to improve diagnostic accuracy and expand the system's capabilities to detect a broader range of dermatological conditions These advancements, combined with continuous updates to the model's training data, will be key to ensuring the system remains a reliable, accurate, and effective tool in clinical dermatology. The development of an autonomous diagnostic system using convolutional neural networks represents a significant advancement in the field of automated melanoma detection by reducing diagnostic times and improving the accuracy of early melanoma detection, this system has the potential to revolutionize dermatological care, making high-quality skin cancer screening more accessible and reliable for patients. Ongoing research and technological advancements will be crucial in optimizing the system for broader clinical adoption and integration into routine clinical practice.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.3.">Comparison with Existing Systems</head><p>While the proposed system demonstrates significant potential in improving the accuracy and accessibility of melanoma detection, it is essential to compare its performance and approach with existing AI-driven diagnostic systems. Recent studies <ref type="bibr" target="#b19">[20,</ref><ref type="bibr" target="#b20">21]</ref>, have explored the capabilities of convolutional neural networks (CNNs) in skin cancer classification. These studies employed CNN-based architectures, similar to our system, for identifying melanoma and other skin lesions, demonstrating accuracy levels comparable to those of trained dermatologists.</p><p>In Mahbod et al.'s work <ref type="bibr" target="#b20">[21]</ref>, a deep learning model was developed to classify skin lesions, focusing on optimizing sensitivity and specificity for melanoma detection. Similarly, implemented CNN architectures <ref type="bibr" target="#b21">[22]</ref>, examining their performance on large datasets to enhance classification accuracy. However, these models were primarily evaluated in controlled environments with limited real-world applicability due to factors such as image quality and lack of integration with web-based platforms.</p><p>Our system advances these efforts by providing an accessible, web-based platform that allows real-time image analysis without the need for backend servers. This approach not only preserves user privacy but also facilitates a faster diagnostic process by performing all computations locally on the user's device. While systems <ref type="bibr" target="#b22">[23,</ref><ref type="bibr" target="#b23">24]</ref> focus on mobile and desktop applications, our system is tailored for direct web browser use, making it highly accessible and requiring minimal technical expertise from the end user.</p><p>Moreover, studies <ref type="bibr" target="#b23">[24]</ref> highlight the ongoing challenge of dataset diversity. Existing models often lack generalizability across different skin phototypes and ethnic backgrounds, which can impact diagnostic accuracy in diverse populations. Our system addresses this limitation by employing data augmentation techniques and transfer learning, though further dataset expansion remains a priority for future improvement <ref type="bibr" target="#b23">[24]</ref>.</p><p>Despite its strengths, our system also shares limitations with these existing models, such as the need for standardized image quality and the risk of overreliance on an AI-based preliminary diagnosis. Thus, suggest, user education is crucial to avoid misinterpretation of results, a challenge that all AI-based diagnostic systems face <ref type="bibr" target="#b22">[23]</ref>.</p><p>Our system aligns with the capabilities and challenges observed in other CNN-based melanoma detection models, its web-based, real-time diagnostic feature distinguishes it from prior approaches. Ongoing research is essential to further refine this technology and fully assess its clinical utility and impact, especially through large-scale trials across diverse populations.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="7.">Conclusion</head><p>The development of our web-based melanoma detection system using convolutional neural networks represents a promising advancement in dermatological diagnostics. By utilizing dermoscopic images and employing sophisticated image processing, this system has demonstrated a high potential to aid in the early detection of melanoma. With mobile accessibility and rapid diagnostic capability, the system can expand access to initial screenings, especially in regions lacking specialist dermatological resources.</p><p>Dermoscopy, while effective in improving melanoma detection accuracy, has limitations in terms of equipment needs and subjectivity in assessment, particularly for non-expert users and deeper lesions. Our system addresses some of these limitations by integrating automated classification, which not only provides real-time results but also enables greater consistency in evaluation. However, it remains complementary to traditional histopathology, which continues to be crucial for confirming malignancy. Histopathological methods, while precise, are limited by cost, access, and time constraints, underlining the utility of an adjunctive AI tool that can assist in preliminary assessments.</p><p>Although the ABCDE criteria offer a valuable foundation for skin lesion evaluation, they can be challenging for laypeople to apply accurately. Our system's intuitive interface aims to bridge this gap, allowing users to engage with melanoma detection independently while also supporting clinicians in triaging cases that warrant further investigation. Future developments will focus on expanding our dataset to improve model accuracy across diverse populations, refining interpretability to enhance user trust, and integrating the system with telemedicine platforms to foster broader clinical adoption.</p><p>Overall, our autonomous diagnostic system is a step toward democratizing skin cancer screening, with the potential to improve outcomes by facilitating early detection and reducing diagnostic wait times. Ongoing research, dataset expansion, and clinical trials will be essential for optimizing and validating the system's real-world effectiveness. Through continued innovation, this technology holds significant promise for enhancing the accessibility and accuracy of melanoma screening on a global scale, ultimately improving patient care and outcomes.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Melanoma detection technology development timeline<ref type="bibr" target="#b3">[4]</ref> </figDesc><graphic coords="2,76.56,367.92,453.96,223.56" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure2: ABCDE characteristics of melanoma<ref type="bibr" target="#b4">[5]</ref> </figDesc><graphic coords="3,117.60,193.80,382.92,68.76" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Skin nevi from the base<ref type="bibr" target="#b19">[20]</ref>.The preprocessing of the dataset involved several key steps to ensure the quality and consistency of the images. First, color normalization was applied to account for variations in lighting conditions and imaging devices, ensuring a uniform appearance across the dataset. Next, noise reduction techniques were employed to remove any artifacts or unwanted disturbances, thereby enhancing the overall image quality and enabling more accurate feature extraction. Additionally, advanced fuzzy logic-based methods were used to further refine the segmentation of the skin lesions, improving the accuracy of the subsequent feature extraction process.To address the common issue of class imbalance in the dataset, we applied data augmentation techniques. This included transformations such as rotations, translations, and flips, which were used to artificially expand the size of the underrepresented classes. This approach</figDesc><graphic coords="9,151.68,463.08,297.24,150.48" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: Real skin nevus diagnosed for surgical excision</figDesc><graphic coords="13,250.08,477.36,99.96,101.76" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1</head><label>1</label><figDesc>Division of the dataset according to classification</figDesc><table><row><cell>Lesion Type</cell><cell cols="2">Train set Validation Set</cell></row><row><cell>Melanocytic Nevi (nv)</cell><cell>5954</cell><cell>751</cell></row><row><cell>Melanoma (mel)</cell><cell>1074</cell><cell>39</cell></row><row><cell>Benign Keratosis-like Lesions (bkl)</cell><cell>1024</cell><cell>75</cell></row><row><cell>Basal Cell Carcinoma (bcc)</cell><cell>484</cell><cell>30</cell></row><row><cell>Actinic Keratoses and Intraepithelial Carcinoma /</cell><cell>301</cell><cell>26</cell></row><row><cell>Bowen's Disease (akiec)</cell><cell></cell><cell></cell></row><row><cell>Vascular Lesions (vasc)</cell><cell>131</cell><cell>11</cell></row><row><cell>Dermatofibroma (df)</cell><cell>109</cell><cell>6</cell></row></table></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_1"><head>Table 1</head><label>1</label><figDesc></figDesc><table><row><cell>Classification Report</cell><cell></cell><cell></cell></row><row><cell>Precision (mel)</cell><cell>Recall (mel)</cell><cell>F1-score (mel)</cell></row><row><cell>0.27</cell><cell>0.48</cell><cell>0.34</cell></row></table></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="8.">Declaration on Generative AI</head><p>In the preparation of this work, the author employed tools such as ChatGPT to assist with grammar and spelling verification, as well as paraphrasing and rephrasing. The content was subsequently reviewed and refined by the author, who assumes full responsibility for the accuracy and integrity of the final publication.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Melanoma in adolescents and young adults (ages 15-39 years): United States, 1999-2006</title>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">K</forename><surname>Weir</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">D</forename><surname>Marrett</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Cokkinides</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Barnholtz-Sloan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Patel</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Tai</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Jemal</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Li</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Kim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">U</forename><surname>Ekwueme</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.jaad.2011.04.038</idno>
		<idno type="PMID">22018066</idno>
		<idno type="PMCID">PMC3254089</idno>
	</analytic>
	<monogr>
		<title level="j">J Am Acad Dermatol</title>
		<imprint>
			<biblScope unit="volume">65</biblScope>
			<biblScope unit="issue">5</biblScope>
			<biblScope unit="page" from="S38" to="49" />
			<date type="published" when="2011-11">2011 Nov</date>
		</imprint>
	</monogr>
	<note>Suppl</note>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Skin cancer and new treatment perspectives: a review</title>
		<author>
			<persName><forename type="first">Mcf</forename><surname>Simões</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jjs</forename><surname>Sousa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Aacc</forename><surname>Pais</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.canlet.2014.11.001</idno>
		<idno type="PMID">25444899</idno>
	</analytic>
	<monogr>
		<title level="j">Cancer Lett</title>
		<imprint>
			<biblScope unit="volume">357</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="8" to="42" />
			<date type="published" when="2014-11-11">2015 Feb 1. 2014 Nov 11</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Recognising Skin Cancer in Primary Care</title>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">T</forename><surname>Jones</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Cki</forename><surname>Ranmuthu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">N</forename><surname>Hall</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Funston</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">M</forename><surname>Walter</surname></persName>
		</author>
		<idno type="PMID">31734824</idno>
		<idno type="PMCID">PMC6969010</idno>
	</analytic>
	<monogr>
		<title level="j">Adv Ther</title>
		<imprint>
			<biblScope unit="volume">37</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="603" to="616" />
			<date type="published" when="2019-11-16">2020 Jan. 2019 Nov 16</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Biosensors for melanoma skin cancer diagnostics</title>
		<author>
			<persName><forename type="first">Eleni</forename><surname>Chatzilakou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yubing</forename><surname>Hu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Nan</forename><surname>Jiang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ali</forename><forename type="middle">K</forename><surname>Yetisen</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.bios.2024.116045</idno>
		<ptr target="https://doi.org/10.1016/j.bios.2024.116045" />
	</analytic>
	<monogr>
		<title level="j">Biosensors and Bioelectronics</title>
		<idno type="ISSN">0956-5663</idno>
		<imprint>
			<biblScope unit="volume">250</biblScope>
			<biblScope unit="page">116045</biblScope>
			<date type="published" when="2024">2024</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<monogr>
		<title level="m" type="main">Classification of malignant melanoma and benign skin lesions: implementation of automatic ABCD rule IET Image Processing</title>
		<author>
			<persName><forename type="first">Reda</forename><surname>Kasmi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Karim</forename><surname>Mokrani</surname></persName>
		</author>
		<idno type="DOI">10.1049/iet-ipr.2015.0385</idno>
		<ptr target="https://doi.org/10.1049/iet-ipr.2015.0385" />
		<imprint>
			<date type="published" when="2016-06">June 2016</date>
			<biblScope unit="volume">10</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<monogr>
		<title level="m" type="main">Alon Scope Principles of dermoscopy and dermoscopic equipment An Atlas of Dermoscopy</title>
		<author>
			<persName><forename type="first">Steven</forename><forename type="middle">Q</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ashfaq</forename><forename type="middle">A</forename><surname>Marghoob</surname></persName>
		</author>
		<idno type="DOI">10.3109/9781841847627</idno>
		<idno>DOI:</idno>
		<ptr target="https://doi.org/10.3109/9781841847627" />
		<imprint>
			<date type="published" when="2013">2013</date>
			<publisher>CRC Press</publisher>
		</imprint>
	</monogr>
	<note>2nd Edition</note>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Dermoscopy for melanoma detection and triage in primary care: a systematic review</title>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">T</forename><surname>Jones</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">C</forename><surname>Jurascheck</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Van Melle</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Hickman</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><forename type="middle">P</forename><surname>Burrows</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><forename type="middle">N</forename><surname>Hall</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Emery</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">M</forename><surname>Walter</surname></persName>
		</author>
		<idno type="DOI">10.1136/bmjopen-2018-027529</idno>
		<idno type="PMID">31434767</idno>
		<idno type="PMCID">PMC6707687</idno>
	</analytic>
	<monogr>
		<title level="j">BMJ Open</title>
		<imprint>
			<biblScope unit="volume">9</biblScope>
			<biblScope unit="issue">8</biblScope>
			<biblScope unit="page">e027529</biblScope>
			<date type="published" when="2019-08-20">2019 Aug 20</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">Cochrane Skin Cancer Diagnostic Test Accuracy Group. Dermoscopy, with and without visual inspection, for diagnosing melanoma in adults</title>
		<author>
			<persName><forename type="first">J</forename><surname>Dinnes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">J</forename><surname>Deeks</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Chuchu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ferrante</forename><surname>Di Ruffano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Matin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">N</forename><surname>Thomson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><forename type="middle">R</forename><surname>Wong</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><forename type="middle">Y</forename><surname>Aldridge</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">B</forename><surname>Abbott</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Fawzy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Bayliss</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">E</forename><surname>Grainge</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">J</forename><surname>Takwoingi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Davenport</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Godfrey</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Walter</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><forename type="middle">M</forename><surname>Williams</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">C</forename></persName>
		</author>
		<idno type="DOI">10.1002/14651858.CD011902.pub2</idno>
		<idno type="PMID">30521682</idno>
		<idno type="PMCID">PMC6517096</idno>
	</analytic>
	<monogr>
		<title level="j">Cochrane Database Syst Rev</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="issue">12</biblScope>
			<biblScope unit="page">D011902</biblScope>
			<date type="published" when="2018-12-04">2018 Dec 4</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Melanomas that failed dermoscopic detection: a combined clinicodermoscopic approach for not missing melanoma</title>
		<author>
			<persName><forename type="first">S</forename><surname>Puig</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Argenziano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Zalaudek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Ferrara</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Palou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Massi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Hofmann-Wellenhof</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">P</forename><surname>Soyer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Malvehy</surname></persName>
		</author>
		<idno type="DOI">10.1111/j.1524-4725.2007.33264</idno>
		<idno type="PMID">17903162</idno>
	</analytic>
	<monogr>
		<title level="j">Dermatol Surg</title>
		<imprint>
			<biblScope unit="volume">33</biblScope>
			<biblScope unit="issue">10</biblScope>
			<biblScope unit="page" from="1262" to="1273" />
			<date type="published" when="2007-10">2007 Oct</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Validity and Reliability of Dermoscopic Criteria Used to Differentiate Nevi From Melanoma: A Web-Based International Dermoscopy Society Study</title>
		<author>
			<persName><forename type="first">C</forename><surname>Carrera</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">A</forename><surname>Marchetti</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">W</forename><surname>Dusza</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Argenziano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">P</forename><surname>Braun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">C</forename><surname>Halpern</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Jaimes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">J</forename><surname>Kittler</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Malvehy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">W</forename><surname>Menzies</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Pellacani</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Puig</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">S</forename><surname>Rabinovitz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Scope</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><forename type="middle">P</forename><surname>Soyer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Stolz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Hofmann-Wellenhof</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Zalaudek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">A</forename><surname>Marghoob</surname></persName>
		</author>
		<idno type="DOI">10.1001/jamadermatol.2016.0624</idno>
		<idno type="PMID">27074267</idno>
		<idno type="PMCID">PMC5451089</idno>
	</analytic>
	<monogr>
		<title level="j">JAMA Dermatol</title>
		<imprint>
			<biblScope unit="volume">152</biblScope>
			<biblScope unit="issue">7</biblScope>
			<biblScope unit="page" from="798" to="806" />
			<date type="published" when="2016-07-01">2016 Jul 1</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Assessment of Diagnostic Accuracy of Dermoscopic Structures and Patterns Used in Melanoma Detection: A Systematic Review and Meta-analysis</title>
		<author>
			<persName><forename type="first">N</forename><forename type="middle">M</forename><surname>Williams</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><forename type="middle">D</forename><surname>Rojas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><forename type="middle">M</forename><surname>Reynolds</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Kwon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Shum-Tien</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Jaimes</surname></persName>
		</author>
		<idno type="DOI">10.1001/jamadermatol.2021.2845</idno>
		<idno type="PMID">34347005</idno>
		<idno type="PMCID">PMC8339993</idno>
	</analytic>
	<monogr>
		<title level="j">JAMA Dermatol</title>
		<imprint>
			<biblScope unit="volume">157</biblScope>
			<biblScope unit="issue">9</biblScope>
			<biblScope unit="page" from="1078" to="1088" />
			<date type="published" when="2021-09-01">2021 Sep 1</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Meta-analysis of digital dermoscopy follow-up of melanocytic skin lesions: a study on behalf of the International Dermoscopy Society</title>
		<author>
			<persName><forename type="first">G</forename><surname>Salerni</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Terán</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Puig</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Malvehy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Zalaudek</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Argenziano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Kittler</surname></persName>
		</author>
		<idno type="DOI">10.1111/jdv.12032</idno>
		<idno type="PMID">23181611</idno>
	</analytic>
	<monogr>
		<title level="j">J Eur Acad Dermatol Venereol</title>
		<imprint>
			<biblScope unit="volume">27</biblScope>
			<biblScope unit="issue">7</biblScope>
			<biblScope unit="page" from="805" to="814" />
			<date type="published" when="2012-11-26">2013 Jul. 2012 Nov 26</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<monogr>
		<title level="m" type="main">Non-invasive diagnostic techniques in pigmentary skin disorders and skin cancer JCD</title>
		<author>
			<persName><forename type="first">Yashdeep</forename><surname>Singh Pathania</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Zoe</forename><surname>Apalla</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Gabriel</forename><surname>Salerni</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Anant</forename><surname>Patil</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Stephan</forename><surname>Grabbe</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Mohamad</forename><surname>Goldust</surname></persName>
		</author>
		<idno type="DOI">10.1111/jocd.14547</idno>
		<ptr target="https://doi.org/10.1111/jocd.14547" />
		<imprint>
			<date type="published" when="2022-02">February 2022</date>
			<biblScope unit="volume">21</biblScope>
			<biblScope unit="page" from="444" to="450" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Ultrasound in Skin Cancer: Why, How, and When to Use It?</title>
		<author>
			<persName><forename type="first">Ximena</forename><surname>Wortsman</surname></persName>
		</author>
		<idno type="DOI">10.3390/cancers16193301</idno>
		<ptr target="https://doi.org/10.3390/cancers16193301" />
	</analytic>
	<monogr>
		<title level="j">Cancers</title>
		<imprint>
			<biblScope unit="volume">16</biblScope>
			<biblScope unit="issue">19</biblScope>
			<biblScope unit="page">3301</biblScope>
			<date type="published" when="2024">2024</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Depth of invasion analysis to predict acral melanoma outcomes</title>
		<author>
			<persName><forename type="first">Marcel</forename><surname>Arakaki Asato</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Francisco</forename><surname>Alves Moares-Neto</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Marcelo</forename><surname>Padovani De Toledo Moraes</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Juliana</forename><surname>Polizel Ocanha-Xavier</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Luiz</forename><surname>Carlos Takita</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Mariangela</forename><surname>Esther Alencar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">José</forename><surname>Marques</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Caldeira</forename><surname>Cândido</surname></persName>
		</author>
		<author>
			<persName><surname>Xavier-Júnior</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.anndiagpath.2024.152305</idno>
		<ptr target="https://doi.org/10.1016/j.anndiagpath.2024.152305" />
	</analytic>
	<monogr>
		<title level="j">Annals of Diagnostic Pathology</title>
		<idno type="ISSN">1092-9134</idno>
		<imprint>
			<biblScope unit="volume">71</biblScope>
			<biblScope unit="page">52305</biblScope>
			<date type="published" when="2024">2024</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">Skin lesion synthesis with generative adversarial networks.&quot; OR 2.0 Context-Aware Operating Theaters, Computer Assisted Robotic Endoscopy, Clinical Image-Based Procedures, and Skin</title>
		<author>
			<persName><forename type="first">Alceu</forename><surname>Bissoto</surname></persName>
		</author>
		<idno type="DOI">10.48550/arXiv.1902.03253</idno>
		<ptr target="https://doi.org/10.48550/arXiv.1902.03253" />
	</analytic>
	<monogr>
		<title level="m">Image Analysis: First International Workshop, OR 2.0 2018, 5th International Workshop, CARE 2018, 7th International Workshop, CLIP 2018, Third International Workshop, ISIC 2018, Held in Conjunction with MICCAI 2018</title>
				<meeting><address><addrLine>Granada, Spain</addrLine></address></meeting>
		<imprint>
			<publisher>Springer International Publishing</publisher>
			<date type="published" when="2018">September 16 and 20, 2018. 2018</date>
		</imprint>
	</monogr>
	<note>Proceedings 5</note>
</biblStruct>

<biblStruct xml:id="b16">
	<monogr>
		<title level="m" type="main">Conjunctival Melanoma and Role of Immunohistochemical Markers protein S100, HMB-45 and Melan A in Tumor Staging: Case Report and Literature Review Surgery Current Trends &amp; Innovations</title>
		<author>
			<persName><forename type="first">Carlos</forename><surname>Luiz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Araújo</forename><surname>De</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Sandra</forename><surname>Souz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Branco</forename><surname>Lúcia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Diogo</forename><surname>Mendes Coutinho</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Santos</forename><surname>Batista Dos</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Hugo</forename><surname>Medeiros</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Figueiredo</forename><surname>Oliveira De</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Rebeca</forename><surname>Cavalcanti</surname></persName>
		</author>
		<author>
			<persName><surname>Lima De</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Júlia</forename><surname>Miranda</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ana</forename><surname>De Souza Araújo</surname></persName>
		</author>
		<author>
			<persName><surname>Gabriela Leite De Moura</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Carvalhêdo</forename><surname>Vinícius</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Raíssa</forename><surname>Cunha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Habka</forename><surname>Cariello</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Paulo</forename><surname>Victor</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Rabelo</forename><surname>Barbosa</surname></persName>
		</author>
		<idno type="DOI">10.24966/SCTI-7284/100011</idno>
		<imprint>
			<date type="published" when="2019-02">Feb 2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<analytic>
		<title level="a" type="main">MPH Marginal and Joint Distributions of S100, HMB-45, and Melan-A Across a Large Series of Cutaneous Melanomas</title>
		<author>
			<persName><forename type="first">Hollis</forename><surname>Viray</surname></persName>
		</author>
		<author>
			<persName><forename type="first">William</forename><forename type="middle">R</forename><surname>Bradley</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kurt</forename><forename type="middle">A</forename><surname>Schalper</surname></persName>
		</author>
		<author>
			<persName><forename type="first">David</forename><forename type="middle">L</forename><surname>Rimm</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Bonnie</forename><forename type="middle">E</forename><surname>Gould Rothberg</surname></persName>
		</author>
		<idno type="DOI">10.5858/arpa.2012-0284-OA</idno>
		<ptr target="https://doi.org/10.5858/arpa.2012-0284-OA" />
	</analytic>
	<monogr>
		<title level="j">Arch Pathol Lab Med</title>
		<imprint>
			<biblScope unit="volume">137</biblScope>
			<biblScope unit="issue">8</biblScope>
			<biblScope unit="page" from="1063" to="1073" />
			<date type="published" when="2013">2013</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<analytic>
		<title level="a" type="main">The HAM10000 dataset, a large collection of multi-source dermatoscopic images of common pigmented skin lesions</title>
		<author>
			<persName><forename type="first">Philipp</forename><surname>Tschandl</surname></persName>
		</author>
		<idno type="DOI">10.7910/DVN/DBW86T</idno>
		<ptr target="https://doi.org/10.7910/DVN/DBW86T" />
	</analytic>
	<monogr>
		<title level="j">Harvard Dataverse</title>
		<imprint>
			<biblScope unit="volume">4</biblScope>
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b19">
	<monogr>
		<title level="m" type="main">Melanoma Skin Cancer Classification based on CNN Deep Learning Algorithms</title>
		<author>
			<persName><forename type="first">Safa</forename><forename type="middle">&amp;</forename><surname>Waheed</surname></persName>
		</author>
		<author>
			<persName><surname>Saadi</surname></persName>
		</author>
		<author>
			<persName><surname>Saadi</surname></persName>
		</author>
		<author>
			<persName><surname>Shafry</surname></persName>
		</author>
		<author>
			<persName><surname>Mohd &amp; Rahim</surname></persName>
		</author>
		<author>
			<persName><surname>Mohd</surname></persName>
		</author>
		<author>
			<persName><surname>Suaib</surname></persName>
		</author>
		<author>
			<persName><surname>Norhaida &amp; Najjar</surname></persName>
		</author>
		<author>
			<persName><surname>Fallah &amp; Mundher</surname></persName>
		</author>
		<author>
			<persName><surname>Myasar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ali</forename><forename type="middle">&amp;</forename><surname>Salim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Johor</forename><surname>Bahru</surname></persName>
		</author>
		<idno type="DOI">.10.11113/mjfas.v19n3.2900</idno>
		<imprint>
			<date type="published" when="2023">2023</date>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="page" from="299" to="305" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b20">
	<analytic>
		<title level="a" type="main">Skin cancer diagnosis based on optimized convolutional neural network</title>
		<author>
			<persName><forename type="first">Ni</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yi-Xin</forename><surname>Cai</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yong-Yong</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yi-Tao</forename><surname>Tian</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Xiao-Li</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Benjamin</forename><surname>Badami</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.artmed.2019.101756</idno>
		<ptr target="https://doi.org/10.1016/j.artmed.2019.101756" />
	</analytic>
	<monogr>
		<title level="j">Artificial Intelligence in Medicine</title>
		<idno type="ISSN">0933-3657</idno>
		<imprint>
			<biblScope unit="volume">102</biblScope>
			<biblScope unit="page">101756</biblScope>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b21">
	<analytic>
		<title level="a" type="main">Developing an efficient method for melanoma detection using CNN techniques</title>
		<author>
			<persName><forename type="first">D</forename><surname>Moturi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">K</forename><surname>Surapaneni</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><forename type="middle">S G</forename><surname>Avanigadda</surname></persName>
		</author>
		<idno type="DOI">10.1186/s43046-024-00210-w</idno>
		<ptr target="https://doi.org/10.1186/s43046-024-00210-w" />
	</analytic>
	<monogr>
		<title level="j">J Egypt Natl Canc Inst</title>
		<imprint>
			<biblScope unit="volume">36</biblScope>
			<biblScope unit="page">6</biblScope>
			<date type="published" when="2024">2024</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b22">
	<analytic>
		<title level="a" type="main">Enhanced skin cancer diagnosis using optimized CNN architecture and checkpoints for automated dermatological lesion classification</title>
		<author>
			<persName><forename type="first">M</forename><forename type="middle">M</forename><surname>Musthafa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>T R</surname></persName>
		</author>
		<idno type="DOI">10.1186/s12880-024-01356-8</idno>
		<ptr target="https://doi.org/10.1186/s12880-024-01356-8" />
	</analytic>
	<monogr>
		<title level="j">BMC Med Imaging</title>
		<imprint>
			<biblScope unit="volume">24</biblScope>
			<biblScope unit="page">201</biblScope>
			<date type="published" when="2024">2024</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b23">
	<monogr>
		<title level="m" type="main">Melanoma Skin Cancer Classification based on CNN Deep Learning Algorithms</title>
		<author>
			<persName><forename type="first">Safa</forename><forename type="middle">&amp;</forename><surname>Waheed</surname></persName>
		</author>
		<author>
			<persName><surname>Saadi</surname></persName>
		</author>
		<author>
			<persName><surname>Saadi</surname></persName>
		</author>
		<author>
			<persName><surname>Shafry</surname></persName>
		</author>
		<author>
			<persName><surname>Mohd &amp; Rahim</surname></persName>
		</author>
		<author>
			<persName><surname>Mohd</surname></persName>
		</author>
		<author>
			<persName><surname>Suaib</surname></persName>
		</author>
		<author>
			<persName><surname>Norhaida &amp; Najjar</surname></persName>
		</author>
		<author>
			<persName><surname>Fallah &amp; Mundher</surname></persName>
		</author>
		<author>
			<persName><surname>Myasar</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ali</forename><forename type="middle">&amp;</forename><surname>Salim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Johor</forename><surname>Bahru</surname></persName>
		</author>
		<idno type="DOI">10.11113/mjfas.v19n3.2900</idno>
		<imprint>
			<date type="published" when="2023">2023</date>
			<biblScope unit="volume">19</biblScope>
			<biblScope unit="page" from="299" to="305" />
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
