<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Trash detection on the floor with IntelRealSense</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author role="corresp">
							<persName><forename type="first">Lesia</forename><surname>Mochurad</surname></persName>
							<email>lesia.i.mochurad@lpnu.ua</email>
							<affiliation key="aff0">
								<orgName type="institution">Lviv Polytechnic National University</orgName>
								<address>
									<addrLine>12 Bandera street</addrLine>
									<postCode>79013</postCode>
									<settlement>Lviv</settlement>
									<country key="UA">Ukraine</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Bohdan</forename><surname>Herasym</surname></persName>
							<affiliation key="aff0">
								<orgName type="institution">Lviv Polytechnic National University</orgName>
								<address>
									<addrLine>12 Bandera street</addrLine>
									<postCode>79013</postCode>
									<settlement>Lviv</settlement>
									<country key="UA">Ukraine</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Trash detection on the floor with IntelRealSense</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">81F4CFCAC8D426B4E41F2DCFCDE57C13</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2025-04-23T16:26+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Trash detection</term>
					<term>computer vision</term>
					<term>trash pick up</term>
					<term>deps camera</term>
					<term>IntelRealSense</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>In our rapidly advancing world, the convenience of modern living comes hand in hand with a growing concern: waste management. As populations surge and urban centers expand, the accumulation of trash has become an ever-looming challenge. The importance of efficient trash detection and pickup systems cannot be overstated in this context. This paper introduces a novel approach for the automatic detection of trash and debris on indoor floors through the utilization of Intel RealSense technology. The system leverages depth-sensing and computer vision capabilities to identify and classify various types of litter, enabling efficient and autonomous cleaning processes in residential and commercial environments. By combining the power of Intel RealSense with advanced machine learning algorithms, this solution demonstrates promising results in improving cleanliness and hygiene, reducing manual labor, and enhancing overall maintenance efficiency.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>Proper waste management is not just about cleanliness; it is about safeguarding the environment, public health, and the overall well-being of communities. Trash detection and pickup play a pivotal role in preserving the natural beauty of our surroundings, ensuring the cleanliness of our streets, and mitigating the adverse effects of pollution. In this age of environmental awareness, these processes are integral to combating climate change, protecting wildlife, and maintaining the delicate balance of our ecosystems <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b1">2]</ref>.</p><p>This vital task extends beyond mere aesthetics. Timely and effective trash detection and pickup are fundamental in preventing the spread of diseases, enhancing the quality of air and water, and creating spaces where people can thrive. Moreover, in the era of smart technology, the integration of innovative solutions like sensors, data analytics, and artificial intelligence has revolutionized waste management, making it not only more efficient but also more environmentally friendly <ref type="bibr" target="#b2">[3]</ref>.</p><p>This introduction sets the stage for a deeper exploration into the significance of trash detection and pickup, shedding light on the multifaceted benefits that these processes offer to our communities, environment, and, ultimately, our collective future.</p><p>In this paper proposes a method for detecting and picking up trash designed for autonomous cleaning robots to operate indoors. However, this innovative approach extends far beyond its initial scope. Its adaptability and potential for wider application suggest a transformative impact on various scenarios, promising a more efficient and comprehensive solution to waste management challenges <ref type="bibr" target="#b3">[4]</ref><ref type="bibr" target="#b4">[5]</ref><ref type="bibr" target="#b5">[6]</ref>.</p><p>Based on the literature search conducted on the topic of the study, eleven articles directly related to the research theme have been selected. They are summarized below in Table <ref type="table" target="#tab_0">1</ref>.</p><p>Objective: The primary objective of this research is to comprehensively investigate and develop strategies aimed at streamlining the cleaning process and elevating the operational efficiency of cleaning robots. This will be achieved through the implementation of advanced techniques such as garbage recognition and the precise identification of pickup points. 2023 <ref type="bibr" target="#b16">[17]</ref> Research Object: The focal point of this study centers on cleaning robots as a critical component of modern cleaning practices. The research aims to delve into their capabilities, performance, and potential in terms of organizing and maintaining cleanliness within indoor spaces.</p><p>Research Subject: The core subject of this investigation pertains to the innovative approach of integrating garbage recognition technology and precise pickup point determination. These innovative methods are expected to serve as a catalyst for improving the overall effectiveness and productivity of cleaning robots, thus addressing the ever-growing demand for efficient and automated cleaning solutions in various indoor environments.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Methodic and materials</head><p>Proposed method consisted of a few parts, diagram provided in Figure <ref type="figure">1</ref>.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 1: Proposed algorithm diagram</head><p>At the inception of the workflow, the system initiates the data collection process, capturing valuable sensor data from IntelRealSense devices. This data serves as a comprehensive representation of the environment, allowing the system to discern its layout and the presence of various objects, with a particular focus on identifying trash items.</p><p>The subsequent phase of the operation involves intricate algorithms designed to not only detect the presence of trash but also classify it into specific categories and pinpoint its precise location on the floor. This multifaceted approach is pivotal for enabling the system to efficiently and accurately recognize various types of waste materials, an essential aspect of any cleaning and waste management process.</p><p>Following this, the data is passed on to a specialized component within the system, tasked with the conversion of the pixel-based position representation into SI units. This conversion serves to standardize the location information, making it universally comprehensible and facilitating seamless integration with other devices and software.</p><p>In the final segment of this complex process, the system utilizes advanced techniques to approximate the shape and attributes of the detected trash items. This analysis is crucial in determining the most suitable points for the robotic gripper to engage with the waste materials. By understanding the intricacies of the trash's structure, the system can make well-informed decisions about the optimal pickup points, ensuring a secure and efficient removal process. This step is instrumental in enhancing the system's overall efficiency and effectiveness in carrying out cleaning and waste management tasks.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.1">Detection and classification</head><p>The methodology employed in our proposed system leverages the formidable capabilities of YOLOv5 <ref type="bibr" target="#b17">[18]</ref>, a state-of-the-art object detection model, which has been trained on the TACO dataset <ref type="bibr" target="#b18">[19]</ref>. This robust training process equips the model with a deep understanding of a wide array of object classes, allowing it to proficiently recognize and classify various items within its field of view.</p><p>However, when it comes to the crucial task of selecting the appropriate objects for pickup, the system needs to discern whether an object is squishable, solid, or unpickable, as these characteristics significantly impact the grasping and handling strategy. To accommodate this decision-making process, all the classes from the TACO dataset are propagated into a smaller set of overarching categories, which primarily include squishable, solid, and unpickable.</p><p>This strategic classification process is instrumental in ensuring that the robotic system can make informed decisions when it comes to the retrieval of objects, taking into account their physical properties and suitability for pickup. By categorizing objects into these key classes, the system can fine-tune its actions, optimizing its efficiency and effectiveness in handling a diverse range of items in real-world scenarios, making it an invaluable asset in the realm of automation and robotics (see Figure <ref type="figure" target="#fig_0">2</ref>). </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.2">Trash region Selection</head><p>After receiving bounding boxes from our recognition model, we extract and focus on the specific regions of interest within the depth image. This process streamlines subsequent analysis, improves computational efficiency, and ensures our algorithms work effectively by concentrating on the most relevant areas. This targeted approach reduces noise, optimizes resources, and enhances the overall performance of our computer vision system.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.3">Trash modeling</head><p>Trash modeling proceeds in two stages, first stage we split all points from selected areas into regions with heights equal to heights of gripper clams.</p><p>Then for each region we project all points into the bottom of the region, and find minimal bounding box.</p><p>Rotating calipers is a geometric algorithm that can be used to find the minimum bounding box (also known as the minimum area rectangle) of a set of points in a 2D plane. This algorithm is efficient and works by rotating two perpendicular calipers to envelope the points. Here are the steps to find the minimal bounding box using the rotating calipers algorithm:</p><p>Find the convex hull of the given set of points. The convex hull is the smallest convex polygon that contains all the input points. Author use Graham Scan algorithm.</p><p>Initialie two calipers (long and short) that are initially aligned with two edges of the convex hull. The long caliper should be aligned with the longest edge of the convex hull, and the short caliper should be perpendicular to the long caliper.</p><p>Calculate the area of the rectangle formed by the two calipers. This rectangle represents a potential minimal bounding box.</p><p>Rotate the short caliper counterclockwise by a small angle (e.g., one degree or a smaller increment) while keeping the long caliper aligned with the longest edge of the convex hull.</p><p>For each rotation of the short caliper, calculate the area of the rectangle formed by the two calipers and check if it's smaller than the previous minimum area found. If it is, update the minimum area.</p><p>Continue rotating the short caliper until it completes a full rotation, which is 180 degrees. Make sure to keep the long caliper aligned with the convex hull's longest edge throughout.</p><p>After completing a full rotation, you will have the minimal bounding box with the minimum area.</p><p>To find the actual coordinates of the minimal bounding box, you can use the endpoints of the long and short calipers and the orientation of the calipers.</p><p>The rotating calipers algorithm guarantees finding the minimum area bounding box efficiently with a time complexity of O(n), where n is the number of points in the convex hull. This method is particularly useful when you need to find the tightest bounding box (see Figure <ref type="figure">3</ref>) for a set of points in 2D space.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 3: Bounding box example</head><p>Then we extend the bounding box back to the selected region height.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.4">Point Selection Algorithm</head><p>After meddling we received stacks of cuboids then we went from top to bottom, checking cuboids for selection.</p><p>If the sides are close enough to be clamped by a gripper we go to the next rectangle.</p><p>We check two scenarios: if the cuboid's lower part is bound to parallel sides of the top cuboid, we mark it as pickable.</p><p>If no last cuboid becomes our pickable option. This algorithm is described below: </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Numerous experiments</head><p>Since the algorithm consists of two parts, experiments were initially conducted with the recognition algorithm. When it achieved satisfactory accuracy, experiments were then performed with the second algorithm. Therefore, the recognition algorithm was first investigated using 10% of the data from the TACO dataset. Additionally, a small amount of data collected by the article's authores, which accounts for approximately 1% of the TACO dataset, was used for testing. This approach was chosen because the TACO dataset is not very large.</p><p>The model performance mainly depends on correct classification, not precision of the bounding box, because the bounding box is good enough to decimate any noise or unwanted objects from the frame.</p><p>This sequential approach to experimenting and validating the two parts of the algorithm allowed for a more structured and efficient development process, ensuring that the recognition component was working well before moving on to the second part. It helped in identifying potential issues and fine-tuning the algorithm's performance (see Figure <ref type="figure" target="#fig_1">4</ref>). The second part of the discussion focuses on an algorithm designed to streamline the process of selecting a pickup point, which has been rigorously compared with a more straightforward, naive pickup approach. In the case of the naive pickup method, the primary strategy involves reaching for the object's center point and attempting to grasp it. Now, let's delve into the specifics of these two pickup methods. In the case of the naive pickup, the objective is clear: reach for the center of the object and grab it. However, there are specific criteria that determine whether the pickup is considered successful. If the gripper fails to raise the object and hold it securely for at least three seconds, the attempt is considered unsuccessful. This time threshold of three seconds serves as a crucial benchmark for evaluating the efficacy of the naive pickup strategy.</p><p>It's important to note that both of these methods have demonstrated commendable performance when it comes to picking up squishable objects. The testing primarily focused on items such as pieces of paper and plastic bags, where both the algorithmic approach and the naive method exhibited a high degree of reliability and efficiency (see Figure <ref type="figure" target="#fig_2">5</ref>).</p><p>However, the stark contrast between these two pickup methods becomes evident when dealing with solid objects. In such cases, the limitations of the naive pickup method become pronounced. One of the significant shortcomings of the naive approach is its failure to consider the orientation of the object, which results in frequent failures. Solid objects often exhibit irregular shapes and varying orientations, making it a challenging task for the naive method to consistently achieve successful pickups. This limitation underscores the importance of the algorithmic approach, which takes into account various factors, including the object's orientation, in order to increase the success rate of pickups, particularly when dealing with solid objects.</p><p>In summary, while both pickup methods excel at handling squishable objects, the algorithmic approach shines when it comes to solid objects due to its ability to adapt to varying orientations. This contrast highlights the significance of adopting more sophisticated strategies when dealing with complex, non-uniform objects in the realm of pickup and manipulation. The automatic detection of waste on the floor using Intel RealSense technology developed in this paper has important environmental, economic, and social implications:</p><p>1. Environmental importance:</p><p>• Recycling efficiency (аutomatic waste detection can promote efficient collection);</p><p>• Pollution prevention (quick identification).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Economic importance:</head><p>• Resource optimization (efficient waste detection);</p><p>• Cost savings (by automating waste detection and collection).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Social importance:</head><p>• Improved hygiene (automatic waste detection);</p><p>• Awareness raising (introduction of waste detection technology can increase public awareness of waste management issues).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Conclusions</head><p>In conclusion, effective trash pickup and detection systems play a pivotal role in addressing the ever-growing global issue of waste management. As our world continues to grapple with the challenges of population growth, urbanization, and environmental degradation, it is imperative that we embrace innovative technologies and strategies to manage our waste more efficiently and sustainably. Trash pickup systems, ranging from traditional municipal services to advanced automated solutions, help keep our communities clean and hygienic, ensuring a healthier and more pleasant living environment. Moreover, by reducing litter and waste in public spaces, they contribute to the preservation of our natural habitats, protecting wildlife and ecosystems from the harmful effects of pollution.</p><p>The integration of detection technologies, such as sensors and machine learning algorithms, into waste management processes offers a promising avenue for improving efficiency and reducing environmental impact. These systems not only enable early detection of overflowing bins, but they also optimize collection routes, reducing fuel consumption and greenhouse gas emissions. This not only benefits the environment but also lowers operational costs for waste management services.</p><p>Furthermore, the use of data analytics and real-time monitoring allows for a more datadriven and responsive approach to waste management. This means that resources can be allocated more efficiently, and waste collection can be tailored to the specific needs of each community. Such data-driven systems can even encourage more responsible waste disposal practices among individuals, fostering a culture of sustainability and environmental awareness.</p><p>In summary, the convergence of traditional trash pickup services with cutting-edge detection technologies represents a significant step forward in our ongoing efforts to create cleaner, more sustainable, and environmentally conscious communities. The collective action of governments, businesses, and individuals, in tandem with the advancement of these technologies, will undoubtedly contribute to a brighter and more sustainable future. By harnessing the power of innovation and data-driven decision-making, we can take meaningful strides towards addressing the challenges of waste management in the 21st century and leaving a cleaner, healthier planet for generations to come.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: Training process</figDesc><graphic coords="4,152.75,536.64,290.23,182.80" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Model examples</figDesc><graphic coords="7,144.47,72.85,136.00,181.07" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: Training comparison</figDesc><graphic coords="8,146.55,72.00,303.40,230.40" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1 Review</head><label>1</label><figDesc></figDesc><table><row><cell></cell><cell cols="3">of existing work</cell><cell></cell><cell></cell><cell></cell></row><row><cell>№</cell><cell>Title</cell><cell></cell><cell></cell><cell></cell><cell cols="2">Instrumentation Task</cell><cell>Year</cell><cell>Reference</cell></row><row><cell>1</cell><cell cols="4">A vision-based robotic</cell><cell>RPN and the</cell><cell cols="2">Detection with pursue</cell><cell>2017 [7]</cell></row><row><cell></cell><cell cols="4">grasping system using</cell><cell>VGG-16</cell><cell cols="2">to detect trash</cell></row><row><cell></cell><cell>deep</cell><cell cols="2">learning</cell><cell>for</cell><cell></cell><cell></cell></row><row><cell></cell><cell cols="3">garbage sorting</cell><cell></cell><cell></cell><cell></cell></row><row><cell>2</cell><cell cols="4">A Review of Trash</cell><cell>Scopus,</cell><cell cols="2">Overview of modern</cell><cell>2021 [8]</cell></row><row><cell></cell><cell cols="4">Collecting and Cleaning</cell><cell>Google scholar</cell><cell>solution</cell><cell>for</cell><cell>trash</cell></row><row><cell></cell><cell>Robots</cell><cell></cell><cell></cell><cell></cell><cell></cell><cell cols="2">detection and cleanup</cell></row><row><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell>using</cell><cell>autonomous</cell></row><row><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell>robots</cell></row><row><cell>3</cell><cell cols="4">Optimal selective floor</cell><cell>CCTV, SORT,</cell><cell cols="2">Detection of places</cell><cell>2022 [9]</cell></row><row><cell></cell><cell cols="4">cleaning using deep</cell><cell>SSD</cell><cell cols="2">best suited for trash</cell></row><row><cell></cell><cell cols="4">learning algorithms and</cell><cell></cell><cell>clean</cell><cell>up</cell></row><row><cell></cell><cell cols="3">reconfigurable</cell><cell>robot</cell><cell></cell><cell cols="2">automatization</cell></row><row><cell></cell><cell>hTetro</cell><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell></row><row><cell>4</cell><cell>Design</cell><cell></cell><cell></cell><cell>and</cell><cell>ROS</cell><cell cols="2">Automatization</cell><cell>and</cell><cell>2023 [10]</cell></row><row><cell></cell><cell cols="4">Implementation of an</cell><cell></cell><cell cols="2">optimization of trash</cell></row><row><cell></cell><cell cols="4">Artificial Intelligence of</cell><cell></cell><cell cols="2">pick up from trash can</cell></row><row><cell></cell><cell cols="2">Things-Based</cell><cell></cell><cell></cell><cell></cell><cell></cell></row><row><cell></cell><cell cols="2">Autonomous</cell><cell cols="2">Mobile</cell><cell></cell><cell></cell></row><row><cell></cell><cell>Robot</cell><cell cols="2">System</cell><cell>for</cell><cell></cell><cell></cell></row><row><cell></cell><cell cols="3">Cleaning Garbage</cell><cell></cell><cell></cell><cell></cell></row><row><cell>5</cell><cell>OATCR:</cell><cell></cell><cell cols="2">Outdoor</cell><cell>Yolov4, Mask-</cell><cell cols="2">Development</cell><cell>of</cell><cell>2021 [11]</cell></row><row><cell></cell><cell cols="2">Autonomous</cell><cell></cell><cell>Trash-</cell><cell>RCNN,</cell><cell cols="2">autonomous robot that</cell></row><row><cell></cell><cell cols="4">Collecting Robot Design</cell><cell></cell><cell cols="2">collect trash</cell></row><row><cell></cell><cell cols="4">Using YOLOv4-Tiny</cell><cell></cell><cell></cell></row><row><cell>6</cell><cell cols="2">Autonomous</cell><cell></cell><cell>Trash</cell><cell>ROS, RNN,</cell><cell cols="2">Development of tiny</cell><cell>2019 [12]</cell></row><row><cell></cell><cell cols="4">Collector Based on</cell><cell>Yolov4</cell><cell cols="2">autonomous robot</cell></row><row><cell></cell><cell cols="4">Object Detection Using</cell><cell></cell><cell cols="2">that collect trash</cell></row><row><cell></cell><cell cols="4">Deep Neural Network</cell><cell></cell><cell></cell></row><row><cell>7</cell><cell cols="4">Deep Learning Based</cell><cell>ROS</cell><cell cols="2">Development</cell><cell>and</cell><cell>2022 [13]</cell></row><row><cell></cell><cell cols="4">Robot for Automatically</cell><cell></cell><cell cols="2">optimization algorithm</cell></row><row><cell></cell><cell cols="4">Picking Up Garbage on</cell><cell></cell><cell cols="2">for trash detection and</cell></row><row><cell></cell><cell cols="2">the Grass</cell><cell></cell><cell></cell><cell></cell><cell>navigation</cell><cell>for</cell></row><row><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell></cell><cell cols="2">autonomous robot</cell></row><row><cell>8</cell><cell cols="2">Cascaded</cell><cell cols="2">Machine-</cell><cell>CNN, SVM, SSD</cell><cell cols="2">Classification for trash</cell><cell>2018 [14]</cell></row><row><cell></cell><cell cols="4">Learning Technique for</cell><cell></cell><cell>detection</cell></row><row><cell></cell><cell cols="4">Debris Classification in</cell><cell></cell><cell cols="2">and check for ability to</cell></row><row><cell></cell><cell cols="3">Floor-Cleaning</cell><cell>Robot</cell><cell></cell><cell cols="2">pick up trash from floor</cell></row><row><cell></cell><cell cols="2">Application</cell><cell></cell><cell></cell><cell></cell><cell></cell></row><row><cell>9</cell><cell cols="4">Deep Learning Based</cell><cell>DCNN, MMS</cell><cell cols="2">Detection of defects</cell><cell>2021 [15]</cell></row><row><cell></cell><cell cols="2">Pavement</cell><cell cols="2">Inspection</cell><cell></cell><cell cols="2">and trash on road</cell></row><row><cell></cell><cell>Using</cell><cell></cell><cell></cell><cell>Self-</cell><cell></cell><cell></cell></row><row><cell></cell><cell cols="4">Reconfigurable Robot</cell><cell></cell><cell></cell></row></table></figure>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgements</head><p>The authors would like to thank the Armed Forces of Ukraine for providing security to perform this work. This work has become possible only because of the resilience and courage of the Ukrainian Army.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">The Concept of Ecological Balance and Environmental conservation: An Islamic perspective</title>
		<author>
			<persName><forename type="middle">Rafique</forename><surname>Dr</surname></persName>
		</author>
		<author>
			<persName><surname>Anjum</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ahmed</forename><surname>Bilal</surname></persName>
		</author>
		<author>
			<persName><surname>Wani</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Suraj Punj Journal For Multidisciplinary Research</title>
		<imprint>
			<biblScope unit="volume">8</biblScope>
			<biblScope unit="issue">12</biblScope>
			<biblScope unit="page" from="45" to="58" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Model of functioning of the centralized wireless information ecosystem focused on multimedia streaming</title>
		<author>
			<persName><forename type="first">V</forename><surname>Kovtun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Izonin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Gregus</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.eij.2022.06.009</idno>
	</analytic>
	<monogr>
		<title level="j">Egyptian Informatics Journal</title>
		<imprint>
			<biblScope unit="volume">23</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="89" to="96" />
			<date type="published" when="2022-12">Dec. (2022</date>
			<publisher>Elsevier BV</publisher>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Predictive Model of Lyme Disease Epidemic Process Using Machine Learning Approach</title>
		<author>
			<persName><forename type="first">D</forename><surname>Chumachenko</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Piletskiy</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Sukhorukova</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Chumachenko</surname></persName>
		</author>
		<idno type="DOI">10.3390/app12094282</idno>
	</analytic>
	<monogr>
		<title level="j">Applied Sciences</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="issue">9</biblScope>
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
	<note>art</note>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Combining OCR Methods to Improve Handwritten Text Recognition with Low System Technical Requirements</title>
		<author>
			<persName><forename type="first">V</forename><surname>Semkovych</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Shymanskyi</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-031-24475-9_56</idno>
	</analytic>
	<monogr>
		<title level="m">Advances in Intelligent Systems, Computer Science and Digital Economics IV</title>
		<title level="s">Lecture Notes on Data Engineering</title>
		<editor>
			<persName><forename type="first">Z</forename><surname>Hu</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">Y</forename><surname>Wang</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">M</forename><surname>He</surname></persName>
		</editor>
		<meeting><address><addrLine>CSDEIS</addrLine></address></meeting>
		<imprint>
			<publisher>Springer</publisher>
			<date type="published" when="2022">2022. 2023</date>
			<biblScope unit="volume">158</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Development of programmable home security using GSM system for early prevention</title>
		<author>
			<persName><forename type="first">Jamil</forename><surname>Abedalrahim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jamil</forename><surname>Alsayaydeh</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Azwan</forename><surname>Aziz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">I A</forename><surname>Rahman</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Syed</forename><surname>Najib Syed</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Maslan</forename><surname>Salim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Zikri</forename><surname>Zainon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Muhammad</forename><surname>Abadi Baharudin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Inam</forename><surname>Abbasi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Adam</forename><surname>Wong</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yoon</forename><surname>Khang</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">ARPN Journal of Engineering and Applied Sciences</title>
		<imprint>
			<biblScope unit="volume">16</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page" from="88" to="97" />
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Development of Security System Using Motion Sensor Powered by RF Energy Harvesting</title>
		<author>
			<persName><forename type="first">W</forename><forename type="middle">A</forename><surname>Indra</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE Student Conference on Research and Development</title>
				<imprint>
			<date type="published" when="2020">2020. 2020</date>
			<biblScope unit="volume">2020</biblScope>
			<biblScope unit="page" from="254" to="258" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">A vision-based robotic grasping system using deep learning for garbage sorting</title>
		<author>
			<persName><forename type="first">C</forename><surname>Zhihong</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Hebin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Yanbo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Binyan</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Yu</surname></persName>
		</author>
		<idno type="DOI">10.23919/ChiCC.2017.8029147</idno>
	</analytic>
	<monogr>
		<title level="m">36th Chinese Control Conference (CCC)</title>
				<meeting><address><addrLine>Dalian, China</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2017">2017. 2017</date>
			<biblScope unit="page" from="11223" to="11226" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">A Review of Trash Collecting and Cleaning Robots</title>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">S</forename><surname>Chandra</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Kulshreshtha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Randhawa</surname></persName>
		</author>
		<idno type="DOI">10.1109/ICRITO51393.2021.9596551</idno>
	</analytic>
	<monogr>
		<title level="m">9th International Conference on Reliability, Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO)</title>
				<meeting><address><addrLine>Noida, India</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2021">2021. 2021</date>
			<biblScope unit="page" from="1" to="5" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Optimal selective floor cleaning using deep learning algorithms and reconfigurable robot hTetro</title>
		<author>
			<persName><forename type="first">B</forename><surname>Ramalingam</surname></persName>
		</author>
		<idno type="DOI">.org/10.1038/s41598-022-19249-7</idno>
	</analytic>
	<monogr>
		<title level="j">Sci Rep</title>
		<imprint>
			<biblScope unit="volume">12</biblScope>
			<biblScope unit="page">15938</biblScope>
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Design and Implementation of an Artificial Intelligence of Things-Based Autonomous Mobile Robot System for Cleaning Garbage</title>
		<author>
			<persName><forename type="first">L.-B</forename><surname>Chen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">X. -R</forename><surname>Huang</surname></persName>
		</author>
		<idno type="DOI">10.1109/JSEN.2023.3254902</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Sensors Journal</title>
		<imprint>
			<biblScope unit="volume">23</biblScope>
			<biblScope unit="issue">8</biblScope>
			<biblScope unit="page" from="8909" to="8922" />
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<monogr>
		<title level="m" type="main">Outdoor Autonomous Trash-Collecting Robot Design Using YOLOv4-Tiny Electronics</title>
		<author>
			<persName><forename type="first">Ejour</forename><surname>Kulshreshtha</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Medhasvi</forename><surname>Chandra</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2021">2021</date>
			<biblScope unit="volume">10</biblScope>
			<biblScope unit="page" from="2079" to="9292" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Autonomous Trash Collector Based on Object Detection Using Deep Neural Network</title>
		<author>
			<persName><forename type="first">S</forename><surname>Hossain</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Debnath</surname></persName>
		</author>
		<idno type="DOI">10.1109/TENCON.2019.8929270</idno>
	</analytic>
	<monogr>
		<title level="m">TENCON 2019 -2019 IEEE Region 10 Conference (TENCON)</title>
				<meeting><address><addrLine>Kochi, India</addrLine></address></meeting>
		<imprint>
			<date type="published" when="2019">2019</date>
			<biblScope unit="page" from="1406" to="1410" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">Deep Learning Based Robot for Automatically Picking Up Garbage on the Grass</title>
		<author>
			<persName><forename type="first">J</forename><surname>Bai</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Lian</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Liu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><surname>Wang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Liu</surname></persName>
		</author>
		<idno type="DOI">10.1109/TCE.2018.2859629</idno>
	</analytic>
	<monogr>
		<title level="j">IEEE Transactions on Consumer Electronics</title>
		<imprint>
			<biblScope unit="volume">64</biblScope>
			<biblScope unit="issue">3</biblScope>
			<biblScope unit="page" from="382" to="389" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<analytic>
		<title level="a" type="main">Cascaded Machine-Learning Technique for Debris Classification in Floor-Cleaning Robot Application</title>
		<author>
			<persName><forename type="first">B</forename><surname>Ramalingam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">K</forename><surname>Lakshmanan</surname></persName>
		</author>
		<idno type="DOI">10.3390/app8122649</idno>
	</analytic>
	<monogr>
		<title level="j">Appl. Sci</title>
		<imprint>
			<biblScope unit="issue">8</biblScope>
			<biblScope unit="page">2649</biblScope>
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<analytic>
		<title level="a" type="main">Deep Learning Based Pavement Inspection Using Self-Reconfigurable Robot</title>
		<author>
			<persName><forename type="first">B</forename><surname>Ramalingam</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">A</forename><surname>Hayat</surname></persName>
		</author>
		<idno type="DOI">10.3390/s21082595</idno>
	</analytic>
	<monogr>
		<title level="j">Sensors</title>
		<imprint>
			<biblScope unit="volume">21</biblScope>
			<biblScope unit="page">2595</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b15">
	<analytic>
		<title level="a" type="main">A cascade ensemble-learning model for the deployment at the edge: case on missing IoT data recovery in environmental monitoring systems</title>
		<author>
			<persName><forename type="first">I</forename><surname>Izonin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Tkachenko</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Krak</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><surname>Berezsky</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Shevchuk</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">K</forename></persName>
		</author>
		<idno type="DOI">10.3389/fenvs.2023.1295526</idno>
	</analytic>
	<monogr>
		<title level="j">Front. Environ. Sci</title>
		<imprint>
			<biblScope unit="volume">11</biblScope>
			<biblScope unit="page">1295526</biblScope>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b16">
	<analytic>
		<title level="a" type="main">An Obstacle-Finding Approach for Autonomous Mobile Robots Using 2D LiDAR Data</title>
		<author>
			<persName><forename type="first">L</forename><surname>Mochurad</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Hladun</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Tkachenko</surname></persName>
		</author>
		<idno type="DOI">10.3390/bdcc7010043</idno>
	</analytic>
	<monogr>
		<title level="j">Big Data and Cognitive Computing</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="issue">1</biblScope>
			<biblScope unit="page">43</biblScope>
			<date type="published" when="2023">2023</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b17">
	<monogr>
		<title level="m" type="main">Faster and Lightweight: An Improved YOLOv5 Object Detector for R e m o t e S e n s i n g I m a g e s . R e m o t e S e n s i n g</title>
		<author>
			<persName><forename type="first">J</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Z</forename><surname>Chen</surname></persName>
		</author>
		<idno type="DOI">.org/10.3390/rs15204974</idno>
		<imprint>
			<biblScope unit="volume">1</biblScope>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b18">
	<monogr>
		<ptr target="http://tacodataset.org/explore" />
		<title level="m">TACO dataset</title>
				<imprint>
			<date type="published" when="2023-11-03">3 November 2023</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
