<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>ProfIT AI</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Trash detection on the floor with IntelRealSense</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Lesia Mochurad</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Bohdan Herasym</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Lviv Polytechnic National University</institution>
          ,
          <addr-line>12 Bandera street, Lviv, 79013</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2023</year>
      </pub-date>
      <volume>3</volume>
      <fpage>20</fpage>
      <lpage>22</lpage>
      <abstract>
        <p>In our rapidly advancing world, the convenience of modern living comes hand in hand with a growing concern: waste management. As populations surge and urban centers expand, the accumulation of trash has become an ever-looming challenge. The importance of efficient trash detection and pickup systems cannot be overstated in this context. This paper introduces a novel approach for the automatic detection of trash and debris on indoor floors through the utilization of Intel RealSense technology. The system leverages depth-sensing and computer vision capabilities to identify and classify various types of litter, enabling efficient and autonomous cleaning processes in residential and commercial environments. By combining the power of Intel RealSense with advanced machine learning algorithms, this solution demonstrates promising results in improving cleanliness and hygiene, reducing manual labor, and enhancing overall maintenance efficiency.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Trash detection</kwd>
        <kwd>computer vision</kwd>
        <kwd>trash pick up</kwd>
        <kwd>deps camera</kwd>
        <kwd>IntelRealSense</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Proper waste management is not just about cleanliness; it is about safeguarding the
environment, public health, and the overall well-being of communities. Trash detection and
pickup play a pivotal role in preserving the natural beauty of our surroundings, ensuring the
cleanliness of our streets, and mitigating the adverse effects of pollution. In this age of
environmental awareness, these processes are integral to combating climate change, protecting
wildlife, and maintaining the delicate balance of our ecosystems [
        <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
        ].
      </p>
      <p>
        This vital task extends beyond mere aesthetics. Timely and effective trash detection and
pickup are fundamental in preventing the spread of diseases, enhancing the quality of air and
water, and creating spaces where people can thrive. Moreover, in the era of smart technology,
the integration of innovative solutions like sensors, data analytics, and artificial intelligence has
revolutionized waste management, making it not only more efficient but also more
environmentally friendly [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ].
      </p>
      <p>This introduction sets the stage for a deeper exploration into the significance of trash
detection and pickup, shedding light on the multifaceted benefits that these processes offer to
our communities, environment, and, ultimately, our collective future.</p>
      <p>
        In this paper proposes a method for detecting and picking up trash designed for autonomous
cleaning robots to operate indoors. However, this innovative approach extends far beyond its
initial scope. Its adaptability and potential for wider application suggest a transformative
impact on various scenarios, promising a more efficient and comprehensive solution to waste
management challenges [
        <xref ref-type="bibr" rid="ref4 ref5 ref6">4-6</xref>
        ].
      </p>
      <p>Based on the literature search conducted on the topic of the study, eleven articles directly
related to the research theme have been selected. They are summarized below in Table 1.</p>
      <p>Objective: The primary objective of this research is to comprehensively investigate and
develop strategies aimed at streamlining the cleaning process and elevating the operational
efficiency of cleaning robots. This will be achieved through the implementation of advanced
techniques such as garbage recognition and the precise identification of pickup points.</p>
      <sec id="sec-1-1">
        <title>Cascaded Machine- CNN, SVM, SSD</title>
      </sec>
      <sec id="sec-1-2">
        <title>Learning Technique for</title>
      </sec>
      <sec id="sec-1-3">
        <title>Debris Classification in</title>
      </sec>
      <sec id="sec-1-4">
        <title>Floor-Cleaning Robot</title>
      </sec>
      <sec id="sec-1-5">
        <title>Application</title>
      </sec>
      <sec id="sec-1-6">
        <title>Deep Learning Based DCNN, MMS</title>
      </sec>
      <sec id="sec-1-7">
        <title>Pavement Inspection</title>
      </sec>
      <sec id="sec-1-8">
        <title>Using Self</title>
      </sec>
      <sec id="sec-1-9">
        <title>Reconfigurable Robot</title>
      </sec>
      <sec id="sec-1-10">
        <title>Overview of modern 2021</title>
        <p>solution for trash
detection and cleanup
using autonomous
robots</p>
      </sec>
      <sec id="sec-1-11">
        <title>Detection of places 2022 best suited for trash clean up automatization</title>
      </sec>
      <sec id="sec-1-12">
        <title>Automatization and 2023 optimization of trash pick up from trash can</title>
      </sec>
      <sec id="sec-1-13">
        <title>Mask- Development of 2021 autonomous robot that collect trash</title>
      </sec>
      <sec id="sec-1-14">
        <title>Development of tiny 2019 autonomous robot that collect trash</title>
      </sec>
      <sec id="sec-1-15">
        <title>Development and 2022</title>
        <p>optimization algorithm
for trash detection and
navigation for
autonomous robot</p>
      </sec>
      <sec id="sec-1-16">
        <title>Classification for trash 2018 detection and check for ability to pick up trash from floor</title>
      </sec>
      <sec id="sec-1-17">
        <title>Detection of defects 2021 and trash on road 10. 11</title>
        <p>Research Object: The focal point of this study centers on cleaning robots as a critical
component of modern cleaning practices. The research aims to delve into their capabilities,
performance, and potential in terms of organizing and maintaining cleanliness within indoor
spaces.</p>
        <p>Research Subject: The core subject of this investigation pertains to the innovative approach
of integrating garbage recognition technology and precise pickup point determination. These
innovative methods are expected to serve as a catalyst for improving the overall effectiveness
and productivity of cleaning robots, thus addressing the ever-growing demand for efficient and
automated cleaning solutions in various indoor environments.</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>2. Methodic and materials</title>
      <p>Proposed method consisted of a few parts, diagram provided in Figure 1.</p>
      <p>At the inception of the workflow, the system initiates the data collection process, capturing
valuable sensor data from IntelRealSense devices. This data serves as a comprehensive
representation of the environment, allowing the system to discern its layout and the presence of
various objects, with a particular focus on identifying trash items.</p>
      <p>The subsequent phase of the operation involves intricate algorithms designed to not only
detect the presence of trash but also classify it into specific categories and pinpoint its precise
location on the floor. This multifaceted approach is pivotal for enabling the system to efficiently
and accurately recognize various types of waste materials, an essential aspect of any cleaning
and waste management process.</p>
      <p>Following this, the data is passed on to a specialized component within the system, tasked
with the conversion of the pixel-based position representation into SI units. This conversion
serves to standardize the location information, making it universally comprehensible and
facilitating seamless integration with other devices and software.</p>
      <p>In the final segment of this complex process, the system utilizes advanced techniques to
approximate the shape and attributes of the detected trash items. This analysis is crucial in
determining the most suitable points for the robotic gripper to engage with the waste materials.
By understanding the intricacies of the trash's structure, the system can make well-informed
decisions about the optimal pickup points, ensuring a secure and efficient removal process. This
step is instrumental in enhancing the system's overall efficiency and effectiveness in carrying
out cleaning and waste management tasks.</p>
      <sec id="sec-2-1">
        <title>2.1 Detection and classification</title>
        <p>The methodology employed in our proposed system leverages the formidable capabilities of
YOLOv5 [18], a state-of-the-art object detection model, which has been trained on the TACO
dataset [19]. This robust training process equips the model with a deep understanding of a wide
array of object classes, allowing it to proficiently recognize and classify various items within its
field of view.</p>
        <p>However, when it comes to the crucial task of selecting the appropriate objects for pickup,
the system needs to discern whether an object is squishable, solid, or unpickable, as these
characteristics significantly impact the grasping and handling strategy. To accommodate this
decision-making process, all the classes from the TACO dataset are propagated into a smaller set
of overarching categories, which primarily include squishable, solid, and unpickable.</p>
        <p>This strategic classification process is instrumental in ensuring that the robotic system can
make informed decisions when it comes to the retrieval of objects, taking into account their
physical properties and suitability for pickup. By categorizing objects into these key classes, the
system can fine-tune its actions, optimizing its efficiency and effectiveness in handling a diverse
range of items in real-world scenarios, making it an invaluable asset in the realm of automation
and robotics (see Figure 2).</p>
      </sec>
      <sec id="sec-2-2">
        <title>2.2 Trash region Selection</title>
        <p>After receiving bounding boxes from our recognition model, we extract and focus on the
specific regions of interest within the depth image. This process streamlines subsequent
analysis, improves computational efficiency, and ensures our algorithms work effectively by
concentrating on the most relevant areas. This targeted approach reduces noise, optimizes
resources, and enhances the overall performance of our computer vision system.</p>
      </sec>
      <sec id="sec-2-3">
        <title>2.3 Trash modeling</title>
        <p>Trash modeling proceeds in two stages, first stage we split all points from selected areas into
regions with heights equal to heights of gripper clams.</p>
        <p>Then for each region we project all points into the bottom of the region, and find minimal
bounding box.</p>
        <p>Rotating calipers is a geometric algorithm that can be used to find the minimum bounding
box (also known as the minimum area rectangle) of a set of points in a 2D plane. This algorithm
is efficient and works by rotating two perpendicular calipers to envelope the points. Here are
the steps to find the minimal bounding box using the rotating calipers algorithm:</p>
        <p>Find the convex hull of the given set of points. The convex hull is the smallest convex polygon
that contains all the input points. Author use Graham Scan algorithm.</p>
        <p>Initialie two calipers (long and short) that are initially aligned with two edges of the convex
hull. The long caliper should be aligned with the longest edge of the convex hull, and the short
caliper should be perpendicular to the long caliper.</p>
        <p>Calculate the area of the rectangle formed by the two calipers. This rectangle represents a
potential minimal bounding box.</p>
        <p>Rotate the short caliper counterclockwise by a small angle (e.g., one degree or a smaller
increment) while keeping the long caliper aligned with the longest edge of the convex hull.</p>
        <p>For each rotation of the short caliper, calculate the area of the rectangle formed by the two
calipers and check if it's smaller than the previous minimum area found. If it is, update the
minimum area.</p>
        <p>Continue rotating the short caliper until it completes a full rotation, which is 180 degrees.
Make sure to keep the long caliper aligned with the convex hull's longest edge throughout.</p>
        <p>After completing a full rotation, you will have the minimal bounding box with the minimum
area.</p>
        <p>To find the actual coordinates of the minimal bounding box, you can use the endpoints of the
long and short calipers and the orientation of the calipers.</p>
        <p>The rotating calipers algorithm guarantees finding the minimum area bounding box
efficiently with a time complexity of O(n), where n is the number of points in the convex hull.
This method is particularly useful when you need to find the tightest bounding box (see Figure
3) for a set of points in 2D space.</p>
        <p>Then we extend the bounding box back to the selected region height.</p>
      </sec>
      <sec id="sec-2-4">
        <title>2.4 Point Selection Algorithm</title>
        <p>After meddling we received stacks of cuboids then we went from top to bottom, checking
cuboids for selection.</p>
        <p>If the sides are close enough to be clamped by a gripper we go to the next rectangle.</p>
        <p>We check two scenarios: if the cuboid's lower part is bound to parallel sides of the top
cuboid, we mark it as pickable.</p>
        <p>If no last cuboid becomes our pickable option. This algorithm is described below:
cuboidStack;
solution=cuboidStack.top;
CheckForBestCuboid(cuboidStack,none)
function CheckForBestCuboid(cuboidStack, restrictedArea):
top=cuboidStack.pop();
if(ValidCuboid(top,restrictedArea))
solution=top
restrictedArea+=RestrictedArea(top)</p>
        <p>CheckForBestCuboid(cuboidStack, restrictedArea)
else:</p>
        <p>end
function ValidCuboid(cuboid,restrictedArea):
if(cuboid.sides in restrictedArea)</p>
        <p>return false
else</p>
        <p>return true
function RestrictedArea(cuboid):
width= cuboid.width in restrictedArea
width= cuboid.length in restrictedArea
if(width&gt;GripperClamSize)</p>
        <p>restrictedArea+= cuboid.AreaOutofWidthSides
if(length&gt;GripperClamSize)</p>
        <p>restrictedArea+= cuboid.AreaOutofLengthSides</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>3. Numerous experiments</title>
      <p>Since the algorithm consists of two parts, experiments were initially conducted with the
recognition algorithm. When it achieved satisfactory accuracy, experiments were then
performed with the second algorithm. Therefore, the recognition algorithm was first
investigated using 10% of the data from the TACO dataset. Additionally, a small amount of data
collected by the article's authores, which accounts for approximately 1% of the TACO dataset,
was used for testing. This approach was chosen because the TACO dataset is not very large.</p>
      <p>The model performance mainly depends on correct classification, not precision of the
bounding box, because the bounding box is good enough to decimate any noise or unwanted
objects from the frame.</p>
      <p>This sequential approach to experimenting and validating the two parts of the algorithm
allowed for a more structured and efficient development process, ensuring that the recognition
component was working well before moving on to the second part. It helped in identifying
potential issues and fine-tuning the algorithm's performance (see Figure 4).</p>
      <p>a)
b)</p>
      <p>The second part of the discussion focuses on an algorithm designed to streamline the process
of selecting a pickup point, which has been rigorously compared with a more straightforward,
naive pickup approach. In the case of the naive pickup method, the primary strategy involves
reaching for the object's center point and attempting to grasp it.</p>
      <p>Now, let's delve into the specifics of these two pickup methods. In the case of the naive
pickup, the objective is clear: reach for the center of the object and grab it. However, there are
specific criteria that determine whether the pickup is considered successful. If the gripper fails
to raise the object and hold it securely for at least three seconds, the attempt is considered
unsuccessful. This time threshold of three seconds serves as a crucial benchmark for evaluating
the efficacy of the naive pickup strategy.</p>
      <p>It's important to note that both of these methods have demonstrated commendable
performance when it comes to picking up squishable objects. The testing primarily focused on
items such as pieces of paper and plastic bags, where both the algorithmic approach and the
naive method exhibited a high degree of reliability and efficiency (see Figure 5).</p>
      <p>However, the stark contrast between these two pickup methods becomes evident when
dealing with solid objects. In such cases, the limitations of the naive pickup method become
pronounced. One of the significant shortcomings of the naive approach is its failure to consider
the orientation of the object, which results in frequent failures. Solid objects often exhibit
irregular shapes and varying orientations, making it a challenging task for the naive method to
consistently achieve successful pickups. This limitation underscores the importance of the
algorithmic approach, which takes into account various factors, including the object's
orientation, in order to increase the success rate of pickups, particularly when dealing with solid
objects.</p>
      <p>In summary, while both pickup methods excel at handling squishable objects, the algorithmic
approach shines when it comes to solid objects due to its ability to adapt to varying
orientations. This contrast highlights the significance of adopting more sophisticated strategies
when dealing with complex, non-uniform objects in the realm of pickup and manipulation.</p>
      <p>The automatic detection of waste on the floor using Intel RealSense technology developed in
this paper has important environmental, economic, and social implications:
1. Environmental importance:
• Recycling efficiency (аutomatic waste detection can promote efficient collection);
• Pollution prevention (quick identification).
2. Economic importance:
• Resource optimization (efficient waste detection);
• Cost savings (by automating waste detection and collection).
3. Social importance:
• Improved hygiene (automatic waste detection);
• Awareness raising (introduction of waste detection technology can increase public
awareness of waste management issues).</p>
    </sec>
    <sec id="sec-4">
      <title>4. Conclusions</title>
      <p>In conclusion, effective trash pickup and detection systems play a pivotal role in addressing the
ever-growing global issue of waste management. As our world continues to grapple with the
challenges of population growth, urbanization, and environmental degradation, it is imperative
that we embrace innovative technologies and strategies to manage our waste more efficiently
and sustainably.</p>
      <p>Trash pickup systems, ranging from traditional municipal services to advanced automated
solutions, help keep our communities clean and hygienic, ensuring a healthier and more
pleasant living environment. Moreover, by reducing litter and waste in public spaces, they
contribute to the preservation of our natural habitats, protecting wildlife and ecosystems from
the harmful effects of pollution.</p>
      <p>The integration of detection technologies, such as sensors and machine learning algorithms,
into waste management processes offers a promising avenue for improving efficiency and
reducing environmental impact. These systems not only enable early detection of overflowing
bins, but they also optimize collection routes, reducing fuel consumption and greenhouse gas
emissions. This not only benefits the environment but also lowers operational costs for waste
management services.</p>
      <p>Furthermore, the use of data analytics and real-time monitoring allows for a more
datadriven and responsive approach to waste management. This means that resources can be
allocated more efficiently, and waste collection can be tailored to the specific needs of each
community. Such data-driven systems can even encourage more responsible waste disposal
practices among individuals, fostering a culture of sustainability and environmental awareness.</p>
      <p>In summary, the convergence of traditional trash pickup services with cutting-edge detection
technologies represents a significant step forward in our ongoing efforts to create cleaner, more
sustainable, and environmentally conscious communities. The collective action of governments,
businesses, and individuals, in tandem with the advancement of these technologies, will
undoubtedly contribute to a brighter and more sustainable future. By harnessing the power of
innovation and data-driven decision-making, we can take meaningful strides towards
addressing the challenges of waste management in the 21st century and leaving a cleaner,
healthier planet for generations to come.</p>
    </sec>
    <sec id="sec-5">
      <title>Acknowledgements</title>
      <p>The authors would like to thank the Armed Forces of Ukraine for providing security to perform
this work. This work has become possible only because of the resilience and courage of the
Ukrainian Army.
[11] Ejour Kulshreshtha, Medhasvi Chandra, et. al. Outdoor Autonomous Trash-Collecting Robot</p>
      <p>Design Using YOLOv4-Tiny Electronics, (2021), 10(18):2079-9292.
[12] S. Hossain, B. Debnath, et. al. Autonomous Trash Collector Based on Object Detection Using
Deep Neural Network, TENCON 2019 - 2019 IEEE Region 10 Conference (TENCON), Kochi,
India, (2019), pp. 1406-1410, doi: 10.1109/TENCON.2019.8929270.
[13] J. Bai, S. Lian, Z. Liu, K. Wang and D. Liu, Deep Learning Based Robot for Automatically
Picking Up Garbage on the Grass, in IEEE Transactions on Consumer Electronics, (2018), vol.
64, no. 3, pp. 382-389, doi: 10.1109/TCE.2018.2859629.
[14] B. Ramalingam, A.K. Lakshmanan, et. al. Cascaded Machine-Learning Technique for Debris
Classification in Floor-Cleaning Robot Application. Appl. Sci., (2018), 8, 2649.
doi:10.3390/app8122649.
[15] B. Ramalingam, A.A. Hayat, et.al. Deep Learning Based Pavement Inspection Using
Self</p>
      <p>Reconfigurable Robot. Sensors, (2021), 21, 2595. doi:10.3390/s21082595.
[16] I. Izonin, R. Tkachenko, I. Krak, O. Berezsky, I. Shevchuk, S.K., A cascade ensemble-learning
model for the deployment at the edge: case on missing IoT data recovery in environmental
monitoring systems. Front. Environ. Sci., (2023), 11:1295526. doi: 10.3389/fenvs.2023.1295526.
[17] L. Mochurad, Y. Hladun, R. Tkachenko, An Obstacle-Finding Approach for Autonomous
Mobile Robots Using 2D LiDAR Data. Big Data and Cognitive Computing. (2023), 7(1):43.
doi:10.3390/bdcc7010043.
[18] J. Zhang, Z. Chen, et. al., Faster and Lightweight: An Improved YOLOv5 Object Detector for</p>
      <p>R e m o t e S e n s i n g I m a g e s . R e m o t e S e n s i n g , ( 2 0 2 3 ) , 1 5 ( 2 0 ) : 4 9 7 4 .
doi.org/10.3390/rs15204974.
[19] TACO dataset. Available online: http://tacodataset.org/explore (accessed on 3 November 2023).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Dr. Rafique</given-names>
            <surname>Anjum</surname>
          </string-name>
          ,
          <article-title>Bilal Ahmed Wani, The Concept of Ecological Balance and Environmental conservation: An Islamic perspective</article-title>
          .
          <source>Suraj Punj Journal For Multidisciplinary Research</source>
          , (
          <year>2020</year>
          ), vol.
          <volume>8</volume>
          , no.
          <issue>12</issue>
          , pp.
          <fpage>45</fpage>
          -
          <lpage>58</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>V.</given-names>
            <surname>Kovtun</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Izonin</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Gregus</surname>
          </string-name>
          ,
          <article-title>Model of functioning of the centralized wireless information ecosystem focused on multimedia streaming</article-title>
          ,
          <source>Egyptian Informatics Journal</source>
          , vol.
          <volume>23</volume>
          , no.
          <issue>4</issue>
          .
          <string-name>
            <surname>Elsevier</surname>
            <given-names>BV</given-names>
          </string-name>
          , pp.
          <fpage>89</fpage>
          -
          <lpage>96</lpage>
          , Dec. (
          <year>2022</year>
          ). doi:
          <volume>10</volume>
          .1016/j.eij.
          <year>2022</year>
          .
          <volume>06</volume>
          .009.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>D.</given-names>
            <surname>Chumachenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Piletskiy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sukhorukova</surname>
          </string-name>
          , T. Chumachenko,
          <source>Predictive Model of Lyme Disease Epidemic Process Using Machine Learning Approach, Applied Sciences (Switzerland)</source>
          ,
          <volume>12</volume>
          (
          <issue>9</issue>
          ), art. no.
          <issue>4282</issue>
          ,(
          <year>2022</year>
          ), doi: 10.3390/app12094282.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>V.</given-names>
            <surname>Semkovych</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Shymanskyi</surname>
          </string-name>
          ,
          <article-title>Combining OCR Methods to Improve Handwritten Text Recognition with Low System Technical Requirements</article-title>
          . In: Hu,
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            ,
            <surname>He</surname>
          </string-name>
          , M. (eds) Advances
          <source>in Intelligent Systems, Computer Science and Digital Economics IV. CSDEIS 2022. Lecture Notes on Data Engineering and Communications Technologies</source>
          , (
          <year>2023</year>
          ), vol
          <volume>158</volume>
          . Springer, Cham. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>031</fpage>
          -24475-9_
          <fpage>56</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Jamil</given-names>
            <surname>Abedalrahim Jamil Alsayaydeh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Azwan</given-names>
            <surname>Aziz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. I. A.</given-names>
            <surname>Rahman</surname>
          </string-name>
          , Syed Najib Syed Salim, Maslan Zainon, Zikri Abadi Baharudin,
          <article-title>Muhammad Inam Abbasi and Adam Wong Yoon Khang, Development of programmable home security using GSM system for early prevention</article-title>
          ,
          <source>ARPN Journal of Engineering and Applied Sciences</source>
          , (
          <year>2021</year>
          ),
          <volume>16</volume>
          (
          <issue>1</issue>
          ):
          <fpage>88</fpage>
          -
          <lpage>97</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>W.A.</given-names>
            <surname>Indra</surname>
          </string-name>
          et. al.,
          <article-title>Development of Security System Using Motion Sensor Powered by RF Energy Harvesting</article-title>
          .
          <source>2020 IEEE Student Conference on Research and Development</source>
          , (
          <year>2020</year>
          ),
          <year>SCOReD 2020</year>
          9250984, pp.
          <fpage>254</fpage>
          -
          <lpage>258</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>C.</given-names>
            <surname>Zhihong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Hebin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Yanbo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Binyan</surname>
          </string-name>
          and
          <string-name>
            <given-names>L.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <article-title>A vision-based robotic grasping system using deep learning for garbage sorting, 2017 36th Chinese Control Conference (CCC), Dalian</article-title>
          , China, (
          <year>2017</year>
          ), pp.
          <fpage>11223</fpage>
          -
          <lpage>11226</lpage>
          , doi: 10.23919/ChiCC.
          <year>2017</year>
          .
          <volume>8029147</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>S. S.</given-names>
            <surname>Chandra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kulshreshtha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Randhawa</surname>
          </string-name>
          ,
          <article-title>A Review of Trash Collecting</article-title>
          and
          <string-name>
            <given-names>Cleaning</given-names>
            <surname>Robots</surname>
          </string-name>
          ,
          <year>2021</year>
          9th International Conference on Reliability,
          <article-title>Infocom Technologies and Optimization (Trends and Future Directions) (ICRITO), Noida</article-title>
          , India, (
          <year>2021</year>
          ), pp.
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          , doi: 10.1109/ICRITO51393.
          <year>2021</year>
          .
          <volume>9596551</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>B.</given-names>
            <surname>Ramalingam</surname>
          </string-name>
          et al.,
          <article-title>Optimal selective floor cleaning using deep learning algorithms</article-title>
          and reconfigurable robot hTetro, (
          <year>2022</year>
          ),
          <source>Sci Rep 12</source>
          ,
          <fpage>15938</fpage>
          . doi.org/10.1038/s41598-022-19249-7.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>L.-B. Chen</surname>
            ,
            <given-names>X. -R.</given-names>
          </string-name>
          <string-name>
            <surname>Huang</surname>
          </string-name>
          , et. al.,
          <article-title>Design and Implementation of an Artificial Intelligence of Things-Based Autonomous Mobile Robot System for Cleaning Garbage</article-title>
          ,
          <source>in IEEE Sensors Journal</source>
          , (
          <year>2023</year>
          ), vol.
          <volume>23</volume>
          , no.
          <issue>8</issue>
          , pp.
          <fpage>8909</fpage>
          -
          <lpage>8922</lpage>
          . doi:
          <volume>10</volume>
          .1109/JSEN.
          <year>2023</year>
          .
          <volume>3254902</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>