<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>FAR: A Firefighter Assistant Robot</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>MarcSchrön</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Felix Heisel</string-name>
          <email>felix.heisel@informatik.tu-freiberg.de</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>BastianPfleging</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>TU Bergakademie Freiberg</institution>
          ,
          <addr-line>09599 Freiberg</addr-line>
          ,
          <country country="DE">Germany</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <abstract>
        <p>Incident commanders of fire brigades still need to invest significant manual and mental efort to assess dynamic ifre scenes or accidents, coordinate personnel, make critical decisions, and manage risks under extreme conditions. We propose Far, a concept for an interactive firefighting assistant robot that leverages advanced sensing and real-time data analysis to support incident commanders. Besides the overall concept, we outline key use cases such as hazard detection, personnel tracking, automated decision support, and adaptive team coordination with the goal of identifying challenges in efective human-robot interaction. With this concept paper, we want to spark a discussion on firefighter-centric robotic assistance, emphasizing usability, trust, and seamless integration to improve emergency response.</p>
      </abstract>
      <kwd-group>
        <kwd>Firefighting</kwd>
        <kwd>human-robot interaction</kwd>
        <kwd>robot assistant</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. Related</title>
    </sec>
    <sec id="sec-3">
      <title>Work</title>
      <p>This chapter gives an overview of related work in the field of assistants and robots for fire departments
and in dangerous contexts. Sensors and related tasks, such as search and rescue, will also be discussed.</p>
      <p>CEUR
Workshop</p>
      <p>ISSN1613-0073</p>
      <sec id="sec-3-1">
        <title>2.1. Robots in the Field of Fire Fighting</title>
        <p>There are numerous areas where robots are already being used to support firefighters. Bogu[e1]
provides an overview of robots currently available on the market: mainly extinguishing robots for
various applications, but also special robots equipped with chainsaws.</p>
        <p>
          Dhiman et al.[
          <xref ref-type="bibr" rid="ref2">2</xref>
          ] have developed a robot that can independently locate (small) fires and fight them
with a fire extinguisher. Restrictions due to the use of a small fire extinguisher, which can also only be
used once, have been circumvented in other papers3[
          <xref ref-type="bibr" rid="ref4 ref5 ref6">, 4, 5, 6</xref>
          ] - in each case with the compromise that
mobility has been reduced. The extinguishing robot from Hassanein et a[l.7], for example, is larger,
but also less flexible. Abu [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] describes a robot that extends water hoses from source (water pump) to
targets. More recently, researchers and industry started to look into drones to support firefighters, both
with regard to exploration as well as extinguishing fires. Alon et al[.9] assessed the HCI perspective of
how semi-automated drones can be integrated into fire extinguishing practices.
        </p>
        <p>
          In the field of human search and rescue (SAR) are a large number of diferent robots for a wide range
of requirements [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ], for example for detecting buried people1[
          <xref ref-type="bibr" rid="ref1">1</xref>
          ].
        </p>
        <p>
          In addition to these forms of robots, there are also exploration robots that are primarily based on
swarm intelligence [
          <xref ref-type="bibr" rid="ref12 ref13">12, 13</xref>
          ]. Up to now, many projects looked into using robots or drones for very
specialized tasks. In contrast, we aim to create a platform that employs all the diferent tools and devices
to gather an overview of the overall scene, and assist the firefighters in coming up (and executing) the
right strategies to solve a particular incident.
        </p>
      </sec>
      <sec id="sec-3-2">
        <title>2.2. Artificial Intelligence and Interactive Systems for Fire &amp; Rescue Scenarios</title>
        <p>
          Artificial intelligence can be used in many areas of fire protection. Hodges et al. [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ] describes six of
them: Actions, Behavior, Decisions, Forecasts, Planning and Reports. One of the challenges he describes
is the lack of suficient training data.
        </p>
        <p>
          Abdelrahman[
          <xref ref-type="bibr" rid="ref15">15</xref>
          ] describes a procedure for combining data from thermal imaging cameras and
radar sensors within extended reality. This work is special because most AI systems are concerned with
prediction, but not with real-time support during the firefighting operation [
          <xref ref-type="bibr" rid="ref16 ref17 ref18">16, 17, 18</xref>
          ].
        </p>
      </sec>
      <sec id="sec-3-3">
        <title>2.3. Other Methods and Tools for Assistance</title>
        <p>In addition to robots, there are many other IT systems designed to support the work of firefighters.</p>
        <p>Scholz et al. [19], for example, have investigated communication during firefighting operations and
developed concepts on how this can be improved. One broad field is flying firefighting drones, which
currently operate mainly under human control. Boonyard et [a2l0.] have looked at the user needs for
these systems. Another area is the supervision of firefighters. The focus here is on ensuring physical
integrity and not on evaluating work results. Parker et a[2l.1] have developed portable devices to
measure this in rural areas.</p>
      </sec>
      <sec id="sec-3-4">
        <title>2.4. Our Own Research Group</title>
        <p>In our own research group, we developed and maintain Apps that support firefighters during incidents.
The app comprises diferent features such as displaying a crowd sourced map of water collection points
(hydrants, lakes, ...), looking up car rescue data sheets based on a car’s license plate, hazmat data sheets,
and monitoring firefighters when wearing air breathing masks. The Apps are actively deployed for all
ifre brigades in the German states of Saxony, Thuringia, and soon Saxony-Anhalt.</p>
      </sec>
      <sec id="sec-3-5">
        <title>2.5. Summary</title>
        <p>There is a wide range of diferent robots and systems that could theoretically support firefighters in
their daily work. These include, in particular, extinguishing robots for a wide variety of operations.
So far, many of these devices are rather single-purpoFsea.r instead aims at fusing the data of various
systems with the goal to assist the incident commander by providing a better overview of the scene and
by improving the decision-making process.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>3. Concept</title>
      <p>While the previously mentioned advances in robotics and drone technology mostly happened in research
so far, such technology is expected to also arrive in actual fire departments in the near future. If we
assume that fire departments will continue to equip themselves with state-of-the-art technology in
the future, we can expect that multiple departments will bring a variety of automated systems from
diferent manufacturers to an operation. This raises the question of how to extract the greatest possible
benefit from these costly systems. How can a symbiosis of heterogeneous systems unfold its potential
in a hectic emergency situation?</p>
      <p>A key challenge is ensuring that automation supports, rather than overwhelms human firefighters.
We assume that human expertise remains essential, as the task performance (human vs. automation/AI
vs. collaboration) may difer from task to task [22]. For instance, automated systems may excel in tasks
like thermal imaging and hazardous material detection, while human judgment is crucial for complex
decision-making.</p>
      <p>We propose a hybrid firefighting team where firefighters and machines collaborate eficiently. In
the course of collaboration, it must be ensured that humans always remain in control and are not
overwhelmed by the operation of the systems. To achieve this, we envision a superior system called
ifrefighting assistant robot ( Far) that aggregates and processes digital information from any source,
such as drone video feeds, sensor data or radio communication. The goal is to create an extended
situation awareness and assist the incident commander to maximize operational eficiency and safety.
AI-driven analyses will interpret available data in real-time, enabling intelligent decision support
without burdening firefighters with system management.</p>
      <p>While Far is currently imagined as ahumanoid robot located near the commander, we also
acknowledge the potential of alternative implementations, such awsearable systems (e.g., AR glasses)
or as asmart virtual agent, which may prove more practical and acceptable. Future work will compare
these formats in terms of usability and efectiveness.</p>
      <sec id="sec-4-1">
        <title>3.1. Key Features</title>
        <p>Far is envisioned to support in the areas that an incident commander commonly has to perform:
• Real Time Situation Detection: What is the current situation about, which are the most urgent
tasks? Supported through a multitude of sensor devices, robots, and droneFsa,r is able to create
a real-time digital twin of the incident site.
• Adaptive Risk Estimation: Where is the highest risk potential? What could go wrong?
• Tactical Operation Strategies: Which actions should be performed to improve the current
situation?
• Assistance in Performing the Tasks: Support the incident commander or firemen in carrying
out certain tasks in the physical world.</p>
      </sec>
      <sec id="sec-4-2">
        <title>3.2. Sense-Plan-Act Architecture</title>
        <p>Common practise is to follow the sense-plan-act paradigm, in which sensors (sense) provide the data, a
planning module (plan) determines the next action and finally an actuation module (act) carries out the
physical action. The following is an overview of how the sense-plan-act paradigm can be equipped
with useful components in the field.
3.2.1. Sensing
Possible sensors include microphones for monitoring the commander’s environment and direct
conversations, digital radio receivers to monitor remote communication, as well as various camera systems,
including those with optical zoom, 360° capture, and thermal imaging capabilities. LiDAR-based sensors
(Light Detection and Ranging) precisely capture distances and spatial structures. Hyperspectral cameras
can capture light in many narrow wavelength ranges to enable detailed material analysis. UV cameras
in the ultraviolet spectrum ofer additional possibilities for surface analysis.</p>
        <p>These sensors can be permanently mounted in the fire truck or be available as mobile variants on
robots or drones. Multiple mobile sensors can be placed at strategically advantageous locations, similar
to how multiple drones can simultaneously provide video streams. Additionally, all firefighters could
be equipped with portable cameras on their uniforms or helmets.</p>
        <sec id="sec-4-2-1">
          <title>3.2.2. Planning 3.2.3. Acting</title>
          <p>All incoming information from the sensing devices is bundled to create a digital twin of the incident
scene. A multimodal sensor fusion AI could take over the analysis and evaluation of this digital twin.
Based on the trained models, for example by observing similar operations, the operational situation can
be evaluated and a plan or forecast of the next actions can be determined.</p>
          <p>The execution of tasks within the physical domain remains the purview of firefighters. However, there
is a growing trend of automation, with firefighting robots and drones becoming increasingly prevalent.
These robots and drones could facilitate tasks such as material transportation and reconnaissance of
hazardous environments that are currently inaccessible to humans. For instance, a firefighting drone
could assist by executing the initial attack on small, emerging fires in high-rise areas. In addition, the
majority of actuators are equipped with sensors that extend the sensing part dynamically.</p>
        </sec>
      </sec>
      <sec id="sec-4-3">
        <title>3.3. Hybrid Team: Firemen + Equipment</title>
        <p>At the scene of an incident, units from various fire departments may come together. Each unit is
equipped diferently – some specialize in specific types of accidents, while others are generally equipped
for firefighting and technical assistance. We assume that these units are equipped with state-of-the-art
technology, which means that a wide range of initially undefined technologies converge at the scene.
Depending on the type of incident and location, diferent units are deployed – and they may arrive at
diferent times: units may arrive one after another, and additional units may be requested. The systems
brought to the scene therefore do not follow a uniform standard but instead have a dynamic character.</p>
        <p>With the Sense-Plan-Act architecture, it is possible to ensure that theFar operates efectively
even under these conditions. The technologies of the arriving units dynamically expand the sensing
capabilities.</p>
        <sec id="sec-4-3-1">
          <title>3.3.1. Conflict Behavior</title>
          <p>The firefighter acts based on his abilities, knowledge, experience, values, and courage, while the system
operates according to its learned AI-based model. This inevitably leads to conflicts. Even if the system’s
assessment results in better situational awareness than that of the human, there is still a risk of conflict.
The human may not understand why the system arrives at certain conclusions. Such conflicts lead to
reduced trust, negative emotions, a sense of helplessness, and a lack of acceptance. In the following, we
present a concept for mitigating conflicts by explaining our user interface.</p>
        </sec>
      </sec>
      <sec id="sec-4-4">
        <title>3.4. User Interface and Exemplary Scenario</title>
        <p>To put the ideas into practice, we consider the following example scenario: A highway accident involving
a truck that carries hazardous materials. The incident commander of the fire department is the first
to arrive at the scene, accompanied byFar. The situation is hectic and confusing, which is why the
incident commander decides to be supported by theFar system upon arrival as much as possible.</p>
        <p>The user interface of the system accompanies the user in the same order as the core functions of the
system are defined. Since these build on each other, a wrong situation monitoring would also lead to
wrong subsequent assessments such as risk estimation.</p>
        <sec id="sec-4-4-1">
          <title>3.4.1. Real Time Situation Detection</title>
          <p>Suggestions of recognized sub-incidents are displayed on a screen in a priority-sorted list. This list can
change dynamically during the operation. Around the monitor are few, but physically well-perceivable
hardware buttons, which can be easily and clearly operated even with firefighter gloves. Available
buttons include a joystick for navigating the list and marking an entry, a confirmation button for
accepting the case, a rejection button for discarding the case, and a restart button.</p>
          <p>Instead of using a screen with hardware buttons, a more compact variant in the sense of a smart
virtual agent with gesture control or similar concepts would also be conceivable. It is recommended
that further work be initiated to discuss these options and compare them with the hardware variant in
ifrefighting operations.</p>
          <p>Finally, the simple but particularly important function of the restart button should be outlined: When
pressing the restart button, all previously captured input data from the sensors is discarded. This
means that conversations or radio messages before pressing the button are not incorporated into the
AI model of multi-modal sensor fusion. This function is particularly important in exercises where
various operational scenarios are played out, or when the system fails. The interface concept has
been modeled on recommendation systems in aviation that alert pilots in high-pressure situations and
provide suggestions for action.</p>
        </sec>
        <sec id="sec-4-4-2">
          <title>3.4.2. Adaptive Risk Estimation</title>
          <p>Similar to situation monitoring, there is a list of hazards sorted according to their potential probability
of occurrence and severity of impact. Hazards that the commander does not consider relevant in the
current operational situation can be manually deselected and thus temporarily deactivated. A hazard
can be selected to obtain background or real-time information on this hazard. In the case of
contextrelated hazards, there is an explanation as to why the model classifies them as a hazard. An example
would be: “Hazard: further accidents due to moving trafic, as there have been 2 rear-end collisions in
comparable accidents on this section of the highway in the last 3 months.” In the case of sensor-based risks,
visualization of the sensors detecting the risk is conceivable. An example of this would be:D“anger:
Possible leakage of hazardous substances at this location of the vehicle.” The visualization shows a video
stream based on the evaluation of the zoom camera and hyperspectral camera at the expected point of
danger.</p>
        </sec>
        <sec id="sec-4-4-3">
          <title>3.4.3. Tactical Operation Strategies</title>
          <p>Based on the previously identified situations and risks, targeted solution strategies can be provided to
the user, supporting the eficient management of the current situation.</p>
          <p>When using this feature, it is assumed that the user seeks inspiration and wishes to explore the
approach proposed byFar for the current situation. In this context, the user is expected to allocate more
time for reflection and adjustment compared to other core functions. This additional time is necessary
because an AI-proposed operational strategy may initially diverge from the conventional perspective of
the incident commander and thus necessitate a discussion. For example, while the incident commander
might favor the strategy c“ontain leaking hazardous material with a mobile collection basin” the AI might
suggest an alternative such ass“ealing the damaged area of the tank.” In such a case, the commander
could object: “We do not have the materials to seal this area.” The system might then respond: “The
RLF 2000 vehicle is stocked with wooden wedges for sealing. My analysis shows that a wedge with a 4 cm
diameter would be best suited.”</p>
          <p>This example clearly demonstrates the added value of a voice-based interaction.</p>
          <p>It is essential to evaluate how acceptable and efective it is for the system to mimic human conflict
behavior and act as an interactive partner — through the use of persuasion techniques, assertiveness, or
negotiation skills to establish a cooperative dialogue with the user.</p>
          <p>This approach also raises important questions regarding social-psychological influencing factors and
the distribution of roles within the hybrid team, particularly concerning who takes the lead in each
situation. What remains crucial is that the system does not make autonomous decisions but serves as an
advisory assistant — comparable to an experienced firefighter who, after decades of service, contributes
his extensive knowledge in an advisory capacity and is open to discussion, but not in an authoritarian
manner.</p>
        </sec>
        <sec id="sec-4-4-4">
          <title>3.4.4. Assistance in Performing the Tasks</title>
          <p>From the solution strategy, individual sub tasks might be derivable that can be transferred to autonomous
robots, such as blocking of a highway in the described sample scenario. A separate task management
view would be useful for such tasks.</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>4. Current Status and Discussion</title>
      <p>As we are at the very beginning of conceptualizing and developing ouFrar system, we are still in the
process of reviewing related work and projects, assessing user needs and developing the overall goals
of our envisioned system. Having collected experiences with our existing Apps for firefighters, we
have established a network with local fire academies and the responsible authorities that command and
equip fire brigades. As next steps, our aim is to create a representative selection of sample scenarios
where ourFar system is expected to support incident commanders the most. In collaboration with the
diferent fire academies, we then want to deep dive into selected incident scenarios with the goal of
understanding the particular needs in these scenarios, identifying the potential for improvements using
Far, and designing and evaluating a first Far prototype.</p>
    </sec>
    <sec id="sec-6">
      <title>5. Conclusion and Future Work</title>
      <p>Our proposed concept aims to enable human-AI collaboration between fire fighters/incident commanders
and an assisting robot at the incident site. By integrating and fusing the data of a diverse set of sensors
and automated systems, we expect to create a digital twin of the incident scene which then serves as a
basis for AI-based tactical recommendations to the commander to improve the overall performance and
response at the incident scene.</p>
    </sec>
    <sec id="sec-7">
      <title>6. Declaration on Generative AI</title>
      <p>During the preparation of this work, the authors used ChatGPT and DeepL for text translations,
grammar and spelling checks, drafting content (initial versions of individual paragraphs), paraphrasing
and rewording. After using these tools/services, the authors reviewed and edited the content as needed
and take full responsibility for the publication’s content.
2019.8666471.
[19] M. Scholz, D. Gordon, L. Ramirez, S. Sigg, T. Dyrks, M. Beigl, A concept for support of
fireifghter frontline communication, Future Internet 5 (2013) 113–127. URL: https://www.mdpi.com/
1999-5903/5/2/113. doi:10.3390/fi5020113.
[20] C. Boonyard, C. Joufrais, J. R. Cauchard, A. Brock, Firefighting with Drone Assistance: User
Needs and Design Considerations for Thailand, in: Proceedings of the 2025 CHI Conference
on Human Factors in Computing Systems, ACM, Yokohama, Japan, 2025. URLh:ttps://enac.hal.
science/hal-04963852. doi:10.1145/3706598.3714172.
[21] R. Parker, A. Vitalis, R. Walker, D. Riley, H. G. Pearce, Measuring wildland fire fighter performance
with wearable technology, Applied Ergonomics 59 (2017) 34–44. URL:https://www.sciencedirect.
com/science/article/pii/S0003687016301715. doi:https://doi.org/10.1016/j.apergo.2016.08.
018.
[22] M. Vaccaro, A. Almaatouq, T. Malone, When combinations of humans and AI are useful: A
systematic review and meta-analysis, Nature Human Behaviour 8 (2024) 2293–2303. d1o0i:.1038/
s41562-024-02024-1.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>R.</given-names>
            <surname>Bogue</surname>
          </string-name>
          ,
          <article-title>The role of robots in firefighting</article-title>
          , Industrial Robot: the
          <source>international journal of robotics research and application 48</source>
          (
          <year>2021</year>
          )
          <fpage>174</fpage>
          -
          <lpage>178</lpage>
          . URLh:ttps://doi.org/10.1108/IR-10-2020-0222. doi:
          <volume>10</volume>
          .1108/IR-10-2020-0222.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Dhiman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Shah</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Adhikari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kumbhar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. S.</given-names>
            <surname>Dhanjal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Mehendale</surname>
          </string-name>
          ,
          <article-title>Firefighting robot with deep learning and machine vision</article-title>
          ,
          <source>Neural Computing and Applications</source>
          <volume>34</volume>
          (
          <year>2022</year>
          )
          <fpage>2831</fpage>
          -
          <lpage>2839</lpage>
          . URL: https://doi.org/10.1007/s00521-021-06537-y. doi:
          <volume>10</volume>
          .1007/s00521-021-06537-y.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>M. M. HUSSIEN</surname>
          </string-name>
          ,
          <article-title>Interfacing cmos camera with arm microcontroller for small robotic platform (</article-title>
          <year>2013</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>P.-H.</given-names>
            <surname>Chang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Kang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G. R.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. H.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. W.</given-names>
            <surname>Jeong</surname>
          </string-name>
          , D. K. Han,
          <string-name>
            <given-names>J. H.</given-names>
            <surname>Jung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.-J.</given-names>
            <surname>Lee</surname>
          </string-name>
          , et al.,
          <article-title>Control architecture design for a fire searching robot using task oriented design methodology</article-title>
          , in: 2006 SICE-ICASE International Joint Conference, IEEE,
          <year>2006</year>
          , pp.
          <fpage>3126</fpage>
          -
          <lpage>3131</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>P.</given-names>
            <surname>Cattin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Dave</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Grünenfelder</surname>
          </string-name>
          , G. Szekely,
          <string-name>
            <given-names>M.</given-names>
            <surname>Turina</surname>
          </string-name>
          , G. Zünd,
          <article-title>Trajectory of coronary motion and its significance in robotic motion cancellation</article-title>
          ,
          <source>European Journal of Cardio-thoracic Surgery</source>
          <volume>25</volume>
          (
          <year>2004</year>
          )
          <fpage>786</fpage>
          -
          <lpage>790</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>H. J.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. S.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. S.</given-names>
            <surname>Jeong</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Shim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. S.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <article-title>Optimal positive end-expiratory pressure during robot-assisted laparoscopic radical prostatectomy</article-title>
          ,
          <source>Korean journal of anesthesiology 65</source>
          (
          <year>2013</year>
          )
          <fpage>244</fpage>
          -
          <lpage>250</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>A.</given-names>
            <surname>Hassanein</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Elhawary</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Jaber</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>El-Abd</surname>
          </string-name>
          ,
          <article-title>An autonomous firefighting robot</article-title>
          ,
          <source>in: 2015 International Conference on Advanced Robotics (ICAR)</source>
          ,
          <year>2015</year>
          , pp.
          <fpage>530</fpage>
          -
          <lpage>535</lpage>
          .
          <year>do1i</year>
          :
          <fpage>0</fpage>
          .1109/ICAR.
          <year>2015</year>
          .
          <volume>7251507</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>U. b. S.</given-names>
            <surname>Abu</surname>
          </string-name>
          ,
          <article-title>Autonomy of Firefighter Robots for Extinguishing Fire in Petrochemical Complexes</article-title>
          ,
          <source>Ph.D. thesis</source>
          , Tohoku University,
          <year>2017</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>O.</given-names>
            <surname>Alon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Rabinovich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Fyodorov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. R.</given-names>
            <surname>Cauchard</surname>
          </string-name>
          ,
          <article-title>Drones in firefighting: A user-centered design perspective</article-title>
          ,
          <source>in: Proceedings of the 23rd International Conference on Mobile Human-Computer Interaction</source>
          , MobileHCI '21,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2021</year>
          . URL: https://doi.org/10.1145/3447526.3472030. doi:
          <volume>10</volume>
          .1145/3447526.3472030.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>J.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ma</surname>
          </string-name>
          ,
          <article-title>Current research, key performances and future development of search and rescue robots, Frontiers of Mechanical Engineering in China 2 (</article-title>
          <year>2007</year>
          )
          <fpage>404</fpage>
          -
          <lpage>416</lpage>
          . URL: https://doi.org/10.1007/s11465-007-0070-2. doi:
          <volume>10</volume>
          .1007/s11465-007-0070-2.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>A.</given-names>
            <surname>Davids</surname>
          </string-name>
          ,
          <article-title>Urban search and rescue robots: from tragedy to technology</article-title>
          ,
          <source>IEEE Intelligent Systems</source>
          <volume>17</volume>
          (
          <year>2002</year>
          )
          <fpage>81</fpage>
          -
          <lpage>83</lpage>
          . doi:
          <volume>10</volume>
          .1109/MIS.
          <year>2002</year>
          .
          <volume>999224</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>A.</given-names>
            <surname>Marjovi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Marques</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Penders</surname>
          </string-name>
          ,
          <article-title>Guardians robot swarm exploration and firefighter assistance</article-title>
          ,
          <source>in: Workshop on NRS in IEEE/RSJ international conference on Intelligent Robots and Systems (IROS)</source>
          ,
          <year>2009</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>A. M. Naghsh</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <string-name>
            <surname>Gancet</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Tanoto</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Roast</surname>
          </string-name>
          ,
          <article-title>Analysis and design of human-robot swarm interaction in firefighting</article-title>
          ,
          <source>in: RO-MAN 2008-The 17th IEEE International Symposium on Robot and Human Interactive Communication</source>
          , IEEE,
          <year>2008</year>
          , pp.
          <fpage>255</fpage>
          -
          <lpage>260</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>J. L.</given-names>
            <surname>Hodges</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. Y.</given-names>
            <surname>Lattimer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V. L.</given-names>
            <surname>Champlin</surname>
          </string-name>
          ,
          <source>The Role of Artificial Intelligence in Firefighting</source>
          , Springer International Publishing, Cham,
          <year>2022</year>
          , pp.
          <fpage>177</fpage>
          -
          <lpage>203</lpage>
          . URLh: ttps://doi.org/10.1007/ 978-3-
          <fpage>030</fpage>
          -98685-
          <issue>8</issue>
          _8. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>030</fpage>
          -98685-
          <issue>8</issue>
          _
          <fpage>8</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Abdelrahman</surname>
          </string-name>
          ,
          <article-title>Thermal Imaging for Amplifying Human Perception</article-title>
          , Universität Stuttgart,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Garrity</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. A.</given-names>
            <surname>Yusuf</surname>
          </string-name>
          ,
          <article-title>A predictive decision-aid device to warn firefighters of catastrophic temperature increases using an ai-based time-series algorithm</article-title>
          ,
          <source>Safety Science</source>
          <volume>138</volume>
          (
          <year>2021</year>
          )
          <article-title>105237</article-title>
          . URL: https://www.sciencedirect.com/science/article/pii/S0925753521000825. doi:https://doi. org/10.1016/j.ssci.
          <year>2021</year>
          .
          <volume>105237</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>H.</given-names>
            <surname>Engelbrecht</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. W.</given-names>
            <surname>Lindeman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hoermann</surname>
          </string-name>
          ,
          <article-title>A swot analysis of the field of virtual reality for ifrefighter training</article-title>
          ,
          <source>Frontiers in Robotics and AI</source>
          <volume>6</volume>
          (
          <year>2019</year>
          )
          <fpage>101</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>E. A.</given-names>
            <surname>Yfantis</surname>
          </string-name>
          ,
          <article-title>A uav with autonomy, pattern recognition for forest fire prevention, and ai for providing advice to firefighters fighting forest fires</article-title>
          ,
          <source>in: 2019 IEEE 9th Annual Computing and Communication Workshop and Conference (CCWC)</source>
          ,
          <year>2019</year>
          , pp.
          <fpage>0409</fpage>
          -
          <lpage>0413</lpage>
          .
          <year>doi1</year>
          :
          <fpage>0</fpage>
          .1109/CCWC.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>