<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Run-time Monitoring for Robot-Assisted Surgery</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Kristina Gogoladze</string-name>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Natasha Alechina</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Zhaoyang Jacopo Hu</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Haozheng Xu</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Romy van Jaarsveld</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jelle P. Ruurda</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Imperial College London</institution>
          ,
          <addr-line>London</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Open Universiteit</institution>
          ,
          <addr-line>Heerlen</addr-line>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>University Medical Centre Utrecht</institution>
          ,
          <addr-line>Utrecht</addr-line>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>Utrecht University</institution>
          ,
          <addr-line>Utrecht</addr-line>
        </aff>
      </contrib-group>
      <abstract>
        <p>We present a proposal to support surgeons during robot-assisted surgery. Our approach is based on run-time monitoring for properties that should hold during the surgery, and issuing a warning to the surgeon if they are violated. We outline the general approach involving properties that require both visual input from the cameras as well as kinematic information obtained from the robot's manipulators. Then we describe preliminary work on monitoring using purely kinematic information.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Runtime verification</kwd>
        <kwd>Robot-assisted surgery</kwd>
        <kwd>Linear temporal logic</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        Robot-assisted surgery involves the surgeon remotely operating a surgical robot that performs operations
on the patient. The surgeon has a magnified view of the area of surgery. The robot can have multiple
arms and use various tools with great precision. Robot-assisted surgery is increasingly being adopted
as the surgical procedure of choice for a wide range of operations [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Robot-assisted surgery has
several advantages over traditional, open, surgery, including improved dexterity, tremor filtering,
threedimensional magnified vision, and smaller incisions. Due to smaller incisions, it leads to faster recovery
and better patient outcomes. However, robot-assisted surgery is complex and requires considerable
training to become proficient. For example, Robot-Assisted Minimally Invasive Esophagectomy (RAMIE)
requires over 100 training cases [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Even trained surgeons may still make mistakes or come close to
making a mistake, such as touching or damaging anatomical structures not involved in the operation. The
reasons for this include obscured vision, disorientation due to the highly magnified view, misrecognition
of anatomy, lack of tactile feedback, and fatigue. The challenge of improving surgical performance is
very important [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>This paper is the first step in a project to improve patient outcomes by supporting surgeons during
robot-assisted surgery. We aim to develop a run-time verification system that warns a surgeon if the
robot manipulators are getting too close to, e.g., the aorta, or that the pressure exerted is too high, or
the movement too fast or uneven. In order to do this, we need to be able to specify properties of interest
(such as, ‘the tool should not touch the aorta’) in a precise formal language.</p>
      <p>
        Run-time verification has been successfully used in a wide range of applications, including software
systems, avionics, and autonomous space robotics [
        <xref ref-type="bibr" rid="ref3 ref4 ref5">3, 4, 5</xref>
        ]. There is also work on surgery monitoring
using Siamese neural networks and transformers [
        <xref ref-type="bibr" rid="ref6 ref7">6, 7</xref>
        ]. However, run-time verification of robot-assisted
surgery that involves precise formal guarantees and can be used for any monitorable property has not
previously been attempted, although there is work on verification applied to robotic surgery, notably
[
        <xref ref-type="bibr" rid="ref8 ref9">8, 9</xref>
        ].
      </p>
      <p>
        Run-time monitoring of robot-assisted surgery involves significant challenges, including: the
complexity of the behaviour (of the robot manipulated by a human); the complexity of the environment
(soft tissues in a human body, which move with every breath and heartbeat, and change during surgical
dissection); and the need to come up with a specification language that is suficiently expressive to allow
the statement of medically relevant properties while at the same time admitting eficient verification.
We focus on the case study of Robot-Assisted Minimally Invasive Esophagectomy (RAMIE) [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ], but the
approach will be applicable to other surgical procedures.
      </p>
    </sec>
    <sec id="sec-2">
      <title>2. Properties of interest</title>
      <p>One of the objectives of our research is to identify and describe relevant and specific properties that
need to be monitored for. We describe several types of properties within the RAMIE procedure that are
important to be monitored in order to obtain a correct execution of the procedure while minimizing
postoperative complications or injuries. The current list of potential properties of interest has been
composed in consultation with the medical experts participating in the project. In future work, we plan
to evaluate and possibly expand the list through a survey of surgeons training to perform the RAMIE
procedure.</p>
      <p>The first kind of properties to be checked is avoiding critical anatomical structures where possible.
During the procedure, a lot of critical anatomical structures are exposed from their surrounding tissue
that also serve as a protective layer. During the procedure lymph nodes are removed around the
anatomical structures to extract cancer cells as well as to get a clear visual of the diseased anatomical
structure parts that need to be removed. Critical anatomical structures like the aorta, azygos vein,
trachea, and laryngeal recurrent nerve are easily damaged which can lead to serious (post operative)
complications and injuries, some even leading to mortality. Therefore, it is necessary to keep away
from critical anatomical structures when possible during a particular phase in the procedure.</p>
      <p>Another kind of properties to be monitored is the motions and movements the surgeon makes during
the procedure. Hasty or hesitant movements and motions can lead to scraping or touching critical
anatomical structures. When a surgeon exhibits this kind of behaviour, alerting them could lead to them
reflecting on their execution and slow down and be more precise if necessary. We could monitor for too
slow or too fast movements, and in general for patterns of movement: either try to compare patterns for
diferent people, or learn what the movement pattern is for a given surgeon and subsequently compare
the current surgery’s movements with already established movements for a particular surgeon in order,
for example, to determine if the surgeon is suddenly tired or tense.</p>
      <p>During the surgery, the surgeon should not stop for more than a few seconds. If the surgeon stops, it
may be a sign that something is going wrong. Therefore, ‘stopping for more than X seconds’ is also a
property to be checked.</p>
      <p>A very important property associated with the tools in the hands of the robot is whether they are
in the camera view. This is important because if we do not see the instruments, then the patient’s
safety cannot be guaranteed. It is important to clearly and fully see the working field (area the tools are
working in) during surgery. Sometimes it can be forgotten to focus the camera view centrally on the
working field. When working in the outer rim, a smaller area can be seen around the entire working
ifeld. This could lead to damaging surrounding structures. Therefore, properties such as ’working field
in outer rim of video view’ and ’instrument completely out of view’ are very important to monitor for.</p>
      <p>When performing robot-assisted surgeries, the operating surgeon does not feel resistance if the
instrument touches anything. If the instruments touch each other, there is a possibility that they are
actually pressing against each other with great force, in which case they can slip and cause irreparable
harm to the patient. This is why it is important to always check that instruments are not touching each
other.</p>
      <p>Another possible property to check for is whether suturing is being done in the correct direction;
namely, at one end of the anastomosis (e.g., the esophagus) the needle goes from outermost to innermost
layer; on the other end (e.g., gastric tube) the needle goes from the innermost to the outermost layer.</p>
      <p>Finally, correctly completing surgical phases before going on to the next one is crucial for the correct
execution of the RAMIE procedure. Monitoring the full completion of surgical phases can be helpful to
surgeons as a memory aid during the procedure.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Expressing Properties in Linear Temporal Logic</title>
      <p>
        In run-time verification, properties to monitor are often expressed in Linear Temporal Logic (LTL) [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ].
This formal language is intended for describing constraints on runs of the system (sequences of states
of the system). It can say that some statement  holds in the Next state, that  holds Globally (in every
state) and that some statement  holds until  becomes true ( Until  ). A system execution is formally
a simple linear trace — a sequence of events that capture a behaviour of the monitored system we are
interested in analysing.
      </p>
      <p>
        We consider a variant of LTL interpreted over finite traces [
        <xref ref-type="bibr" rid="ref11 ref12">11, 12</xref>
        ], called LTL . LTL is formally
defined as follows. Let  be a set of atomic propositions, then the syntax of a formula in LTL is
 ::=  | ¬ |  ∧  |   |  
where  ∈ . We use the standard abbreviations such as ∨ ::= ¬(¬∧¬ ) and  →  ::= ¬∨ ,
as well as ♢  ::= ⊤  (eventually) and □  ::= ¬♢ ¬ (globally).
      </p>
      <p>A state is a set of atomic propositions from  (intuitively, a state is a set of atomic propositions
that are true there). A finite trace  is a finite sequence of states. We denote by | | the length of  , by
 [] the th state on  (counting from 0) and by   the trace beginning at position  of  .</p>
      <p>The notion of LTL formula  being  at a trace starting at position , denoted as   |= , is
defined inductively as follows:
  |=  if  ∈  [];
  |= ¬ if   ̸|= ;
  |=  ∧  if   |=  and   |=  ;
  |=   if  &lt; | | and  +1 |= ;
  |=   if for some  such that  ≤  ≤ |  |, we have that   |=  and for every ,  ≤  &lt; 
it holds that   |= .</p>
      <p>Below we give some examples of how some properties introduced in the previous section can be
expressed in LTL .</p>
      <p>
        An example of a ‘safe distance to the anatomy’ property is ‘the distance to aorta should always be
at least ’. This can be expressed in LTL as □ (distance &gt; ). Note that distance &gt;  is an atomic
proposition. The ability to monitor this property relies crucially on being able to identify anatomical
structures such as the aorta during surgery, and there have been considerable advances in research in
this field, see for example [
        <xref ref-type="bibr" rid="ref13 ref14">13, 14</xref>
        ].
      </p>
      <p>A similar property is monitoring for too high speed of the tool tip: □ (speedToolTip &lt; ).</p>
      <p>A related problem is how much ‘slack’ to leave in the safe interval of the variables’ values. To come
back to the example of the distance to aorta, signalling the violation when the distance is 0 is correct
but not helpful. Setting the set distance to a larger value, e.g. 1 cm, is also not helpful, because it may
be perfectly safe and necessary to position the tool tip at a shorter distance to the aorta. In general a
good formulation would require taking into the consideration the phase of the surgery, the speed and
direction of movement of the tool tip as well as the distance.</p>
      <p>An additional challenge is in formally defining properties which are to do with human behaviour
rather than the values of physical characteristics of the robotic tools (speed, position, etc.). For example,
properties such as the surgeon’s movements being too hesitant, or movements indicating that the
surgeon is tired. These types of warnings are considered by surgeons to be very useful, and their timing
is less critical than the ‘distance to aorta’ type of properties. However, as far as we know, there is no
work on formal run-time monitoring for this kind of property. For example, one way of specifying
formally the property that ‘the surgeon’s movement is hesitant’, could be stating that the speed of the
tool tip should not be above , below , and then above it again:</p>
      <p>□ ¬((speedToolTip &gt; ) ∧  (   &lt; ) ∧   (speedToolTip &gt; ))</p>
      <p>Similarly properties suggesting that the surgeon is slowing down or is making more aggressive
movements can be specified. The specification of such properties may be specific to a particular surgeon.
It is well known that each surgeon is diferent, and what is too aggressive for one surgeon may be the
usual manner of operating for another very experienced and highly successful surgeon. This problem is
related to the problem of how to time a warning so that a violation of the property can be prevented,
however in this case we must decide when the movement has become too slow or too aggressive for
this particular surgeon.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Experiments with kinematic information</title>
      <p>The most successful device used for RAMIE is the da Vinci Surgical System (Intuitive Surgical Inc.,
Sunnyvale, CA). In the mid-2010s, the first generation of the da Vinci robot was taken out of service,
since it was impractical to be serviced and supplied anymore. However, these robotic platforms were
still functional, and could be used for research. The development of the da Vinci Research Kit (dVRK)
was started at the Johns Hopkins University, and in a few years, an active community has gathered with
more than 30 setups worldwide. The dVRK consists of open-source, custom-built hardware controllers
and software elements to make possible the programming of the attached da Vinci arms.</p>
      <p>Considering that robot manufacturing companies do not provide access to robot sensors, our main
methods of obtaining data from tools will be visual and kinematic information. Because we do not have
reliable real-time image segmentation yet, in this paper we will focus on kinematic information from
the Patient Side Manipulators (PSMs) of a dVRK and the properties that can be expressed using only
such information. Using kinematic data we can monitor all properties that do not involve identifying
anatomical structures from visual information. This includes, for example, the following properties:
hesitant movements, too slow or too fast movements, patterns of movements, conflicts of instruments,
instruments in the camera view, and suturing in the wrong direction.</p>
      <p>Let us consider an experiment in runtime monitoring during suturing. The data we get from the
sensors contains a time stamp, positions of the left and right PSMs of the dVRK and of the sigma.7 hand
interfaces which we adopted to control the dVRK, rotation matrices of the PSMs, orientations of the
sigma.7s, translational and rotational velocities of the PSMs and the sigma.7s, and gripper angles of the
PSMs and the sigma.7s.</p>
      <p>Since the cartesian coordinates of each PSM are given in respect to their own base, we place their
origins in a global coordinate knowing that their location and distance should be approximately
represented when their end-efectors are the closest during the hand-of of the needle.</p>
      <p>
        The monitoring was performed of 2 robotic suturing throws (1 throw = 1 stitch). The setup was
similar to [
        <xref ref-type="bibr" rid="ref15">15</xref>
        ], which comprised of the dVRK PSMs that integrate two sigma.7 hand interfaces (Force
Dimension, Nyon, Switzerland) to allow the user to control the robotic platform.
      </p>
      <p>We augment the raw data that we get from the sensors: in every state, in addition to storing ,  and
 coordinates of the tool tip, we also store the coordinate values we had in the previous state (except
the very first state, we can set it to be the same there), we denote them by − , − and − . We also
store the velocity for the current state, as well as other parameters we might need for expressing the
properties we are monitoring. In addition to adding the necessary parameters to the states, we remove
all unused variables received by the sensors. Thus, we get the sequence of states we need.</p>
      <p>Having this augmented data, the setup is as follows. The sequence of states is generated as above,
with a state for the data received at each time stamp. If we want to monitor, for example, for the correct
direction of suturing, knowing all the coordinates, positions, and speed in the current time point and in
the previous ones, we can tell whether the direction is correct. Let us look at an easy example of simple
continuous suturing. The direction of the stitches here is along the wound. Let  be an axis parallel to
the wound (if wound is a straight incision), so the correct direction of suturing is along . We also have
 coordinates of the tool tip during suturing. Specifying the property (the correct direction of suturing),
we want to make sure that the diference between the subsequent moments (coordinates at subsequent
moments) is positive. Assume we are performing suturing and we have a way of determining when
knot tying happens, then we can express the property in LTL as follows:</p>
      <p>
        (ToolTip &gt; ToolTip− ) (KnotTied ),
where ToolTip is the  coordinate of the tool tip and ToolTip− is the  coordinate of the tool tip at
the previous state. The property is described in LTL , then we can use one of the existing software
tools that will monitor this property in real time, for example the one discussed in [
        <xref ref-type="bibr" rid="ref16">16, 17</xref>
        ]1.
      </p>
      <p>To check that the surgeon is not stopping for more than  steps, we have to verify that at least one
coordinate is changing every  steps:
□ ¬(ToolTip = ToolTip− ∧ (ToolTip = ToolTip− ∧. . .∧ (ToolTip = ToolTip− ) . . .) ∧
⏟− ⏞1</p>
      <p>We can also verify that the tools are not outside of the camera view. If for simplicity we assume that
the  axis is perpendicular to the camera view, then we only have to worry about the other coordinates.
Assume the bottom left corner has coordinates (0, 0) and the top right corner — (, ), then the
property of interest looks like this</p>
      <p>□ (ToolTip &gt; 0 ∧ ToolTip &lt;  ∧ ToolTip &gt; 0 ∧ ToolTip &lt; ).</p>
      <p>When the  axis is not orthogonal to the camera view, we have to calculate the projections for the
given camera angle.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Future Work</title>
      <p>Addressing the problems of setting the safe interval for the variable values, and of diferent styles of
diferent surgeons is one of the challenges facing our project. We plan to address this challenge by
learning the correct parameters (or the correct automaton corresponding to the property) by using
techniques for learning automata (deterministic finite automata in this case) from data (in our case,
video records of surgeries), see, for example, [18]. We expect that these techniques will enable us to
learn, e.g., an individual surgeon’s ‘normal’ movement and speed characteristics. Once we have the
automaton representing the surgeon’s normal movements, we will attempt to monitor for unusual
movement patterns.</p>
      <p>We intend to experiment with diferent formalisms to use in stating properties to be verified and for
synthesising monitors from them. It is accepted in the run-time verification community that expressing
constraints on history (past) rather than future-directed constraints is often more intuitive and more
eficient to monitor for [ 19, 20]. We will investigate the past version of LTL and study its computational
properties and suitability for expressing properties related to robot-assisted surgery.</p>
      <p>Evaluation of the prototype monitoring system will be carried out by showing the output of the
system to observer surgeons who are not carrying out the surgery but are seeing the same view from
the camera as the operating surgeon, and asking them to rate the usefulness of the warnings.
Acknowledgements We gratefully acknowledge funding by NWO project Run-time verification for
robot-assisted surgery, OCENW.M.21.377.
1Note that this property is not monitorable in LTL, but is monitorable in LTL .
[17] T. Yamaguchi, B. Hoxha, D. Ničković, RTAMT – Runtime Robustness Monitors with Application
to CPS and Robotics, International Journal on Software Tools for Technology Transfer 26 (2024)
79–99. doi:10.1007/s10009-023-00720-3.
[18] E. Abadi, R. I. Brafman, Learning and solving regular decision processes, in: C. Bessiere (Ed.),
Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence,
IJCAI20, International Joint Conferences on Artificial Intelligence Organization, 2020, pp. 1948–1954.</p>
      <p>URL: https://doi.org/10.24963/ijcai.2020/270. doi:10.24963/ijcai.2020/270, main track.
[19] O. Lichtenstein, A. Pnueli, L. D. Zuck, The glory of the past, in: R. Parikh (Ed.), Logics of Programs,
volume 193 of Lecture Notes in Computer Science, Springer, 1985, pp. 196–218.
[20] L. Geatti, N. Gigante, A. Montanari, G. Venturato, Past Matters: Supporting LTL+Past in the BLACK
Satisfiability Checker, in: C. Combi, J. Eder, M. Reynolds (Eds.), 28th International Symposium on
Temporal Representation and Reasoning, TIME 2021, September 27-29, 2021, Klagenfurt, Austria,
volume 206 of LIPIcs, Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2021, pp. 8:1–8:17. URL:
https://www.dagstuhl.de/dagpub/978-3-95977-206-8.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>J. P.</given-names>
            <surname>Ruurda</surname>
          </string-name>
          , P. C. van der Sluis, S. van der Horst, R. van Hilllegersberg,
          <article-title>Robot-assisted minimally invasive esophagectomy for esophageal cancer: A systematic review</article-title>
          ,
          <source>J Surg Oncol</source>
          .
          <volume>112</volume>
          (
          <year>2015</year>
          )
          <fpage>257</fpage>
          -
          <lpage>265</lpage>
          . doi:doi:10.1002/jso.23922.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>J. W.</given-names>
            <surname>Suliburk</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q. M.</given-names>
            <surname>Buck</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. J.</given-names>
            <surname>Pirko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. N.</given-names>
            <surname>Massarweh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. R.</given-names>
            <surname>Barshes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Singh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T. K.</given-names>
            <surname>Rosengart</surname>
          </string-name>
          ,
          <article-title>Analysis of human performance deficiencies associated with surgical adverse events</article-title>
          ,
          <source>JAMA Network Open</source>
          <volume>2</volume>
          (
          <year>2019</year>
          ). doi:doi:10.1001/jamanetworkopen.
          <year>2019</year>
          .
          <volume>8067</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>E.</given-names>
            <surname>Bartocci</surname>
          </string-name>
          , Y. Falcone (Eds.),
          <source>Lectures on Runtime Verification - Introductory and Advanced Topics</source>
          , volume
          <volume>10457</volume>
          of Lecture Notes in Computer Science, Springer,
          <year>2018</year>
          . URL: https://doi.org/10.1007/ 978-3-
          <fpage>319</fpage>
          -75632-5. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>319</fpage>
          -75632-5.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ferrando</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. C.</given-names>
            <surname>Cardoso</surname>
          </string-name>
          , M. Fisher,
          <string-name>
            <given-names>D.</given-names>
            <surname>Ancona</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Franceschini</surname>
          </string-name>
          , V. Mascardi,
          <article-title>ROSMonitoring: A runtime verification framework for ROS</article-title>
          , in: A.
          <string-name>
            <surname>Mohammad</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          <string-name>
            <surname>Dong</surname>
          </string-name>
          , M. Russo (Eds.),
          <source>Towards Autonomous Robotic Systems - 21st Annual Conference, TAROS</source>
          <year>2020</year>
          , volume
          <volume>12228</volume>
          of Lecture Notes in Computer Science, Springer,
          <year>2020</year>
          , pp.
          <fpage>387</fpage>
          -
          <lpage>399</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>B.</given-names>
            <surname>Hertz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Luppen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. Y.</given-names>
            <surname>Rozier</surname>
          </string-name>
          ,
          <article-title>Integrating runtime verification into a sounding rocket control system</article-title>
          , in: A.
          <string-name>
            <surname>Dutle</surname>
          </string-name>
          ,
          <string-name>
            <surname>M. M. Moscato</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Titolo</surname>
            ,
            <given-names>C. A.</given-names>
          </string-name>
          <string-name>
            <surname>Muñoz</surname>
          </string-name>
          , I. Perez (Eds.),
          <source>NASA Formal Methods - 13th International Symposium, NFM</source>
          <year>2021</year>
          ,
          <string-name>
            <given-names>Virtual</given-names>
            <surname>Event</surname>
          </string-name>
          , May
          <volume>24</volume>
          -28,
          <year>2021</year>
          , Proceedings, volume
          <volume>12673</volume>
          of Lecture Notes in Computer Science, Springer,
          <year>2021</year>
          , pp.
          <fpage>151</fpage>
          -
          <lpage>159</lpage>
          . doi:
          <volume>10</volume>
          .1007/ 978-3-
          <fpage>030</fpage>
          -76384-8.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Shao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Stoyanov</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E. B.</given-names>
            <surname>Mazomenos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Jin</surname>
          </string-name>
          ,
          <article-title>Think step by step: Chain-of-gesture prompting for error detection in robotic surgical videos</article-title>
          ,
          <year>2024</year>
          . URL: https://arxiv.org/abs/2406.19217. arXiv:
          <volume>2406</volume>
          .
          <fpage>19217</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Hutchinson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Alemzadeh</surname>
          </string-name>
          ,
          <article-title>Runtime detection of executional errors in robot-assisted surgery</article-title>
          ,
          <source>in: 2022 International Conference on Robotics and Automation (ICRA)</source>
          ,
          <year>2022</year>
          , pp.
          <fpage>3850</fpage>
          -
          <lpage>3856</lpage>
          . doi:
          <volume>10</volume>
          .1109/ICRA46639.
          <year>2022</year>
          .
          <volume>9812034</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>D.</given-names>
            <surname>Bresolin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Geretti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Muradore</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fiorini</surname>
          </string-name>
          , T. Villa,
          <article-title>Formal verification of robotic surgery tasks by reachability analysis</article-title>
          ,
          <source>Microprocess. Microsystems</source>
          <volume>39</volume>
          (
          <year>2015</year>
          )
          <fpage>836</fpage>
          -
          <lpage>842</lpage>
          . URL: https://doi.org/10. 1016/j.micpro.
          <year>2015</year>
          .
          <volume>10</volume>
          .006. doi:
          <volume>10</volume>
          .1016/j.micpro.
          <year>2015</year>
          .
          <volume>10</volume>
          .006.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>D.</given-names>
            <surname>Bresolin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Geretti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Muradore</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fiorini</surname>
          </string-name>
          , T. Villa, Formal verification applied to robotic surgery,
          <source>Lecture Notes Control Inf Sci</source>
          .
          <volume>456</volume>
          (
          <year>2015</year>
          )
          <fpage>347</fpage>
          -
          <lpage>355</lpage>
          . doi:doi:10.1007/978-3-
          <fpage>319</fpage>
          -10407-2_
          <fpage>40</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>A.</given-names>
            <surname>Pnueli</surname>
          </string-name>
          ,
          <article-title>The temporal logic of programs</article-title>
          ,
          <source>in: Proceedings of the 18th Annual Symposium on Foundations of Computer Science (SFCS-77)</source>
          , IEEE,
          <year>1977</year>
          , pp.
          <fpage>46</fpage>
          -
          <lpage>57</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>G. D.</given-names>
            <surname>Giacomo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. Y.</given-names>
            <surname>Vardi</surname>
          </string-name>
          ,
          <article-title>Linear temporal logic and linear dynamic logic on finite traces</article-title>
          , in: F. Rossi (Ed.),
          <source>IJCAI 2013, Proceedings of the 23rd International Joint Conference on Artificial Intelligence</source>
          , Beijing, China,
          <source>August 3-9</source>
          ,
          <year>2013</year>
          , IJCAI/AAAI,
          <year>2013</year>
          , pp.
          <fpage>854</fpage>
          -
          <lpage>860</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>S.</given-names>
            <surname>Cerrito</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. C.</given-names>
            <surname>Mayer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Praud</surname>
          </string-name>
          ,
          <article-title>First order linear temporal logic over finite time structures</article-title>
          , in: H.
          <string-name>
            <surname>Ganzinger</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>McAllester</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          Voronkov (Eds.),
          <source>Logic for Programming and Automated Reasoning</source>
          , Springer Berlin Heidelberg, Berlin, Heidelberg,
          <year>1999</year>
          , pp.
          <fpage>62</fpage>
          -
          <lpage>76</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>A. S.</given-names>
            <surname>Lundervold</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Lundervold</surname>
          </string-name>
          ,
          <article-title>An overview of deep learning in medical imaging focusing on MRI</article-title>
          ,
          <source>Zeitschrift für Medizinische Physik</source>
          <volume>29</volume>
          (
          <year>2019</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14] R. den Boer, C. de Jongh, W. Huijbers,
          <string-name>
            <given-names>T.</given-names>
            <surname>Jaspers</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Pluim</surname>
          </string-name>
          , R. van Hillegersberg,
          <string-name>
            <surname>M. van Eijnatten</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ruurda</surname>
          </string-name>
          ,
          <article-title>Computer-aided anatomy recognition in intrathoracic and -abdominal surgery: a systematic review</article-title>
          ,
          <source>Surgical Endoscopy</source>
          <volume>36</volume>
          (
          <year>2022</year>
          )
          <fpage>8737</fpage>
          -
          <lpage>8752</lpage>
          . doi:
          <volume>10</volume>
          .1007/s00464-022-09421-5.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>A.</given-names>
            <surname>Saracino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Deguet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Staderini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. N.</given-names>
            <surname>Boushaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Cianchi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Menciassi</surname>
          </string-name>
          , E. Sinibaldi,
          <article-title>Haptic feedback in the da Vinci Research Kit (dvrk): A user study based on grasping, palpation, and incision tasks</article-title>
          ,
          <source>The International Journal of Medical Robotics and Computer Assisted Surgery</source>
          <volume>15</volume>
          (
          <year>2019</year>
          )
          <article-title>e1999</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>L.</given-names>
            <surname>Geatti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Montanari</surname>
          </string-name>
          ,
          <string-name>
            <surname>N. Saccomanno,</surname>
          </string-name>
          <article-title>Towards machine learning enhanced LTL monitoring</article-title>
          ,
          <source>in: Proceedings of the 5th Workshop on Artificial Intelligence and Formal Verification</source>
          , Logic, Automata, and
          <article-title>Synthesis hosted by the 22nd International Conference of the Italian Association for Artificial Intelligence</article-title>
          (AIxIA
          <year>2023</year>
          ),
          <year>2023</year>
          , pp.
          <fpage>13</fpage>
          -
          <lpage>19</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>