<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Toward the Design of a Tele-assistance User Interface for Autonomous Vehicles</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Felix Tener</string-name>
          <email>felix.tener@mail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Joel Lanir</string-name>
          <email>ylanir@is.haifa.ac.il</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>University of Haifa</institution>
          ,
          <addr-line>Abba Khoushy Ave 199, Haifa, 3498838</addr-line>
          ,
          <country country="IL">Israel</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Autonomous vehicles (AVs) are rapidly evolving as a novel and disruptive way of transportation. However, both in industry and academia, it is believed that AVs will not be able resolve every traffic situation autonomously and therefore, remote human intervention will be required. However, existing teleoperation methods are extremely challenging and thus it is evident that novel remote operation paradigms should evolve. Such a paradigm is teleassistance, which posits that remote operators (ROs) should provide high-level guidance to AVs and delegate low-level controls to automation. Our work explores how to design such a teleassistance interface. Through interviews with 14 experts in AV teleoperation, we first discover in which road scenarios AVs will need remote human assistance. Then, based on these scenarios, we devise a set of discrete high-level commands through which a remote operator will be able to resolve most road scenarios without the need to manually control the AV. Finally, we create a prototype for such an interface.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Human-centered computing</kwd>
        <kwd>human-computer interaction</kwd>
        <kwd>interaction design</kwd>
        <kwd>automobile</kwd>
        <kwd>empirical study</kwd>
        <kwd>interview</kwd>
        <kwd>qualitative methods</kwd>
        <kwd>user-interface design</kwd>
        <kwd>tele-driving</kwd>
        <kwd>teleassistance</kwd>
        <kwd>tele-operation</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>Recent technological advancements,
especially in the field of artificial intelligence and
machine learning, enable rapid evolvement of
autonomous vehicles (AVs) as a novel way of
transportation [1]. An autonomous vehicle (AV)
is envisioned to be able to drive itself on its own,
without any human input, using various sensors to
perceive the environment identifying paths and
obstacles. However, similarly to other
autonomous systems, it is most likely that AVs
will also require human monitoring and
2 Disengagement - a situation when the vehicle returns to manual
control or the driver feels the need to take back the wheel from the
AV decision system.
intervention. Situations such as road construction,
a malfunctioning traffic light, or a busy junction
might prevent an AV from moving autonomously
[2] and can cause disengagements2 [3]. Therefore,
it is widely believed today both in the industry and
in academia that, at least in the near and
foreseeable future, AVs will not be able to resolve
all ambiguous traffic situations by their own and
remote human assistance will be required
[4][5][6][7][8].</p>
      <p>A promising approach to resolve these
situations and provide an actionable solution for
AV disengagements is Teleoperation - operation
of a machine from distance. While teleoperation
systems for AVs are already in use and are being
developed by various automotive companies [7]
[9], manually driving a vehicle remotely is an
extremely challenging task [10]. For example,
since the remote operator (RO) is physically
disconnected from the operated AV, she cannot
feel the forces that are applied on the teleoperated
vehicle or hear its surroundings sounds. Another
example is latency, which is caused by the fact
that a lot of information should be transmitted
from the AV to the RO over the network [11][12].</p>
      <p>Currently, there are two major teleoperation
paradigms: tele-driving and tele-assistance
(Figure 1). In tele-driving the remote operator
continuously operates the AV using a steering
wheel and pedals, while in tele-assistance the
cooperation between the human (RO) and the
machine (AV) happens on the guidance level [13].</p>
      <p>There are many advantages of using
teleassistance. First, remote assistance has the
potential to significantly shorten a teleoperation
session time because a simple command such as
“wait” or “progress slowly” would be much
shorter to issue than manually driving a car.
Second, tele-assistance might improve safety:
according to the U.S. department of
transportation, 94 percent of crashes in the U.S.
were caused by a human error [14]. Therefore,
delegating the low-level maneuvers to the AV
might significantly improve AV’s safety. Third,
guiding AVs using generic commands (instead of
using steering and pedals) may allow ROs to
control heterogeneous vehicles (with different
sizes, widths, etc.) and fleets (private cars,
shuttles, trucks, etc.) without the need to develop
new mental models when transitioning between
one teleoperated vehicle to another [10]. Finally,
properly designed tele-assistance user interfaces
(UIs) has the potential to reduce RO’s cognitive
load over tele-driving interfaces, which are shown
to require a very high level of attention [10].</p>
      <p>Bogdoll et. al [4] performed a comprehensive
analysis of recent teleoperation methods and
created a taxonomy for remote human input
systems of AVs. In their review, they highlight
remote high-level assistance as a viable solution
and one that is already being developed by several
companies. In the academia, several works
examined path generation as a high-level input
method in which the operator “draws” the desired
2D path for the remote vehicle to follow
[15][16][17]. Others, started to explore high-level
interface commands that can be delegated to the
AV [18][19]. However, no research work
systematically examined how such a high-level
command language should look like, in what
cases should it be used, and what should its
components consist of. Our work aims to fill this
gap and build upon it by designing, implementing,
and evaluating a tele-assistance UI.</p>
      <p>To create such an interface, we conducted a
qualitative study with 14 experts in AV
teleoperation with the aim to unveil and
categorize the various disengagement scenarios.
Following the study, and using the insights gained
from it, we designed and implemented an initial
prototype version of a tele-assistance UI.</p>
    </sec>
    <sec id="sec-2">
      <title>2. High-level Concepts</title>
    </sec>
    <sec id="sec-3">
      <title>2.1. Tele-assistance vs. tele-driving</title>
      <p>Since teleoperation of AVs is an emerging area
of research, currently there is no uniform
teleoperation terminology across industry and
academia [4]. However, it is possible to divide
teleoperation into two major paradigms:
teledriving and tele-assistance (Figure 1). In
teledriving the remote operator uses a steering wheel
and pedals (or other controls such as a joystick) to
continuously drive the AV, while in
teleassistance the lower-level maneuvers are
delegated to the AV through high-level
instructions by the RO [13].</p>
      <p>As listed earlier, we believe that
teleassistance has many advantages over tele-driving,
especially when envisioning a large-scale
deployment of AVs on public roads. In such a
scenario, several teleoperation centers, with
multiple teleoperation stations each, will be
deployed in a geographic region to support all the
edge-case scenarios that AVs will fail to resolve
autonomously. Every RO in such a center will
have to deal with many disengagements in a single
work shift and therefore an efficient and intuitive
teleoperation user interface (UI) is essential. Our
research aims at investigating how to best design
such a teleassistance interface.</p>
    </sec>
    <sec id="sec-4">
      <title>Disengagements</title>
      <p>The Society of Automotive Engineers3 defined 6
levels of driving automation: in Level-0 there is
no automation at all, while in Level-5 vehicles
have full automotive technology. The levels, in
between, have partial automation capabilities.
In vehicles with a safety driver (Level-1 to
Level3), a disengagement is a situation in which the AV
returns to manual control or the driver feels the
need to take back the wheel from the AV decision
system. However, when discussing Level-4 and
Level-5 of automation, we refer to a situation in
which the AV delivers the control to a RO, who
might be located miles away from the scene.
Several academic works, [21]–[23] investigated
the reasons for such disengagements using
quantitative methods, which were applied on
California’s DMV4 reports. Dixit et.al.[23], thrive
to provide fundamental insights into trust and
reaction times in disengagements, Favaro
et.al.[22], aim to improve the testing and
deployment regulations for AVs on public roads,
and Lv et al. [21], try to improve automation
technologies. However, none of these studies
addressed remote disengagements (i.e., a
disengagement without a person in the vehicle).
Unlike previous studies, we take a User-Centered
Design (UCD) [24] approach and use qualitative
analysis to look at disengagements with AVs and
the way to address them through teleassistance.</p>
    </sec>
    <sec id="sec-5">
      <title>3. Current research</title>
    </sec>
    <sec id="sec-6">
      <title>3.1. Unveiling disengagement usecases</title>
      <p>With the purpose to automate driver’s actions
and deliver driving low-level controls to the AV
itself, we aim to define a discrete, finite, and
generic command language, which can be used by
tele-assistants in cases of disengagements and
when the vehicle’s decision system needs human
support. The first step in doing so is to investigate
in which remote use-cases AVs will fail to deal
with the remote situation autonomously. To do so
we conducted in-depth semi-structured interviews
with 14 experts from leading automotive
companies, innovation centers of well-known
automotive corporations, cutting-edge start-ups in
the AV teleoperation field, and academia with an
average of 20.3 years of experience in the fields.
3 https://www.sae.org/
We used Thematic Analysis [25] to analyze and
categorize the data and came up with eight main
categories in which remote human intervention
would be required. Each category included
between 3 to 6 specific sub-categories. Table 1
presents these results.
Based on the above interviews and findings, we
conducted several brainstorming sessions within
our design team and came up with a list of
possible high-level commands, which might help
RO’s to resolve the above scenarios by delegating
low-level controls to the AV. Table 2 summarizes
our suggestions:</p>
      <p>Defining the above commands was a necessary
step in the Research Through Design [28] process
we follow. In addition, we performed an in-depth
investigation of two additional aspects of the
future interface: (1) The perspective of the video
feed(s) necessary to increase RO’s situation
awareness (SA) and (2) The necessary UI
interactions and affordances. We have reviewed
24 interfaces of teleoperation companies and
performed a competitive analysis of 10 UIs from
that list. In addition, we reviewed various
academic works, which focus on various
interaction paradigms [29]–[33]. Following this
analysis, we defined the desired perspective of the
video feed, visible to the RO, to be 5 meters
behind and 5 meters above the teleoperated AV
(Figure 2). Additionally, we defined the following
interaction paradigms to be part of the designed
UI: (1) Discrete high-level commands inserted via
button clicks, (2) Path plotting, (3) Adding data to
unrecognized object in the remote scene, (4)
Selecting AI-suggested options.</p>
      <p>After formulating the above, we designed a
high-fidelity interactive tele-assistance prototype
(Figure 2), which incorporates all the above
14 https://www.cognata.com/
insights into one coherent solution. In particular,
we used screen shots from Cognata’s14 simulation
platform in order to imitate the AV’s
environment.</p>
    </sec>
    <sec id="sec-7">
      <title>4. Future work</title>
      <p>After completing the tele-assistance UI design,
we plan to perform usability testing and an
evaluation of the interface with expert
teleoperators in order to evaluate (1) General
screen taxonomy, (2) Navigation flows, (3)
Necessity and location of various UI elements, (4)
Interaction paradigms, (5) Affordances, and (6)
Importance of the video feed perspective to RO’s
SA.</p>
      <p>Next, we plan to implement the above UI with
the help of the upper-mentioned simulation
platform and measure the RO’s cognitive load,
situation awareness, task performance and overall
system’s usability, comparing it to a tele-driving
interface. We believe that such quantitative
measurements along with qualitative insights will
help us understand whether the tele-assistance
paradigm can be a substitute for tele-driving in the
majority of the disengagement scenarios.</p>
    </sec>
    <sec id="sec-8">
      <title>5. Acknowledgements</title>
      <p>This work was supported by the Israeli
Innovation Authority, IDIT PhD fellowship, The
Israeli Smart Transportation Research Center, and
The Israeli Ministry of Aliyah and Integration.
We also wish to thank DriveU and Cognata for
their collaboration and specifically, Eli Shapira
for his help throughout this work.
6. References</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          77, pp.
          <fpage>167</fpage>
          -
          <lpage>181</lpage>
          ,
          <year>2015</year>
          , doi: 10.1016/j.tra.
          <year>2015</year>
          .
          <volume>04</volume>
          .003.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <given-names>V. V.</given-names>
            <surname>Dixit</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Chand</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Nair</surname>
          </string-name>
          , “Autonomous vehicles: Disengagements, accidents and reaction times,”
          <source>PLoS One</source>
          , vol.
          <volume>11</volume>
          , no.
          <issue>12</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>14</lpage>
          ,
          <year>2016</year>
          , doi: 10.1371/journal.pone.
          <volume>0168054</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <given-names>Mario</given-names>
            <surname>Herger</surname>
          </string-name>
          , “
          <source>2021 Disengagement Report from California</source>
          ,”
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          https://thelastdriverlicenseholder.com/
          <year>2022</year>
          /02/09/2021- disengagement
          <string-name>
            <surname>-</surname>
          </string-name>
          report
          <string-name>
            <surname>-</surname>
            from-california/ D. Bogdoll,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Orf</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Töttel</surname>
            , and
            <given-names>J. M.</given-names>
          </string-name>
          <string-name>
            <surname>Zöllner</surname>
          </string-name>
          , “
          <article-title>Taxonomy and Survey on Remote Human Input Systems for Driving Automation Systems</article-title>
          ,”
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <given-names>N. J.</given-names>
            <surname>Cooke</surname>
          </string-name>
          , “
          <article-title>HUMAN FACTORS OF REMOTELY OPERATED VEHICLES</article-title>
          ,” Hum Factors, pp.
          <fpage>166</fpage>
          -
          <lpage>169</lpage>
          ,
          <year>2006</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <given-names>N.</given-names>
            <surname>Goodall</surname>
          </string-name>
          , “
          <article-title>Non-technological challenges for the remote operation of automated vehicles</article-title>
          ,
          <source>” Transp Res Part A Policy Pract</source>
          , vol.
          <volume>142</volume>
          , no.
          <source>March</source>
          , pp.
          <fpage>14</fpage>
          -
          <lpage>26</lpage>
          ,
          <year>2020</year>
          , doi: 10.1016/j.tra.
          <year>2020</year>
          .
          <volume>09</volume>
          .024.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Mutzenich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Durant</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Helman</surname>
          </string-name>
          , and
          <string-name>
            <given-names>P.</given-names>
            <surname>Dalton</surname>
          </string-name>
          , “
          <article-title>Updating our understanding of situation awareness in relation to remote operators of autonomous vehicles,” Cogn Res Princ Implic</article-title>
          , vol.
          <volume>6</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>17</lpage>
          ,
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>SAE</surname>
          </string-name>
          , “
          <article-title>Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles</article-title>
          ,”
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>GreyB</surname>
          </string-name>
          , “
          <article-title>Top 30 Self Driving Technology</article-title>
          and Car Companies,”
          <year>2021</year>
          . https://www.greyb.com/autonomousvehicle-companies/#
          <string-name>
            <given-names>F.</given-names>
            <surname>Tener</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Lanir</surname>
          </string-name>
          , “
          <article-title>Driving from a Distance: Challenges and Guidelines for Autonomous Vehicle Teleoperation Interfaces</article-title>
          ,” pp.
          <fpage>1</fpage>
          -
          <lpage>13</lpage>
          ,
          <year>2022</year>
          , doi: 10.1145/3491102.3501827.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <given-names>J. M.</given-names>
            <surname>Georg</surname>
          </string-name>
          and
          <string-name>
            <surname>F. DIermeyer,</surname>
          </string-name>
          “
          <article-title>An adaptable and immersive real time interface for resolving system limitations of automated vehicles with teleoperation,”</article-title>
          <source>Conf Proc IEEE Int Conf Syst Man Cybern</source>
          , vol.
          <source>2019- Octob</source>
          , pp.
          <fpage>2659</fpage>
          -
          <lpage>2664</lpage>
          ,
          <year>2019</year>
          , doi: 10.1109/SMC.
          <year>2019</year>
          .
          <volume>8914306</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <given-names>T.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , “Toward
          <source>Automated Vehicle Teleoperation: Vision</source>
          , Opportunities, and Challenges,
          <string-name>
            <surname>” IEEE Internet Things</surname>
            <given-names>J</given-names>
          </string-name>
          , vol.
          <volume>7</volume>
          , no.
          <issue>12</issue>
          , pp.
          <fpage>11347</fpage>
          -
          <lpage>11354</lpage>
          ,
          <year>2020</year>
          , doi: 10.1109/JIOT.
          <year>2020</year>
          .
          <volume>3028766</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>Bruder</surname>
          </string-name>
          , “
          <article-title>Towards cooperative guidance and control of highly automated vehicles: H-</article-title>
          <string-name>
            <surname>Mode and</surname>
          </string-name>
          Conduct-byWire,” Ergonomics, vol.
          <volume>57</volume>
          , no.
          <issue>3</issue>
          . Taylor &amp; Francis, pp.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          343-
          <fpage>360</fpage>
          ,
          <year>2014</year>
          . doi:
          <volume>10</volume>
          .1080/00140139.
          <year>2013</year>
          .
          <volume>869355</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <string-name>
            <given-names>R.</given-names>
            <surname>Hussain</surname>
          </string-name>
          and
          <string-name>
            <given-names>S.</given-names>
            <surname>Zeadally</surname>
          </string-name>
          , “Autonomous Cars: Research Results, Issues, and Future Challenges,
          <source>” IEEE Communications Surveys and Tutorials</source>
          , vol.
          <volume>21</volume>
          , no.
          <issue>2</issue>
          , pp.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          1275-
          <fpage>1313</fpage>
          ,
          <year>2019</year>
          , doi: 10.1109/COMST.
          <year>2018</year>
          .
          <volume>2869360</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <string-name>
            <given-names>M.</given-names>
            <surname>Fennel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Zea</surname>
          </string-name>
          , and U. D. Hanebeck, “
          <article-title>Haptic-guided path generation for remote car-like vehicles,” IEEE Robot Autom Lett</article-title>
          , vol.
          <volume>6</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>4088</fpage>
          -
          <lpage>4095</lpage>
          ,
          <year>2021</year>
          , doi: 10.1109/LRA.
          <year>2021</year>
          .
          <volume>3067846</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <string-name>
            <given-names>J. S.</given-names>
            <surname>Kay</surname>
          </string-name>
          , “
          <article-title>STRIPE: remote driving using limited image data,”</article-title>
          <source>Conference on Human Factors in Computing Systems - Proceedings</source>
          , vol.
          <volume>2</volume>
          , pp.
          <fpage>107</fpage>
          -
          <lpage>108</lpage>
          ,
          <year>1995</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <string-name>
            <given-names>D.</given-names>
            <surname>Schitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Graf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Rieth</surname>
          </string-name>
          , and
          <string-name>
            <given-names>H.</given-names>
            <surname>Aschemann</surname>
          </string-name>
          , “
          <article-title>Corridor-Based Shared Autonomy for Teleoperated Driving,” IFAC-PapersOnLine</article-title>
          , vol.
          <volume>53</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>15368</fpage>
          -
          <lpage>15373</lpage>
          ,
          <year>2020</year>
          , doi: 10.1016/j.ifacol.
          <year>2020</year>
          .
          <volume>12</volume>
          .2351.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <string-name>
            <surname>Bruder</surname>
          </string-name>
          , “
          <article-title>Towards cooperative guidance and control of highly automated vehicles: H-</article-title>
          <string-name>
            <surname>Mode and</surname>
          </string-name>
          Conduct-byWire,” Ergonomics, vol.
          <volume>57</volume>
          , no.
          <issue>3</issue>
          . Taylor &amp; Francis, pp.
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          343-
          <fpage>360</fpage>
          ,
          <year>2014</year>
          . doi:
          <volume>10</volume>
          .1080/00140139.
          <year>2013</year>
          .
          <volume>869355</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Kettwich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Schrank</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Oehl</surname>
          </string-name>
          , “
          <article-title>Teleoperation of highly automated vehicles in public transport: Usercentered design of a human-machine interface for remoteoperation and its expert usability evaluation,” Multimodal Technologies and Interaction</article-title>
          , vol.
          <volume>5</volume>
          , no.
          <issue>5</issue>
          ,
          <year>2021</year>
          , doi: 10.3390/MTI5050026.
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          <string-name>
            <given-names>J.</given-names>
            <surname>Zimmerman</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Forlizzi</surname>
          </string-name>
          , “Research through Design:
          <article-title>Method for Interaction Design Research in HCI,”</article-title>
          <source>Chi</source>
          <year>2011</year>
          , pp.
          <fpage>167</fpage>
          -
          <lpage>189</lpage>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Lv</surname>
          </string-name>
          et al.,
          <article-title>“Analysis of autopilot disengagements occurring during autonomous vehicle testing</article-title>
          ,
          <source>” IEEE/CAA Journal of Automatica Sinica</source>
          , vol.
          <volume>5</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>58</fpage>
          -
          <lpage>68</lpage>
          , Jan.
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          <year>2018</year>
          , doi: 10.1109/JAS.
          <year>2017</year>
          .
          <volume>7510745</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          <string-name>
            <given-names>F.</given-names>
            <surname>Favarò</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Eurich</surname>
          </string-name>
          , and
          <string-name>
            <given-names>N.</given-names>
            <surname>Nader</surname>
          </string-name>
          , “Autonomous vehicles' disengagements: Trends, triggers, and regulatory limitations,”
          <source>Accid Anal Prev</source>
          , vol.
          <volume>110</volume>
          , pp.
          <fpage>136</fpage>
          -
          <lpage>148</lpage>
          , Jan.
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          <year>2018</year>
          , doi: 10.1016/j.aap.
          <year>2017</year>
          .
          <volume>11</volume>
          .001.
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          <string-name>
            <given-names>V. v.</given-names>
            <surname>Dixit</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Chand</surname>
          </string-name>
          , and
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Nair</surname>
          </string-name>
          , “Autonomous vehicles: Disengagements, accidents and reaction times,”
          <source>PLoS One</source>
          , vol.
          <volume>11</volume>
          , no.
          <issue>12</issue>
          ,
          <string-name>
            <surname>Dec</surname>
          </string-name>
          .
          <year>2016</year>
          , doi: 10.1371/journal.pone.
          <volume>0168054</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          <string-name>
            <given-names>V.</given-names>
            <surname>Braun</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Clarke</surname>
          </string-name>
          , “
          <article-title>Using thematic analysis in psychology</article-title>
          ,
          <source>” Qual Res Psychol</source>
          ,
          <year>2006</year>
          , doi: 10.1191/1478088706qp063oa.
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          <string-name>
            <surname>Liu</surname>
          </string-name>
          , “
          <article-title>Identifying the Operational Design Domain for an Automated Driving System through Assessed Risk,” in IEEE Intelligent Vehicles Symposium</article-title>
          , Proceedings,
          <year>2020</year>
          , pp.
          <fpage>1317</fpage>
          -
          <lpage>1322</lpage>
          . doi:
          <volume>10</volume>
          .1109/IV47402.
          <year>2020</year>
          .
          <volume>9304552</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          1083-
          <fpage>1092</fpage>
          ,
          <year>2009</year>
          , doi: 10.1145/1518701.1518866.
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          <string-name>
            <given-names>J.</given-names>
            <surname>Zimmerman</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Forlizzi</surname>
          </string-name>
          , “Research through Design:
          <article-title>Method for Interaction Design Research in HCI,”</article-title>
          <source>Chi</source>
          <year>2011</year>
          , pp.
          <fpage>167</fpage>
          -
          <lpage>189</lpage>
          ,
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          <string-name>
            <given-names>C.</given-names>
            <surname>Kettwich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Schrank</surname>
          </string-name>
          , and
          <string-name>
            <given-names>M.</given-names>
            <surname>Oehl</surname>
          </string-name>
          , “
          <article-title>Teleoperation of highly automated vehicles in public transport: Usercentered design of a human-machine interface for remoteoperation and its expert usability evaluation,” Multimodal Technologies and Interaction</article-title>
          , vol.
          <volume>5</volume>
          , no.
          <issue>5</issue>
          ,
          <year>2021</year>
          , doi: 10.3390/MTI5050026.
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          <string-name>
            <surname>Bruder</surname>
          </string-name>
          , “
          <article-title>Towards cooperative guidance and control of highly automated vehicles: H-</article-title>
          <string-name>
            <surname>Mode and</surname>
          </string-name>
          Conduct-byWire,” Ergonomics, vol.
          <volume>57</volume>
          , no.
          <issue>3</issue>
          . Taylor &amp; Francis, pp.
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          343-
          <fpage>360</fpage>
          ,
          <year>2014</year>
          . doi:
          <volume>10</volume>
          .1080/00140139.
          <year>2013</year>
          .
          <volume>869355</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          <string-name>
            <given-names>D.</given-names>
            <surname>Schitz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Graf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Rieth</surname>
          </string-name>
          , and
          <string-name>
            <given-names>H.</given-names>
            <surname>Aschemann</surname>
          </string-name>
          , “
          <article-title>Corridor-Based Shared Autonomy for Teleoperated Driving,” IFAC-PapersOnLine</article-title>
          , vol.
          <volume>53</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>15368</fpage>
          -
          <lpage>15373</lpage>
          ,
          <year>2020</year>
          , doi: 10.1016/j.ifacol.
          <year>2020</year>
          .
          <volume>12</volume>
          .2351.
        </mixed-citation>
      </ref>
      <ref id="ref36">
        <mixed-citation>
          <string-name>
            <given-names>M.</given-names>
            <surname>Fennel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Zea</surname>
          </string-name>
          , and U. D. Hanebeck, “
          <article-title>Haptic-guided path generation for remote car-like vehicles,” IEEE Robot Autom Lett</article-title>
          , vol.
          <volume>6</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>4088</fpage>
          -
          <lpage>4095</lpage>
          ,
          <year>2021</year>
          , doi: 10.1109/LRA.
          <year>2021</year>
          .
          <volume>3067846</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref37">
        <mixed-citation>
          <string-name>
            <given-names>J.</given-names>
            <surname>Feiler</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Hoffmann</surname>
          </string-name>
          , and
          <string-name>
            <given-names>F.</given-names>
            <surname>Diermeyer</surname>
          </string-name>
          , “
          <article-title>Concept of a Control Center for an Automated Vehicle Fleet</article-title>
          ,”
          <source>2020 IEEE 23rd International Conference on Intelligent Transportation Systems, ITSC</source>
          <year>2020</year>
          ,
          <year>2020</year>
          , doi: 10.1109/ITSC45102.
          <year>2020</year>
          .
          <volume>9294411</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>