<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Towards a Real-Time Emergency Response Model for Connected and Autonomous Vehicles</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Yen-Hung Liu</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Otavio de P. Albuquerque</string-name>
          <email>otavioalbuquerque@usp.br</email>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Patrick C. K. Hung</string-name>
          <email>patrick.Hung@ontariotechu.ca</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Hossam A. Gabbar</string-name>
          <email>hossam.gaber@uoit.ca</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Marcelo Fantinato</string-name>
          <email>m.fantinato@usp.br</email>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Farkhund Iqbal</string-name>
          <email>farkhund.iqbal@zu.ac.ae</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>College of Technological Innovation, Zayed University</institution>
          ,
          <addr-line>Abu Dhabi</addr-line>
          ,
          <country country="AE">United Arab Emirates</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Faculty of Business and IT, Ontario Tech University</institution>
          ,
          <addr-line>Oshawa</addr-line>
          ,
          <country country="CA">Canada</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>School of Arts, Sciences and Humanities, University of São Paulo</institution>
          ,
          <addr-line>São Paulo</addr-line>
          ,
          <country country="BR">Brazil</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>ing. In the industry, most CAV classifies as Level 4 of</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2022</year>
      </pub-date>
      <abstract>
        <p>Recently technological advancements in the automobile and transportation sector have gained significant interest from governments, industry leaders, and citizens. Together with Autonomous Vehicles (AV) and Connected Vehicles (CV), Connected-Autonomous Vehicles (CAV) have made a revolution in these sectors. Emergency Vehicles (EVs), such as ambulances, fire trucks, and patrol cars, are essential to our daily trafic life. Each of EVs has a diferent purpose, but all have their urgency and importance, and any time passing may cause the death of life. Thus, whenever other vehicle drivers encounter an EV on the road, they must yield to the EVs. Therefore, a CAV system that can detect EVs will significantly improve these issues. According to the Society of Automotive Engineers International (SAE), in today's autonomous vehicles, most of them are less than Level 5, and car manufacturers assume the driver will take back control. Still, most autonomous vehicles mainly rely on their vision sensor instead of their sound sensor. Thus, when the system notifies the driver that the EVs are already close to them, it may be dangerous for the driver, pedestrians, and passengers in the vehicle. This paper proposes a conceptual framework and discusses a related methodology to support such a real-time emergency response model for CAV. connected-autonomous vehicle, emergency vehicle, emergency response, real-time response, driving assistance technology, the Doppler Efect, exceptional handling, control strategy, machine learning from governments, industry leaders, and citizens. Au- (SAV) should be considered at least Level 2 of vehicle ∗Corresponding author.</p>
      </abstract>
      <kwd-group>
        <kwd>entirely human-operated vehicles</kwd>
        <kwd>while Level 5 vehicles</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <sec id="sec-1-1">
        <title>Recently technological advancements in the automobile</title>
        <p>
          and transportation sector have gained significant interest
tonomous Vehicles (AV) technology enables vehicles to be
controlled by precise, fast responding computers instead
of error-prone and slowly responding human beings;
Connected Vehicles (CV) allow infrastructure units and
vehicles to share high-resolution information through
wireless connectivity that can communicate to support
interaction with their internal and external environments
in real-time, e.g., for trafic systems and between
indiwhich integrates the best of both AVs and CVs, have
revolutionized these sectors [
          <xref ref-type="bibr" rid="ref1 ref2">1, 2</xref>
          ].
        </p>
        <p>According to the report of the Society of Automotive</p>
      </sec>
      <sec id="sec-1-2">
        <title>Engineers International (SAE) [3], the level of autonomy of vehicles ranges from Level 0 to 5. Level 0 concerns</title>
        <p>THECOG 2022: Transforms in behavioral and afective computing,</p>
      </sec>
      <sec id="sec-1-3">
        <title>CAV can excel in numerous advantages for smart city</title>
        <p>
          citizens by ofering them better and more efective
transportation services, such as dramatically reducing car
crashes and driver fatigue [
          <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
          ]. CAV ofers benefits
vidual vehicles. Connected-autonomous Vehicles (CAV), driving task. Such cases can be dangerous for the driver
as private and service (e.g., Uber) cars, buses for public detailed analysis and enhancement of technical aspects
transport (which includes school buses), and trucks (e.g. and operation, safety, and reliable performance for the
garbage collectors and agricultural trucks). CAV bene- emergency environment under pressure. The proposed
ifts are achieved by collecting relevant information from model of CAV will have a routine for Key Performance
the CAV’s context, such as geolocation, date and time, Indicators (KPIs) based on functional operational and
and other individual attributes like age, address, gender, safety requirements, resiliency measures, risk analysis,
and income. Therefore, CAV can infer an individual’s and Safety Integrity Level (SIL) allocation. Verification
interests, traits, beliefs, and intentions. and validation will be evaluated in the related Electronic
        </p>
        <p>
          Many of today’s automated vehicles lose track of the Control Units (ECUs) with relevant standards or codes
lane position when the lane markings are absent. For ex- such as the National Electrical Code/National Fire
Proample, erroneous lane marking recognition contributed tection Association (NFPA) 70, Canadian Electrical Code
to a fatal crash of Tesla cars in California in 2018 [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ]. and the International Electrotechnical Commission (IEC)
CAV can address such issues since it can be connected standard and Underwriters Laboratories (UL) standard
through external interfaces, like Wi-Fi, Bluetooth, Global and other codes/standards/regulations such as the
workPositioning System (GPS), and Tire Pressure Monitor- ing document provided by International Organization for
ing System (TPMS). Moreover, internally, CAV works Standardization (ISO): ISO 26262 “Road vehicles -
Funcwith a Controller Area Network (CAN), connecting dif- tional safety” [9], and ISO 21434 “Road Vehicles – Cyber
ferent Electronic Control Units (ECU) as the engine itself. Security Engineering” [10].
        </p>
        <p>On the one hand, these connections are necessary to The remainder of the paper is organized as follows:
provide basic (e.g., driving) and advanced features (e.g., Section 2 reviews the related scientific works present in
autonomous driving and entertainment) to involved per- the literature in this work’s area. Section 3 presents a
sons, such as drivers, passengers, and pedestrians. proposed conceptual cooperative framework and
method</p>
        <p>The application of CAV in real-time response for Emer- ology for real-time emergency response for
Connectedgency Vehicles (EVs) has resulted in great improvements Autonomous Vehicles. Section 4 concludes the paper,
in the eficiency of the process. EVs, such as ambulances, presenting a brief discussion and the found limitations.
ifre trucks, and police cars, are essential to our daily
trafifc life. Each EV has a diferent purpose, but all have their
urgency and importance for emergency response and 2. Literature Review
saving a life. Thus, whenever a vehicle driver encounters
an EV on the road, the driver must yield to the EV in a Safety risks may bring implications for CAV passengers,
safe condition. other vehicles, pedestrians, and road infrastructure to</p>
        <p>
          Usually, the EV has vision and audio devices to remind understand human aspects and perceptions towards CAV,
drivers and pedestrians of their existence, but these de- such as trust [11, 12], driving style [13], and physical
vices may sometimes be less efective in a noisy surround- safety of pedestrians and city infrastructure [14]. These
ing environment [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ]. In 2019, 170 people were killed in numerous safety issues related to security risks may
incrashes involving emergency vehicles, most of which lfuence the consumer’s trust in purchasing CAV solutions
were non-emergency vehicle occupants in the United [11].
        </p>
        <p>
          States [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ]. To address this issue, Advanced Driver Assis- There are many causes of EVs accidents. In personal
tance Systems (ADAS) such as adaptive cruise control, factors, the EVs drivers usually drive under high pressure
blind-spot object detection, and lane departure warn- because of the time pressure, long shift hours and code
ing have been designed to improve driving safety and 3 running thinking. In code 3 running, the driver can
support CAV. However, to our best knowledge, there is exceed the speed limit and does not have to follow the
still not much research work in ADAS for the real-time trafic signs in order to save the most time [ 15]. In
enemergency response for EVs, especially in CAV. vironmental factors, drivers usually drive in unfamiliar
        </p>
        <p>This paper aims to study a real-time emergency re- environments or even disaster areas, and intersections
sponse model for CAV (SAE Level of autonomy 3), specif- are the most frequent places for EVs to be involved in
ically for EVs, using a hybrid approach embedding either car accidents [15]. In the physical feature, fire trucks and
vision and sounds sensors, aiming to detect and localize ambulances have a larger volume and are more likely to
the EVs by the vision and siren detection system, which cause danger when they are driven together with other
will be discussed and analyzed the increase of accuracy vehicles. Summing up the above factors, we can
underof distance and identification measures to ensure sustain- stand the threat of encountering EVs on the road. Their
able and safe operation during normal and emergency task nature is more dangerous than general road driving,
conditions and consideration of current related codes and especially since most vehicle conflicts occur at
intersecstandards environment. tions. Therefore, if the trafic signal light can respond to</p>
        <p>The paper will survey and evaluate CAV based on a the situation on the road and save time on the ambulance,
it can also reduce the risk in the intersections. According station, for example. In the context of smart
technolto the task of EVs, there is usually time urgency, so for the ogy, the interface may have access to the fridge or food
surrounding vehicles, the best way to respond is to slow storage information to add a stop at the supermarket or
down and stop as soon as possible so that emergency grocery store so that the human can purchase supplies.
vehicles do not have to be distracted by other vehicles. These contextual GPS scenarios can ofer more efective
However, even the Intelligent Transport Systems (ITS), itineraries for the drivers. They can include everything
which include the protocols of communications between from picking up colleagues to sharing a ride for work to
CAVs and the intelligent trafic, could be compromised by syncing the driver’s agenda or adapting routes to trafic
cyber-attacks becoming susceptible to safety risks [16]. information, among other individual behaviors.</p>
        <p>Advancements in the CAV industry also open oppor- In previous research, the methods for detecting sirens
tunities for creating a new profile of drivers. Among the from EVs can be divided into two diferent approaches.
most promising approaches, CAV is the first alternative The first approach is to identify whether the data
confor independent visually impaired drivers [17]. Accord- tains siren sound based on the siren’s characteristics,
ing to the World Health Organization (WHO), more than such as high-frequency and low-frequency, or cyclical
1 billion people live with some visual impairment in the nature [26, 27]. However, this method does not perform
world [18]. WHO shows that 36 million of them are blind, well in noisy environments, especially in urban areas.
and the majority of those people are over 50 years old. In- The second approach is to extract the siren signal from
deed, population aging is a worldwide phenomenon that background noise [28]. For example, Fazenda et al.
emis expected to bring economic consequences [19]. Build- ployed the least mean squared algorithm to create a noise
ing CAV that the elderly population can use can help the canceller to extract the target signals [29]. Nishimura et
industry to overcome the economic challenge. However, al. proposed a data embedding method for the vehicle
to this end, accessibility of CAV becomes imperative for location into the siren sound [30]. This research tends
the sector [17]. Besides the elderly and people with visual to extract the siren signals from background noise,
inimpairments, advances in CAV also open opportunities cluding lots of parts in real trafic life, by considering
to enhance children’s transportation. For example, au- the Doppler Efect. The Doppler Efect is that the sound
tonomous school buses may benefit their independent frequency will vary according to diferent speeds, so the
transportation, or, even, parents may use private CAV to siren frequency in real life may difer from the spectrum
drive their kids to school. we observe. Referring to the siren datasets we collected</p>
        <p>Driving essentials scenarios include, but are not lim- from the Web, Schröder et al. showed that some audio
ited to, studies on Computer Vision (CV) to enhance CAV software could help to mimic the Doppler Efect in the
capabilities of (partial or complete) self-driving, Artificial datasets, such as Adobe Audition 1.0 [31].
Intelligence (AI), Cloud computing, and machine learn- Our hybrid approach aims to localize the siren by
ing, among other computational domains concerned to a time delay estimation method and a sound intensity
enhance essential driving functionalities [20, 21, 22, 23]. probe method in the Path Planning function. For
examEssential functionalities cover GPS services (to allow au- ple, Fazenda et al. showed that the accuracy of the time
tonomous driving) and a range of sensing technologies delay estimation method is better in the distance between
suitable for driver’s and passenger identification, includ- the emergency vehicle and the driver in a long distance.
ing vehicle and road infrastructure detection and parking But if it is in a short distance, the sound intensity probe
assistance. This scenario involves Computer Vision tech- method can also get a higher accuracy [29].
niques and route generation, which come from Computer Diferent projects have made great eforts to advance
Science and Automation Engineering backgrounds. An- in the CAV areas. Big technology and automotive
comother technology, such as the blockchain, has already panies and universities have been working together to
been embedded in CAV, considering its eficient perfor- advance the projects, designing vehicles and developing
mance mechanisms for decentralized distributed storage diferent algorithms and drive systems split into diferent
and security management of big data [24, 25]. levels of autonomy.</p>
        <p>The main stakeholders in this real-time emergency Uber, in partnership with Carnegie Mellon
Univerresponse scenario are the automobile sector, the mobile sity; Lift together with General Motors; and Didi, with
application market, and average drivers and passengers. Japanese automotive companies, have been building
selfAn interface for this scenario can assist drivers in se- driving cars and ride-sharing services with level 4 and
lecting and monitoring their driving routes, including planning to build level 5 of autonomy. Uber, Lift and
contextual stops for either safety or personal matters. Didi are companies that provide mobility services and
The emergency response model component may collect have a great power of trafic data collection, which is
CAV’s contextual information, like the vehicle’s fuel or the key to developing and improving their automation
another engine status. Then, it uses such information to system and models [32]. AutoX, in 2018, built an
addetermine if the GPS route must add a stop at the gas vanced full-stack self-driving AI platform in partnership</p>
      </sec>
    </sec>
    <sec id="sec-2">
      <title>3. A Conceptual Framework and</title>
    </sec>
    <sec id="sec-3">
      <title>Methodology</title>
      <p>with Alibaba Group, Chery automotive, NVIDIA, and
other companies to build SAE Level 2 and 3
assistivedriving vehicles, including RoboTaxis and RoboTrucks.</p>
      <p>The companies’ idea for the next decade is to develop a CAVs are under various scenarios and operating
condifull (Level 5) autonomous car, beginning in 2021 with the tions of residential, industrial facilities, transportation
ifrst Fully Driverless RoboTaxi Service to The Public in electrification, and grid-connected integration.
ThereChina [32, 33]. fore, it should provide a comprehensive evaluation of the</p>
      <p>
        Google Waymo, in partners with automotive com- safety risks during the operation of CAV by identifying
panies such as Fiat-Chrysler, Audi, Toyota, and Jaguar, hazards and estimating risks in diferent operating
condihas been working on self-driving vehicles of autonomy tions and modes. Codes and standards roadmap should
level 4, operating the Waymo Driver, a commercial au- be performed based on fault propagation modeling and
tonomous ride-hailing service, in San Francisco, Cali- analysis, simulation and evaluation will be presented and
fornia. Recently in 2020, Waymo started the operation analyzed by independent protection layers for all
possiof Waymo Via, transporting commercial goods that use ble normal and abnormal operating conditions. Besides,
autonomous vans and trucks [
        <xref ref-type="bibr" rid="ref9">34</xref>
        ]. The Apollo project, evaluation and validation of technical, economic, safety,
Baidu’s open-source self-driving platform, was created reliability, and availability with risk factors, life cycle
to test and improve the CAVs’ motion planning and ve- costing, and environmental assessment will be discussed.
hicle control algorithms to aim the driving safety and The research will consider the technical requirement
riding experiences. AVs such as the Lincoln MKZ Sedan of CAV for safety and performance evaluation. It will
and the Ford Transit Van were used to train their dy- consolidate the currently available codes and standards
namic models by real-world road data collected from of CAV and propose new evaluation criteria for real-time
Apollo autonomous vehicles driving on urban roads [
        <xref ref-type="bibr" rid="ref10">35</xref>
        ]. emergency responses, to support CAV standards. CAV
The Apollo platform was its 6.0 version at the end of has two aspects: hardware and software. Hardware
gov2020. The union of eforts of the big technology compa- erns sensors such as Vehicle-to-Vehicle (V2V),
Vehicle-tonies and automakers to conceive powerful Autonomous Grid (V2G), Vehicle-to-Infrastructure (V2I) and
VehicleDriving Systems (ADS), and consequently build fully Self- to-Everything (V2X) technology, and actuators. Software
driving cars have provided great technological advance- deals with processes of perception, planning, and control.
ments aimed at reaching common goals such as enhanc- V2X technology components V2V and V2I allow the
vehiing safety, decongesting roadways, saving time for users, cle to communicate by receiving information and talking
reducing greenhouse gas emissions, and ensuring mobil- to other systems in the environment. These
environity for all people, including the disabled and the elderly mental communication systems can be other vehicles or
[
        <xref ref-type="bibr" rid="ref11">36</xref>
        ]. smart city light-changing signals. CAV should be tested
      </p>
      <p>
        To enhance situational awareness in CAV, our pro- according to their ability to transition with city speed
posed model will also incorporate the research work in restrictions during an emergency.
computer vision tools such as geolocalized photos and During autonomous driving, one of the most
dangervideos of the situations into the proposed model [
        <xref ref-type="bibr" rid="ref12">37</xref>
        ]. ous maneuvers is lane changing. Even with ADAS, lane
Furthermore, the model is expected to be implemented change is still very complex and potentially dangerous.
by a machine learning approach based on Support Vec- ADAS systems should be tested on features like Adaptive
tor Machine and Neural Network with the datasets we Cruise Control (ACC), Autonomous Emergency Braking
collected for this research and responded to the real-time (AEB), and Lane Keep Assistant (LKA). Planning
operainput data [
        <xref ref-type="bibr" rid="ref13">38</xref>
        ]. tions should be tested under diferent scenarios to test the
      </p>
      <p>
        This research will consider the operation of CAVs that vehicle’s ability to adapt to road circumstances.
Accordface some technical challenges, such as the instability in ing to recent literature, the Path Planning component of
some operating conditions due to the dynamic response CAV comprises three functions: Mission Planning,
Beof emergency trafic scenarios on a real-time basis. Most havioral Planning and Motion Planning. A typical task
of the CAV mainly rely on their vision sensor instead of each function is outlined as follows: (1) The Mission
of their sound sensor. One of the related research areas Planner: High-level decisions such as determining pickup
is the hearing impaired [
        <xref ref-type="bibr" rid="ref14 ref15">39, 40</xref>
        ]. In recent years this destination locations and road selections achieve the
tarresearch area has gradually begun to move into CAV [36, get mission. (2) The Behavioral Planner: Dynamic ad-hoc
41]. Therefore, we believe that detecting the approaching decisions such as lane change, intersection crossing, and
EVs using both a vision sensor and a siren detection overtaking. (3) The Motion Planner: Collision avoidance,
system is essential in the future. obstacle avoidance, alarm generation, etc. The proposed
model will be incorporated into Path Planning.
      </p>
      <sec id="sec-3-1">
        <title>3.1. Evs Identification Process</title>
        <p>In Figure 1, Our proposed EVs identification process will
start by detecting the approaching vehicle based on
visual and sound sensors. There are usually two ways to
detect EVs. One is to see their unique appearance, and
the other is to hear their siren sound. In the siren sound
detection, we first identify which type of emergency
vehicle it belongs to and extract the specific siren sound
from the background noise. After we extracted the siren
sound, we used the time delay estimation method and
sound intensity probe method to localize the direction
of the sound. From the previous research [29], the time
delay estimation method has better performance in long
distances than the sound intensity probe method. Thus,
we can decide the direction of the siren sound detection.</p>
        <sec id="sec-3-1-1">
          <title>In visual detection, we first detect the type of emer</title>
          <p>gency vehicle and use the image we captured and the
distance we detect from Light Detection and Ranging
(LIDAR) to localize the direction and the position of the
EVs. Combine the above information and use the GPS to
help predict the possible path of EVs and provide driver
information so that appropriate responses can be made
as soon as possible. If our route is interleaved with the
route of EVs, we will respond. If there is no conflict, we
will continue to observe but do not require a response.
Basically, we hope we can detect both sound and visual
detection to give complete information to the driver. If
we can not detect the precise position, our system can
probably predict the possible position to give the most
appropriate response.</p>
        </sec>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Algorithm</title>
        <sec id="sec-3-2-1">
          <title>Then we introduce our Emergency Vehicles response</title>
          <p>algorithm. Table 1 is a brief conceptual introduction.
When we detect approaching EVs, we first evaluate our
speed. If our speed is higher than a certain speed, we
gradually decrease the speed to maintain safety. Then
when the position of EVs has been confirmed, the system
starts to perform the lane change to yield to EVs. The
method of changing lanes will first detect vehicles in the
vicinity, according to LIDAR, which can detect vehicles
within a radius of 100 meters.</p>
          <p>Result:
def yieldToEVs():
Matrix = detectSurroundVehicle();
# By Using LIDAR to detect vehicles
changeLane(Matrix);
# change lane according to the vehicle matrix
while Emergency Vechicles is detected do
if speed is above certain speed then
slowDown();
# slow down to certain speed
if EVs’ location is confirmed then
yieldToEVs();
vehicleStop();
end
end
end
Algorithm 1: Algorithm of Emergency Vehicles
Response</p>
        </sec>
        <sec id="sec-3-2-2">
          <title>Based on the results scanned by LIDAR, we can form a</title>
          <p>
            vehicle matrix according to our lane and then plan how
to yield EVs based on the vehicle matrix. In Canada, the
Ministry of Transportation [
            <xref ref-type="bibr" rid="ref17">42</xref>
            ], in its traditional rules,
stipulate that when EVs are encountered, they pull as
close as possible to the right edge of the road. However,
according to the real situation, it is not always the best
option to pull to the right. We should make the most
space according to the trafic conditions. When we yield
the position to EVs, the best way is to stop and let the
EVs drive safely. After all, any vehicle movement can
cause distractions in our EVs driver.
3.3. Conceptual Cooperative CAV system
records, will be used to maintain complete traceability in
dynamic and static data and properly manage risks, and
our ADAS will be reviewed according to standard
security requirements. From the cybersecurity perspective,
the work will follow ISO 21434 while security will be
considered in the development and deployment process,
embracing the Security by Design approach, considering
requirements since the adoption of secure wireless
connection protocols to the usage of encryption in
information transmission. For every information transmission,
an appropriate incident response mechanism will be
initiated, which will include methods for determining actions
of progress or remediation and vulnerabilities analysis,
that will consider the potential damage.
          </p>
        </sec>
        <sec id="sec-3-2-3">
          <title>By using the sharing property of the CAV, we aim to</title>
          <p>create a comprehensive safety standard and system.
Figure 2 is our conceptual cooperative CAV system; all the
CAVs can do collaborative sense and computation. In
addition to the communication between vehicle and
vehicle (V2V), the communication between the vehicle and
the trafic lights (V2I) can also significantly save time
for the EVs to reach the destination. Usually, the biggest
problem encountered by EVs is the trafic jam on the
road or the danger encountered when executing ‘code
3 running’, on the way to an emergency. Our EVs
response ADAS can collaborate with other ADAS, such as
Adaptive Cruise Control, Autonomous Emergency
Braking, and Lane Change Assistant, to give corresponding
responses automatically. As for the interaction between 4. Conclusion
humans and vehicles, in the process of automated
response, humans need to supervise and be able to inter- The operation of CAVs faces some technical challenges,
vene. For example, in Tesla’s autopilot system, people such as the instability in some operating conditions due
need to put their hands on the steering wheel to ensure to the dynamic response of trafic scenarios, such as
emersafety when performing lane changes. Accidents caused gency vehicle response. Therefore, the SAE classifies the
by automated failures in the response of EVs can seri- vehicle from Level 0 to 5 based on the automation
capaously afect lives. At present, a few fatal vehicle accidents bilities. In today’s CAV, most of them are in Level 3, and
occur on autonomous vehicles. Therefore, before we can car manufacturers assume the driver will take back
confully guarantee the driving safety of autonomous vehi- trol during the emergency in real time. However, drivers
cles, appropriate human supervision and intervention should stay aware of automation limitations, and the
manare necessary. ufacturers should make a warning system that can give</p>
          <p>
            ADAS can outperform humans in some automated op- warnings far ahead of time. Nowadays, most CAV mainly
erations, thus promoting trafic safety and resulting in rely on their vision sensor instead of their sound sensor.
the development of autonomous vehicles. To achieve Our hybrid approach aims to detect and localize the EVs
the driving safety of autonomous cars, some standard- by the vision and siren detection system. We believe
ized methodologies were created. Projects such as the that detecting the approaching EVs using both vision
Waymo Driver and AutoX developed the ADAS of their and sound sensors are essential in CAV, increasing the
Self-driving cars using the ISO 26262 “Road Vehicles - accuracy of distance and identification measures. The
reFunctional Safety,” which presents guidelines applied to search direction covers codes and standards for all control
safety-related systems that include one or more electri- and communication functions in related ECUs through
cal and/or electronic systems in automobiles. Due to the operating process of CAV, which should be tested for
the necessity of dynamically and automatically driving accuracy and strength to support the requirements of the
awareness, with lanes and vehicle detection, as well as design, development, operation, and evaluation of the
collision avoidance, the CAV systems have continuously model. The requirements can fit together with standards
improved their automobile intelligence and connectivity (e.g., the working document ISO 21434 “Road Vehicles –
capabilities, increasing the focus on cybersecurity. Stan- Cyber Security Engineering”) and functional safety
redardized methods of cybersecurity, such as the ISO 21434 quirements (e.g., ISO 26262 “Road vehicles - Functional
“Road Vehicles - Cyber Security Engineering,” have been safety”) within the North American regulatory structure
adopted, which specifies requirements for the whole life and utility requirements.
cycle of automotive products of engineering-related
cybersecurity risk management for road vehicle electrical
and electronic systems and their components and inter- Acknowledgments
faces [
            <xref ref-type="bibr" rid="ref18">43</xref>
            ]. This paper is supported by Research Grant Fund R20090,
          </p>
          <p>Our system development will comply with ISO 26262 Zayed University, United Arab Emirates.
and ISO 21434. The software development life cycle in
ISO 26262, from design to implantation and validation,
will be followed. In addition to software development,
appropriate information management, including process
[9] Road vehicles - Functional safety, Technical Report and transportation engineering (English edition) 6
ISO Standard No.26262:2018, International Organi- (2019) 109–131.</p>
          <p>zation for Standardization, 2018. [22] H. Gao, B. Cheng, J. Wang, K. Li, J. Zhao, D. Li,
Ob[10] Road vehicles Cybersecurity system evaluation ject classification using cnn-based fusion of vision
method, Technical Report ISO Standard No. and lidar in autonomous vehicle environment, IEEE
21434:2021, International Organization for Stan- Transactions on Industrial Informatics 14 (2018)
dardization, 2021. 4224–4231.
[11] H. Abraham, C. Lee, S. Brady, C. Fitzgerald, [23] M. R. T. Hossai, M. A. Shahjalal, N. F. Nuri, Design
B. Mehler, B. Reimer, J. F. Coughlin, Autonomous of an iot based autonomous vehicle with the aid
vehicles, trust, and driving alternatives: A survey of of computer vision, in: 2017 International
Conferconsumer preferences, Massachusetts Inst. Technol, ence on Electrical, Computer and Communication
AgeLab, Cambridge 1 (2016) 2018–12. Engineering (ECCE), IEEE, 2017, pp. 752–756.
[12] K. Lazanyi, G. Maraczi, Dispositional trust—do we [24] G. Drakopoulos, E. Kafeza, H. Al Katheeri, Proof
trust autonomous cars?, in: 2017 IEEE 15th In- systems in blockchains: A survey, in: 2019 4th
ternational Symposium on Intelligent Systems and South-East Europe Design Automation, Computer
Informatics (SISY), IEEE, 2017, pp. 000135–000140. Engineering, Computer Networks and Social Media
[13] C. Basu, Q. Yang, D. Hungerman, M. Singhal, A. D. Conference (SEEDA-CECNSM), IEEE, 2019, pp. 1–6.</p>
          <p>Dragan, Do you want your autonomous car to drive [25] T. Jiang, H. Fang, H. Wang, Blockchain-based
interlike you?, in: Proceedings of the 2017 ACM/IEEE net of vehicles: Distributed network architecture
International Conference on Human-Robot Interac- and performance analysis, IEEE Internet of Things
tion, 2017, pp. 417–425. Journal 6 (2018) 4640–4649.
[14] A. Millard-Ball, Pedestrians, autonomous vehicles, [26] S. Kiran, M. Supriya, Siren detection and driver
and cities, Journal of planning education and re- assistance using modified minimum mean square
search 38 (2018) 6–12. error method, in: 2017 International Conference
[15] H. Hsiao, J. Chang, P. Simeonov, Preventing emer- On Smart Technologies For Smart Nation
(Smartgency vehicle crashes: status and challenges of TechCon), IEEE, 2017, pp. 127–131.
human factors issues, Human factors 60 (2018) [27] J.-J. Liaw, W.-S. Wang, H.-C. Chu, M.-S. Huang,
C.1048–1072. P. Lu, Recognition of the ambulance siren sound
[16] B. B. Gupta, A. Gaurav, E. C. Marín, W. Alhalabi, in taiwan by the longest common subsequence, in:
Novel graph-based machine learning technique to 2013 IEEE International Conference on Systems,
secure smart vehicles in intelligent transportation Man, and Cybernetics, IEEE, 2013, pp. 3825–3828.
systems, IEEE Transactions on Intelligent Trans- [28] L. Marchegiani, P. Newman, Listening for sirens:
portation Systems (2022). Locating and classifying acoustic alarms in city
[17] M. Bonani, R. Oliveira, F. Correia, A. Rodrigues, scenes, IEEE Transactions on Intelligent
TransT. Guerreiro, A. Paiva, What my eyes can’t see, a portation Systems (2022).
robot can show me: Exploring the collaboration [29] B. Fazenda, H. Atmoko, F. Gu, L. Guan, A. Ball,
between blind people and robots, in: Proceedings Acoustic based safety emergency vehicle detection
of the 20th International ACM SIGACCESS Con- for intelligent transport systems, in: 2009
ICCASference on Computers and Accessibility, 2018, pp. SICE, IEEE, 2009, pp. 4250–4255.</p>
          <p>15–27. [30] A. Nishimura, Encoding data by frequency
modu[18] World Health Organization, Blindness lation of a high-low siren emitted by an emergency
and vision impairment, 2021. URL: https: vehicle, in: 2014 Tenth International Conference
//www.who.int/news-room/fact-sheets/detail/ on Intelligent Information Hiding and Multimedia
blindness-and-visualimpairment. Signal Processing, IEEE, 2014, pp. 255–259.
[19] G. Carbonaro, E. Leanza, P. McCann, F. Medda, De- [31] J. Schröder, S. Goetze, V. Grützmacher, J. Anemüller,
mographic decline, population aging, and modern Automatic acoustic siren detection in trafic noise
ifnancial approaches to urban policy, International by part-based models, in: 2013 IEEE International
Regional Science Review 41 (2018) 210–232. Conference on Acoustics, Speech and Signal
Pro[20] A. Dominguez-Sanchez, M. Cazorla, S. Orts- cessing, IEEE, 2013, pp. 493–497.</p>
          <p>Escolano, Pedestrian movement direction recog- [32] C. Badue, R. Guidolini, R. V. Carneiro, P. Azevedo,
nition using convolutional neural networks, IEEE V. B. Cardoso, A. Forechi, L. Jesus, R. Berriel, T. M.
transactions on intelligent transportation systems Paixao, F. Mutz, et al., Self-driving cars: A
sur18 (2017) 3540–3548. vey, Expert Systems with Applications 165 (2021)
[21] D. Elliott, W. Keen, L. Miao, Recent advances in 113816.</p>
          <p>connected and automated vehicles, journal of trafic [33] AutoX, The autox safety factor technical
re</p>
        </sec>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>J.</given-names>
            <surname>Ma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Shladover</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. A.</given-names>
            <surname>Rakha</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.-Y.</given-names>
            <surname>Lu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Jagannathan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. J.</given-names>
            <surname>Dailey</surname>
          </string-name>
          ,
          <article-title>Freeway speed harmonization</article-title>
          ,
          <source>IEEE Transactions on Intelligent Vehicles</source>
          <volume>1</volume>
          (
          <year>2016</year>
          )
          <fpage>78</fpage>
          -
          <lpage>89</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A.</given-names>
            <surname>Ghiasi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ma</surname>
          </string-name>
          ,
          <article-title>A mixed trafic speed harmonization model with connected autonomous vehicles</article-title>
          ,
          <source>Transportation Research Part C: Emerging Technologies</source>
          <volume>104</volume>
          (
          <year>2019</year>
          )
          <fpage>210</fpage>
          -
          <lpage>233</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S. O.-R. A. V. S.</given-names>
            <surname>Committee</surname>
          </string-name>
          , et al.,
          <article-title>Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems</article-title>
          ,
          <source>SAE Standard J</source>
          <volume>3016</volume>
          (
          <year>2014</year>
          )
          <fpage>1</fpage>
          -
          <lpage>16</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>P.</given-names>
            <surname>Bansal</surname>
          </string-name>
          ,
          <string-name>
            <surname>K. M. Kockelman</surname>
          </string-name>
          ,
          <article-title>Forecasting americans' long-term adoption of connected and autonomous vehicle technologies</article-title>
          ,
          <source>Transportation Research Part A: Policy and Practice</source>
          <volume>95</volume>
          (
          <year>2017</year>
          )
          <fpage>49</fpage>
          -
          <lpage>63</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>M.</given-names>
            <surname>Johns</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Mok</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Sirkin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Gowda</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Smith</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Talamonti</surname>
          </string-name>
          , W. Ju,
          <article-title>Exploring shared control in automated driving</article-title>
          ,
          <source>in: 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI)</source>
          , IEEE,
          <year>2016</year>
          , pp.
          <fpage>91</fpage>
          -
          <lpage>98</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>D.</given-names>
            <surname>Shepardson</surname>
          </string-name>
          ,
          <string-name>
            <surname>U.</surname>
          </string-name>
          <article-title>s. agency to determine cause of 2018 fatal tesla autopilot crash</article-title>
          ,
          <year>2020</year>
          . URL: https:// www.reuters.com/article/us-tesla
          <article-title>-crash-california/ u-s-agencytodetermine-cause-of-2018-fatal-tesla-/ autopilot-crash-idUSKBN1ZD24B.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>L. J. Wang</given-names>
            <surname>Wensheng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Sibin</surname>
          </string-name>
          ,
          <article-title>Identifying the sound of ambulance whistle by frequency feature</article-title>
          ,
          <source>International Journal of Advanced Information Technologies</source>
          <volume>6</volume>
          (
          <year>2012</year>
          )
          <fpage>39</fpage>
          -
          <lpage>45</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>National</given-names>
            <surname>Safety</surname>
          </string-name>
          <string-name>
            <surname>Council</surname>
          </string-name>
          , Emergency vehicles,
          <year>2020</year>
          . URL: https://injuryfacts.nsc.org/motor-vehicle/ road-users/emergency-vehicles. port,
          <year>2020</year>
          . URL: https://medium.com/autox/ the-autox
          <article-title>-safety-factor-c76d80e6768f.</article-title>
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [34]
          <string-name>
            <given-names>M. A.</given-names>
            <surname>Cusumano</surname>
          </string-name>
          ,
          <article-title>Self-driving vehicle technology: progress and promises</article-title>
          ,
          <source>Communications of the ACM</source>
          <volume>63</volume>
          (
          <year>2020</year>
          )
          <fpage>20</fpage>
          -
          <lpage>22</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [35]
          <string-name>
            <given-names>J.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Luo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Xiao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Hu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Miao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <article-title>An automated learning-based procedure for large-scale vehicle dynamics modeling on baidu apollo platform</article-title>
          ,
          <source>in: 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)</source>
          , IEEE,
          <year>2019</year>
          , pp.
          <fpage>5049</fpage>
          -
          <lpage>5056</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [36]
          <string-name>
            <given-names>F.</given-names>
            <surname>Youssef</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Houda</surname>
          </string-name>
          ,
          <article-title>Comparative study of end-toend deep learning methods for self-driving car</article-title>
          ,
          <source>Int. J. Intell. Syst. Appl</source>
          <volume>12</volume>
          (
          <year>2020</year>
          )
          <fpage>15</fpage>
          -
          <lpage>27</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [37]
          <string-name>
            <given-names>L.</given-names>
            <surname>Lopez-Fuentes</surname>
          </string-name>
          , J. van de Weijer, M. GonzálezHidalgo,
          <string-name>
            <given-names>H.</given-names>
            <surname>Skinnemoen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. D.</given-names>
            <surname>Bagdanov</surname>
          </string-name>
          ,
          <article-title>Review on computer vision techniques in emergency situations</article-title>
          ,
          <source>Multimedia Tools and Applications</source>
          <volume>77</volume>
          (
          <year>2018</year>
          )
          <fpage>17069</fpage>
          -
          <lpage>17107</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [38]
          <string-name>
            <given-names>A.</given-names>
            <surname>Benterki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Boukhnifer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Judalet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Choubeila</surname>
          </string-name>
          ,
          <article-title>Prediction of surrounding vehicles lane change intention using machine learning</article-title>
          ,
          <source>in: 2019 10th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS)</source>
          , volume
          <volume>2</volume>
          , IEEE,
          <year>2019</year>
          , pp.
          <fpage>839</fpage>
          -
          <lpage>843</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [39]
          <string-name>
            <given-names>F.</given-names>
            <surname>Beritelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Casale</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Russo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Serrano</surname>
          </string-name>
          ,
          <article-title>An automatic emergency signal recognition system for the hearing impaired</article-title>
          ,
          <source>in: 2006 IEEE 12th Digital Signal Processing Workshop &amp; 4th IEEE Signal Processing Education Workshop</source>
          , IEEE,
          <year>2006</year>
          , pp.
          <fpage>179</fpage>
          -
          <lpage>182</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [40]
          <string-name>
            <given-names>F.</given-names>
            <surname>Meucci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Pierucci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Del Re</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Lastrucci</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Desii</surname>
          </string-name>
          ,
          <article-title>A real-time siren detector to improve safety of guide in trafic environment</article-title>
          ,
          <source>in: 2008 16th European Signal Processing Conference</source>
          , IEEE,
          <year>2008</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>5</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [41]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Ebizuka</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Itami</surname>
          </string-name>
          ,
          <article-title>Detecting approach of emergency vehicles using siren sound processing</article-title>
          ,
          <source>in: 2019 IEEE Intelligent Transportation Systems Conference (ITSC)</source>
          , IEEE,
          <year>2019</year>
          , pp.
          <fpage>4431</fpage>
          -
          <lpage>4436</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [42] Canada Ministry of Transportation, Road safety: Emergency vehicles,
          <year>2020</year>
          . URL: http://www.mto.gov.on.ca/english/safety/ emergency-vehicles.shtml.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [43]
          <string-name>
            <given-names>H.</given-names>
            <surname>Shan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>He</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Fang</surname>
          </string-name>
          ,
          <article-title>Road vehicles cybersecurity system evaluation method</article-title>
          ,
          <source>in: Journal of Physics: Conference Series</source>
          , volume
          <volume>1607</volume>
          ,
          <string-name>
            <given-names>IOP</given-names>
            <surname>Publishing</surname>
          </string-name>
          ,
          <year>2020</year>
          , p.
          <fpage>012054</fpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>