=Paper= {{Paper |id=Vol-3318/paper3 |storemode=property |title=Towards A Real-Time Emergency Response Model For Connected And Autonomous Vehicles |pdfUrl=https://ceur-ws.org/Vol-3318/paper3.pdf |volume=Vol-3318 |authors=Yen-Hung Liu,Otavio de P. Albuquerque,Patrick C. K. Hung,Hossam A. Gabbar,Marcelo Fantinato,Farkhund Iqbal |dblpUrl=https://dblp.org/rec/conf/cikm/LiuAHGFI22 }} ==Towards A Real-Time Emergency Response Model For Connected And Autonomous Vehicles== https://ceur-ws.org/Vol-3318/paper3.pdf
Towards a Real-Time Emergency Response Model for
Connected and Autonomous Vehicles
Yen-Hung Liu1 , Otavio de P. Albuquerque2,∗ , Patrick C. K. Hung1 , Hossam A. Gabbar1 ,
Marcelo Fantinato2 and Farkhund Iqbal3
1
  Faculty of Business and IT, Ontario Tech University, Oshawa, Canada
2
  School of Arts, Sciences and Humanities, University of São Paulo, São Paulo, Brazil
3
  College of Technological Innovation, Zayed University, Abu Dhabi, United Arab Emirates


                                          Abstract
                                          Recently technological advancements in the automobile and transportation sector have gained significant interest from
                                          governments, industry leaders, and citizens. Together with Autonomous Vehicles (AV) and Connected Vehicles (CV),
                                          Connected-Autonomous Vehicles (CAV) have made a revolution in these sectors. Emergency Vehicles (EVs), such as ambu-
                                          lances, fire trucks, and patrol cars, are essential to our daily traffic life. Each of EVs has a different purpose, but all have their
                                          urgency and importance, and any time passing may cause the death of life. Thus, whenever other vehicle drivers encounter
                                          an EV on the road, they must yield to the EVs. Therefore, a CAV system that can detect EVs will significantly improve these
                                          issues. According to the Society of Automotive Engineers International (SAE), in today’s autonomous vehicles, most of them
                                          are less than Level 5, and car manufacturers assume the driver will take back control. Still, most autonomous vehicles mainly
                                          rely on their vision sensor instead of their sound sensor. Thus, when the system notifies the driver that the EVs are already
                                          close to them, it may be dangerous for the driver, pedestrians, and passengers in the vehicle. This paper proposes a conceptual
                                          framework and discusses a related methodology to support such a real-time emergency response model for CAV.

                                          Keywords
                                          connected-autonomous vehicle, emergency vehicle, emergency response, real-time response, driving assistance technology,
                                          the Doppler Effect, exceptional handling, control strategy, machine learning



1. Introduction                                                                                                 entirely human-operated vehicles, while Level 5 vehicles
                                                                                                                are fully automated. For example, in Level 1 of automa-
Recently technological advancements in the automobile tion, the vehicle may assist the driver with tasks like
and transportation sector have gained significant interest steering or acceleration. Shared Autonomous Vehicle
from governments, industry leaders, and citizens. Au- (SAV) should be considered at least Level 2 of vehicle
tonomous Vehicles (AV) technology enables vehicles to be automation. It enables the driver to remain fully engaged
controlled by precise, fast responding computers instead with the driving task but gradually transfer control from
of error-prone and slowly responding human beings; human to machine. Level 2 automation features include
Connected Vehicles (CV) allow infrastructure units and adaptive cruise control and automatic emergency brak-
vehicles to share high-resolution information through ing. In the industry, most CAV classifies as Level 4 of
wireless connectivity that can communicate to support automation, while automotive companies have carefully
interaction with their internal and external environments explored Level 3. Under some circumstances, the ma-
in real-time, e.g., for traffic systems and between indi- chine and the human might be sharing of controlling the
vidual vehicles. Connected-autonomous Vehicles (CAV), driving task. Such cases can be dangerous for the driver
which integrates the best of both AVs and CVs, have and passengers due to the spare time between control-
revolutionized these sectors [1, 2].                                                                            ling exchange and human-decision making. Level 4 of
              According to the report of the Society of Automotive autonomous capability means cars can self-drive in most
Engineers International (SAE) [3], the level of autonomy conditions without human intervention. However, there
of vehicles ranges from Level 0 to 5. Level 0 concerns are many open design challenges, including technical,
                                                                                                                ethical, and regulatory matters. A completely automated
THECOG 2022: Transforms in behavioral and affective computing, vehicle (Level 5) can perform all driving functions under
October 2022, Atlanta, Georgia, USA                                                                             all conditions. In this situation, humans are just passen-
∗
     Corresponding author.                                                                                      gers.
Envelope-Open yenhung.liu@ontairotechu.net (Y. Liu);
otavioalbuquerque@usp.br (O. d. P. Albuquerque);
                                                                                                                   CAV can excel in numerous advantages for smart city
patrick.Hung@ontariotechu.ca (P. C. K. Hung);                                                                   citizens by offering them better and more effective trans-
hossam.gaber@uoit.ca (H. A. Gabbar); m.fantinato@usp.br                                                         portation services, such as dramatically reducing car
(M. Fantinato); farkhund.iqbal@zu.ac.ae (F. Iqbal)                                                              crashes and driver fatigue [4, 5]. CAV offers benefits
                   © 2022 Copyright for this paper by its authors. Use permitted under Creative Commons License
                   Attribution 4.0 International (CC BY 4.0).                                                   for private and public transportation. It includes vehicles
    CEUR
    Workshop
    Proceedings
                  http://ceur-ws.org
                  ISSN 1613-0073
                                       CEUR Workshop Proceedings (CEUR-WS.org)
as private and service (e.g., Uber) cars, buses for public       detailed analysis and enhancement of technical aspects
transport (which includes school buses), and trucks (e.g.        and operation, safety, and reliable performance for the
garbage collectors and agricultural trucks). CAV bene-           emergency environment under pressure. The proposed
fits are achieved by collecting relevant information from        model of CAV will have a routine for Key Performance
the CAV’s context, such as geolocation, date and time,           Indicators (KPIs) based on functional operational and
and other individual attributes like age, address, gender,       safety requirements, resiliency measures, risk analysis,
and income. Therefore, CAV can infer an individual’s             and Safety Integrity Level (SIL) allocation. Verification
interests, traits, beliefs, and intentions.                      and validation will be evaluated in the related Electronic
   Many of today’s automated vehicles lose track of the          Control Units (ECUs) with relevant standards or codes
lane position when the lane markings are absent. For ex-         such as the National Electrical Code/National Fire Pro-
ample, erroneous lane marking recognition contributed            tection Association (NFPA) 70, Canadian Electrical Code
to a fatal crash of Tesla cars in California in 2018 [6].        and the International Electrotechnical Commission (IEC)
CAV can address such issues since it can be connected            standard and Underwriters Laboratories (UL) standard
through external interfaces, like Wi-Fi, Bluetooth, Global       and other codes/standards/regulations such as the work-
Positioning System (GPS), and Tire Pressure Monitor-             ing document provided by International Organization for
ing System (TPMS). Moreover, internally, CAV works               Standardization (ISO): ISO 26262 “Road vehicles - Func-
with a Controller Area Network (CAN), connecting dif-            tional safety” [9], and ISO 21434 “Road Vehicles – Cyber
ferent Electronic Control Units (ECU) as the engine itself.      Security Engineering” [10].
On the one hand, these connections are necessary to                 The remainder of the paper is organized as follows:
provide basic (e.g., driving) and advanced features (e.g.,       Section 2 reviews the related scientific works present in
autonomous driving and entertainment) to involved per-           the literature in this work’s area. Section 3 presents a
sons, such as drivers, passengers, and pedestrians.              proposed conceptual cooperative framework and method-
   The application of CAV in real-time response for Emer-        ology for real-time emergency response for Connected-
gency Vehicles (EVs) has resulted in great improvements          Autonomous Vehicles. Section 4 concludes the paper,
in the efficiency of the process. EVs, such as ambulances,       presenting a brief discussion and the found limitations.
fire trucks, and police cars, are essential to our daily traf-
fic life. Each EV has a different purpose, but all have their
urgency and importance for emergency response and                2. Literature Review
saving a life. Thus, whenever a vehicle driver encounters
                                                                 Safety risks may bring implications for CAV passengers,
an EV on the road, the driver must yield to the EV in a
                                                                 other vehicles, pedestrians, and road infrastructure to
safe condition.
                                                                 understand human aspects and perceptions towards CAV,
   Usually, the EV has vision and audio devices to remind
                                                                 such as trust [11, 12], driving style [13], and physical
drivers and pedestrians of their existence, but these de-
                                                                 safety of pedestrians and city infrastructure [14]. These
vices may sometimes be less effective in a noisy surround-
                                                                 numerous safety issues related to security risks may in-
ing environment [7]. In 2019, 170 people were killed in
                                                                 fluence the consumer’s trust in purchasing CAV solutions
crashes involving emergency vehicles, most of which
                                                                 [11].
were non-emergency vehicle occupants in the United
                                                                    There are many causes of EVs accidents. In personal
States [8]. To address this issue, Advanced Driver Assis-
                                                                 factors, the EVs drivers usually drive under high pressure
tance Systems (ADAS) such as adaptive cruise control,
                                                                 because of the time pressure, long shift hours and code
blind-spot object detection, and lane departure warn-
                                                                 3 running thinking. In code 3 running, the driver can
ing have been designed to improve driving safety and
                                                                 exceed the speed limit and does not have to follow the
support CAV. However, to our best knowledge, there is
                                                                 traffic signs in order to save the most time [15]. In en-
still not much research work in ADAS for the real-time
                                                                 vironmental factors, drivers usually drive in unfamiliar
emergency response for EVs, especially in CAV.
                                                                 environments or even disaster areas, and intersections
   This paper aims to study a real-time emergency re-
                                                                 are the most frequent places for EVs to be involved in
sponse model for CAV (SAE Level of autonomy 3), specif-
                                                                 car accidents [15]. In the physical feature, fire trucks and
ically for EVs, using a hybrid approach embedding either
                                                                 ambulances have a larger volume and are more likely to
vision and sounds sensors, aiming to detect and localize
                                                                 cause danger when they are driven together with other
the EVs by the vision and siren detection system, which
                                                                 vehicles. Summing up the above factors, we can under-
will be discussed and analyzed the increase of accuracy
                                                                 stand the threat of encountering EVs on the road. Their
of distance and identification measures to ensure sustain-
                                                                 task nature is more dangerous than general road driving,
able and safe operation during normal and emergency
                                                                 especially since most vehicle conflicts occur at intersec-
conditions and consideration of current related codes and
                                                                 tions. Therefore, if the traffic signal light can respond to
standards environment.
                                                                 the situation on the road and save time on the ambulance,
   The paper will survey and evaluate CAV based on a
it can also reduce the risk in the intersections. According      station, for example. In the context of smart technol-
to the task of EVs, there is usually time urgency, so for the    ogy, the interface may have access to the fridge or food
surrounding vehicles, the best way to respond is to slow         storage information to add a stop at the supermarket or
down and stop as soon as possible so that emergency              grocery store so that the human can purchase supplies.
vehicles do not have to be distracted by other vehicles.         These contextual GPS scenarios can offer more effective
However, even the Intelligent Transport Systems (ITS),           itineraries for the drivers. They can include everything
which include the protocols of communications between            from picking up colleagues to sharing a ride for work to
CAVs and the intelligent traffic, could be compromised by        syncing the driver’s agenda or adapting routes to traffic
cyber-attacks becoming susceptible to safety risks [16].         information, among other individual behaviors.
   Advancements in the CAV industry also open oppor-                In previous research, the methods for detecting sirens
tunities for creating a new profile of drivers. Among the        from EVs can be divided into two different approaches.
most promising approaches, CAV is the first alternative          The first approach is to identify whether the data con-
for independent visually impaired drivers [17]. Accord-          tains siren sound based on the siren’s characteristics,
ing to the World Health Organization (WHO), more than            such as high-frequency and low-frequency, or cyclical
1 billion people live with some visual impairment in the         nature [26, 27]. However, this method does not perform
world [18]. WHO shows that 36 million of them are blind,         well in noisy environments, especially in urban areas.
and the majority of those people are over 50 years old. In-      The second approach is to extract the siren signal from
deed, population aging is a worldwide phenomenon that            background noise [28]. For example, Fazenda et al. em-
is expected to bring economic consequences [19]. Build-          ployed the least mean squared algorithm to create a noise
ing CAV that the elderly population can use can help the         canceller to extract the target signals [29]. Nishimura et
industry to overcome the economic challenge. However,            al. proposed a data embedding method for the vehicle
to this end, accessibility of CAV becomes imperative for         location into the siren sound [30]. This research tends
the sector [17]. Besides the elderly and people with visual      to extract the siren signals from background noise, in-
impairments, advances in CAV also open opportunities             cluding lots of parts in real traffic life, by considering
to enhance children’s transportation. For example, au-           the Doppler Effect. The Doppler Effect is that the sound
tonomous school buses may benefit their independent              frequency will vary according to different speeds, so the
transportation, or, even, parents may use private CAV to         siren frequency in real life may differ from the spectrum
drive their kids to school.                                      we observe. Referring to the siren datasets we collected
   Driving essentials scenarios include, but are not lim-        from the Web, Schröder et al. showed that some audio
ited to, studies on Computer Vision (CV) to enhance CAV          software could help to mimic the Doppler Effect in the
capabilities of (partial or complete) self-driving, Artificial   datasets, such as Adobe Audition 1.0 [31].
Intelligence (AI), Cloud computing, and machine learn-              Our hybrid approach aims to localize the siren by
ing, among other computational domains concerned to              a time delay estimation method and a sound intensity
enhance essential driving functionalities [20, 21, 22, 23].      probe method in the Path Planning function. For exam-
Essential functionalities cover GPS services (to allow au-       ple, Fazenda et al. showed that the accuracy of the time
tonomous driving) and a range of sensing technologies            delay estimation method is better in the distance between
suitable for driver’s and passenger identification, includ-      the emergency vehicle and the driver in a long distance.
ing vehicle and road infrastructure detection and parking        But if it is in a short distance, the sound intensity probe
assistance. This scenario involves Computer Vision tech-         method can also get a higher accuracy [29].
niques and route generation, which come from Computer               Different projects have made great efforts to advance
Science and Automation Engineering backgrounds. An-              in the CAV areas. Big technology and automotive com-
other technology, such as the blockchain, has already            panies and universities have been working together to
been embedded in CAV, considering its efficient perfor-          advance the projects, designing vehicles and developing
mance mechanisms for decentralized distributed storage           different algorithms and drive systems split into different
and security management of big data [24, 25].                    levels of autonomy.
   The main stakeholders in this real-time emergency                Uber, in partnership with Carnegie Mellon Univer-
response scenario are the automobile sector, the mobile          sity; Lift together with General Motors; and Didi, with
application market, and average drivers and passengers.          Japanese automotive companies, have been building self-
An interface for this scenario can assist drivers in se-         driving cars and ride-sharing services with level 4 and
lecting and monitoring their driving routes, including           planning to build level 5 of autonomy. Uber, Lift and
contextual stops for either safety or personal matters.          Didi are companies that provide mobility services and
The emergency response model component may collect               have a great power of traffic data collection, which is
CAV’s contextual information, like the vehicle’s fuel or         the key to developing and improving their automation
another engine status. Then, it uses such information to         system and models [32]. AutoX, in 2018, built an ad-
determine if the GPS route must add a stop at the gas            vanced full-stack self-driving AI platform in partnership
with Alibaba Group, Chery automotive, NVIDIA, and            3. A Conceptual Framework and
other companies to build SAE Level 2 and 3 assistive-
driving vehicles, including RoboTaxis and RoboTrucks.
                                                                Methodology
The companies’ idea for the next decade is to develop a      CAVs are under various scenarios and operating condi-
full (Level 5) autonomous car, beginning in 2021 with the    tions of residential, industrial facilities, transportation
first Fully Driverless RoboTaxi Service to The Public in     electrification, and grid-connected integration. There-
China [32, 33].                                              fore, it should provide a comprehensive evaluation of the
   Google Waymo, in partners with automotive com-            safety risks during the operation of CAV by identifying
panies such as Fiat-Chrysler, Audi, Toyota, and Jaguar,      hazards and estimating risks in different operating condi-
has been working on self-driving vehicles of autonomy        tions and modes. Codes and standards roadmap should
level 4, operating the Waymo Driver, a commercial au-        be performed based on fault propagation modeling and
tonomous ride-hailing service, in San Francisco, Cali-       analysis, simulation and evaluation will be presented and
fornia. Recently in 2020, Waymo started the operation        analyzed by independent protection layers for all possi-
of Waymo Via, transporting commercial goods that use         ble normal and abnormal operating conditions. Besides,
autonomous vans and trucks [34]. The Apollo project,         evaluation and validation of technical, economic, safety,
Baidu’s open-source self-driving platform, was created       reliability, and availability with risk factors, life cycle
to test and improve the CAVs’ motion planning and ve-        costing, and environmental assessment will be discussed.
hicle control algorithms to aim the driving safety and          The research will consider the technical requirement
riding experiences. AVs such as the Lincoln MKZ Sedan        of CAV for safety and performance evaluation. It will
and the Ford Transit Van were used to train their dy-        consolidate the currently available codes and standards
namic models by real-world road data collected from          of CAV and propose new evaluation criteria for real-time
Apollo autonomous vehicles driving on urban roads [35].      emergency responses, to support CAV standards. CAV
The Apollo platform was its 6.0 version at the end of        has two aspects: hardware and software. Hardware gov-
2020. The union of efforts of the big technology compa-      erns sensors such as Vehicle-to-Vehicle (V2V), Vehicle-to-
nies and automakers to conceive powerful Autonomous          Grid (V2G), Vehicle-to-Infrastructure (V2I) and Vehicle-
Driving Systems (ADS), and consequently build fully Self-    to-Everything (V2X) technology, and actuators. Software
driving cars have provided great technological advance-      deals with processes of perception, planning, and control.
ments aimed at reaching common goals such as enhanc-         V2X technology components V2V and V2I allow the vehi-
ing safety, decongesting roadways, saving time for users,    cle to communicate by receiving information and talking
reducing greenhouse gas emissions, and ensuring mobil-       to other systems in the environment. These environ-
ity for all people, including the disabled and the elderly   mental communication systems can be other vehicles or
[36].                                                        smart city light-changing signals. CAV should be tested
   To enhance situational awareness in CAV, our pro-         according to their ability to transition with city speed
posed model will also incorporate the research work in       restrictions during an emergency.
computer vision tools such as geolocalized photos and           During autonomous driving, one of the most danger-
videos of the situations into the proposed model [37].       ous maneuvers is lane changing. Even with ADAS, lane
Furthermore, the model is expected to be implemented         change is still very complex and potentially dangerous.
by a machine learning approach based on Support Vec-         ADAS systems should be tested on features like Adaptive
tor Machine and Neural Network with the datasets we          Cruise Control (ACC), Autonomous Emergency Braking
collected for this research and responded to the real-time   (AEB), and Lane Keep Assistant (LKA). Planning opera-
input data [38].                                             tions should be tested under different scenarios to test the
   This research will consider the operation of CAVs that    vehicle’s ability to adapt to road circumstances. Accord-
face some technical challenges, such as the instability in   ing to recent literature, the Path Planning component of
some operating conditions due to the dynamic response        CAV comprises three functions: Mission Planning, Be-
of emergency traffic scenarios on a real-time basis. Most    havioral Planning and Motion Planning. A typical task
of the CAV mainly rely on their vision sensor instead        of each function is outlined as follows: (1) The Mission
of their sound sensor. One of the related research areas     Planner: High-level decisions such as determining pickup
is the hearing impaired [39, 40]. In recent years this       destination locations and road selections achieve the tar-
research area has gradually begun to move into CAV [36,      get mission. (2) The Behavioral Planner: Dynamic ad-hoc
41]. Therefore, we believe that detecting the approaching    decisions such as lane change, intersection crossing, and
EVs using both a vision sensor and a siren detection         overtaking. (3) The Motion Planner: Collision avoidance,
system is essential in the future.                           obstacle avoidance, alarm generation, etc. The proposed
                                                             model will be incorporated into Path Planning.
3.1. Evs Identification Process                            route of EVs, we will respond. If there is no conflict, we
                                                           will continue to observe but do not require a response.
In Figure 1, Our proposed EVs identification process will
                                                           Basically, we hope we can detect both sound and visual
start by detecting the approaching vehicle based on vi-
                                                           detection to give complete information to the driver. If
sual and sound sensors. There are usually two ways to
                                                           we can not detect the precise position, our system can
detect EVs. One is to see their unique appearance, and
                                                           probably predict the possible position to give the most
the other is to hear their siren sound. In the siren sound
                                                           appropriate response.
detection, we first identify which type of emergency ve-
hicle it belongs to and extract the specific siren sound
from the background noise. After we extracted the siren 3.2. Algorithm
sound, we used the time delay estimation method and Then we introduce our Emergency Vehicles response
sound intensity probe method to localize the direction algorithm. Table 1 is a brief conceptual introduction.
of the sound. From the previous research [29], the time When we detect approaching EVs, we first evaluate our
delay estimation method has better performance in long speed. If our speed is higher than a certain speed, we
distances than the sound intensity probe method. Thus, gradually decrease the speed to maintain safety. Then
we can decide the direction of the siren sound detection. when the position of EVs has been confirmed, the system
                                                           starts to perform the lane change to yield to EVs. The
                                                           method of changing lanes will first detect vehicles in the
                                                           vicinity, according to LIDAR, which can detect vehicles
                                                           within a radius of 100 meters.

                                                             Result:
                                                             def yieldToEVs():
                                                              Matrix = detectSurroundVehicle();
                                                              # By Using LIDAR to detect vehicles
                                                              changeLane(Matrix);
                                                              # change lane according to the vehicle matrix
                                                              while Emergency Vechicles is detected do
                                                                 if speed is above certain speed then
                                                                     slowDown();
                                                                     # slow down to certain speed
                                                                       if EVs’ location is confirmed then
                                                                          yieldToEVs();
                                                                          vehicleStop();
                                                                     end
                                                                 end
                                                             end
                                                            Algorithm 1: Algorithm of Emergency Vehicles Re-
                                                            sponse

                                                             Based on the results scanned by LIDAR, we can form a
                                                          vehicle matrix according to our lane and then plan how
                                                          to yield EVs based on the vehicle matrix. In Canada, the
                                                          Ministry of Transportation [42], in its traditional rules,
Figure 1: EVs Identification Process.                     stipulate that when EVs are encountered, they pull as
                                                          close as possible to the right edge of the road. However,
                                                          according to the real situation, it is not always the best
   In visual detection, we first detect the type of emer- option to pull to the right. We should make the most
gency vehicle and use the image we captured and the space according to the traffic conditions. When we yield
distance we detect from Light Detection and Ranging the position to EVs, the best way is to stop and let the
(LIDAR) to localize the direction and the position of the EVs drive safely. After all, any vehicle movement can
EVs. Combine the above information and use the GPS to cause distractions in our EVs driver.
help predict the possible path of EVs and provide driver
information so that appropriate responses can be made
as soon as possible. If our route is interleaved with the
3.3. Conceptual Cooperative CAV system                        records, will be used to maintain complete traceability in
                                                              dynamic and static data and properly manage risks, and
By using the sharing property of the CAV, we aim to
                                                              our ADAS will be reviewed according to standard secu-
create a comprehensive safety standard and system. Fig-
                                                              rity requirements. From the cybersecurity perspective,
ure 2 is our conceptual cooperative CAV system; all the
                                                              the work will follow ISO 21434 while security will be
CAVs can do collaborative sense and computation. In
                                                              considered in the development and deployment process,
addition to the communication between vehicle and ve-
                                                              embracing the Security by Design approach, considering
hicle (V2V), the communication between the vehicle and
                                                              requirements since the adoption of secure wireless con-
the traffic lights (V2I) can also significantly save time
                                                              nection protocols to the usage of encryption in informa-
for the EVs to reach the destination. Usually, the biggest
                                                              tion transmission. For every information transmission,
problem encountered by EVs is the traffic jam on the
                                                              an appropriate incident response mechanism will be initi-
road or the danger encountered when executing ‘code
                                                              ated, which will include methods for determining actions
3 running’, on the way to an emergency. Our EVs re-
                                                              of progress or remediation and vulnerabilities analysis,
sponse ADAS can collaborate with other ADAS, such as
                                                              that will consider the potential damage.
Adaptive Cruise Control, Autonomous Emergency Brak-
ing, and Lane Change Assistant, to give corresponding
responses automatically. As for the interaction between 4. Conclusion
humans and vehicles, in the process of automated re-
sponse, humans need to supervise and be able to inter- The operation of CAVs faces some technical challenges,
vene. For example, in Tesla’s autopilot system, people such as the instability in some operating conditions due
need to put their hands on the steering wheel to ensure to the dynamic response of traffic scenarios, such as emer-
safety when performing lane changes. Accidents caused gency vehicle response. Therefore, the SAE classifies the
by automated failures in the response of EVs can seri- vehicle from Level 0 to 5 based on the automation capa-
ously affect lives. At present, a few fatal vehicle accidents bilities. In today’s CAV, most of them are in Level 3, and
occur on autonomous vehicles. Therefore, before we can car manufacturers assume the driver will take back con-
fully guarantee the driving safety of autonomous vehi- trol during the emergency in real time. However, drivers
cles, appropriate human supervision and intervention should stay aware of automation limitations, and the man-
are necessary.                                                ufacturers should make a warning system that can give
   ADAS can outperform humans in some automated op- warnings far ahead of time. Nowadays, most CAV mainly
erations, thus promoting traffic safety and resulting in rely on their vision sensor instead of their sound sensor.
the development of autonomous vehicles. To achieve Our hybrid approach aims to detect and localize the EVs
the driving safety of autonomous cars, some standard- by the vision and siren detection system. We believe
ized methodologies were created. Projects such as the that detecting the approaching EVs using both vision
Waymo Driver and AutoX developed the ADAS of their and sound sensors are essential in CAV, increasing the
Self-driving cars using the ISO 26262 “Road Vehicles - accuracy of distance and identification measures. The re-
Functional Safety,” which presents guidelines applied to search direction covers codes and standards for all control
safety-related systems that include one or more electri- and communication functions in related ECUs through
cal and/or electronic systems in automobiles. Due to the operating process of CAV, which should be tested for
the necessity of dynamically and automatically driving accuracy and strength to support the requirements of the
awareness, with lanes and vehicle detection, as well as design, development, operation, and evaluation of the
collision avoidance, the CAV systems have continuously model. The requirements can fit together with standards
improved their automobile intelligence and connectivity (e.g., the working document ISO 21434 “Road Vehicles –
capabilities, increasing the focus on cybersecurity. Stan- Cyber Security Engineering”) and functional safety re-
dardized methods of cybersecurity, such as the ISO 21434 quirements (e.g., ISO 26262 “Road vehicles - Functional
“Road Vehicles - Cyber Security Engineering,” have been safety”) within the North American regulatory structure
adopted, which specifies requirements for the whole life and utility requirements.
cycle of automotive products of engineering-related cy-
bersecurity risk management for road vehicle electrical
and electronic systems and their components and inter- Acknowledgments
faces [43].
                                                              This paper is supported by Research Grant Fund R20090,
   Our system development will comply with ISO 26262
                                                              Zayed University, United Arab Emirates.
and ISO 21434. The software development life cycle in
ISO 26262, from design to implantation and validation,
will be followed. In addition to software development,
appropriate information management, including process
Figure 2: Conceptual Cooperative CAV System.



References                                                 [5] M. Johns, B. Mok, D. Sirkin, N. Gowda, C. Smith,
                                                               W. Talamonti, W. Ju, Exploring shared control in
 [1] J. Ma, X. Li, S. Shladover, H. A. Rakha, X.-Y. Lu,        automated driving, in: 2016 11th ACM/IEEE Inter-
     R. Jagannathan, D. J. Dailey, Freeway speed harmo-        national Conference on Human-Robot Interaction
     nization, IEEE Transactions on Intelligent Vehicles       (HRI), IEEE, 2016, pp. 91–98.
     1 (2016) 78–89.                                       [6] D. Shepardson, U.s. agency to determine cause of
 [2] A. Ghiasi, X. Li, J. Ma, A mixed traffic speed har-       2018 fatal tesla autopilot crash, 2020. URL: https://
     monization model with connected autonomous ve-            www.reuters.com/article/us-tesla-crash-california/
     hicles, Transportation Research Part C: Emerging          u-s-agencytodetermine-cause-of-2018-fatal-tesla-/
     Technologies 104 (2019) 210–233.                          autopilot-crash-idUSKBN1ZD24B.
 [3] S. O.-R. A. V. S. Committee, et al., Taxonomy and     [7] L. J. Wang Wensheng, L. Sibin, Identifying the
     definitions for terms related to on-road motor ve-        sound of ambulance whistle by frequency feature,
     hicle automated driving systems, SAE Standard J           International Journal of Advanced Information
     3016 (2014) 1–16.                                         Technologies 6 (2012) 39–45.
 [4] P. Bansal, K. M. Kockelman, Forecasting americans’    [8] National Safety Council, Emergency vehicles, 2020.
     long-term adoption of connected and autonomous            URL: https://injuryfacts.nsc.org/motor-vehicle/
     vehicle technologies, Transportation Research Part        road-users/emergency-vehicles.
     A: Policy and Practice 95 (2017) 49–63.
 [9] Road vehicles - Functional safety, Technical Report           and transportation engineering (English edition) 6
     ISO Standard No.26262:2018, International Organi-             (2019) 109–131.
     zation for Standardization, 2018.                        [22] H. Gao, B. Cheng, J. Wang, K. Li, J. Zhao, D. Li, Ob-
[10] Road vehicles Cybersecurity system evaluation                 ject classification using cnn-based fusion of vision
     method, Technical Report ISO Standard No.                     and lidar in autonomous vehicle environment, IEEE
     21434:2021, International Organization for Stan-              Transactions on Industrial Informatics 14 (2018)
     dardization, 2021.                                            4224–4231.
[11] H. Abraham, C. Lee, S. Brady, C. Fitzgerald,             [23] M. R. T. Hossai, M. A. Shahjalal, N. F. Nuri, Design
     B. Mehler, B. Reimer, J. F. Coughlin, Autonomous              of an iot based autonomous vehicle with the aid
     vehicles, trust, and driving alternatives: A survey of        of computer vision, in: 2017 International Confer-
     consumer preferences, Massachusetts Inst. Technol,            ence on Electrical, Computer and Communication
     AgeLab, Cambridge 1 (2016) 2018–12.                           Engineering (ECCE), IEEE, 2017, pp. 752–756.
[12] K. Lazanyi, G. Maraczi, Dispositional trust—do we        [24] G. Drakopoulos, E. Kafeza, H. Al Katheeri, Proof
     trust autonomous cars?, in: 2017 IEEE 15th In-                systems in blockchains: A survey, in: 2019 4th
     ternational Symposium on Intelligent Systems and              South-East Europe Design Automation, Computer
     Informatics (SISY), IEEE, 2017, pp. 000135–000140.            Engineering, Computer Networks and Social Media
[13] C. Basu, Q. Yang, D. Hungerman, M. Singhal, A. D.             Conference (SEEDA-CECNSM), IEEE, 2019, pp. 1–6.
     Dragan, Do you want your autonomous car to drive         [25] T. Jiang, H. Fang, H. Wang, Blockchain-based inter-
     like you?, in: Proceedings of the 2017 ACM/IEEE               net of vehicles: Distributed network architecture
     International Conference on Human-Robot Interac-              and performance analysis, IEEE Internet of Things
     tion, 2017, pp. 417–425.                                      Journal 6 (2018) 4640–4649.
[14] A. Millard-Ball, Pedestrians, autonomous vehicles,       [26] S. Kiran, M. Supriya, Siren detection and driver
     and cities, Journal of planning education and re-             assistance using modified minimum mean square
     search 38 (2018) 6–12.                                        error method, in: 2017 International Conference
[15] H. Hsiao, J. Chang, P. Simeonov, Preventing emer-             On Smart Technologies For Smart Nation (Smart-
     gency vehicle crashes: status and challenges of               TechCon), IEEE, 2017, pp. 127–131.
     human factors issues, Human factors 60 (2018)            [27] J.-J. Liaw, W.-S. Wang, H.-C. Chu, M.-S. Huang, C.-
     1048–1072.                                                    P. Lu, Recognition of the ambulance siren sound
[16] B. B. Gupta, A. Gaurav, E. C. Marín, W. Alhalabi,             in taiwan by the longest common subsequence, in:
     Novel graph-based machine learning technique to               2013 IEEE International Conference on Systems,
     secure smart vehicles in intelligent transportation           Man, and Cybernetics, IEEE, 2013, pp. 3825–3828.
     systems, IEEE Transactions on Intelligent Trans-         [28] L. Marchegiani, P. Newman, Listening for sirens:
     portation Systems (2022).                                     Locating and classifying acoustic alarms in city
[17] M. Bonani, R. Oliveira, F. Correia, A. Rodrigues,             scenes, IEEE Transactions on Intelligent Trans-
     T. Guerreiro, A. Paiva, What my eyes can’t see, a             portation Systems (2022).
     robot can show me: Exploring the collaboration           [29] B. Fazenda, H. Atmoko, F. Gu, L. Guan, A. Ball,
     between blind people and robots, in: Proceedings              Acoustic based safety emergency vehicle detection
     of the 20th International ACM SIGACCESS Con-                  for intelligent transport systems, in: 2009 ICCAS-
     ference on Computers and Accessibility, 2018, pp.             SICE, IEEE, 2009, pp. 4250–4255.
     15–27.                                                   [30] A. Nishimura, Encoding data by frequency modu-
[18] World       Health      Organization,       Blindness         lation of a high-low siren emitted by an emergency
     and vision impairment, 2021. URL: https:                      vehicle, in: 2014 Tenth International Conference
     //www.who.int/news-room/fact-sheets/detail/                   on Intelligent Information Hiding and Multimedia
     blindness-and-visualimpairment.                               Signal Processing, IEEE, 2014, pp. 255–259.
[19] G. Carbonaro, E. Leanza, P. McCann, F. Medda, De-        [31] J. Schröder, S. Goetze, V. Grützmacher, J. Anemüller,
     mographic decline, population aging, and modern               Automatic acoustic siren detection in traffic noise
     financial approaches to urban policy, International           by part-based models, in: 2013 IEEE International
     Regional Science Review 41 (2018) 210–232.                    Conference on Acoustics, Speech and Signal Pro-
[20] A. Dominguez-Sanchez, M. Cazorla, S. Orts-                    cessing, IEEE, 2013, pp. 493–497.
     Escolano, Pedestrian movement direction recog-           [32] C. Badue, R. Guidolini, R. V. Carneiro, P. Azevedo,
     nition using convolutional neural networks, IEEE              V. B. Cardoso, A. Forechi, L. Jesus, R. Berriel, T. M.
     transactions on intelligent transportation systems            Paixao, F. Mutz, et al., Self-driving cars: A sur-
     18 (2017) 3540–3548.                                          vey, Expert Systems with Applications 165 (2021)
[21] D. Elliott, W. Keen, L. Miao, Recent advances in              113816.
     connected and automated vehicles, journal of traffic     [33] AutoX, The autox safety factor technical re-
     port, 2020. URL: https://medium.com/autox/
     the-autox-safety-factor-c76d80e6768f.
[34] M. A. Cusumano, Self-driving vehicle technology:
     progress and promises, Communications of the
     ACM 63 (2020) 20–22.
[35] J. Xu, Q. Luo, K. Xu, X. Xiao, S. Yu, J. Hu, J. Miao,
     J. Wang, An automated learning-based proce-
     dure for large-scale vehicle dynamics modeling on
     baidu apollo platform, in: 2019 IEEE/RSJ Interna-
     tional Conference on Intelligent Robots and Sys-
     tems (IROS), IEEE, 2019, pp. 5049–5056.
[36] F. Youssef, B. Houda, Comparative study of end-to-
     end deep learning methods for self-driving car, Int.
     J. Intell. Syst. Appl 12 (2020) 15–27.
[37] L. Lopez-Fuentes, J. van de Weijer, M. González-
     Hidalgo, H. Skinnemoen, A. D. Bagdanov, Review
     on computer vision techniques in emergency situa-
     tions, Multimedia Tools and Applications 77 (2018)
     17069–17107.
[38] A. Benterki, M. Boukhnifer, V. Judalet,
     M. Choubeila, Prediction of surrounding ve-
     hicles lane change intention using machine
     learning, in: 2019 10th IEEE International Confer-
     ence on Intelligent Data Acquisition and Advanced
     Computing Systems: Technology and Applications
     (IDAACS), volume 2, IEEE, 2019, pp. 839–843.
[39] F. Beritelli, S. Casale, A. Russo, S. Serrano, An auto-
     matic emergency signal recognition system for the
     hearing impaired, in: 2006 IEEE 12th Digital Signal
     Processing Workshop & 4th IEEE Signal Processing
     Education Workshop, IEEE, 2006, pp. 179–182.
[40] F. Meucci, L. Pierucci, E. Del Re, L. Lastrucci, P. De-
     sii, A real-time siren detector to improve safety of
     guide in traffic environment, in: 2008 16th Euro-
     pean Signal Processing Conference, IEEE, 2008, pp.
     1–5.
[41] Y. Ebizuka, S. Kato, M. Itami, Detecting approach of
     emergency vehicles using siren sound processing,
     in: 2019 IEEE Intelligent Transportation Systems
     Conference (ITSC), IEEE, 2019, pp. 4431–4436.
[42] Canada Ministry of Transportation, Road
     safety:       Emergency vehicles, 2020. URL:
     http://www.mto.gov.on.ca/english/safety/
     emergency-vehicles.shtml.
[43] H. Shan, K. He, B. Wang, X. Fang, Road vehicles cy-
     bersecurity system evaluation method, in: Journal
     of Physics: Conference Series, volume 1607, IOP
     Publishing, 2020, p. 012054.