=Paper=
{{Paper
|id=Vol-2622/paper9
|storemode=property
|title=Towards an Autonomous Radiation Early Warning System
|pdfUrl=https://ceur-ws.org/Vol-2622/paper9.pdf
|volume=Vol-2622
|authors=Mohammed Al-Saleh,Béatrice Finance,Rafiqul Haque,Yehia Taher,Ali Jaber
|dblpUrl=https://dblp.org/rec/conf/bdcsintell/Al-SalehFHTJ19
}}
==Towards an Autonomous Radiation Early Warning System==
Towards An Autonomous Radiation Early Warning
System
Mohammed Al Saleh Béatrice Finance, Yehia Taher Rafiqul Haque
Lebanese Atomic Energy Commission (LAEC) David Laboratory Intelligencia R&D
National Council For Scientific Research (CNRS) University of Versailles (UVSQ) Paris, France
m.alsaleh@laec-cnrs.gov.lb fname.lname@uvsq.fr rafiqul.haque@intelligencia.fr
Abstract—Although radiation level is a serious problem which probes) disseminated on a specific region that monitor the
requires continuous monitoring, many existing systems are de- gamma radiation level. This system reacts as soon as possible
signed to perform this task. Radiation Early Warning System to anomalies by raising an alert. Typically, the alerts are
(REWS) is one of these systems which monitor the gamma
radiation level in the environment. On the other hand, such determined by predefined threshold values that are essentially
system requires high manual intervention, depends totally on chosen based on observations (i.e. experience). It is worth
experts analysis, and has some shortcomings that can be risky noting that there are different threshold values at different
sometimes. In this paper we introduced our approach called locations since the threshold value depends strictly on the
RIMI (Refining Incoming Monitored Incidents) which aims to normal reading of the radiation level (known as background
improve this system to become an autonomous system. We also
introduced a new method to change this system to become a level) which is in turn is not fixed due to many factors
predictive and proactive system which learns from past incidents. such as the altitude. Once an alert is raised, it needs to be
checked by an expert. Indeed, the expert needs to analyze the
Index Terms—Radiation, Early Warning System, Data Analyt- potential causes for the incident as some alerts refer to an
ics, Anomamy detection, etc. authentic threat of high radiation level and others denote the
rise of radiation level that has no hazardous impact on the
I. I NTRODUCTION environment or living beings. In order to do so the expert will
Radiation level is one of the most critical hazards that consult additional information such as the weather broadcast
must be taken care of due to its catastrophic and persistent and the quality factors (also called quality bits) of the probe.
consequences on the environment, humans and the other non- For instance, the alert is false when the quality bits of the
living things. Radioactive incidents and disasters such as probe indicate that there is a defect in the probe, meaning
Chernobyl [1], Fukushima [2], and the most recent one at that we cannot trust the collected gamma dose rate value. The
Russian nuclear missile test site [3], raised a serious concern. alert is innocent when external factors have occurred such as
These events have given rise to the need for continuous rain, wind, lightening, etc. These external factors are the more
monitoring of the radiation level in the environment. Since the difficult to analyse, but they represent more than 90% of the
radiation can be transmitted through the wind, it is important alarms. Finally, the alert is real and an emergency action need
to monitor the radioactivity within widespread geographical to be taken by the authority immediately.
locations to prevent any unwanted exposure. The continuous Existing REWS solutions have various shortcomings. The
monitoring would greatly help in taking a proactive measure most critical one is the manual intervention of the expert that
that would eventually raise an alert upon an occurrence of is heavily time-consuming, labor-intensive, and risk-prone.
incidence. Therefore, many countries around the world raised Indeed, when an alarm is raised a considerable amount of
the idea of developing several techniques for monitoring the time and efforts are consumed by the expert to analyze the
radiation level in the environment to detect any abnormal parameters that are stemming from external data sets such
release or discharge. Lebanon was one of these countries as weather data sets in order to classify the alert as false,
that developed a national environmental radiation monitoring innocent or real. As there is no automated data collector,
program to establish radiation baseline level and determine the experts must carry out data searching and data fetching
trend of radiation level in the country. Air monitoring was operations manually. Moreover, most of the time, the expert
one of the scopes of this program.[Reference Public Exposure cannot classify the alert immediately as he/she needs to wait
Article] for further readings of the gamma dose rate to see if it will
There exist different approaches to monitor and analyze the return to normal. This can take hours due to some parameters
impact of high radiation levels. Among them is the Radiation such as rain. Therefore, it is not possible to make a faster or
Early Warning System (REWS) that is a widely used network real-time inference using the current approach.
system which exists in Lebanon since 2013. The REWS is Today, we assist to the explosion of machine learning
composed of many radiation detection sensors (also called techniques and complex algorithms in order to help experts or
Copyright © 2019 for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0).
57
Figure 1. Low Dose Tube Broken Effect on Gamma Dose Rate Figure 2. Wind Effect on Gamma Dose Rate
non-experts to learn more about their data. Machine learning
techniques might help building predictive models in order to
have a real-time proactive system. However, in order to apply
these techniques, some preliminaries analysis should be done
to better characterize the problem that needs to be solved. The
main objective of this research is to analyse REWS and see if
the expert can be removed from the picture and replaced by
an autonomous REWS. There are many challenges to address
before reaching this goal. The work described in this paper
is the first attempt to do so, as to our knowledge it does not
exist autonomous REWS in the literature.
The main objective of this research is to develop an end-to-
end solution that will be integrated with running REWS sys-
Figure 3. Lightning Effect on Gamma Dose Rate
tems without any disruption or without replacing it completely.
Indeed, before replacing the expert, the system should prove
its accuracy to predict the right answer. Thus, a supervised
when the low dose tube of the probe is broken, thus it affects
learning should take place at the beginning until it reaches its
the quality of the gamma dose rate value. Its interpretation is
full potential and work on its own. In this paper, we present
no more reliable. This type of scenario will produce a false
our RIMI framework (Refining Incoming Monitored Incidents)
alarm.
that highlight the different steps that need to take place before
In the scenarios respectively described in Figure 2 and
reaching an autonomous REWS solution. In this framework,
Figure 3, we see how the wind and lightnings directly and
we plan to develop a list of components from data acquisition
immediately impact the gamma dose rate. In these scenarios,
and normalization, to building a predictive model on a real
we observe many peaks that do not last very long. We called
data set produced by a running REWS, then by using it to
them hard parabola. These types of scenarios will produce a
predict the right classification of the alarm on real-time data.
false alarm.
The remaining of this paper is organized as follows. Section
2 will highlight the nature of the problems that need to On the opposite, the rain impacts the gamma dose rate in
be solved in REWS. In Section 3, we describe our RIMI a completely different manner. Rain for example can cause
framework and detail each of its components. Finally, we will the soil to emit radioactive gases into the air resulting a true
conclude our work in Section 4. Notice, that there will not be a innocent gamma radiation readings. When hitting the soil,
related work section in itself, as it does not exist similar work the rain can increase the gamma dose rate that will return
in the literature but rather, once a problem has been refined we to normal values after specific time. Sometimes, even if it
will give some hints of the approaches that have been proposed continues to rain after the peak, it will not affect the gamma
in the literature to solve this particular problem. dose rate anymore. The effect is seen when the soil is dry, not
when the soil is already humid. This behavior is described in
II. PROBLEM DESCRIPTION the Figure 4 and corresponds to what is called soft parabola.
In this section, we illustrate some of the scenarios that an It is classified as an innocent alarm.
expert will encounter during his/her work. These scenarios Fortunately, real alarm are very rare, but as you can imagine
reflect the variety of the situations that occur most of the time. the peak will not decrease after a short period of time but
The scenario described in Figure 1 illustrates how an will continue to increase. Many other scenarios can also be
internal factor can affect the gamma dose rate. For instance, found in practice. For instance, some factors like earthquakes
58
A. Data Collector and Data Enrichment
As seen in the previous section, the data is heterogeneous
(i.e. time series, quality bits, events) and comes either from
the online REWS monitoring system or from the external data
sources that must be queried on demand by experts in the
case of a triggered alarm. The data collected by the REWS
system is stored in an historical database. The data acquisition
is done on a regular basis through secure channels between
the radiation detection sensors (i.e. probes) and the server.
In normal mode, the probe sends a message containing the
gamma dose rate average every hour, but when an alarm is
triggered because the gamma dose rate is over the threshold,
the system switches to an alarm mode, thus the probe sends a
Figure 4. Rain Effect on Gamma Dose Rate
message every minute. The system returns automatically to its
normal mode when the gamma dose rate returns to normal. In
addition to the gamma dose rate, the probes send other sensors
or a truck with radioactive materials load passing near a probe data (also known as Quality Bits) as they are equipped with
can cause the gamma dose rate level to increase immediately. internal sensors that can detect the defectiveness of any of
Moreover, multiple factors can be combined together such as the system components. These sensors data are stored in the
rain and wind making the recognition of the cause less easier. historical REWS database for later analysis. As we have seen
Many data sources should be combined together. Some are earlier in one of the scenarios, the defect of the low dose tube
collected in a continuous manner by the REWS and stored in can cause a direct false high gamma radiation level.
an historical database. But many others data sources must be For the time being, three years of historical data have been
queried on demand when an investigation is launched by an collected by our REWS that is used in Lebanon and controlled
expert. Combining all these heterogeneous data sources on the by the Lebanese Atomic Energy Commission. These data are
fly is also a difficult problem in itself. precious and can be used to train our predictive model, but
Another dimension of the problem concerns the variability these data needs to be enriched by external information in
of the threshold values that evolve over time and that is also order to automatically find the causes of an alarm. The data
dependant on the location of the probe itself. As said earlier, sources that can be queried on demand are numerous. It can
these predefined threshold values are essentially chosen based be a weather database, a radiation transportation database, etc.
on observations or experience at the beginning, but they evolve In order to be queried, we need to know the approximate
slightly over time on a monthly basis, making the comparison timestamp of the alarm in order to better understand the past
of the time series over multiple months not an easy task. context or situation in which the alarm was triggered. On the
All these examples illustrate the difficulty and the hetero- historical data set, sometimes the data are annotated by the
geneity of analysing the gamma dose rate shape and under- alarm timestamp (i.e. when the alarm was triggered) but most
standing its causes in order to classify properly the alarm in of the time it is not. So it is our responsibility to infer the
an automatic way. For all these reasons we believe that the alarm detection on past sensors data.
research problem is interesting to be tackled as it will require Because of the heterogeneity of the data sources, the data
many different techniques or approaches to be used. This is needs to have some validation and normalization before being
the reason why we define the RIMI framework to offer an integrated into our framework. The RIMI framework has to
end-to-end solution towards an autonomous REWS. deal with multivariate time series that need to be integrated
in a proper way. The main time series related to the gamma
dose rate needs to be harmonized in respect with : (1) the
III. T HE RIMI F RAMEWORK normal and fast modes, (2) communication problems between
the probe and the server which result in missing information.
In this section, we provide a detailed description of our Moreover the enrichment of the time series should also be
framework entitled, RIMI (Refining Incoming Monitored In- done in a proper way especially when it stems from external
cidents). The framework consists of three main components: data sources. For instance, timestamps should be aligned or
(1) the data collector and enrichment, (2) the building of the normalized to avoid inconsistencies in the analysis.
predictive model, and (3) the online detection and prediction.
Figure 5 illustrates more in detail each of the main compo- B. Building the Predictive Model
nents. As the incident is caused by a high gamma dose rate The predictive model is built upon the historical databases
level which can be harmful for humans and environment, this produced by the running REWS system, which continuously
framework aims to replace a human-driven verification system collect sensors data produced on the different probes. First,
that refines the incoming incidents and alerts and detect its there is a need to identify all the incidents. Second, we should
cause by doing it automatically with a high level of accuracy. research the causes of each incident. This requires the data
59
Figure 5. A Framework for Real-time Radiation Pollution Detection
enrichment to better understand the past situation or context We choose standard deviation because of the nature of the
that occurred during the incident. Finally, all this mass of distributions of data. According to our observation, radiation
information should be organized and classified in order to build level data are uniformly distributed and to the best of our
our predictive model that will be used at run-time. understanding standard deviation is a suitable technique for
1) Incident Extractor: Incident extraction consists in finding intervals when data are uniformly distributed in a two
analysing the gamma dose rate time series data in order to dimensional graph. This background level interval is calculated
identify a fragment (i.e. shape). A fragment corresponds in by adding and subtracting the value resulted in by calculating
fact to a triggered alarm (i.e. incident). As said earlier, the the standard deviation to the mean of the background values in
threshold and the background values are not fixed but they the current month. This computation model produces a catalog
evolved over time. At the beginning of the system, a value is of parameters with the corresponding means, thresholds, and
given but it is refined over time to better suit the default gamma the background intervals values for each month. Thus the
dose rate of the location on which the probe is installed. This Incident Extractor component relies on the catalog which
value called the background can be different from one location defines the appropriate background interval values for each
to the next. At the end of each month, the average of the month. We assume that this catalog is fully computed on
background values is calculated to find the background mean. historical data before the extraction starts.
This mean will be used for finding the background interval In the Figure 6, we define the background interval that corre-
which is the range of the safe gamma dose rate values in sponds to the acceptable background values. It is represented
the environment. It is important here to mention that this by a lower B1 and an upper B2 bounds. We also show the
mean will be calculated after removing the threshold values threshold value which is 1.5 times the background mean.
from the month data set. To find the threshold value, we A fragment is defined by a beginning x and an end y times-
noticed that experts in different countries depend on different tamps. In other words, once a threshold value is found, the
methods. Some consider that values that are equal and greater Incident Extractor will search for the nearest points x and y that
than 1.5 times the background mean as thresholds. Others represent the preceding and succeeding values of the threshold
refer to the values that are equal and greater than 2 times and extract the current fragment from x till y. Note that these
the background mean as thresholds. We decided to be more two values must lie within the background level interval. They
precise and rely on the 1.5 method knowing that this value respectively identify the time when the gamma dose rate starts
can be changed to suit experts’ expectations through different to increase in an abnormal way and when its return to a normal
countries. We explored several methods to find the most state. Moreover, it is worth noting that we designed a locking
suitable one that determines the lower and the upper bounds of mechanism that does not allow the Incident Extractor to start a
the background interval. Our study revealed that the standard new extraction operation unless the previous one is completed.
deviation [4] is promising to find the background level interval. We used a locking mechanism because a graph may contain
60
(Figure 3).
On the other hand, the causes behind the incidents with soft
parabola are few. Rain is one of the most frequent causes that
can lead to the increase of the gamma radiation level for a
long period of time. This increase produces time series data
with soft parabola knowing that the radiation level will return
to normal values after a specific period of time. Other causes
leading to a soft parabola of gamma radiation level could be
real threads. Although rain has an effect on the radiation level,
its effect will not start directly on occurrence. For example,
we can notice in Figure 4 the several rain events occurred
before the gamma radiation level starts to increase. Also, we
can notice how rain events continue after the gamma radiation
level returned to normal values without affecting it again. This
Figure 6. Fragment Extractor can tell us that the effect of the rain on the gamma radiation
level is not simultaneous.
In the literature, it exist some causal models such as the
more than one fragment exceeding the threshold value. Thus, most well-known one which is the Granger Causality Model
incident extractor extract these fragments sequentially and the [8] that can be helpful for finding the cause between time series
endpoint of the preceding fragment may become the starting especially for hard parabola. Indeed, many external factors,
point for succeeding fragment. which come as time series data, have an immediate effect
Shapelet extraction has drawn significant research attention, on the gamma dose rate. In other words, they are directly
in recent years. Many algorithms have been proposed in correlated and it is an evidence that the external factors are
the literature such as Piecewise Aggregate Approximation the cause, and not the other way round. If the external factors
(PAA)[5] and the Multivariate Shapelets Detection (MSD)[6]. are not the cause for the hard parabola, we need to investigate
These approaches search for shapelets that are similar to a the Quality Bits to look for a problem in the components of
referent shapelet. These approaches are not be suitable to our the probe. Moreover we need to inform some technicians that
problem as we do not have any referent shapelet and due to the the probe needs some reparation. However for soft parabola,
evolving of the background level, the duration of our fragment the Granger Causality Model will for sure not work as the
can be multiple. Moreover, some of the approaches go beyond correlations are not direct, nor obvious. In particular, we need
that and discuss extracting shapelets based on predefined key to enrich the data in a more vaster way as the cause may have
points[7]. These methods aim at detecting key points in the happened before the beginning timestamp of the fragment.
time series and then extract the shapelets referring to these key How much before we do not know as it can depend on the
points. Such approaches need to be investigated more in order nature of the soil. We also don’t know if the quantity of heavy
to check their compatibility with our evolving background rain will have an impact of the increase and decrease of the
interval. gamma dose rate. At the moment, we do not know which
2) Identifying the Incidents Causes: Once the fragments are techniques can be used, may be a specific causal model needs
extracted, we enrich them by querying external data sources to to be defined. Finally, if multiple causes are combined, such
better understand the context that occurred in the past during as rain and wind, the causal model that can be used for hard
a specific period of time as defined in the Figure 6 with the parabola will not work anymore. It is pretty sure that we may
beginning x and the end y timestamps. Our goal is to annotate encounter some situations that we have not yet identified.
the fragments with its potential causes. Two kinds of incidents Once the possible causes for an incident have been identi-
can be obtained based mainly on the duration of a fragment: fied, we might check the accuracy of our model by consulting
the one representing incidents with hard parabola and the the historical database. Indeed, some incidents have been
others representing incidents with soft parabola as illustrated annotated in the past by some experts and we can use this
in the different scenarios detailed in Section II. At this stage, knowledge to tune our model. Moreover, there will be also the
the cause behind each incident is unknown. possibility at the beginning of this stage to involve an expert,
For incidents with hard parabola, the cause will not be risky to evaluate the accuracy of our findings. Most probably we
as it shows different peaks in a short period of time. In such will find more causes for incident, because most of the time
case, the cause could be because of the quality bits readings the expert may focus on the main cause and will not look
indicating an error in one of the system components as shown for all causes as it is time consuming. Finding the causes for
in Figure 1. Other causes for hard parabola incidents could be a situation is in a way of finding a complex situation that
the wind and the earthquakes. They will perform a shaking occurred in the past. We plan to look at situation awareness
effect on the probe leading to a false increase in the gamma models to better capture the complex nature of the causes (i.e.
radiation level (Figure 2). Lightning can also cause the probe situation). For instance, knowing that the rain that occurred
to represent a false increase in the gamma dose rate level in the first week of October at a specific location may have
61
more or less impact than the one occurring in the summer.
What is clear is that each fragment will be annotated with the
description of the situation (i.e. causes or events happening
together).
3) Classification: After identifying the causes and calculat-
ing the situation for each fragment, we need to classify them
in order to build and train our predictive model. We will start
by comparing the different incidents extracted in order to set
the main classes, then we will compare the situations inside
each class to form the sub classes. The classification process
will start forming the main classes by following a model-based
approach. Through this approach, the incidents extracted from
the gamma dose rate time series will be compared based on the
graphical shapes representing the incidents. Several incident
shapes will be gathered and classified as soft or hard parabola
shapes.
Existing shapelet classifiers proposed in literature use dif- Figure 7. Fragment Matcher
ferent techniques. In [9], the authors proposed a model for
classifying shapelets using minimum Euclidean distance be-
tween the input and the expected templates. Mahalanobis the new incidents by performing a matching operation over
distance, introduced in [10], takes into consideration the the annotated patterns produced at the incident classification
correlation of the data and is scale-invariant. The authors step. We designed this analytical engine based on Kappa
believe that this approach will give more accurate results than Architectural Style which means that the incident pattern
the Euclidean Distance method. We found some works that matching operations will be performed in real-time. At the
focus on simplifying the distance calculation. For instance, same time, a pre-designed algorithm will be running in the
the DTW (Dynamic Time Wrapping) algorithm[11] focuses on background to calculate the parameters that will be used for
reducing the computing complexity and improves efficiency. next month’s evaluation. On incident detection, the framework
In addition, many previous approaches dealing with time series will search for the current situation, since it will be always
classification will be investigated to perform this task. Some connected to external and internal factors databases, and will
potential mathematical model could be used in developing our run the Incident Matcher phase, which will look to the nearest
classifier including the Chebychev distance[12], Manhattan point before the threshold belonging to the background level,
distance[13], and Minkowski distance[14]. We will investigate and start matching the beginning of the current shapelet with
all potential algorithms and models to find the best technique those represented in the predefined classes.
that can fit our data. For example, Figure 7 shows the analyzing process for
After forming the main classes, the incidents will be com- incoming data. As we can notice, the readings started within
pared based on their annotated situations. Sub classes will be the background level interval which means a normal situ-
formed based on the incident annotated cause or combined ation with accepted values. Once the readings exceed the
causes. This will refine the classes to form the sub classes upper bound of the background interval (B2) and reached
based on the values obtained. We will end up with a wide the threshold value, then the alarm will be triggered detecting
range of classes that will be used in the online detection of incident case and the matching process starts. This will provide
the incident. different possibilities for the continuity of the current shapelet
referring to the already obtained shape after comparing it to
C. Online Detection and Prediction the previously classified incidents. It will repeat this process
Once the classes are defined, the Online Detection and until the possibilities become so limited that the cause can be
Prediction phase can take place. Its job starts when an incident detected. Thus, the framework will be able as soon as possible
alarm is triggered. As said earlier, the alarm can be categorized to detect the cause behind the incident and alert the experts if
into three types: false, innocent, or true. The aim behind this special procedures must be taken. To perform this task, we can
phase is to check if the current incident occurring with the spe- reuse the techniques defined in the Incident Extractor module.
cific annotated information corresponds to a predefined class 2) Accuracy and Verification: The objectives of our re-
as quickly as possible. The main job of the online predictive search is to propose a fully automated framework. However,
model will be calculating the context by checking what is the we strongly believe that at the initial stage the solution needs
current situation once an incident is captured. Based on the an expert opinion to validate the results produced by the
discovered situation, the model will start searching for similar system. This validation is important due to the sensitivity of
incidents in the related classes. the use cases that will be implemented using this solution.
1) Incident Pattern Matching Engine: The incident pattern This will help in increasing the accuracy rate of the proposed
matching engine is the analytical engine deployed to recognize framework. Moreover, in case of exceptional use cases that
62
were not known, the involvement of the experts would help [8] Andrew Arnold, Yan Liu, and Naoki Abe. “Temporal
to enhance the solution by training the classifier over the data causal modeling with graphical granger methods”. In:
and make it capable of recognizing incident patterns that were Proceedings of the ACM SIGKDD International Confer-
unknown before. ence on Knowledge Discovery and Data Mining (2007),
pp. 66–75. DOI: 10.1145/1281192.1281203.
CONCLUSION
[9] Ali Julazadeh, Mahdi Marsousi, and Javad Alirezaie.
This paper presented an end-to-end framework for (pre)- “Classification based on sparse representation and Eu-
processing, processing and analysis of radiation level data. clidian distance”. In: 2012 IEEE Visual Communica-
The objective of developing this framework is to eliminate the tions and Image Processing, VCIP 2012. 1. 2012. ISBN:
manual intervention in radiation level monitoring systems. In 9781467344050. DOI: 10.1109/VCIP.2012.6410815.
this paper, we explained the key components of the framework [10] M. Govardhan A. and Arathi. “ACCURATE TIME SE-
including data pre-processor, incident extractor, data enhancer, RIES CLASSIFICATION USING SHAPELETS”. In:
and incident classifier. We provided a detailed description of an International Journal of Data Mining & Knowledge
analytical engine which matches the fragment patterns in real- Management Process (IJDKP) 2.3 (2011), pp. 1–23.
time and helps the experts in faster decision making regarding arXiv: arXiv:1412.6564v1.
verification of an alarm. [11] Wang Huiqing et al. “Shapelet Classification Algorithm
Several works have been lined up for future. In the near Based on Efficient Subsequence Matching”. In: Elec-
future, we planned to develop techniques for classifying the tronic Visualisation and the Arts (2018).
incidents. We will train, test, and optimize the classifier [12] S Demirci, I Erer, and O Ersoy. “Weighted Chebyshev
to guarantee that accuracy of classification. Also, we will Distance Algorithms for Hyperspectral Target Detection
develop a real-time analytical engine using advanced tools for and Classification Applications”. In: (), pp. 1–17.
performing classification in real-time. [13] Oriehi Edisemi Destiny Anyaiwe et al. “Weighted Man-
R EFERENCES hattan Distance Classifier; SELDI data for Alzheimer’s
disease diagnosis”. In: 2017 IEEE Congress on Evo-
[1] Agnieszka Jelewska and Michał Krawczak. “The Spec- lutionary Computation, CEC 2017 - Proceedings De-
trality of Nuclear Catastrophe: The Case of Chernobyl”. cember (2017), pp. 257–262. DOI: 10.1109/CEC.2017.
In: Electronic Visualisation and the Arts (2018). DOI: 7969321.
10.14236/ewic/evac18.30. [14] Qingyang Zhang. “A class of association measures
[2] International Atomic Energy Agency. The Fukushima for categorical variables based on weighted Minkowski
Daiichi Accident. International Atomic Energy Agency, distance”. In: Entropy 21.10 (2019), pp. 1–12. ISSN:
2015. ISBN: 9789201070159. URL: https://inis.iaea.org/ 10994300. DOI: 10.3390/e21100990.
search/search.aspx?orig q=RN:46110858.
[3] Andrew E. Kramer. “Russia Orders Evacuation of Vil-
lage Near Site of Nuclear Explosion”. In: The New York
Times (2019), pp. 13–14.
[4] Barde PJ. Barde MP. “What to use to express the
variability of data: Standard deviation or standard error
of mean?” In: Perspect Clin Res. 3 (2012), 113–116.
DOI : 10.4103/2229-3485.100662.
[5] Huiqing Wang et al. “Shapelet classification algo-
rithm based on efficient subsequence matching”. In:
Data Science Journal 17.2009 (2018), pp. 1–12. ISSN:
16831470. DOI: 10.5334/dsj-2018-006.
[6] Mohamed F. Ghalwash and Zoran Obradovic. “Early
classification of multivariate temporal observations by
extraction of interpretable shapelets”. In: BMC Bioin-
formatics 13.1 (2012). ISSN: 14712105. DOI: 10.1186/
1471-2105-13-195.
[7] Guiling Li, Wenhe Yan, and Zongda Wu. “Discovering
shapelets with key points in time series classifica-
tion”. In: Expert Systems with Applications 132 (2019),
pp. 76–86. ISSN: 09574174. DOI: 10.1016/j.eswa.2019.
04.062. URL: https://doi.org/10.1016/j.eswa.2019.04.
062.
63