<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Hybrid digital twin-driven anomaly detection in IoT telemetry using LSTM autoencoder</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Emil Faure</string-name>
          <email>e.faure@chdtu.edu.ua</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Inna Rozlomii</string-name>
          <email>inna-roz@ukr.net</email>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Serhii Naumenko</string-name>
          <email>naumenko.serhii1122@vu.cdu.edu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Bohdan Khmelnytsky National University of Cherkasy</institution>
          ,
          <addr-line>Shevchenko Blvd., 81, Cherkasy, 18031</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Cherkasy State Technological University</institution>
          ,
          <addr-line>Shevchenko Blvd., 460, Cherkasy, 18006</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>State Scientific and Research Institute of Cybersecurity Technologies and Information Protection</institution>
          ,
          <addr-line>M. Zaliznyaka Str., 3 (6), Kyiv, 03142</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>WDA'26: International Workshop on Data Analytics</institution>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
      <abstract>
        <p>The rapid development of Internet of Things ecosystems leads to a substantial growth of sensory data streams and increases the need for intelligent monitoring methods capable of identifying abnormal behaviour in complex environments. Digital Twin technology provides a virtual reflection of physical systems and enables continuous comparison between expected and observed system behaviour. However, traditional anomaly detection methods often perform inconsistently in the presence of noise, missing data or slow system degradation. This paper proposes a hybrid anomaly detection approach that integrates Digital Twin simulation with an AI-based model using LSTM Autoencoder reconstruction. The Digital Twin layer forms a behavioural baseline, while the neural model evaluates deviations to detect contextual, collective and latent anomalies. Experimental evaluation shows that the hybrid DT+AI architecture demonstrates higher stability and detection capability compared to classical machine learning models, particularly in scenarios with signal distortions and incomplete telemetry. Visual inspection through time-series plots, reconstruction loss curves and multivariate heatmaps confirms interpretability and applicability of the approach in industrial IoT monitoring and predictive maintenance tasks. The results indicate that combining Digital Twin models with artificial intelligence strengthens anomaly detection performance and provides a promising foundation for real-time cyber-physical system analytics.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Digital Twin</kwd>
        <kwd>IoT anomaly detection</kwd>
        <kwd>LSTM autoencoder</kwd>
        <kwd>hybrid DT+AI architecture</kwd>
        <kwd>telemetry time-series</kwd>
        <kwd>predictive monitoring</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The growing integration of Internet of Things (IoT) technologies into industry, healthcare, smart cities,
and cyber-physical environments leads to an exponential increase in heterogeneous sensory data and
reinforces the demand for reliable analysis mechanisms [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Digital Twins have emerged as a key paradigm
for real-time system representation, enabling virtual replication of physical processes, prediction of
system behavior, and early identification of abnormal states. A Digital Twin continuously maintains
synchronization with its physical counterpart, receiving telemetry streams, reflecting condition changes
and generating analytical insight about potential faults [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. However, maintaining the reliability of
such models under dynamic conditions requires robust data-driven intelligence capable of detecting
anomalies, deviations, and unexpected behavior that may indicate system failure, sensor malfunction,
cyber intrusions or deteriorating operational parameters [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ].
      </p>
      <p>
        Traditional rule-based and statistical methods of anomaly detection often demonstrate limited
eficiency when dealing with noisy time series, missing values, nonlinear patterns and evolving operating
environments [
        <xref ref-type="bibr" rid="ref6 ref7 ref8">6, 7, 8</xref>
        ]. Similar challenges in reliability and security in distributed computing
environments were discussed in recent works [
        <xref ref-type="bibr" rid="ref10 ref11 ref9">9, 10, 11</xref>
        ]. In contrast, artificial intelligence brings new
capabilities for learning hidden dependencies and identifying latent patterns within telemetry sequences.
      </p>
      <p>AI-driven anomaly detection supports deeper contextual understanding of Digital Twin dynamics and
improves sensitivity to rare or previously unseen anomalies.</p>
      <p>
        Despite noticeable progress, current research still lacks suficient integration between Digital Twin
behavioral models and intelligent data analytics approaches. Existing solutions rarely provide adaptive
anomaly thresholding that considers reconstruction errors and contextual features [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]. Additional
challenges arise in processing continuous real-time data streams under computation-constrained IoT
environments [13].
      </p>
      <p>To clarify the conceptual structure of the proposed approach and illustrate the interaction between
data acquisition, preprocessing, Digital Twin simulation and AI-based anomaly detection mechanisms,
the general data flow is presented in Figure 1. The diagram demonstrates how raw telemetry originating
from IoT sensors undergoes preprocessing, state-space digital twin simulation, anomaly evaluation
using LSTM-based reconstruction, and subsequent decision making with alert generation. The pipeline
also incorporates a cloud/edge computation layer, secure data transfer, model training and inference
paths, as well as a feedback loop for updating system behavior in real time, forming a unified analytical
cycle suitable for industrial IoT environments.</p>
      <p>These aspects highlight the relevance of designing an anomaly detection approach that incorporates
Digital Twin-based state modeling together with intelligent data analysis techniques. Such a hybrid
perspective provides a foundation for more accurate identification of abnormal events, reduction of
false alarms, and strengthening system resilience against unpredictable disturbances. In this study, an
AI-oriented method for anomaly detection in Digital Twin data is proposed, focusing on the analysis of
time-series telemetry originating from sensor-driven IoT systems. The research addresses the problem
of eficient detection of abnormal deviations in Digital Twin data streams, evaluates the performance
of machine learning models in noisy environments, and demonstrates the advantages of combining
Digital Twin dynamics with artificial intelligence techniques for improving anomaly detection quality.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related works</title>
      <p>The integration of Digital Twin (DT) technology with artificial intelligence methods for anomaly
detection in IoT systems has received significant attention in recent years. Digital Twins, as virtual
replicas of physical systems, enable real-time monitoring, simulation, and predictive analysis. Their
growing use in industrial IoT (IIoT), smart manufacturing, energy systems, and healthcare applications
highlights the need for robust mechanisms to detect deviations from expected behavior based on
telemetry data [14].</p>
      <p>Traditional anomaly detection techniques, including threshold-based monitoring, statistical process
control, and rule-based systems, remain insuficient in dynamic, noisy and complex environments
[15]. The need for stable execution environments and secure workload orchestration in distributed
infrastructures is discussed in studies on containerized scheduling and application security enhancement
[16, 17]. These methods often fail to detect contextual or collective anomalies, especially in multivariate
time series generated by IoT sensor networks. In contrast, machine learning (ML) and deep learning
(DL) techniques ofer flexible frameworks for learning latent representations and identifying deviations
in data without requiring handcrafted rules [18].</p>
      <p>A wide range of ML-based methods has been proposed for anomaly detection, including
OneClass Support Vector Machines (OC-SVM), k-Nearest Neighbors (kNN), Isolation Forests (iForest),
and statistical clustering [19]. However, these approaches often struggle with temporal dependencies
and sequence modeling. To address this, recent studies have employed recurrent neural networks
(RNNs), particularly Long Short-Term Memory (LSTM) networks, to learn temporal patterns in IoT
telemetry [20, 21]. Autoencoders, including variational and denoising versions, are widely used for
reconstruction-based anomaly detection, where deviations are inferred from reconstruction loss.</p>
      <p>Recent research has explored hybrid models that combine physics-based simulation of Digital Twins
with AI-based anomaly detection [22, 23]. For example, several authors propose the integration of
LSTM-based autoencoders with digital twin models to predict equipment degradation or detect faults
in real-time [24]. Others focus on GAN-based anomaly detectors, which can generate realistic samples
and highlight abnormal patterns as outliers in latent space [25]. Still, these approaches often rely on
large labeled datasets, which are rare in industrial contexts, and lack robustness under conditions of
noise, missing data, or abrupt changes in system behavior.</p>
      <p>There is also growing interest in edge AI solutions, which aim to implement anomaly detection directly
on embedded IoT devices or edge gateways [26]. Lightweight deep learning models, quantized networks
and pruning techniques are employed to reduce computational complexity. However, integrating
such methods with real-time DT data streams remains challenging due to latency constraints, energy
limitations, and variability in sensor quality.</p>
      <p>In terms of Digital Twin modeling itself a considerable number of works focus on system-level
simulation, control loop integration, and predictive maintenance [27, 28]. Nevertheless, fewer studies
have investigated how DT models can serve not only as mirrors of physical systems but also as
datacontextual filters that enhance anomaly detection by embedding physical knowledge [ 29]. This creates
an opportunity for synergistic approaches where digital twins not only provide structural simulation
but also reinforce data analytics with behavior-based baselines.</p>
      <p>Given the current state of the field, it becomes evident that a unified method combining digital twin
behavior modeling with adaptive AI-based anomaly detection – particularly for noisy, incomplete,
or streaming data – remains an open research challenge. The present work aims to address this
gap by designing and evaluating a hybrid framework that leverages both the predictive capability of
digital twins and the pattern recognition power of deep neural networks, focusing on IoT telemetry in
resource-constrained settings.</p>
    </sec>
    <sec id="sec-3">
      <title>3. Methodology research</title>
      <sec id="sec-3-1">
        <title>3.1. Architecture of the approach</title>
        <p>The proposed anomaly detection approach is based on a synergistic integration of a Digital Twin
simulation model and intelligent AI-based analytics for processing telemetry data from IoT sensors.
The overall architecture is designed to enable real-time detection of abnormal events, with emphasis
on handling noisy, incomplete, and dynamic multivariate data streams. The core idea is to enhance
anomaly detection accuracy by incorporating domain knowledge through a Digital Twin layer, which
serves both as a behavioral reference and as a context-aware filter before data enters the AI detection
pipeline.</p>
        <p>The system architecture consists of five functional layers arranged sequentially:
1. IoT Sensor Layer, which collects real-time telemetry from physical devices.
2. Data Preprocessing Layer, responsible for noise filtering, normalization, and handling of missing
values.</p>
        <p>The data flow begins with sensor measurements that may include parameters such as temperature,
vibration, voltage, humidity, or CPU load. These raw signals are first preprocessed using filtering
techniques and rescaled into consistent formats. Subsequently, the data is passed to the Digital Twin
model, which simulates expected behavior under normal conditions. Any deviation between observed
and simulated outputs is quantified and passed along as enriched input for the anomaly detection module.
This hybrid approach allows the AI model to work not only on raw values, but also on contextualized
discrepancies, thus improving sensitivity to complex anomaly types such as contextual and collective
anomalies.</p>
        <p>To handle temporal dependencies, the anomaly detection module is implemented as an LSTM-based
Autoencoder that reconstructs recent windows of telemetry data. Abnormal behavior is inferred from
high reconstruction errors, which signal deviations from learned normal patterns. The resulting anomaly
scores are compared against adaptive thresholds – calculated dynamically based on moving averages
and confidence intervals – so that both abrupt spikes and gradual drifts can be identified.</p>
        <p>The system is designed to operate either in a centralized architecture (e.g., cloud-based deployment)
or in a decentralized edge computing environment, where Digital Twin logic and AI detection modules
are executed on lightweight embedded platforms. This makes the approach applicable for low-power
IoT devices that operate in resource-constrained conditions.</p>
        <p>The modular nature of the architecture also allows flexible substitution of individual components:
diferent preprocessing pipelines, simulation models, or AI detectors can be integrated depending on
the application domain. The layered structure ensures scalability and supports future extensions, such
as integrating cryptographic protection of telemetry data or adding federated learning mechanisms.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Dataset formation</title>
        <p>The construction of a suitable dataset is a critical component for evaluating the proposed anomaly
detection method in Digital Twin-driven IoT systems. Since the accuracy and generalizability of AI
models strongly depend on the quality, variability, and representativeness of the data used during
training and testing, a carefully designed approach to dataset formation is adopted.</p>
        <p>The initial source of data consists of multivariate telemetry streams collected from simulated or
real IoT sensors. These may include temperature, humidity, vibration, voltage, current, CPU load, and
network delay metrics, depending on the target application domain.</p>
        <p>To formalize the structure of anomalous patterns used for dataset generation, a classification of
anomaly types relevant to Digital Twin telemetry was compiled. Table 1 summarizes the main
categories considered in this study, including point, contextual, and collective anomalies, along with their
behavioural characteristics and examples of manifestation in IoT signals. This categorization guided the
synthetic anomaly injection process and ensured that the dataset reflects realistic scenarios observable
in operational cyber-physical systems.</p>
        <p>In this work, the dataset incorporates both normal system behavior patterns and a broad range of
injected anomalies in order to reflect realistic operating conditions.</p>
        <p>To ensure statistical richness and model robustness, the dataset is constructed as a combination of:
1. Clean normal data sequences, reflecting standard operation under nominal conditions as modeled
by the Digital Twin core;
2. Synthetic anomalies, generated through controlled manipulation of time-series data using multiple
strategies: point anomalies – sudden spikes or drops in sensor values; contextual anomalies –
values that appear abnormal within a specific context (e.g., temperature spikes only when load is
low); collective anomalies – prolonged deviations or trends inconsistent with the Digital Twin
prediction (e.g., slow drifts or periodic faults);
3. Real-world anomalies, when available, extracted from open datasets (e.g., NASA bearing data,</p>
        <p>SMAP/MTD from NASA’s telemetry anomalies, or proprietary industrial logs if accessible).</p>
        <p>Missing values and sensor noise are also introduced into the dataset to simulate practical challenges.
Data gaps are inserted randomly with varying durations, and Gaussian noise is added with diferent
amplitudes to reflect interference or hardware imprecision.</p>
        <p>The dataset is divided into three subsets:
1. Training set (70%) – contains only normal sequences used to train the autoencoder without
exposure to anomalies;
2. Validation set (15%) – includes both normal and anomalous samples used for model tuning and
threshold selection;
3. Test set (15%) – used for final performance evaluation, containing a mix of known and unseen
anomaly types.</p>
        <p>Each time-series window is encoded using a sliding window technique, with the window size and
stride empirically selected based on the temporal granularity of the monitored system. Statistical
normalization (e.g., z-score) is applied per feature to ensure consistency and improve training stability.</p>
        <p>By simulating both standard and abnormal operational conditions of the Digital Twin system, the
resulting dataset supports the training of neural architectures that are capable of learning latent
structures of normality and distinguishing subtle deviations. Furthermore, the design of the dataset
allows controlled benchmarking of the proposed method against alternative detection models under
varying levels of noise, data loss, and temporal context.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Data preprocessing</title>
        <p>Preprocessing of sensor data is a critical step in preparing input for both the Digital Twin simulation layer
and the AI-based anomaly detection module. The quality of data directly influences the performance of
learning algorithms, particularly in the presence of real-world challenges such as noise, missing values,
non-stationarity, and scale variability across diferent sensor channels. The preprocessing pipeline
employed in this study is designed to ensure that telemetry data is clean, consistent, and temporally
aligned to support robust learning and anomaly inference.</p>
        <p>The raw data streams obtained from IoT sensors are multivariate time series, each characterized
by specific sampling frequencies, value ranges, and physical units. To enable efective integration,
the first step involves resampling all time-series signals to a common time base using interpolation
techniques. In cases where sensor signals are asynchronous or event-based, interpolation with fixed
intervals ensures temporal alignment without introducing artificial bias.</p>
        <p>Next, missing values are addressed through a hybrid strategy that combines linear interpolation,
forward/backward filling, and in some cases polynomial interpolation for smoother recovery. The
approach dynamically selects the imputation method depending on the length and position of the data
gap. For example, short-term missing sequences (&lt;5 time steps) are filled using linear interpolation,
while longer gaps at the beginning or end of the window invoke forward/backward filling to avoid
distortion.</p>
        <p>Noise reduction is performed through low-pass filtering using Savitzky–Golay or moving average
iflters, depending on the application context. These filters smooth high-frequency fluctuations while
preserving local trends and inflection points, which are essential for contextual anomaly detection.
To ensure consistency across features with diferent units and scales, z-score normalization is applied
individually to each variable (1).</p>
        <p>norm =</p>
        <p>,
 − 

(1)
where  is the mean and  is the standard deviation of the corresponding time-series window. This
standardization centers the data around zero and allows the anomaly detection model to treat all features
with equal importance during training.</p>
        <p>To capture temporal dependencies, the multivariate time series is segmented into overlapping
windows using a sliding window technique. Each input window is represented as a matrix of size  ×  ,
where  is the window length and  is the number of sensor features. The stride between consecutive
windows is adjusted empirically to balance temporal resolution and computational eficiency.</p>
        <p>An additional feature extraction step may optionally be applied to enrich the input representation.
This includes statistical metrics such as mean, variance, skewness, and kurtosis; frequency-domain
features obtained via the Fast Fourier Transform (FFT); and domain-specific indicators. These features
can be concatenated with the raw window data for use in hybrid models that combine deep learning
with conventional anomaly scoring functions.</p>
        <p>As a result, the preprocessing stage produces a clean, normalized, and temporally structured dataset
that reflects both raw sensor behavior and contextual system dynamics. This prepared data is then
forwarded in parallel to the Digital Twin simulation core and the AI-based anomaly detection module,
enabling each layer to operate on synchronized and semantically rich inputs.</p>
      </sec>
      <sec id="sec-3-4">
        <title>3.4. AI models</title>
        <p>To detect anomalies in the telemetry data generated by Digital Twin-enabled IoT systems, a combination
of classical machine learning and deep learning models was implemented and evaluated. The goal of
this modeling layer is to learn the latent structure of normal system behavior from historical data and
identify deviations indicative of abnormal or potentially dangerous system states.</p>
        <p>Two conventional anomaly detection algorithms were used as baselines for comparative evaluation:
1. Isolation Forest – an ensemble-based method that isolates observations by randomly selecting
a feature and a split value. Anomalies are expected to be isolated faster and thus have shorter
average path lengths in the tree structure. Isolation Forest performs well with high-dimensional
data but lacks sensitivity to temporal dependencies.
2. One-Class SVM (OC-SVM) – a kernel-based method that learns the boundary of the normal data
distribution in feature space and classifies new points based on their distance from this boundary.
While efective for simple distributions, OC-SVM struggles with dynamic time-series data and is
sensitive to parameter selection.</p>
        <p>These models provide reference points for evaluating the improvements ofered by neural
architectures and hybrid strategies.</p>
        <p>The primary anomaly detection method is based on Autoencoder architectures, particularly Long
Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) variants, which are well-suited for
sequential data and long-range temporal dependencies. The Autoencoder is trained in an unsupervised
manner using only normal data windows. Its objective is to reconstruct the input sequence with minimal
loss, learning a compressed representation in the latent space.</p>
        <p>During inference, anomalous sequences that difer significantly from the learned structure will result
in high reconstruction errors, serving as a signal of abnormality. The reconstruction error is computed
as the root mean square error (RMSE) between the input and the output sequence (2).</p>
        <p>⎯
 = ⎷⎸⎸ 1 ∑=︁1 ( − ˆ )2,
(2)
where  denotes the original input value, ˆ is the reconstructed value produced by the Autoencoder,
and  represents the number of time steps within the sliding window.</p>
        <p>A dynamically adjustable anomaly threshold  is defined based on the distribution of reconstruction
errors computed on the validation set. Typically, this threshold is determined using statistical criteria
such as the mean and standard deviation of the reconstruction error distribution or percentile-based
cutofs. Sequences satisfying  &gt;  are classified as anomalous.</p>
        <p>The proposed architecture further integrates knowledge derived from the Digital Twin (DT) simulation
into the anomaly detection pipeline. Specifically, the model does not rely solely on raw telemetry data
but also incorporates error vectors obtained by comparing actual sensor measurements with simulated
outputs generated by the Digital Twin. These deviations provide context-aware signals that highlight
abnormal system behavior relative to its expected physical dynamics.</p>
        <p>This hybrid design enables the AI model to operate not only on the raw input signal , but also on
residual signals defined as  = − DT where DT denotes the simulated (predicted) values produced by
the Digital Twin model. As a result, the anomaly detection system becomes more sensitive to behavioral
anomalies that may remain hidden in the raw telemetry data but are revealed through model-based
comparisons.</p>
        <p>The dual-input architecture can be implemented either by concatenating raw and residual feature
vectors prior to feeding them into the Autoencoder, or by designing a two-branch encoder network that
jointly processes both data sources. Experimental results demonstrate that this hybrid DT+AI approach
outperforms classical statistical methods and standard deep learning baselines in terms of detection
accuracy, precision, and robustness to noise.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Experimental setup and implementation</title>
      <sec id="sec-4-1">
        <title>4.1. Digital twin implementation</title>
        <p>The implementation of the digital twin for modeling the process of the IoT system was performed in the
Python environment using the NumPy, Pandas, and TensorFlow packages for further integration with
deep learning models. Node-RED was also used to prototyping data flows and organizing telemetry
exchange, which allowed for the rapid formation of message processing routes, connection of sensor
nodes, and visual flow control. In cases where it is necessary to perform physical modeling of an object
or process, Simulink was used as a state-space modeling tool. Thus, the selected architecture provides
lfexibility and the ability to transfer the logic of the digital twin to other execution environments.</p>
        <p>The digital twin is implemented as a state description of the system dynamics, where each sensor
parameter forms a state vector that changes over time according to the predicted model. The state
model allows you to reproduce the normal behavior of the system, predict the next parameter values,
and form reference series for comparison with real indicators. The diference between the model and
actual measurements is the basis for constructing deviation vectors, which are used by the subsequent
anomaly detection module.</p>
        <p>Interaction with sensor nodes is organized through an MQTT broker, where each node transmits
parameter values with a frequency of 1 to 5 seconds, depending on the requirements of the experiment.
The real-time data stream is fed to the preprocessing module, synchronized by timestamps and
transmitted to the digital twin to calculate predicted values. To ensure stable operation, packet bufering is
provided, which minimizes losses due to network delays. This architecture allows you to model both
autonomous systems and scenarios with mixed data processing at the edge and cloud levels, which is
especially important for further testing of deep models under diferent load conditions.</p>
      </sec>
      <sec id="sec-4-2">
        <title>4.2. Software module architecture</title>
        <p>The software module of the anomaly detection system is built on a modular principle, which provides
lfexibility of configuration and the possibility of further expansion. The architecture includes four logical
layers that interact in the data stream processing mode. The first layer is the Data Acquisition Layer,
responsible for receiving telemetry from sensor nodes via MQTT. At this stage, incoming messages are
cached and a bufer is formed to protect against packet loss in case of network delays. Data streams are
normalized by time stamps and stored in an intermediate structure available for further processing.</p>
        <p>The second functional component is the Data Processing Layer, which implements cleaning,
smoothing and normalization of time series. Sliding smoothing is used to filter noise, and linear or polynomial
interpolation is used to compensate for gaps. Parameter values are brought to a single scale by the
standardization method based on the mean and standard deviation. At this stage, data segmentation is
also performed in a fixed-length sliding window, which allows preparing them for analysis using deep
learning models.</p>
        <p>The third level of the AI Detection Layer system. The model receives two data streams: normalized
series of sensor indicators and a deviation vector obtained by comparing with the predictions of the
digital twin. The LSTM Autoencoder architecture is trained on examples of normal system behavior,
reconstructing time windows with minimal error. After the training stage is completed, the model is
used to calculate the reconstruction error for new sequences. The greater the error between the input
and reconstructed signals, the higher the probability of an anomaly.</p>
        <p>The last Visualization &amp; Reporting level is responsible for visualization and interpretation of the
results. The module creates a monitoring dashboard, where the values of sensor parameters, predicted
digital twin indicators, reconstruction error, and anomaly status are displayed in real time. Additionally,
a time series graph is generated with highlighted intervals where the model detected deviations. The
results can be exported in report format or integrated into external information systems via REST API.</p>
      </sec>
      <sec id="sec-4-3">
        <title>4.3. Performance metrics</title>
        <p>To evaluate the results of the anomaly detection system, the classification accuracy and the model’s
ability to distinguish between normal and abnormal states were used. The main comparison criteria
are Precision, Recall, F1-score and the area under the ROC curve (ROC-AUC), which allows measuring
the balance between the probability of false alarms and the system’s ability to detect real deviations.
Precision determines the proportion of correctly detected anomalies among all activations, while Recall
reflects the proportion of actual anomalies that were detected by the model. The higher the F1-score,
the higher the consistency between these two characteristics. ROC-AUC is considered as an integral
measure of the system’s quality, resistant to changes in the classification threshold, which is especially
important when processing real IoT data streams.</p>
        <p>The key parameter in reconstruction models is the average reproduction error, which is defined as
the mean square of the deviation between the input and reconstructed signals at the output of the
autoencoder. If the reconstruction error value exceeds the set threshold, the sequence is marked as
potentially anomalous. The threshold T is chosen experimentally based on the statistics of validated
windows and is determined through the average value of the reconstruction error taking into account
the standard deviation. Thus, the model is able to adaptively respond to changes in the signal structure
and minimize the number of false alarms.</p>
        <p>To study the performance of the system in cases of application on peripheral computing nodes, the
delay in processing one packet, the average inference time and the computational cost of the model
were separately measured. This allows us to assess the suitability of the algorithm for deployment on
low-power edge devices, where the balance between the quality of anomaly detection and speed is
critical. In cases where the system operates in real time, these parameters afect the response speed and
the possibility of integrating the solution into industrial environments. The evaluation was carried out
during a series of experiments, where the results of basic models and deep autoencoder architectures
were compared, which allowed us to establish their efectiveness under diferent load conditions and
input noise levels.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Results and evaluation</title>
      <p>During the experimental verification of the proposed approach, a comparison of several models for
detecting anomalies in digital twin data was conducted. The aim of the experiment was to determine
the efectiveness of the classical Isolation Forest and One-Class SVM algorithms and compare them with
the LSTM Autoencoder model, as well as with the proposed hybrid solution, which took into account
deviations from the digital twin forecast. The models were tested on a generated dataset containing both
normal time series and synthetically added anomalous areas, including point and collective anomalies.</p>
      <p>The comparison of the results was carried out using the Precision, Recall, F1-score and ROC-AUC
metrics. As shown in Table 2, the classical methods demonstrated a basic level of accuracy and allowed
to detect obvious anomalies in the data, but had insuficient sensitivity to contextual cases. The LSTM
Autoencoder model showed a significantly higher ability to reconstruct normal patterns and detect
deviations due to the recovery error. The best result was obtained in the proposed hybrid architecture,
when the input features additionally included the deviation vector between the actual values and the
digital twin prediction. This allowed for better detection of weak-signal and gradual changes in the
system state, which classical algorithms ignored.</p>
      <p>To illustrate classification performance beyond numeric scores, confusion matrices were generated
for all evaluated models. Figure 2 demonstrates the distribution of true positives (TP), true negatives
(TN), false positives (FP) and false negatives (FN). A visible reduction in FN values for the proposed
Hybrid DT+AI model indicates its advantage in detecting subtle anomalies that classical methods fail to
capture.</p>
      <p>The results confirm that the integration of the digital twin into the analysis process allows to increase
the accuracy of anomaly detection compared to models operating without contextual modeling of
system behavior. Such an improvement becomes especially noticeable for small and slowly accumulating
deviations, which usually remain unrecognized when using only raw sensor signals.</p>
      <p>To assess the stability of the algorithms to real operating conditions, a series of experiments were
conducted with a gradual increase in the noise level in the input signal from 0% to 20%, as well as with
the insertion of data gaps within short and medium windows. This approach allowed to test the ability
of the models to recognize anomalies under conditions of deterioration in the quality of telemetry,
which is typical for industrial and field IoT systems. During the experiments, a degradation of the
accuracy of anomaly detection was observed in classical methods, in particular Isolation Forest and
One-Class SVM, which demonstrated a sharp decrease in Recall at a noise level of more than 10%. The
LSTM Autoencoder model was more stable and retained the ability to distinguish anomalous deviations,
however, at 20% noise, its accuracy decreased, which was manifested in an increase in reconstruction
errors even for normal signal sections. The proposed hybrid approach based on Digital Twin protected
the model from noise loads by using deviation vectors that enhance the diference between normal and
anomalous trajectories.</p>
      <p>Table 3 contains comparative F1-score accuracy indicators for each model depending on the noise
level, which demonstrates the advantage of the hybrid architecture, especially in conditions of high
signal noise. At 0–10% noise, the hybrid algorithm practically does not lose quality, and at 20% it remains
significantly more efective than other methods. Thus, the introduction of a digital twin allows you
to compensate for some of the distortions in the input data, maintaining detection stability even in
complex scenarios.</p>
      <p>Additional experiments with data gaps showed that classical models significantly lose accuracy even
with short absences of telemetry, while LSTM Autoencoder partially smooths out signal defects due
to the context of historical values. However, the best result was again demonstrated by the hybrid
solution, for which the presence of a digital twin allows to compensate for the gaps by using predicted
value models and calculating the reconstruction error relative to the expected behavior of the system.
This confirms the importance of modeling the physical essence of the object and using the Digital Twin
as a context amplifier for the AI module.</p>
      <p>To confirm the efectiveness of the proposed system, several experiments were performed,
demonstrating the operation of the reconstruction module and the process of detecting deviations in time
series. On the time graph from the real test set, intervals were highlighted where the system recorded
anomalies. Normal telemetry is displayed with a smooth line, and segments with anomalous deviations
are marked in red. Figure 3 shows where the model with increasing reconstruction error (RE) captures
point and collective deviations. It is clearly seen that it is the combination of Digital Twin error and
Autoencoder reconstruction that allowed us to detect slow drift changes that classical methods missed
in previous experiments.</p>
      <p>For a more detailed assessment of the model behavior, a reconstruction loss graph was constructed
on a segment of the test stream (Figure 4). It allows you to observe the dynamics of the error between
the reconstructed signals and the actual sensor values. Within the normal zone, the reconstruction
error remains stably low, while at points with deviations the value increases sharply and crosses the
threshold line T, defined during validation. This allows you to automatically detect the anomaly at
the level of individual windows without the need to manually set the thresholds. Cases of hidden or
low-contrast anomalies are reflected by a gradual increase in the error, which is an important indicator
of early degradation in IoT systems.</p>
      <p>In the case of multi-channel telemetry, it is important not only to detect the anomaly itself, but also
to determine the sensor or component that caused the failure. For this purpose, a heatmap was created,
where the color intensity corresponds to the magnitude of the deviation for each channel (Figure 5).</p>
      <p>This form of data presentation demonstrates that in the case of degradation of a single sensor (for
example, temperature or vibration), the deviation is amplified precisely in the corresponding rows of
the matrix. Heatmap allows you to quickly localize the source of the anomaly, which is especially
valuable for industrial applications, where it is important to determine which sensor needs replacement
or additional monitoring. The analysis showed that for complex scenarios with many dimensions,
the hybrid model forms a sharper contrast between the norm and the deviation, which simplifies
interpretation even without additional manual processing.</p>
      <p>The results confirm the system’s high ability to detect non-obvious and context-dependent anomalies.
The combination of a predictive Digital Twin layer with Autoencoder reconstruction forms an adaptive
response mechanism capable of signaling deviations before critical errors appear on the object.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Discussion</title>
      <p>The experimental evaluation confirms that the integration of Digital Twin modelling with AI-based
anomaly detection can significantly improve the reliability of monitoring in IoT systems. Classical
approaches such as Isolation Forest and One-Class SVM demonstrated acceptable performance for
detecting explicit point anomalies; however, their sensitivity decreased rapidly in the presence of
contextual and slow-degradation deviations. The LSTM Autoencoder, trained exclusively on normal
behaviour data, achieved considerably higher quality due to its ability to learn latent temporal structure
and reconstruct time-series windows with low error under nominal conditions. Nevertheless, its
performance degraded under increased noise levels and data gaps, emphasising the need for contextual
modelling.</p>
      <p>The proposed hybrid DT+AI architecture demonstrated the most stable performance across all test
conditions. A key observation is that the deviation vector between Digital Twin predictions and
real signals acted as an additional informative feature, amplifying abnormal behaviour that remained
weakly represented in raw data. This efect was especially visible in cases of gradual drift, where
the reconstruction error alone would exhibit delayed growth, while Digital Twin residuals produced
earlier deviation signals. The confusion matrices also demonstrated that the hybrid method reduced the
number of false-negative classifications, which is critical for safety-oriented IoT deployments where
missed anomalies may lead to equipment damage or system failure.</p>
      <p>Noise-resilience tests further confirmed the advantage of the hybrid model. While the performance
of classical algorithms decreased sharply under noise &gt;10%, the proposed method maintained high
F1score values even under 20% noise, suggesting potential for field environments with unstable telemetry
quality. Experiments with synthetic data gaps showed a similar trend: the Digital Twin component
compensated missing observations through predictive state-space modelling, enabling the AI engine
to continue inference with reduced performance loss. This indicates that Digital Twin integration
increases temporal robustness and reduces the dependence on complete raw input.</p>
      <p>Visual inspection of time-series anomaly plots, reconstruction loss curves, and multivariate heatmaps
demonstrates that the hybrid system provides interpretable anomaly indicators suitable for real-time
monitoring interfaces. Unlike black-box models, Digital Twin coupling provides semantic insights
into system states and helps identify the source of deviations at the sensor level. This interpretability
is critical for industrial adoption, where decision-making requires explanation rather than binary
classification alone.</p>
      <p>Overall, the obtained results show that combining Digital Twin physical-state simulation with
neural reconstruction mechanisms forms an efective baseline for context-aware anomaly detection
in IoT environments. The approach addresses limitations related to noise, missing data, and temporal
complexity, ofering both improved performance and higher interpretability compared to traditional
machine learning models.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusions</title>
      <p>This work presents a hybrid anomaly detection framework that integrates Digital Twin simulation with
AI-driven telemetry analysis for IoT systems. The proposed approach leverages state-space modelling
to generate expected behaviour patterns and compute deviation vectors, which are further processed by
an LSTM Autoencoder to identify abnormal states through reconstruction error. Experimental results
demonstrate that the hybrid DT+AI architecture outperforms conventional anomaly detection methods,
achieving higher accuracy and robustness under noisy and incomplete data conditions. The system
reduced false-negative rates, improved recognition of gradual and contextual anomalies, and maintained
stability under high noise levels compared to classical baselines.</p>
      <p>The visual and metric-based evaluation confirms the applicability of the approach for real-time
monitoring scenarios, where early detection of system degradation is essential for failure prevention.
The proposed method is particularly relevant for industrial IoT deployments, predictive maintenance
systems, and cyber-physical infrastructure with constrained computational resources. The modular
architecture allows flexible integration with alternative preprocessing or AI models and supports future
extensions.</p>
      <p>Future work will focus on expanding the Digital Twin model with multi-physics simulation elements,
incorporating federated learning for distributed analytics, and evaluating deployment feasibility on
edge hardware such as microcontrollers and embedded gateways. Additional research may explore
reinforcement learning for adaptive thresholding and encrypted telemetry pipelines for secure anomaly
inference in mission-critical environments.</p>
    </sec>
    <sec id="sec-8">
      <title>Acknowledgements</title>
      <p>This research was funded by the Ministry of Education and Science of Ukraine under grant 0123U100270.</p>
    </sec>
    <sec id="sec-9">
      <title>Declaration on Generative AI</title>
      <p>The authors have not employed any Generative AI tools.
[13] F. Tusa, S. Clayman, A. Buzachis, M. Fazio, Microservices and serverless functions—lifecycle,
performance, and resource utilisation of edge-based real-time iot analytics, Future Generation
Computer Systems 155 (2024) 204–218.
[14] I. Rozlomii, S. Naumenko, P. Mykhailovskyi, V. Monarkh, Resource-saving cryptography for
microcontrollers in biomedical devices, in: Proceedings of the IEEE 5th KhPI Week on Advanced
Technology, IEEE, 2024, pp. 1–5.
[15] N. Jefrey, Q. Tan, J. R. Villar, A review of anomaly detection strategies to detect threats to
cyber-physical systems, Electronics 12 (2023) 3283.
[16] Y. V. Voievodin, I. O. Rozlomii, Advanced software framework for comparing balancing strategies
in container orchestration systems, in: Proceedings of an International Conference on Distributed
Systems, 2024, pp. 60–69.
[17] Y. Voievodin, I. Rozlomii, Application security optimization in container orchestration systems
through strategic scheduler decisions, in: Proceedings of the CPITS-2024: Cybersecurity Providing
in Information and Telecommunication Systems, volume 3654 of CEUR Workshop Proceedings,
2024, pp. 471–478.
[18] N. L. Rane, M. Paramesha, S. P. Choudhary, J. Rane, Machine learning and deep learning for big
data analytics: A review of methods and applications, Partners Universal International Innovation
Journal 2 (2024) 172–197.
[19] S. Oswal, S. Shinde, M. Vijayalakshmi, A survey of statistical, machine learning, and deep
learningbased anomaly detection techniques for time series, in: International Advanced Computing
Conference, Springer Nature Switzerland, Cham, Switzerland, 2022, pp. 221–234.
[20] S. A. Bkheet, J. I. Agbinya, G. S. M. Khamis, Advanced deep learning approach for smart home
appliance identification using recurrent neural networks with lstm, IoT 5 (2024) 835–851.
[21] M. Tayebi, S. El Kafhali, Performance analysis of recurrent neural networks for intrusion detection
systems in industrial internet of things, Franklin Open 12 (2025) 100310.
[22] H. Farhat, A. Altarawneh, Physics-informed machine learning for intelligent gas turbine digital
twins: A review, Energies 18 (2025) 5523.
[23] S. S. Reza, C. M. K. Uddin, M. F. Rabbi, Bridging virtual and physical worlds: Ai in digital twin
development for mechanical systems, Scientia: Technology, Science and Society 2 (2025) 16–31.
[24] X. Bampoula, N. Nikolakis, K. Alexopoulos, Condition monitoring and predictive maintenance of
assets in manufacturing using lstm-autoencoders and transformer encoders, Sensors 24 (2024)
3215.
[25] H. Li, Y. Li, Anomaly detection methods based on gan: A survey, Applied Intelligence 53 (2023)
8209–8231.
[26] S. Baimukhanov, H. Ali, A. Yazici, Enhancing ml-based anomaly detection in data management for
security through integration of iot, cloud, and edge computing, Expert Systems with Applications
(2025) 128700.
[27] Q. Feng, Y. Zhang, B. Sun, X. Guo, D. Fan, Y. Ren, Z. Wang, Multi-level predictive maintenance
of smart manufacturing systems driven by digital twin: A matheuristics approach, Journal of
Manufacturing Systems 68 (2023) 443–454.
[28] Y. Zhang, Q. Feng, D. Fan, Y. Ren, Y. Song, M. Liu, Z. Wang, Predictive control for operation
and maintenance in smart manufacturing systems with multiple operating modes, Computers &amp;
Industrial Engineering 207 (2025) 111196.
[29] Y. Wang, E. Zhang, A. Yang, K. Du, J. Gao, Mixed reality-based multi-scenario visualization and
control in automated terminals: A middleware and digital twin driven approach, Buildings 15
(2025) 3879.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>S. H.</given-names>
            <surname>Abdulhussain</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. M.</given-names>
            <surname>Mahmmod</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Alwhelat</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Shehada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z. I.</given-names>
            <surname>Shihab</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. J.</given-names>
            <surname>Mohammed</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Hussain</surname>
          </string-name>
          ,
          <article-title>A comprehensive review of sensor technologies in iot: Technical aspects, challenges, and future directions</article-title>
          ,
          <source>Computers</source>
          <volume>14</volume>
          (
          <year>2025</year>
          )
          <fpage>342</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>G.</given-names>
            <surname>Vidyalakshmi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Gopikrishnan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Boulila</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Koubaa</surname>
          </string-name>
          , G. Srivastava,
          <article-title>Digital twins and cyberphysical systems: A new frontier in computer modeling</article-title>
          ,
          <source>Computer Modeling in Engineering &amp; Sciences</source>
          <volume>143</volume>
          (
          <year>2025</year>
          )
          <fpage>51</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.</given-names>
            <surname>Adibi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Rajabifard</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Shojaei</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Wickramasinghe</surname>
          </string-name>
          ,
          <article-title>Enhancing healthcare through sensorenabled digital twins in smart environments: A comprehensive analysis</article-title>
          ,
          <source>Sensors</source>
          <volume>24</volume>
          (
          <year>2024</year>
          )
          <fpage>2793</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>A.</given-names>
            <surname>Yarmilko</surname>
          </string-name>
          , I. Rozlomii,
          <string-name>
            <given-names>S.</given-names>
            <surname>Naumenko</surname>
          </string-name>
          ,
          <article-title>Dependability of embedded systems in the industrial internet of things: Information security and reliability of the communication cluster</article-title>
          ,
          <source>in: Proceedings of the International Scientific-Practical Conference “Information Technology for Education, Science and Technics”</source>
          , Springer Nature Switzerland, Cham, Switzerland,
          <year>2024</year>
          , pp.
          <fpage>235</fpage>
          -
          <lpage>249</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>V.</given-names>
            <surname>Larin</surname>
          </string-name>
          , et al.,
          <article-title>Prediction of the final discharge of the UAV battery based on fuzzy logic estimation of information and influencing parameters</article-title>
          ,
          <source>in: 2022 IEEE 3rd KhPI Week on Advanced Technology (KhPIWeek)</source>
          ,
          <year>2022</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>A. S.</given-names>
            <surname>AlSalehy</surname>
          </string-name>
          , M. Bailey,
          <article-title>Improving time series data quality: Identifying outliers and handling missing values in a multilocation gas and weather dataset</article-title>
          ,
          <source>Smart Cities</source>
          <volume>8</volume>
          (
          <year>2025</year>
          )
          <fpage>82</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>M.</given-names>
            <surname>Mayilsamy</surname>
          </string-name>
          ,
          <article-title>Intelligent anomaly detection in real-time big data engineering</article-title>
          ,
          <source>Journal of Engineering and Computer Sciences</source>
          <volume>4</volume>
          (
          <year>2025</year>
          )
          <fpage>577</fpage>
          -
          <lpage>589</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>S.</given-names>
            <surname>Zhyla</surname>
          </string-name>
          , et al.,
          <article-title>Practical imaging algorithms in ultra-wideband radar systems using active aperture synthesis and stochastic probing signals</article-title>
          ,
          <source>Radioelectronic and Computer Systems</source>
          <volume>1</volume>
          (
          <year>2023</year>
          )
          <fpage>55</fpage>
          -
          <lpage>76</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Z.</given-names>
            <surname>Amiri</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Heidari</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N. J.</given-names>
            <surname>Navimipour</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Unal</surname>
          </string-name>
          ,
          <article-title>Resilient and dependability management in distributed environments: A systematic and comprehensive literature review</article-title>
          ,
          <source>Cluster Computing</source>
          <volume>26</volume>
          (
          <year>2023</year>
          )
          <fpage>1565</fpage>
          -
          <lpage>1600</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>I.</given-names>
            <surname>Rozlomii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Yarmilko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Naumenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Mykhailovskyi</surname>
          </string-name>
          ,
          <article-title>Hardware encryptors and cryptographic libraries for optimizing security in iot</article-title>
          ,
          <source>in: Proceedings of the 12th International Conference on Information Control Systems &amp; Technologies (ICST</source>
          <year>2024</year>
          ), Odesa, Ukraine,
          <year>2024</year>
          , pp.
          <fpage>99</fpage>
          -
          <lpage>109</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M.</given-names>
            <surname>Abd Elaziz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Abualigah</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Attiya</surname>
          </string-name>
          ,
          <article-title>Advanced optimization technique for scheduling iot tasks in cloud-fog computing environments</article-title>
          ,
          <source>Future Generation Computer Systems</source>
          <volume>124</volume>
          (
          <year>2021</year>
          )
          <fpage>142</fpage>
          -
          <lpage>154</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>C.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Du</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Sun</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <article-title>Hierarchical context representation and self-adaptive thresholding for multivariate anomaly detection</article-title>
          ,
          <source>IEEE Transactions on Knowledge and Data Engineering</source>
          <volume>36</volume>
          (
          <year>2024</year>
          )
          <fpage>3139</fpage>
          -
          <lpage>3150</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>