<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Federated Learning for Distributed Weather Forecasting: A Practical Approach on Real Multidimensional Georeferenced Data</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Attilio Di Vicino</string-name>
          <email>attilio.divicino001@studenti.uniparthenope.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Giuseppe Fiorillo</string-name>
          <email>giuseppe.fiorillo001@studenti.uniparthenope.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Luigi Galluccio</string-name>
          <email>luigi.galluccio001@studenti.uniparthenope.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Rafaele Montella</string-name>
          <email>rafaele.montella@uniparthenope.it</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Science and Technology, University of Naples “Parthenope”</institution>
          ,
          <addr-line>Naples</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2025</year>
      </pub-date>
      <abstract>
        <p>The rapid advancement of technology-particularly in machine learning and data availability has led to an increasing demand for reducing communication overhead and operational costs. Edge computing is one of the fastest-growing fields, enabling cost reduction and improving communication. Simultaneously, the advancement of machine learning and the availability of data are both accelerating. Federated learning (FL) addresses not only privacy concerns but also issues related to cost and communication eficiency. This paper presents an application of a federated learning framework to improve weather forecast accuracy through the collaborative analysis of distributed data, while ensuring data confidentiality and computational eficiency. To replicate the federated environment, we developed an architecture that combines real-time data collection using the Signal K server, containerization using Docker, and a Hadoop cluster on Microsoft Azure. We evaluated the performance of a Transformer and a Crossformer, demonstrating the efectiveness of both models in this context, with the Crossformer showing superior performance in managing spatiotemporal dependencies for forecasting. Our experimental results indicate an important improvement in reducing the error with respect to previous methods, achieving a Mean Absolute Error (MAE) of 0.144 for the Crossformer and 0.232 for the Transformer, highlighting the potential of FL and advanced deep learning architectures in managing sensitive data in distributed scenarios, in line with previous research trends. This study proposes a robust and scalable approach, opening new perspectives for future applications in cooperative and secure learning.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Federated learning</kwd>
        <kwd>edge computing</kwd>
        <kwd>weather forecasting</kwd>
        <kwd>time series</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The exponential growth in the cost for training neural networks has become an increasingly pressing
issue, both in terms of financial cost and computational resources, as highlighted in studies [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. The
trend shows a dramatic increase not only in hardware requirements but also in energy consumption
and training time. This underlines the necessity for innovative methods to address such exponential
growth.
      </p>
      <p>
        One of the most promising approaches to address these challenges is edge computing, which aims
to ofload computation from centralized data centers to distributed local nodes. However, with the
rising amount of data being generated—especially in sensitive domains—the need for secure,
privacypreserving data management becomes equally critical. This is where federated learning (FL) is
relevant [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ].
      </p>
      <p>
        FL is a recently developed distributed deep learning paradigm in which clients independently train
their local neural network models with private data and then jointly aggregate a global model on the
central server [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], ofering a promising approach for training machine learning models on local devices
while avoiding raw data sharing and maintaining control over sensitive information.
      </p>
      <p>
        The aim of this project is to investigate the practical use of federated learning in collecting and
analyzing multidimensional georeferenced data from dispersed, remote stations. Using cooperative
models will help to improve weather forecasting accuracy by lowering the computing cost to train
the model and maintaining data privacy while also ensuring that, as observed in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ], peripheral nodes’
limited computing power and confidentiality restrictions are respected. However, there are several
weather phenomena that are very complicated to predict and that is why it still turns out to be a
complicated task [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ].
      </p>
      <p>To address this challenge, an architecture that leverages modern technologies such as Signal K server
for real-time data collection, Docker for containerization of services, and a Hadoop cluster on Microsoft
Azure for federated paradigm simulation has been designed. The main contribution lies in the definition
of a complete pipeline, ranging from data collection and pre-processing to the collaborative training
phase with advanced federated learning techniques. In particular, the FedProx algorithm was used to
put together the weights of diferent models.</p>
      <p>
        By integrating Transformer and Crossformer models for space-time analysis [
        <xref ref-type="bibr" rid="ref6 ref7">6, 7</xref>
        ], it is shown that
FL techniques can be efectively applied in real-world scenarios, overcoming the limitations of data
centralization and paving the way for new, secure, and cooperative learning frameworks. The best
performance was achieved with the Crossformer model, which obtained a Mean Square Error (MSE) of
0.232 and a Mean Absolute Error (MAE) of 0.144, significantly improving over the previous approach of
De Vita et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], where the MSE was 1.324 and the MAE was 0.827.
      </p>
      <p>The rest of the paper is organized as follows: in Section 2 we discuss the literature; Section 3 illustrates
the main architecture, including data collection and data pre-processing; Section 4 discusses the practical
implementation of the models; Section 5 details the experimental results of our solution, comparing
them to previous works; Section 6 provides a critical interpretation of the results and highlights their
implications; finally, Section 7 wraps up the paper’s views and outlines possible directions for future
research.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        FL was introduced around 2016 by Google as an innovative approach to training machine learning models.
Traditionally, developing high-quality models required access to large amounts of data. This frequently
requires working directly with raw and sensitive data, raising privacy and security concerns [
        <xref ref-type="bibr" rid="ref10 ref9">9, 10</xref>
        ].
      </p>
      <p>
        FL addresses these challenges by allowing model training across multiple decentralized devices,
reducing the need to share or centralize raw data [
        <xref ref-type="bibr" rid="ref11 ref12">11, 12</xref>
        ]. The core concept is to perform local training
on each node, and then combine the weights into a single global model through a coordinating node [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
Only encrypted model weights are shared, not the raw data itself, which ensures privacy throughout
the process [
        <xref ref-type="bibr" rid="ref14">14, 15</xref>
        ].
      </p>
      <p>Several recent studies address data privacy in FL applied to time-series forecasting with georeferenced
sensors. For instance, privacy-preserving collaborative forecasting techniques have been analyzed in
energy and time-series domains, demonstrating how data transformation, secure multi-party
computation, and diferential privacy can be used to balance forecast accuracy and confidentiality. From a
general perspective, data privacy becomes especially important in edge computing scenarios [16].</p>
      <p>Reviews focused on IoT sensor aggregation highlight the challenges in resource-constrained settings,
such as weather stations. Federated learning approaches combined with secure aggregation have proven
efective in residential load forecasting scenarios. These studies support the feasibility of extending
federated approaches with diferential privacy or secure aggregation approximations in distributed
weather forecasting systems [17].</p>
      <p>
        Based on the previous considerations, privacy is in fact a critical issue for both individual users
and large enterprises, which is why FL has gained significant attention and experienced a boom in
recent years. However, many FL systems still rely on a centralized approach, which can lead to several
problems, such as data falsification and lack of transparency [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. That is why there are several studies
on FL, ranging from IoT-based systems to concerns about data privacy and security [15].
      </p>
      <p>While federated learning reduces privacy risks by keeping raw data at the edge, distributed weather
forecasting introduces several domain-specific security vulnerabilities. A recent study demonstrates that
corrupting a small subset of input observations, even in non-sensitive sensor networks, can significantly
skew forecast accuracy. This exemplifies the real risk of poisoning or manipulation at the data collection
layer [18].</p>
      <p>
        The study in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ] summarizes how FL and blockchain can be combined to overcome these obstacles.
Furthermore, Chen et al. [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] presented prompt federated learning (PFL) to address the challenges
of cooperative weather forecasting across diferent meteorological datasets. This approach exploits
a spatiotemporal Transformer-based foundation model, with a novel prompt learning mechanism
designed to meet the communication and computational constraints of low-resource sensors. Chen
et al. [19] also introduced Federated Prompt Learning for Weather Foundation Models on Devices
(FedPoD) to address challenges such as data heterogeneity and communication overhead in on-device
weather forecasting, employing adaptive prompt tuning to obtain highly personalized model training
while maintaining communication eficiency [19, 20].
      </p>
      <p>In the realm of renewable energy, Li et al. [21] proposed an approach called federated deep
reinforcement learning (FedDRL) for ultra-short-term wind power forecasting. FedDRL integrates the deep
deterministic policy gradient (DDPG) algorithm within a federated learning framework not only to
improve prediction accuracy but also to enable decentralized model training without compromising data
privacy. Climate change impacts, and the dependence on non-renewable energy sources for electricity
generation, led to the proposal of numerous IoT-based smart weather station systems [22].</p>
      <p>
        De Vita et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] proposed an efective method to predict spatiotemporal weather data using advanced
deep learning models. In particular, the method is based on Transformer and Crossformer architectures
(the latter being more suitable for spatiotemporal series). Data were collected from weather stations at
the University of Naples “Parthenope”, specifically two stations, and temperature, humidity, pressure,
wind speed, and wind direction were used as input characteristics. The target of the prediction was
the temperature. This work highlights not only the efectiveness of FL but also the potential of deep
learning models such as Transformers and Crossformers in weather prediction. It was also demonstrated
that, in this specific case, the Crossformer outperformed the classical Transformer when dealing with
spatiotemporal data [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. Currently, there is a large body of research on weather prediction, each focusing
on diferent models such as artificial neural networks and deep neural networks.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. Materials and Methods</title>
      <p>To address the challenges of distributed weather data collection and analysis, we have developed an
architecture and methodology that leverages state-of-the-art technologies and distributed machine
learning paradigms. The objective is twofold: to ensure the accuracy of forecasts while preserving
data privacy and optimizing the use of computational resources. This section describes in detail the
key components of our infrastructure, the data acquisition and pre-processing phases, and the adopted
federated learning model.</p>
      <p>The collection of real-time meteorological data from each station is managed via Signal K Server, an
open-source protocol originally designed to aggregate data from marine environments [23, 24].</p>
      <p>Federated learning is coordinated by Flower, a framework that manages the aggregation of model
weights and provides several algorithms, including Federated Averaging (FedAvg) and Federated
Proximal (FedProx). Each of these has specific characteristics, but we chose FedProx to reduce communication
overhead between the client and the server, as it employs a quantization process to compress model
updates [25, 26]. After local training on each station, the encrypted weights of the models are shared with
the coordinating node—never the raw data—thus ensuring privacy. The coordinating node aggregates
these weights to create a single global model. Subsequently, updated versions of the global model are
sent back to the stations for the next training round, iteratively improving the accuracy of the overall</p>
      <sec id="sec-3-1">
        <title>3.1. Data Collection</title>
        <p>To collect data from distributed weather stations, we used Signal K, an open-source software protocol
originally developed for aggregating sensor data from maritime environments. Signal K was used in
this context to gather real-time weather data from sensor-equipped weather stations provided by the
University of Naples “Parthenope”. A plugin has been set up in the Signal K system to allow distributed
weather stations and a central broker to communicate using the MQTT protocol, allowing time series
data to be published and transmitted eficiently over the network. Another plugin has been employed
in order to save data locally for model training.</p>
        <p>This architecture, illustrated in Figure 1, allowed the central server to receive live environmental
measurements from multiple remote nodes with no raw hardware access. The Signal K services were
containerized using Docker, ofering a reproducible and scalable deployment environment for data
collection across the cluster.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Data Pre-processing</title>
        <p>The data input is highly sensitive to data quality. To address this problem, which may lead to imprecise
forecasts, we need to do some data pre-processing operations [27].</p>
        <p>We created a pre-processing pipeline that includes the following processes in order to standardize
and modify this data for our purposes:
1. Wind Direction Encoding: In order to maintain angular relationships and render it suitable for our
machine learning assignment, the wind direction was encoded into its sine and cosine, as it is
cyclic.
2. Data Smoothing: To reduce noise, each numeric feature was smoothed using a moving average
with a window size of 3.
Data Aggrega- Hadoop (Azure
tion HDInsight)
3. Data Normalization: To increase model stability and make the training process easier, all numerical
features-aside from time and direction encoding—were normalized in order to have zero mean
and unit variance (i.e., the normal distribution).</p>
        <p>In total, there were 20,199 data, collected as follows: 4,196 from Castelvolturno; 6,829 from Città
della scienza; 4,919 from Marina di Stabia; 4,255 from Via Acton. The sampling interval was Δ ≈ 1
hour [28].</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Technology Justification and Comparison</title>
        <p>To motivate the architectural decisions taken during the implementation of our federated learning
testbed, we provide a comparative overview of the adopted technologies and their alternatives. Table 1
summarizes the main components and the rationale behind each selection.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Model Architecture</title>
      <p>In order to process and train the models on distributed data, we used a Hadoop 3.3.4 cluster using
Microsoft Azure, as illustrated in Figure 1. The head nodes were configured using the E2 v3 instance
type (2 cores, 16 GB of RAM), with two head nodes deployed for redundancy. For fault tolerance and
coordination, we also used three zookeeper nodes based on the A2 v2 instance type (2 cores, 4 GB RAM).
This central infrastructure provides the coordination and management capabilities necessary for our
distributed environment.</p>
      <p>The worker nodes were configured using the E2 v3 instance type (2 cores, 16 GB of RAM), with two
worker nodes deployed in the cluster. This infrastructure ofers a trade-of between computational
eficiency and scalability for our distributed environment. It has been employed to simulate a distributed
federated learning environment, where each worker node acted as two diferent clients participating in
model training, mimicking automated weather stations collecting and processing meteorological data
in a federated manner.</p>
      <p>For weather forecasting, we integrated the Transformer and Crossformer models, advanced deep
learning architectures suitable for time series analysis with complex dependencies. The Transformer
was used for its ability to capture long-range relationships in data sequences [29]. The Crossformer,
specifically designed for multivariate and multimodal time series, has been used to handle spatiotemporal
interactions between diferent stations [30], but both are based on the Transformer architecture [31].</p>
      <p>In addition, both the Transformer and Crossformer architectures had a model dimension of 64,
four attention heads, two layers, and a dropout rate of 0.1. The input dimension was 4, including
the features station_id, temperature, humidity, and wind_speed, as well as the encoded wind
components wind_sin and wind_cos. The output dimension was 1, focusing on temperature. The
model operated with an input_window of 20 hours and an output_window of 1 hour. Federated
training was conducted over 20 rounds, with 25 local training periods per round.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Experimental Results</title>
      <sec id="sec-5-1">
        <title>5.1. Models Evaluation</title>
        <p>Model evaluation was conducted to quantify the efectiveness of Transformer and Crossformer
architectures in the context of distributed weather forecasting, leveraging the FL approach. The primary
objective was to determine the performance of each model in terms of predictive accuracy, taking into
account the specificities of the meteorological data collected and the distributed configuration of the
training.</p>
        <p>The experimental environment has been configured to simulate the real operating conditions of
distributed weather stations.</p>
        <p>Model performance was evaluated using Mean Square Error (MSE) and Mean Absolute Error (MAE),
two of the best metrics for assessing time series predictions [32].</p>
      </sec>
      <sec id="sec-5-2">
        <title>5.2. Federated Learning Results</title>
        <p>
          Table 2 compares Transformers and Crossformers models using MSE and MAE metrics. The evaluated
model is the aggregated one obtained at round 20 (the last one) using the FedProx algorithm. Table 3
employs the same approach as the first table; however, it compares our results with the work of De Vita
et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ].
        </p>
        <p>Table 4 shows the performance of the aggregated model (here it refers to the results obtained by the
best model between the two, in this case the Crossformer) over the course of the 20 rounds of federated
training, showing a progressive improvement in error metrics as the overall model becomes more
refined. These preliminary results support the efectiveness of Federated Learning and the architectures
chosen for the distributed weather forecasting task.</p>
        <p>
          Figure 2 further compares our approach with that of De Vita et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] for the Crossformer model
in the context of federated learning, analyzing MSE and MAE metrics over 20 rounds of training.
The data clearly show the superiority of our method. Although the results of De Vita et al. [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ] show
some variability with a minimum MSE of 1,311 and an MAE of 0,817 in round 10, our approach has
significantly lower error values from the first round (MSE 0,510, MAE 0,519). At the end of the 20 rounds,
our model achieves optimum performance with an MSE of 0.232 and an MAE of 0.144, demonstrating
not only a more efective convergence but also overall greater stability and accuracy compared to the
above work.
        </p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Discussion</title>
      <p>As can be seen in Table 2, the Crossformer has demonstrated superior performance compared to the
Transformer in terms of MSE and MAE (using the fictitious values provided), suggesting a greater ability
to manage complex space-dependent weather patterns in meteorological data. This outcome aligns
with the Crossformer design, which is tailored for the cross-analysis of multivariate time series [30].</p>
      <p>
        The results obtained, summarized in Table 3, demonstrate an improvement in the performance of
the Transformer and Crossformer models compared to the values reported by De Vita et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] In
particular, our approach has achieved an MSE of 0.243 and an MAE of 0.313 for the Transformer, and
an MSE of 0.232 and an MAE of 0.144 for the Crossformer. These values are lower than those of De
Vita et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] (MSE of 2,851 and MAE of 1,411 for the Transformer; MSE of 1,324 and MAE of 0,827 for
the Crossformer), indicating a greater predictive accuracy of our model, with the Crossformer, which
proves to be the best in both contexts. However, we have to consider that their work used a diferent
dataset as well as diferent techniques for data processing.
      </p>
      <p>
        As shown in Table 4 and Figure 2, the federated learning setup demonstrates a consistent
improvement in prediction accuracy over the course of the 20 training rounds. Both MAE and MSE decrease
significantly from round 1 to round 20, confirming the efectiveness of the federated learning strategy.
In particular, the MAE drops from 0.510 to 0.144, while the MSE decreases from 0.519 to 0.232, indicating
a more precise model with fewer large errors as training progresses. This improvement is especially
evident after round 10, where the error metrics begin to show a sharper decline. The stability and
convergence observed in the metrics suggest that the federated approach not only preserves data privacy
reduces communication and computational costs but also maintains high predictive performance.
Compared to the baseline values reported in [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ], our approach shows better generalization and optimization
over time, reinforcing the value of structured federated training even with relatively few rounds and
clients.
      </p>
      <p>While the current testbed consists of a limited number of physical weather stations and a reduced
amount of data, this design choice was driven by practical constraints related to data availability and
the need to ensure controlled, verifiable experimentation. Given the fragmented nature of data access,
we integrated multiple technologies for data collection, selecting solutions that are inherently scalable.
In particular, the use of containerized services (Docker), publish-subscribe communication protocols
(MQTT), and federated orchestration frameworks (such as Flower) provides a solid foundation for
future extensions. Although our experiments are currently limited in scale, the presence of a historical
database means that additional data can be incorporated into the architecture to validate performance
and scalability at a larger scale. Ultimately, this represents a strategic starting point to explore the
real-world applicability of our federated framework in meteorological scenarios.</p>
    </sec>
    <sec id="sec-7">
      <title>7. Conclusions and Future Work</title>
      <p>In this work, we have introduced and developed an innovative application of FL in the context of
the collection and analysis of distributed meteorological data. The goal is to improve the accuracy
of weather forecasts through collaborative models and to ensure data confidentiality and computing
eficiency at peripheral nodes.</p>
      <p>We designed and implemented an architecture that leverages modern technologies such as Signal K
for real-time data collection, Docker for service containerization, and a Hadoop cluster on Microsoft
Azure for federated paradigm simulation. A key contribution of this work is the definition of a complete
pipeline that covers data collection and preprocessing up to the collaborative training phase, using
advanced FL techniques.</p>
      <p>The integration of Transformer and Crossformer models for space-time analysis has demonstrated
the efectiveness of FL techniques in real scenarios, overcoming the limitations of data centralization,
and paving the way to new safe and cooperative learning scenarios. In particular, it was highlighted
that the Crossformer has outperformed the Transformer in space-time data management for weather
forecasting. The experimental results show that the Transformer model had an MSE of 0.265 and an
MAE of 0.194, while the Crossformer model had an MSE of 0.232 and an MAE of 0.144.</p>
      <p>
        However, an important consideration is about data security, integrity, and reliability. Attacks could
potentially originate not only from individual nodes but also from the central coordinating node [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>Data were collected from distributed meteorological stations of the University of Naples “Parthenope”,
with a total of 20,199 data from diferent locations. To ensure data quality and improve the stability
of the model, a pre-processing phase has been implemented which includes wind direction coding,
smoothing of data, and normalization of numerical features.</p>
      <p>In summary, this study not only confirms the efectiveness of Federated Learning and deep learning
models such as Transformer and Crossformer in weather forecasting but also proposes a robust and
scalable approach for managing sensitive data in distributed environments, opening new perspectives
for future applications in cooperative and safe learning environments. Edge computing has proven
to be an instrumental tool in optimizing resource utilization, leading to significant economic and
communication eficiencies.</p>
      <p>
        It could be interesting to explore possible directions for future work. An interesting path could be
starting from the work of De Vita et al. [
        <xref ref-type="bibr" rid="ref8">8</xref>
        ] and applying federated transfer learning. This would involve
leveraging a pre-trained model while integrating our additional features in the original feature space,
as well as across diferent nodes. This could lead to a more accurate and robust model.
This research has been partially funded by the “High-performance computing Weather nowcasting
with Federated Artificial Intelligence (Hi-WeFAI)” project 1 founded by the National Center ICSC –
“National Center for HPC, Big Data and Quantum Computing” – Cascade Call Spoke 9 – “Digital Society
&amp; Smart City” – CUP E63C22000980007 ID CODE CN_00000013. The computational infrastructure used
to train the predictive model has been provided by the Research Computing Facilities of the University
of Naples “Parthenope”.
      </p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>The author(s) have not employed any Generative AI tools.
[15] S. Shen, T. Zhu, D. Wu, W. Wang, W. Zhou, From distributed machine learning to federated
learning: In the view of data privacy and security, Concurrency and Computation: Practice and
Experience 34 (2022) e6002.
[16] C. Gonçalves, R. J. Bessa, P. Pinson, A critical overview of privacy-preserving approaches for
collaborative forecasting, International journal of Forecasting 37 (2021) 322–342.
[17] J. D. Fernández, S. P. Menci, C. M. Lee, A. Rieger, G. Fridgen, Privacy-preserving federated learning
for residential short-term load forecasting, Applied energy 326 (2022) 119915.
[18] E. Imgrund, T. Eisenhofer, K. Rieck, Adversarial observations in weather forecasting, arXiv
preprint arXiv:2504.15942 (2025).
[19] S. Chen, G. Long, T. Shen, J. Jiang, C. Zhang, Federated prompt learning for weather foundation
models on devices, arXiv preprint arXiv:2305.14244 (2023).
[20] P. Hamedi, R. Razavi-Far, E. Hallaji, Federated continual learning: Concepts, challenges, and
solutions, arXiv preprint arXiv:2502.07059 (2025).
[21] Y. Li, R. Wang, Y. Li, M. Zhang, C. Long, Wind power forecasting considering data privacy
protection: A federated deep reinforcement learning approach, Applied Energy 329 (2023) 120291.
[22] S. Ganesan, C. P. Lean, L. Chen, K. F. Yuan, N. P. Kiat, M. R. B. Khan, Iot-enabled smart weather
stations: innovations, challenges, and future directions, Malaysian Journal of Science and Advanced
Technology (2024) 180–190.
[23] D. Di Luccio, A. Riccio, A. Galletti, G. Laccetti, M. Lapegna, L. Marcellino, S. Kosta, R. Montella,
Coastal marine data crowdsourcing using the internet of floating things: Improving the results of
a water quality model, IEEE Access 8 (2020) 101209–101223.
[24] R. Montella, D. Di Luccio, L. Marcellino, A. Galletti, S. Kosta, G. Giunta, I. Foster,
Workflowbased automatic processing for internet of floating things crowdsourced data, Future generation
computer systems 94 (2019) 103–119.
[25] T. Li, A. K. Sahu, M. Zaheer, M. Sanjabi, A. Talwalkar, V. Smith, Federated optimization in
heterogeneous networks, Proceedings of Machine learning and systems 2 (2020) 429–450.
[26] S. Dasari, R. Kaluri, 2p3fl: A novel approach for privacy preserving in financial sectors using
lfower federated learning, CMES-Computer Modeling in Engineering and Sciences 140 (2024)
2035–2051.
[27] L. Zheng, W. Hu, Y. Min, Raw wind data preprocessing: a data-mining approach, IEEE Transactions
on Sustainable Energy 6 (2014) 11–19.
[28] S. Wang, J. Cao, S. Y. Philip, Deep learning for spatio-temporal data mining: A survey, IEEE
transactions on knowledge and data engineering 34 (2020) 3681–3700.
[29] H. Zhou, S. Zhang, J. Peng, S. Zhang, J. Li, H. Xiong, W. Zhang, Informer: Beyond eficient
transformer for long sequence time-series forecasting, in: Proceedings of the AAAI conference on
artificial intelligence, volume 35, 2021, pp. 11106–11115.
[30] Y. Zhang, J. Yan, Crossformer: Transformer utilizing cross-dimension dependency for multivariate
time series forecasting, in: The eleventh international conference on learning representations,
2023.
[31] A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, I. Polosukhin,</p>
      <p>Attention is all you need, Advances in neural information processing systems 30 (2017).
[32] A. Botchkarev, Performance metrics (error measures) in machine learning regression, forecasting
and prognostics: Properties and typology, arXiv preprint arXiv:1809.03006 (2018).</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>E.</given-names>
            <surname>Strubell</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Ganesh</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>McCallum</surname>
          </string-name>
          ,
          <article-title>Energy and policy considerations for modern deep learning research</article-title>
          ,
          <source>in: Proceedings of the AAAI conference on artificial intelligence</source>
          , volume
          <volume>34</volume>
          ,
          <year>2020</year>
          , pp.
          <fpage>13693</fpage>
          -
          <lpage>13696</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>H.</given-names>
            <surname>Kaur</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Rani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sachdeva</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Mittal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Kumar</surname>
          </string-name>
          ,
          <article-title>Federated learning: a comprehensive review of recent advances and applications</article-title>
          ,
          <source>Multimedia Tools and Applications</source>
          <volume>83</volume>
          (
          <year>2024</year>
          )
          <fpage>54165</fpage>
          -
          <lpage>54188</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>R.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <article-title>Toward resource-eficient federated learning in mobile edge computing</article-title>
          ,
          <source>IEEE Network 35</source>
          (
          <year>2021</year>
          )
          <fpage>148</fpage>
          -
          <lpage>155</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>F. R.</given-names>
            <surname>Mughal</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>He</surname>
          </string-name>
          ,
          <string-name>
            <surname>B. Das</surname>
            ,
            <given-names>F. A.</given-names>
          </string-name>
          <string-name>
            <surname>Dharejo</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Zhu</surname>
            ,
            <given-names>S. B.</given-names>
          </string-name>
          <string-name>
            <surname>Khan</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <string-name>
            <surname>Alzahrani</surname>
          </string-name>
          ,
          <article-title>Adaptive federated learning for resource-constrained iot devices through edge intelligence and multi-edge clustering</article-title>
          ,
          <source>Scientific Reports</source>
          <volume>14</volume>
          (
          <year>2024</year>
          )
          <fpage>28746</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>G.</given-names>
            <surname>Giunta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Montella</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Mariani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Riccio</surname>
          </string-name>
          ,
          <article-title>Modeling and computational issues for air/water quality problems: A grid computing approach, NUOVO CIMENTO-SOCIETA ITALIANA DI FISICA SEZIONE C 28 (</article-title>
          <year>2005</year>
          )
          <fpage>215</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>S.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Long</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Jiang</surname>
          </string-name>
          ,
          <article-title>Prompt federated learning for weather forecasting: Toward foundation models on meteorological data</article-title>
          ,
          <source>arXiv preprint arXiv:2301.09152</source>
          (
          <year>2023</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>A.</given-names>
            <surname>Zeng</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Zhang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Xu</surname>
          </string-name>
          ,
          <article-title>Are transformers efective for time series forecasting?</article-title>
          ,
          <source>in: Proceedings of the AAAI conference on artificial intelligence</source>
          , volume
          <volume>37</volume>
          ,
          <year>2023</year>
          , pp.
          <fpage>11121</fpage>
          -
          <lpage>11128</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>C. G. De Vita</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <string-name>
            <surname>Mellone</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Casolaro</surname>
            ,
            <given-names>M. G.</given-names>
          </string-name>
          <string-name>
            <surname>Orsini</surname>
            ,
            <given-names>J. L.</given-names>
          </string-name>
          <string-name>
            <surname>Gonzalez-Compean</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Ciaramella</surname>
          </string-name>
          ,
          <article-title>Federated learning and crowdsourced weather data: Practice and experience</article-title>
          ,
          <source>in: 2024 IEEE 20th International Conference on e-Science (e-Science)</source>
          , IEEE,
          <year>2024</year>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>9</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>E.</given-names>
            <surname>Dritsas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Trigka</surname>
          </string-name>
          ,
          <article-title>Federated learning for iot: A survey of techniques, challenges, and applications</article-title>
          ,
          <source>Journal of Sensor and Actuator Networks</source>
          <volume>14</volume>
          (
          <year>2025</year>
          )
          <article-title>9</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>M.</given-names>
            <surname>Polato</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Hammer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.-M.</given-names>
            <surname>Schleif</surname>
          </string-name>
          , et al.,
          <article-title>Machine learning in distributed, federated and nonstationary environments-recent trends</article-title>
          ,
          <source>ESANN 2024 proceedings (</source>
          <year>2024</year>
          )
          <fpage>47</fpage>
          -
          <lpage>56</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>P. M. Mammen</surname>
          </string-name>
          ,
          <article-title>Federated learning: Opportunities and challenges</article-title>
          ,
          <source>arXiv preprint arXiv:2101.05428</source>
          (
          <year>2021</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>S.</given-names>
            <surname>Ferretti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Cassano</surname>
          </string-name>
          , G. Cialone,
          <string-name>
            <surname>J. D'Abramo</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <string-name>
            <surname>Imboccioli</surname>
          </string-name>
          ,
          <article-title>Decentralized coordination for resilient federated learning: A blockchain-based approach with smart contracts and decentralized storage</article-title>
          ,
          <source>Computer Communications</source>
          <volume>236</volume>
          (
          <year>2025</year>
          )
          <fpage>108112</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>H.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Togo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Ogawa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Haseyama</surname>
          </string-name>
          ,
          <article-title>Prompt-based personalized federated learning for medical visual question answering</article-title>
          ,
          <source>in: ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</source>
          , IEEE,
          <year>2024</year>
          , pp.
          <fpage>1821</fpage>
          -
          <lpage>1825</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>B.</given-names>
            <surname>McMahan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Ramage</surname>
          </string-name>
          ,
          <article-title>Federated learning: Collaborative machine learning without centralized training data</article-title>
          ,
          <source>Google Research Blog</source>
          <volume>3</volume>
          (
          <year>2017</year>
          ).
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>