<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>B. S. Guendouzi, S. Ouchani, H. El Assaad, M. El Zaher, A systematic review of federated learning:
Challenges, aggregation methods, and development tools, Journal of Network and Computer Ap-
plications</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">2640-3498</issn>
    </journal-meta>
    <article-meta>
      <article-id pub-id-type="doi">10.1016/j.jnca.2023.103714</article-id>
      <title-group>
        <article-title>Quantum Federated Learning for Noisy and Imbalanced State Discrimination</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Rocco Ballester</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Christian Blum</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Jesus Cerquides</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Luis Artiles</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Artificial Intelligence Research Institute (IIIA-CSIC)</institution>
          ,
          <addr-line>Barcelona 08193</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Autonomous University of Barcelona (UAB)</institution>
          ,
          <addr-line>Barcelona 08193</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Strategic Platform</institution>
          ,
          <addr-line>Barcelona 08018</addr-line>
          ,
          <country country="ES">Spain</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
      <volume>220</volume>
      <issue>2023</issue>
      <fpage>431</fpage>
      <lpage>448</lpage>
      <abstract>
        <p>Quantum Federated Learning (QFL) is an emerging framework that integrates Federated Learning (FL) with Quantum Computing (QC), enabling collaborative training of quantum models across distributed and privacypreserving environments. While prior work has primarily focused on classical machine learning tasks using Variational Quantum Circuits (VQCs), the application of QFL to inherently quantum problems remains largely unexplored. In this work, we investigate QFL in the context of one of the most fundamental tasks in quantum information theory, namely, Quantum State Discrimination (QSD). We design and evaluate a set of toy yet realistic scenarios involving small, imbalanced, and noisy quantum datasets, reflecting practical constraints common in quantum sensing and metrology. Our findings demonstrate that QFL can successfully overcome local dataset biases and quantum noise, achieving near-optimal performance comparable to analytical results. All code and experiments are made publicly available at https://github.com/roccobb/QFL4QSD to support reproducibility and encourage further research.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Quantum Federated Learning</kwd>
        <kwd>Quantum State Discrimination</kwd>
        <kwd>Artificial Intelligence</kwd>
        <kwd>Distributed Quantum Computing</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The rapid advancement of both Machine Learning (ML) and Quantum Computing (QC) is transforming
the landscape of modern computing and Artificial Intelligence (AI) [
        <xref ref-type="bibr" rid="ref1 ref2 ref3">1, 2, 3</xref>
        ]. ML has become a cornerstone
of contemporary AI, enabling significant breakthroughs in data analysis, pattern recognition,
decisionmaking, and many more [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. However, the explosive growth of data has exposed key limitations of
centralized ML models, including storage bottlenecks, high communication costs, and heightened risks
of data leakage [4, 5]. These concerns, along with increasing emphasis on privacy and compliance with
regulations such as the EU’s General Data Protection Regulation (GDPR), have motivated a shift toward
decentralized, privacy-preserving approaches such as Federated Learning (FL), a paradigm in which
multiple participants collaboratively train a shared model without exchanging raw data [6, 7].
      </p>
      <p>
        Parallelly, the field of QC has undergone significant advancements, progressing from theoretical
foundations to the realization of Noisy Intermediate-Scale Quantum (NISQ) devices [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]. Unlike classical
systems, quantum computers exploit phenomena such as superposition and entanglement to process
information in fundamentally new ways. These principles have enabled quantum algorithms that ofer
computational advantages for specific tasks [ 8, 9], as well as the development of cryptographic protocols
that guarantee information-theoretic security [10].
      </p>
      <p>However, despite the promising capabilities enabled by QC, practical implementations remain
constrained by the limitations of current hardware. Today’s quantum devices are characterized by a small
number of qubits, limited coherence times, high error rates, and the absence of full-scale quantum error
correction [11]. These challenges place significant restrictions on the complexity and depth of quantum
circuits that can be reliably executed, thereby limiting the scope of near-term quantum applications
[12].</p>
      <p>In addition, many practical quantum computing scenarios involve datasets that are inherently small,
distributed, and sensitive [13, 14]. This is particularly evident in domains such as quantum sensing
and quantum metrology, where data is often generated across multiple remote quantum devices or
laboratories [15]. In such contexts, centralizing data is not only impractical due to hardware constraints
but may also be undesirable due to privacy concerns or institutional boundaries [16].</p>
      <p>These challenges, combined with the growing need for scalable, distributed quantum data processing
[17], have motivated the integration of FL principles into the quantum domain—giving rise to the field
known as Quantum Federated Learning (QFL).</p>
      <p>Nevertheless, while QFL ofers a promising avenue for collaborative quantum learning, the majority
of existing works have primarily focused on applying it to classical tasks, particularly to standard
classification problems using Variational Quantum Circuits (VQCs) and classical datasets [18]. In contrast, this
work explores the use of QFL to address Quantum State Discrimination (QSD), a foundational problem
in quantum information theory. We investigate this task under practical constraints, including noisy
and imbalanced quantum datasets distributed across diferent clients. To the best of our knowledge,
this is the first work to apply a federated learning framework to a genuinely quantum task—namely,
QSD—highlighting the broader potential of QFL for advancing fundamental quantum information
processing problems in realistic settings.</p>
      <p>The main contributions of this work are as follows:
1. We introduce, for the first time to the best of our knowledge, the use of QFL to address a
foundational quantum information problem, namely QSD.
2. We design and implement a set of toy but realistic experiments within the QFL framework,
including:
a) Small and imbalanced quantum datasets with varying degrees of class imbalance;
b) Small, imbalanced quantum datasets subject to depolarizing noise, modeling realistic
quantum noise in state preparation.
3. We demonstrate that QFL can efectively mitigate local bias and noise, achieving strong
performance in QSD tasks. Our results are benchmarked against an analytical solution, validating the
robustness and accuracy of the QFL approach.
4. To promote transparency and reproducibility, we release a public repository containing all code,
experiments, and results available at https://github.com/roccobb/QFL4QSD.</p>
      <p>The rest of this paper is organized as follows. Section 2 reviews related work in both QFL and QSD.
Section 3 provides the necessary background on these two domains. In Section 4, we define the problem
under study and detail the assumptions and challenges involved. Section 5 presents and discusses our
experimental results. Finally, Section 6 concludes the paper and outlines potential directions for future
work.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>QFL has quickly become a growing area of research at the intersection of quantum computing and
distributed learning. Since its initial formulation in the early 2020s [19], QFL has attracted increasing
interest for its potential to enable collaborative quantum model training without requiring centralized
access to data, making it a promising candidate for real-world applications using current NISQ devices
[20, 17, 18].</p>
      <p>While some studies have explored the use of QFL to enhance communication security—leveraging
quantum cryptographic techniques such as Quantum Key Distribution (QKD) to enable more private and
robust federated protocols [21, 22, 23, 24, 25]—the predominant research focus has been on optimization,
particularly in the context of classical classification tasks [ 18]. Thus, the vast majority of works in the
ifeld typically use quantum models, most often VQCs, to classify classical datasets such as MNIST or
CIFAR-10 within a federated framework (see [19, 26, 27, 28, 29, 30, 31, 32, 33, 34], to name a few). In this
sense, the primary goal of most works is to obtain better convergence or improved eficiency within a
federated framework, rather than to address a problem that is inherently quantum in nature.</p>
      <p>Among the growing body of QFL literature, only a handful of works explicitly address the use of
quantum data, in particular, [35] and [36]. However, these studies adopt relatively simple settings in
which quantum data is artificially generated by applying single-qubit rotations, with labels assigned
based on whether the resulting state exceeds a predefined excitation threshold. Although VQCs are
employed within a federated learning framework, the classification tasks remain straightforward. In
contrast, our work addresses a significantly more complex and foundational challenge, namely QSD,
while incorporating practical constraints such as noisy states, class imbalance, and heterogeneous data
distributions across clients.</p>
      <p>On a diferent note, QSD stands out as a particularly important and impactful problem among the
many challenges in quantum information theory [37, 38]. Its relevance spans a wide array of applications,
including quantum communication, quantum cryptography, quantum metrology, and quantum sensing,
where reliable state identification is essential to system performance and security [39].</p>
      <p>A few recent works have explored the use of VQCs for QSD. Chen et al. [40] introduced a framework
where parameterized quantum circuits are trained to approximate generalized quantum measurements,
also known as Positive Operator-Valued Measure (POVM), enabling discrimination of non-orthogonal
quantum states with strong generalization to unseen inputs. Similarly, Lee et al. [41] proposed a
framework which learns the optimal POVM for minimum-error discrimination using a cost-function-based
variational approach, showing performance close to semidefinite programming solutions. Lastly, [ 42]
tested the performance of noisy quantum neural networks for QSD, showing that efective discrimination
is possible even under realistic noise conditions.</p>
      <p>These works establish the feasibility of using VQCs for QSD. Nevertheless, they typically assume
access to centralized quantum data and rely on idealized settings, often involving large or even infinite
datasets. In contrast, we address a more realistic and practical setting where multiple clients each
hold small, finite quantum datasets that may be imbalanced and subject to noise. This decentralized
approach better reflects the constraints faced in current quantum technologies and distributed quantum
applications [14].</p>
    </sec>
    <sec id="sec-3">
      <title>3. Fundamental Concepts</title>
      <sec id="sec-3-1">
        <title>3.1. State Discrimination</title>
        <p>QSD is the task of identifying an unknown quantum state,  , chosen from a known set of possible states
 ∈ { 1,  2, ...,  }. The objective is to design an optimal measurement that best identifies  .</p>
        <p>In this work, we focus on quantum states represented by density matrices acting on a 2-dimensional
complex Hilbert space ℋ, that is, single-qubit states. In addition, we restrict our attention to the binary
classification case, where the unknown state belongs to the set  ∈ { ,  }.</p>
        <p>Thus, the goal is to construct a quantum measurement that, upon receiving an unknown qubit state
drawn from { ,  } identifies it as either class A or class B with the smallest possible probability of
error.</p>
        <p>Let the measurement, , with  ∈ {0, 1}, be described by a 2-element POVM. A measurement
outcome of  = 0 is interpreted as a guess that the system was prepared in state  , and similarly, a
measurement outcome of  = 1 is interpreted as having received state  .</p>
        <p>Assuming the state has been prepared in   or   with prior probabilities  and , respectively,
the probability of correctly identifying the state is:
 =  [ 0] +  [ 1].
(1)</p>
        <p>If the states are orthogonal (i.e.,  [  ] = 0), they can be perfectly distinguished. However, in a
more realistic and general scenario,   and   are not orthogonal (i.e., 0 &lt;  [  ] &lt; 1), and it is
impossible to distinguish them with certainty.</p>
        <p>The optimal strategy for this particular case was provided by Carl W. Helstrom (i.e., [43]).
Since ∑︀  = I for all POVMs, Eq. 1 can be written as
 =  [ 0] +  [ I −  0] =  +  [(  −  )0].
(2)</p>
        <p>To maximize Eq. 2, the trace needs to be maximized. This is achieved when 0 is a projector on the
positive eigenspace of   −  .</p>
        <p>Moreover, in this case, the solution is equivalent to finding a rotation in the Bloch sphere that aligns
the optimal measurement direction with a given fixed measurement basis such that  =  † |⟩ ⟨|  .</p>
        <p>Finding this optimal rotation efectively reduces the problem of QSD to a variational optimization
task. In this work, we employ a Variational Quantum Circuit (VQC) to learn the unitary transformation
that implements the desired rotation. The quantum circuit is trained to minimize a suitable loss function
that quantifies the discrimination error, allowing the model to approximate the optimal measurement
strategy directly from data.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. Quantum Federated Learning</title>
        <p>FL is a decentralized machine learning paradigm that enables multiple clients to collaboratively train a
shared model without exchanging their raw data. This approach addresses critical concerns around
data privacy, communication costs, and scalability by performing local training on each client’s dataset
and aggregating model updates centrally [7].</p>
        <p>Generally, a FL framework comprises a central server and a set of  parties, commonly referred to
as clients, each holding a subset of the dataset. The goal is to collaboratively train a machine learning
model without disclosing any local data. To accomplish this, each client performs local training and
sends its model parameters to the central server, which aggregates the updates and returns the revised
global parameters to all clients:
+1 ←
=1
∑︁  +1, where +1 ←

 −  ∇ (), ∀,
(3)
where  represents the fraction of dataset holded by client ,  are the model parameters of client
 at step ,  is the learning rate,  (· ) represents the machine learning model, and  are the global
model parameters at step .</p>
        <p>This process is repeated iteratively until a convergence or stopping criterion is met.</p>
        <p>Despite its advantages, FL also introduces several challenges, including statistical heterogeneity across
clients, communication ineficiencies, and vulnerabilities to adversarial behavior or data poisoning
attacks [4].</p>
        <p>QFL emerges from the intersection of FL and Quantum Machine Learning (QML) and aims to address
some of FL’s mentioned challenges while leveraging the potential advantages of QC.</p>
        <p>As discussed in Section 2, typical implementations of QFL combine QML models—most notably
VQCs—at the client level to boost performance, while employing quantum communication protocols to
address privacy concerns and mitigate communication bottlenecks. Nonetheless, we believe that QFL
is particularly well-suited for scenarios where quantum data is inherently distributed across diferent
devices or institutions and where data centralization is infeasible or undesirable.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Problem Definition</title>
      <p>As stated in Section 3.1, the goal in binary QSD is to identify the true label  ∈ {0, 1} of an unknown
quantum state  ∈ ℋ, which is known to be prepared from one of two possible sources:   (class A) or
  (class B). The optimal strategy for minimizing the probability of misclassification is given by Eq. 2,
whose success probability depends on both:
• The actual quantum states   and  .</p>
      <p>• Their respective prior probabilities  =  ( = 0) and  =  ( = 1), with  +  = 1.
In this work, we consider a more constrained and realistic version of this problem.
Let there be  clients {}=1, each holding a private dataset of labeled quantum states:
 = {( (), ())}=1, where ()
∈ {0, 1},
with  being the number of training samples per client (for simplicity, we assume that all clients
have the same number of samples, i.e, || = , ∀).</p>
      <p>Moreover, it is important to know that in our scenario, each client:
• Does not know the actual quantum states  (), only the corresponding class labels ().
• Does not know the prior probabilities ,  of the true population from which the unknown
test state is drawn.
• May have a class imbalance in their local dataset, such that the empirical class ratio</p>
      <p>These constraints make the learning problem significantly more challenging. The presence of noise
can degrade the training signal and introduce uncertainty, especially since clients lack access to state
ifdelity information and cannot correct for errors explicitly.</p>
      <p>Similarly, although clients may suspect that their local datasets are imbalanced, and thus not
representative of the true population, they cannot estimate or adjust for the true class priors due to the
decentralized setup and privacy-preserving constraints. Collectively, these factors complicate the task
of collaboratively learning an optimal global measurement strategy, such as the Helstrom measurement,
making this problem both realistic and nontrivial.</p>
      <p>We consider a VQC of a single trainable rotation gate  ( ) and a final measurement in the
computational basis in order to learn the optimal rotation for the aforementioned state discrimination
task.</p>
      <p>While this model could be extended to higher-dimensional state spaces, more complex circuits, or
multi-class settings, we deliberately focus on a minimal configuration. This ensures that we can isolate
and analyze the fundamental efects of federated optimization, data imbalance, and quantum noise,
without conflating them with circuit complexity or optimization instability, while being able to compare
our results with analytical solutions.</p>
    </sec>
    <sec id="sec-5">
      <title>5. Experiments and Discussion</title>
      <p>In this section, we evaluate the performance of our quantum model in both federated and non-federated
training settings, and compare the results against the known analytical solution given by the Helstrom
bound.</p>
      <p>We consider two experimental configurations:
• Data Imbalance Only. Clients are afected only by varying local class ratios, with no quantum
noise applied. This isolates the efect of statistical bias in local datasets.
• Data Imbalance and Depolarizing Noise. Clients experience both class imbalance and
depolarizing noise during state preparation. This setting reflects a more realistic challenge, where
both data heterogeneity and quantum noise are present.</p>
      <p>In our experiments, we consider a binary QSD task between the states   = |0⟩ ⟨0| and   = |+⟩ ⟨+|,
where |+⟩ = √12 |0⟩ + |1⟩, with prior probabilities  =  = 0.5. While this symmetric distribution
simplifies the setup and provides a clear reference point, our methodology is applicable to any other
class distribution, as will be discussed later.</p>
      <p>In this particular setup, the optimal measurement corresponds to a rotation around the Y-axis
by an angle of  = (1/√2) ≈ 0.7854 radians. This known analytical solution allows us to
quantitatively assess how well our quantum variational model, trained under diferent conditions,
approximates the optimal decision boundary.</p>
      <p>Thus, for each configuration, we compare:
• Local Training. Each client trains its VQC independently using only local data.
• Federated Training (QFL). Clients participate in a federated optimization loop where only
model parameters are shared with a central server. No raw data or quantum states are exchanged,
preserving privacy and reducing communication complexity.</p>
      <p>These experiments aim to assess whether QFL enables a collaborative and privacy-preserving learning
process that converges to near-optimal discrimination performance, even under realistic constraints.</p>
      <sec id="sec-5-1">
        <title>5.1. Experiment 1: Imbalanced Datasets</title>
        <p>We begin by analyzing the impact of class imbalance across clients in the absence of quantum noise. In
this setup, each client receives a local training dataset of size  = 100, consisting of 1-qubit states
labeled according to class  or class . During training, we use a single shot per circuit evaluation.
However, since gradients are estimated using the parameter-shift rule, each input state is efectively
used twice. Equivalently, we may regard the dataset as comprising 100 pairs of state inputs, with each
pair corresponding to the two shifted evaluations required for gradient estimation.</p>
        <p>
          The class distribution is not globally uniform but varies across clients, simulating a realistic FL
scenario with non-identically distributed data. The class imbalance for client  is determined by
a parameter,  , that controls the variability of local class ratios. Specifically, each client’s class-A
proportion  ∈ [
          <xref ref-type="bibr" rid="ref1">0, 1</xref>
          ] is sampled as:
        </p>
        <p>, with  ∼  (0,  2).</p>
        <p>Thus, the corresponding labels are drawn with class probabilities (, 1 − ). Larger values of 
yield greater diversity in class imbalance across clients. Note that while the true population priors are
assumed to be  =  = 0.5, each client’s empirical distribution may significantly deviate from this,
especially when  is large.</p>
        <p>Nevertheless, the methodology applies to any class distribution, provided that the local class ratios
are sampled from a Gaussian centered at the true population prior.</p>
        <p>To better understand the impact of class imbalance on the local learning process, we conduct an
experiment where a single client trains its VQC model on its 100 samples with varying proportions
of class  states ratios. For each class ratio, the training is repeated 50 times with diferent random
samples to account for statistical variability. The metric reported is the angular distance between the
learned rotation and the analytically optimal solution.</p>
        <p>As shown in Figure 1, high imbalances significantly degrade performance, while class ratios closer to
the true prior  = 0.5 yield more accurate models with lower angular deviations, as expected.</p>
        <p>Next, we investigate how well a single client and a federated setup with 10 clients learn the optimal
rotation for the QSD task under a highly imbalanced and heterogeneous data distribution (using  = 4).</p>
        <p>Fig. 2a shows the training dynamics of a single client, which updates its model parameter after each
individual state measurement and does not participate in any FL process. In contrast, Fig. 2b illustrates
the evolution of the global model in a QFL setup. In this setting, each client performs local updates
after every state measurement, but communicates its model parameter to a central server after every 5
measurements. The server aggregates these parameters by averaging and broadcasts the updated global
degrades as the local class ratio deviates from the true prior  = 0.5.</p>
        <p>(a) Learning dynamics of a single client.</p>
        <p>(b) Learning dynamics of a QFL setup with 10 clients
and 20 communication rounds.
( = 4). Shaded areas denote standard deviation across 50 runs.
model back to all clients. Thus, the QFL setup accounts for 100/5 = 20 communication rounds. As
before, both experiments are repeated 50 times to account for statistical variability.</p>
        <p>As shown in Figure 2, the federated setup significantly outperforms the single-client case. Not only
does the QFL model converge to a parameter much closer to the optimal rotation, but it also exhibits
considerably lower variance across runs. This highlights the ability of QFL to mitigate the efects of local
imbalance by leveraging information aggregated from multiple clients without the need to centralize or
share local data.</p>
      </sec>
      <sec id="sec-5-2">
        <title>5.2. Experiment 2: Imbalanced and Noisy Datasets</title>
        <p>To simulate realistic hardware conditions, we also introduce quantum noise in the form of depolarizing
noise. This noise is applied immediately after the state preparation stage, and acts as a quantum channel:

3
ℰ ( ) = (1 − ) +</p>
        <p>
          ( +   +  ),
where  ∈ [
          <xref ref-type="bibr" rid="ref1">0, 1</xref>
          ] is the depolarization probability. This models a common and challenging form of
noise present in many current quantum devices [15], and serves to test the robustness of QFL under
noisy conditions.
        </p>
        <p>Figure 3 shows the angular distance between the learned and optimal rotation as the noise probability
increases for a single client setup. Fig. 3a corresponds to the case of balanced datasets ( = 0) while
Fig. 3b illustrates the performance under high imbalance ( = 4). In both scenarios, we observe a
consistent degradation in accuracy as the noise level increases. However, the degradation is significantly
more pronounced when both imbalance and noise are present, highlighting their compounding efect.
(a) Balanced datasets ( = 0).
(b) Imbalanced datasets ( = 4).</p>
        <p>As done in Section 5.1, we replicate here the training evolution analysis from Fig. 2 but under more
challenging conditions. In particular, we consider a highly imbalanced setting ( = 4) combined with
strong depolarizing noise ( = 0.5).</p>
        <p>(a) Learning dynamics of a single client.</p>
        <p>(b) Learning dynamics of a QFL setup with 10 clients
and 20 communication rounds.</p>
        <p>Fig. 4a shows the evolution of a single client’s model parameter under a highly imbalanced and noisy
scenario, while Fig. 4b shows the evolution of the QFL framework using 10 clients and 20 federated
rounds. As before, each curve represents the average angular distance to the optimal rotation over 50
trials. We observe that the single-client setup struggles to find a good approximation to the optimal
rotation, with large variance across trials. In contrast, the QFL setup converges more efectively and
exhibits significantly reduced variance, demonstrating its robustness in the presence of both data
imbalance and quantum noise.</p>
        <p>These experiments demonstrate that QFL is a valuable framework for QSD under realistic constraints.
By efectively mitigating the detrimental efects of data imbalance and depolarizing noise, QFL enables
more accurate and stable training compared to isolated, local approaches. This highlights its potential
for scalable and privacy-preserving quantum information processing in practical settings.</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusion and Future Work</title>
      <p>In this work, we have demonstrated that QFL can be efectively employed to mitigate the impact of
data imbalance and quantum noise in a decentralized and privacy-preserving setting. While most
prior research on QFL has focused on classical machine learning tasks using quantum models, our
results show that QFL can also benefit genuinely quantum problems—such as QSD—by enabling robust
learning in scenarios where clients have limited, biased, and noisy access to quantum data. Given the
constraints of current and near-term quantum hardware, QFL presents a promising framework for
practical distributed quantum learning applications.</p>
      <p>Looking ahead, several directions for future research emerge. First, a systematic study of key
hyperparameters—such as the number of clients, the amount of data per client, the number and
frequency of federated rounds, and the aggregation strategies used—could help identify configurations
that optimize performance across diferent quantum learning scenarios. In our experiments, these
choices were made heuristically; a more principled exploration could yield deeper insights into the
dynamics of QFL and its limitations.</p>
      <p>Another natural direction for future research is to increase the complexity of the QSD problem
itself. This could involve extending the task to multi-qubit systems, introducing more than two classes,
or simulating more realistic quantum environments by incorporating multiple, heterogeneous noise
sources. These extensions would not only bring the problem closer to real-world quantum applications
but also further test the scalability and robustness of the QFL framework in more demanding scenarios.</p>
      <p>Along the same lines, future work could also examine the role of quantum model architecture in
distributed learning. While this study focused on a simple VQC to allow for analytical comparison,
more expressive ansätze—potentially involving entanglement or deeper layers—could enhance learning
performance in more complex settings. Understanding how circuit design influences learning dynamics
and generalization in QFL could provide useful guidelines for building more powerful and adaptable
quantum models for distributed applications.</p>
    </sec>
    <sec id="sec-7">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the authors used ChatGPT in order to: Grammar and spelling
check, Paraphrase and reword. After using this tool, the authors reviewed and edited the content as
needed and takes full responsibility for the publication’s content.
[20] J. Wu, T. Hu, Q. Li, Distributed Quantum Machine Learning: Federated and Model-Parallel
Approaches, IEEE Internet Computing 28 (2024) 65–72. URL: https://ieeexplore.ieee.org/document/
10508212/?arnumber=10508212. doi:10.1109/MIC.2024.3361288, conference Name: IEEE
Internet Computing.
[21] W. Li, S. Lu, D.-L. Deng, Quantum federated learning through blind quantum computing,
Science China Physics, Mechanics &amp; Astronomy 64 (2021) 100312. URL: https://doi.org/10.1007/
s11433-021-1753-3. doi:10.1007/s11433-021-1753-3.
[22] B. Polacchi, D. Leichtle, L. Limongi, G. Carvacho, G. Milani, N. Spagnolo, M. Kaplan, F. Sciarrino,
E. Kashefi, Multi-client distributed blind quantum computation with the Qline architecture, Nature
Communications 14 (2023) 7743. URL: https://doi.org/10.1038/s41467-023-43617-0. doi:10.1038/
s41467-023-43617-0.
[23] Q. Xia, Z. Tao, Q. Li, Defending Against Byzantine Attacks in Quantum Federated Learning,
in: 2021 17th International Conference on Mobility, Sensing and Networking (MSN), 2021, pp.
145–152. URL: https://ieeexplore.ieee.org/document/9751508/?arnumber=9751508. doi:10.1109/
MSN53354.2021.00035.
[24] W. Yamany, N. Moustafa, B. Turnbull, OQFL: An Optimized Quantum-Based Federated
Learning Framework for Defending Against Adversarial Attacks in Intelligent Transportation
Systems, IEEE Transactions on Intelligent Transportation Systems 24 (2023) 893–903. URL: https:
//ieeexplore.ieee.org/document/9641742/. doi:10.1109/TITS.2021.3130906, conference Name:
IEEE Transactions on Intelligent Transportation Systems.
[25] Y. Zhang, C. Zhang, C. Zhang, L. Fan, B. Zeng, Q. Yang, Federated Learning with Quantum Secure
Aggregation, 2023. URL: http://arxiv.org/abs/2207.07444. doi:10.48550/arXiv.2207.07444,
arXiv:2207.07444 [quant-ph].
[26] L. Sünkel, P. Altmann, M. Kölle, T. Gabor, Quantum Federated Learning for Image
Classiifcation:, in: Proceedings of the 16th International Conference on Agents and Artificial
Intelligence, SCITEPRESS - Science and Technology Publications, Rome, Italy, 2024, pp. 936–
942. URL: https://www.scitepress.org/DigitalLibrary/Link.aspx?doi=10.5220/0012421200003636.
doi:10.5220/0012421200003636.
[27] H. Lee, S. Park, Performance Analysis of Quantum Federated Learning in Data Classification, The
Journal of Korean Institute of Communications and Information Sciences 49 (2024) 264–269. URL:
https://www.dbpia.co.kr. doi:10.7840/kics.2024.49.2.264.
[28] J. Shi, T. Chen, S. Zhang, X. Li, Personalized Quantum Federated Learning for Privacy Image
Classification, 2024. URL: http://arxiv.org/abs/2410.02547. doi: 10.48550/arXiv.2410.02547,
arXiv:2410.02547.
[29] D. Gurung, S. R. Pokhrel, A Personalized Quantum Federated Learning, in: Proceedings of
the 8th Asia-Pacific Workshop on Networking, ACM, Sydney Australia, 2024, pp. 175–176. URL:
https://dl.acm.org/doi/10.1145/3663408.3665806. doi:10.1145/3663408.3665806.
[30] T. Wang, H.-H. Tseng, S. Yoo, Quantum Federated Learning with Quantum Networks, in:
ICASSP 2024 - 2024 IEEE International Conference on Acoustics, Speech and Signal Processing
(ICASSP), 2024, pp. 13401–13405. URL: https://ieeexplore.ieee.org/abstract/document/10447516.
doi:10.1109/ICASSP48485.2024.10447516, iSSN: 2379-190X.
[31] D. De, M. Nandy Pal, D. Hazra, FQPDR: federated quantum neural network for privacy-preserving
early detection of diabetic retinopathy, Evolutionary Intelligence (2024). URL: https://doi.org/10.
1007/s12065-024-00971-2. doi:10.1007/s12065-024-00971-2.
[32] Z. Qu, L. Zhang, P. Tiwari, Quantum Fuzzy Federated Learning for Privacy Protection in Intelligent
Information Processing, IEEE Transactions on Fuzzy Systems (2024) 1–12. URL: https://ieeexplore.
ieee.org/document/10572363/?arnumber=10572363. doi:10.1109/TFUZZ.2024.3419559,
conference Name: IEEE Transactions on Fuzzy Systems.
[33] Y. Song, Y. Wu, S. Wu, D. Li, Q. Wen, S. Qin, F. Gao, A quantum federated learning framework
for classical clients, Science China Physics, Mechanics &amp; Astronomy 67 (2024) 250311. URL:
https://doi.org/10.1007/s11433-023-2337-2. doi:10.1007/s11433-023-2337-2.
[34] W. J. Yun, J. P. Kim, S. Jung, J. Park, M. Bennis, J. Kim, Slimmable quantum federated learning,
arXiv preprint arXiv:2207.10221 (2022).
[35] M. Chehimi, W. Saad, Quantum Federated Learning with Quantum Data, in: ICASSP 2022 - 2022
IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2022, pp.
8617–8621. doi:10.1109/ICASSP43922.2022.9746622, iSSN: 2379-190X.
[36] S. Rayhan, An Approach to Work with Quantum Data in Federated Learning, 2023.
[37] A. Chefles, Quantum State Discrimination, Contemporary Physics 41 (2000) 401–424. URL: http:
//arxiv.org/abs/quant-ph/0010114. doi:10.1080/00107510010002599, arXiv:quant-ph/0010114.
[38] J. A. Bergou, Discrimination of quantum states, Journal of Modern Optics 57 (2010) 160–180.</p>
      <p>URL: https://doi.org/10.1080/09500340903477756. doi:10.1080/09500340903477756, publisher:
Taylor &amp; Francis _eprint: https://doi.org/10.1080/09500340903477756.
[39] J. Bae, L.-C. Kwek, Quantum state discrimination and its applications, 2017. URL: https://arxiv.org/
abs/1707.02571v2. doi:10.1088/1751-8113/48/8/083001.
[40] H. Chen, L. Wossnig, S. Severini, H. Neven, M. Mohseni, Universal discriminative quantum
neural networks, Quantum Machine Intelligence 3 (2021) 1. URL: http://arxiv.org/abs/1805.08654.
doi:10.1007/s42484-020-00025-7, arXiv:1805.08654 [quant-ph].
[41] D. Lee, K. Baek, J. Huh, D. K. Park, Variational quantum state discriminator for supervised machine
learning, 2023. URL: https://arxiv.org/abs/2303.03588v1. doi:10.1088/2058-9565/ad0a05.
[42] A. Patterson, H. Chen, L. Wossnig, S. Severini, D. Browne, I. Rungger, Quantum state discrimination
using noisy quantum neural networks, Physical Review Research 3 (2021) 013063. URL: https://link.
aps.org/doi/10.1103/PhysRevResearch.3.013063. doi:10.1103/PhysRevResearch.3.013063.
[43] C. W. Helstrom, Quantum detection and estimation theory, Journal of Statistical Physics 1 (1969)
231–252. URL: https://doi.org/10.1007/BF01007479. doi:10.1007/BF01007479.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>S.</given-names>
            <surname>Tufail</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Riggs</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Tariq</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. I. Sarwat</surname>
          </string-name>
          ,
          <article-title>Advancements and Challenges in Machine Learning: A Comprehensive Review of Models, Libraries, Applications, and</article-title>
          <string-name>
            <surname>Algorithms</surname>
          </string-name>
          ,
          <source>Electronics</source>
          <volume>12</volume>
          (
          <year>2023</year>
          )
          <article-title>1789</article-title>
          . URL: https://www.mdpi.com/2079-9292/12/8/1789. doi:
          <volume>10</volume>
          .3390/electronics12081789, number: 8 Publisher: Multidisciplinary Digital Publishing Institute.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>M.</given-names>
            <surname>Schuld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Petruccione</surname>
          </string-name>
          ,
          <article-title>Supervised Learning with Quantum Computers, Quantum Science</article-title>
          and Technology, Springer International Publishing, Cham,
          <year>2018</year>
          . URL: https://link.springer.com/10. 1007/978-3-
          <fpage>319</fpage>
          -96424-9. doi:
          <volume>10</volume>
          .1007/978-3-
          <fpage>319</fpage>
          -96424-9.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>M.</given-names>
            <surname>Schuld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Sinayskiy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Petruccione</surname>
          </string-name>
          ,
          <article-title>An introduction to quantum machine learning</article-title>
          ,
          <source>Contemporary Physics</source>
          <volume>56</volume>
          (
          <year>2015</year>
          )
          <fpage>172</fpage>
          -
          <lpage>185</lpage>
          . URL: https://doi.org/10.1080/00107514.
          <year>2014</year>
          .
          <volume>964942</volume>
          . doi:
          <volume>10</volume>
          .1080/00107514.
          <year>2014</year>
          .
          <volume>964942</volume>
          , publisher: Taylor &amp; Francis _eprint: https://doi.org/10.1080/00107514.
          <year>2014</year>
          .
          <volume>964942</volume>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>