<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Predicting Pseudo-Random and Quantum Random Number Sequences using Hybrid Deep Learning Models</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Dmytro Proskurin</string-name>
          <email>proskurin.d@stud.nau.edu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Sergiy Gnatyuk</string-name>
          <email>s.gnatyuk@nau.edu.ua</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tetiana Okhrimenko</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>National Aviation University</institution>
          ,
          <addr-line>1 Liubomyra Huzara Ave, 03058, Kyiv</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <abstract>
        <p>Predicting random number sequences has significant implications for cryptography and secure communication systems. In this paper, a hybrid deep learning model was proposed, it combines Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and RNNs to predict pseudo-random number generator (PRNG) and quantum random number generator (QRNG) sequences. Proposed model was compared with traditional CNNs, LSTMs, and RNNs models. Given results showed that the hybrid model outperforms the other models, providing better prediction accuracy for PRNG and QRNG sequences.</p>
      </abstract>
      <kwd-group>
        <kwd>1 Random numbers</kwd>
        <kwd>RNN</kwd>
        <kwd>CNN</kwd>
        <kwd>LSTM</kwd>
        <kwd>GRU</kwd>
        <kwd>Hybrid model</kwd>
        <kwd>Secure communication</kwd>
        <kwd>PRNG</kwd>
        <kwd>QRNG</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. Related Works</title>
      <p>
        Several studies have explored the use of deep learning techniques for predicting random number sequences.
For instance, the use of CNNs and LSTMs has been reported in predicting PRNG sequences [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ]. In another
study, RNNs have been employed to predict QRNG sequences [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. However, there is limited research on hybrid
deep learning models that combine multiple neural network architectures to predict PRNG and QRNG
sequences.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. Goal of the Research</title>
      <p>The primary goal of this research is to investigate the effectiveness of various deep learning architectures,
including MLP, CNN, LSTM, and RNNs models, for the task of predicting the next value in a sequence of
random numbers generated by a combination of PRNG and QRNG sources. By exploring different neural
network architectures, was aimed to identify the most suitable model for this problem, considering aspects such
as predictive accuracy, model complexity, and training time. Another objective is to assess whether the trained
models can achieve better prediction results than a random baseline, indicating that they have learned
meaningful patterns in the data. To ensure a fair comparison, will be used appropriate evaluation metrics, such
as Mean Squared Error (MSE) and Mean Absolute Error (MAE), to quantify the performance of each model
and compare it against a random prediction benchmark. Finally, was aimed to provide insights into the practical
implications of using deep learning models to predict random number sequences generated from quantum
sources, as well as discussing potential future research directions in this field. By understanding the strengths
and limitations of various models for this task, authors hope to contribute to the development of more advanced
techniques for analysing and predicting random number sequences in different contexts.</p>
    </sec>
    <sec id="sec-4">
      <title>4. Methodology</title>
    </sec>
    <sec id="sec-5">
      <title>5. Model Architecture</title>
      <p>
        The dataset used in this study consists of PRNG and QRNG sequences generated using various algorithms,
such as the Mersenne Twister, Linear Congruential Generator, and a commercial QRNG device [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ]. The dataset
is divided into training, validation, and test sets, ensuring a balanced representation of PRNG and QRNG
sequences in each set.
      </p>
      <p>
        The proposed hybrid deep learning model combines the strengths of CNNs, LSTMs, and RNNs to predict
PRNG and QRNG sequences. The model consists of a CNN layer for feature extraction, followed by an LSTM
layer to capture temporal dependencies, and a RNNs layer for capturing long-range dependencies. The final
output is a single linear activation unit that produces the predicted value. The model is trained using the Adam
optimizer and mean squared error (MSE) as the loss function [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
    </sec>
    <sec id="sec-6">
      <title>6. Results Analysis</title>
      <p>The first step in analysing the model's performance was to visually inspect the predicted values against the
true values. This was achieved by plotting the first 100 true values and the corresponding predicted values on
the same graph. This visualization allows us to assess the overall fit of the model to the data and identify any
noticeable discrepancies between the predicted and true values.</p>
      <sec id="sec-6-1">
        <title>6.1 Similarity Assessment</title>
        <p>To quantify the similarity between the true values and the predicted values, was calculated the Pearson
correlation coefficient. This metric measures the linear relationship between two datasets, with a value close to
1 indicating a strong positive relationship. A pre-defined threshold of 0.9 was used to determine whether the
predicted values were considered close to the true values. Based on the computed correlation coefficient, was
concluded whether the model's predictions were close to the true values or not.</p>
      </sec>
      <sec id="sec-6-2">
        <title>6.2 Model Performance Comparison</title>
        <p>To assess the effectiveness of the model, its performance was compared against a random prediction
baseline. This was done by generating random predictions within the same range as the true values and
calculating the Mean Squared Error (MSE) for both the model's predictions and the random predictions. By
comparing these MSE values, we were able to determine whether the GRU model's predictions were better
than random ones. The results from the visual inspection, similarity assessment, and model performance
comparison provide a comprehensive analysis of the model's performance in predicting the next value in a
sequence of random numbers. These findings contribute to our understanding of the model's effectiveness for
this specific task and offer insights into potential improvements or alternative approaches.</p>
      </sec>
    </sec>
    <sec id="sec-7">
      <title>7. Experiments and Results</title>
      <p>The hybrid model is trained on the dataset and its performance is compared with traditional RNNs, CNNs,
and LSTMs. The models are evaluated using the Pearson correlation coefficient and mean squared error (MSE)
to assess the similarity between the true and predicted values.</p>
      <sec id="sec-7-1">
        <title>7.1 Simple RNN</title>
        <p>The simple RNN is the most basic form of a recurrent neural network, characterized by its single hidden
layer that takes input from the previous time step and feeds it back into the network for the next time step (Fig. 1).</p>
        <p>Despite its simplicity, the performance of simple RNNs in predicting PRNG and QRNG sequences is limited
due to their inability to capture long-range dependencies as a result of the vanishing gradient problem (Fig. 2).</p>
      </sec>
      <sec id="sec-7-2">
        <title>7.2 Gated Recurrent Unit (GRU)</title>
        <p>The GRU is an advanced RNN architecture that addresses the vanishing gradient problem observed in
simple RNNs. With the introduction of gating mechanisms, GRUs can learn when to update the hidden state
and when to maintain the existing state, allowing them to capture longer-range dependencies more
effectively (Fig. 3).</p>
        <p>When applied to PRNG and QRNG sequence prediction, GRUs demonstrate improved performance
compared to simple RNNs (Fig. 4).</p>
      </sec>
    </sec>
    <sec id="sec-8">
      <title>7.3 Bidirectional RNN</title>
      <p>Bidirectional RNNs process the input sequence in both forward and backward directions, enabling the
network to capture information from both past and future time steps. This capability proves useful for tasks where
context from both directions is important, such as natural language processing and speech recognition (Fig. 5).</p>
      <p>In the context of PRNG and QRNG sequence prediction, bidirectional RNNs exhibit enhanced performance
due to their ability to incorporate information from the entire sequence (Fig. 6).</p>
    </sec>
    <sec id="sec-9">
      <title>7.4 Stacked RNN</title>
      <p>A stacked RNN architecture consists of multiple layers of RNNs stacked on top of each other, allowing the
network to learn more complex features and representations of the input sequence. This increased complexity
can lead to improved prediction performance for PRNG and QRNG sequences (Fig.7).</p>
      <p>Stacked RNNs, when compared with other RNN variants, demonstrate superior performance in capturing
intricate patterns within the input data (Fig. 8).</p>
    </sec>
    <sec id="sec-10">
      <title>7.5 Convolutional Neural Networks</title>
      <p>
        CNNs have shown success in time series prediction tasks due to their ability to capture local patterns and
dependencies [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ]. In our experiments, a CNN model is trained on the PRNG and QRNG sequences dataset (Fig. 9).
      </p>
      <p>The results indicate that the CNN model can capture some local patterns in the sequences, but struggles to
predict long-range dependencies, leading to suboptimal prediction accuracy (Fig. 10).</p>
    </sec>
    <sec id="sec-11">
      <title>7.6 Long Short-Term Memory Networks</title>
      <p>
        LSTMs are designed to capture long-term dependencies in time series data [
        <xref ref-type="bibr" rid="ref10 ref8 ref9">8-10</xref>
        ]. Was trained an LSTM
model on the PRNG and QRNG sequences dataset and evaluate its performance (Fig. 11).
      </p>
      <p>
        The results show that the LSTM model can capture temporal dependencies in the sequences, but its
performance is limited by the absence of feature extraction capabilities (Fig. 12) [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
    </sec>
    <sec id="sec-12">
      <title>7.7 Hybrid Deep Learning Model</title>
      <p>
        The proposed hybrid model combines the strengths of CNNs, LSTMs, and RNNs to predict PRNG and
QRNG sequences (Fig. 13) [
        <xref ref-type="bibr" rid="ref12 ref13">12,13</xref>
        ].
      </p>
      <p>The model's performance is compared with the other models, and the results show that the hybrid model
outperforms the traditional CNNs, LSTMs, and RNNs models, providing better prediction accuracy for PRNG
and QRNG sequences (Fig. 14).</p>
      <p>It can be observed numerous instances where the models were able to predict the exact value or a very close
trend in PRNG and QRNG sequences (Fig. 15, 16). These instances demonstrate the effectiveness of the models
in understanding the underlying patterns and dependencies within the data, as well as their capability to
generalize and make accurate predictions on unseen data.</p>
      <p>Furthermore, it was observed that the models were often able to predict a close trend in the sequences, even
if the exact value was not pinpointed (Fig.17).</p>
      <p>
        This indicates that the models have a strong grasp of the overall dynamics and structure of the data, enabling
them to generate predictions that closely follow the actual trajectory of the PRNG and QRNG sequences
[1416]. This level of trend identification can prove beneficial in scenarios where understanding the general
direction or pattern of the data is more critical than pinpointing individual values [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ].
      </p>
    </sec>
    <sec id="sec-13">
      <title>Conclusions and Future Work</title>
      <p>In this paper, was presented a novel hybrid deep learning model that combines the strengths of
Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and RNNs to predict
Pseudo-Random Number Generator (PRNG) and Quantum Random Number Generator (QRNG) sequences.
Our results demonstrate that the hybrid model outperforms traditional CNNs, LSTMs, and RNNs models in
terms of prediction accuracy for both PRNG and QRNG sequences.</p>
      <p>As part of the future work, it is planned to explore other hybrid model architectures that could further
enhance the performance of our current model. Also was aimed to investigate the use of additional features,
such as information from the frequency domain, to improve the prediction capabilities of our model.
Furthermore, authors intend to study the generalizability of our hybrid model to other sequence prediction tasks.
These tasks may include predicting cryptographic keys, secure communication protocols, and other
securityrelated applications. Additionally, will be considered the development of more robust and efficient training
strategies to ensure that proposed model remains effective even in the face of rapidly evolving security threats.
By continuing to enhance and refine our hybrid deep learning model, authors hope to contribute to the
advancement of secure communications and data protection in the digital age.</p>
    </sec>
    <sec id="sec-14">
      <title>Acknowledgements</title>
      <p>This work is carried out within the framework of research grant №0122U002361“Intelligent system of
secure packet data transmission based on reconnaissance UAV”, funding by the Ministry of Education and
Science of Ukraine during 2022-2023.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>M.</given-names>
            <surname>Herrero-Collantes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. C.</given-names>
            <surname>Garcia-Escartin</surname>
          </string-name>
          , “
          <article-title>Quantum random number generators”</article-title>
          ,
          <source>Reviews of Modern Physics</source>
          , Vol.
          <volume>89</volume>
          ,
          <year>2017</year>
          , art.
          <volume>015004</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>I.</given-names>
            <surname>Goodfellow</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bengio</surname>
          </string-name>
          ,
          <string-name>
            <surname>A</surname>
          </string-name>
          . Courville, “Deep Learning”, MIT Press,
          <year>2016</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Pérez-Jiménez</surname>
          </string-name>
          , “
          <article-title>Forecasting Sunspot Numbers with LSTM”</article-title>
          ,
          <source>International Conference on Membrane Computing</source>
          , Springer, Cham,
          <year>2016</year>
          , pp.
          <fpage>153</fpage>
          -
          <lpage>166</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>A.</given-names>
            <surname>Vaswani</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Shazeer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Parmar</surname>
          </string-name>
          et al, “
          <article-title>Attention is all you need”</article-title>
          ,
          <source>Advances in neural information processing systems</source>
          , Vol.
          <volume>30</volume>
          ,
          <year>2017</year>
          , pp.
          <fpage>5998</fpage>
          -
          <lpage>6008</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>G.</given-names>
            <surname>Marsaglia</surname>
          </string-name>
          , “Random Number Generators”,
          <source>Journal of Modern Applied Statistical Methods</source>
          , Vol.
          <volume>2</volume>
          ,
          <issue>2003</issue>
          , pp.
          <fpage>2</fpage>
          -
          <lpage>13</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>D. P.</given-names>
            <surname>Kingma</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Ba</surname>
          </string-name>
          , Adam, “
          <article-title>A method for stochastic optimization”</article-title>
          ,
          <source>arXiv preprint arXiv:1412.6980</source>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>Y.</given-names>
            <surname>LeCun</surname>
          </string-name>
          , Y. Bengio, G. Hinton, “
          <article-title>Deep learning”</article-title>
          ,
          <source>Nature</source>
          , Vol.
          <volume>521</volume>
          ,
          <year>2015</year>
          , pp.
          <fpage>436</fpage>
          -
          <lpage>444</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>S.</given-names>
            <surname>Hochreiter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Schmidhuber</surname>
          </string-name>
          ,
          <article-title>“Long short-term memory”, Neural computation</article-title>
          , Vol.
          <volume>9</volume>
          ,
          <issue>1997</issue>
          , pp.
          <fpage>1735</fpage>
          -
          <lpage>1780</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Imanbayev</surname>
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Tynymbayev</surname>
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Odarchenko R</surname>
          </string-name>
          . et al, “
          <article-title>Research of Machine Learning Algorithms for the Development of Intrusion Detection Systems in 5G Mobile Networks</article-title>
          and Beyond”, Sensors,
          <year>2022</year>
          , Vol.
          <volume>22</volume>
          , issue 24, art.
          <volume>9957</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>I.</given-names>
            <surname>Sutskever</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Vinyals</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q. V.</given-names>
            <surname>Le</surname>
          </string-name>
          , “
          <article-title>Sequence to sequence learning with neural networks</article-title>
          ,
          <source>Advances in neural information processing systems”</source>
          , Vol.
          <volume>27</volume>
          ,
          <year>2014</year>
          , pp.
          <fpage>3104</fpage>
          -
          <lpage>3112</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>J.</given-names>
            <surname>Chung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Gulcehre</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Cho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Bengio</surname>
          </string-name>
          , “
          <article-title>Empirical evaluation of gated recurrent neural networks on sequence modelling”</article-title>
          ,
          <source>arXiv preprint arXiv:1412.3555</source>
          ,
          <year>2014</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Azarov</surname>
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Gnatyuk</surname>
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Aleksander</surname>
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Azarov</surname>
            <given-names>I.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mukasheva</surname>
            <given-names>A.</given-names>
          </string-name>
          “
          <article-title>Real-time ML Algorithms for the Detection of Dangerous Objects in Critical Infrastructures”</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          ,
          <year>2023</year>
          , Vol.
          <volume>3373</volume>
          , pp.
          <fpage>217</fpage>
          -
          <lpage>226</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Iashvili</surname>
            <given-names>G.</given-names>
          </string-name>
          , Avkurova
          <string-name>
            <given-names>Z.</given-names>
            ,
            <surname>Iavich</surname>
          </string-name>
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Bauyrzhan</surname>
          </string-name>
          <string-name>
            <given-names>M.</given-names>
            ,
            <surname>Gagnidze</surname>
          </string-name>
          <string-name>
            <given-names>A.</given-names>
            ,
            <surname>Gnatyuk</surname>
          </string-name>
          <string-name>
            <surname>S.</surname>
          </string-name>
          “
          <source>Content-Based Machine Learning Approach for Hardware Vulnerabilities Identification System”, Lecture Notes on Data Engineering and Communications Technologies</source>
          , Vol.
          <volume>83</volume>
          , pp.
          <fpage>117</fpage>
          -
          <lpage>126</lpage>
          ,
          <year>2021</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>J.</given-names>
            <surname>Aldama</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Sarmiento</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I. H.</given-names>
            <surname>López Grande</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Signorini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. T.</given-names>
            <surname>Vidarte</surname>
          </string-name>
          and
          <string-name>
            <given-names>V.</given-names>
            <surname>Pruneri</surname>
          </string-name>
          , “
          <article-title>Integrated QKD and QRNG Photonic Technologies</article-title>
          ,”
          <source>in Journal of Lightwave Technology</source>
          , vol.
          <volume>40</volume>
          , no.
          <issue>23</issue>
          , pp.
          <fpage>7498</fpage>
          -
          <issue>7517</issue>
          , 1 Dec.1,
          <year>2022</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Faure</surname>
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Shcherba</surname>
            <given-names>A.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Vasiliu</surname>
            <given-names>Y.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Fesenko</surname>
            <given-names>A.</given-names>
          </string-name>
          “
          <article-title>Cryptographic key exchange method for data factorial coding”</article-title>
          ,
          <source>CEUR Workshop Proceedings</source>
          ,
          <year>2020</year>
          , Vol.
          <volume>2654</volume>
          , pp.
          <fpage>643</fpage>
          -
          <lpage>653</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>R.</given-names>
            <surname>Kuang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Lou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>He</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>McKenzie</surname>
          </string-name>
          and
          <string-name>
            <given-names>M.</given-names>
            <surname>Redding</surname>
          </string-name>
          , “
          <article-title>Pseudo Quantum Random Number Generator with Quantum Permutation Pad</article-title>
          ,” 2021
          <string-name>
            <given-names>IEEE</given-names>
            <surname>International</surname>
          </string-name>
          <article-title>Conference on Quantum Computing and Engineering (QCE), Broomfield</article-title>
          , CO, USA,
          <year>2021</year>
          , pp.
          <fpage>359</fpage>
          -
          <lpage>364</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>M.</given-names>
            <surname>Gupta and M. J. Nene</surname>
          </string-name>
          , “
          <article-title>Random Sequence Generation using Superconducting Qubits,” 2021 Third International Conference on Intelligent Communication Technologies and Virtual Mobile Networks (ICICV), Tirunelveli</article-title>
          , India,
          <year>2021</year>
          , pp.
          <fpage>640</fpage>
          -
          <lpage>645</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>