<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>ORCID:</journal-title>
      </journal-title-group>
      <issn pub-type="ppub">1613-0073</issn>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Showcasing sentiment classification and word prediction in the quantum natural processing area</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>David Peral-García</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Juan Cruz-Benito</string-name>
          <email>juan.cruz.benito@ibm.com</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Francisco José García-Peñalvo</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Salamanca</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Spain</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Workshop</string-name>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Faculty of Science, Universidad de Salamanca (https://ror.org/02f40zc51)</institution>
          ,
          <addr-line>Plaza de los Caídos s/n, 37008</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>GRIAL Research Group, Department of Computers and Automatics, Research Institute for Educational Sciences</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>IBM T.J. Watson Research Center</institution>
          ,
          <addr-line>Yorktown Heights, NY 10598</addr-line>
          ,
          <country country="US">USA</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2045</year>
      </pub-date>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>The advent of quantum computers makes it possible to perform quantum computations in different areas like machine learning, finance, or chemistry. This paper showcases one of the emerging areas under quantum machine learning, quantum natural language processing. We present two quantum natural language processing tasks, sentiment classification and missing word prediction in a sentence. We show how these tasks can be achieved even in real quantum computers using the two main libraries in this subfield, DisCoPy, and lambeq. quantum computing, quantum machine learning, quantum natural language processing Proceedings</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction and state of the art</title>
      <p>
        Quantum physics and quantum mechanics are nowadays two key components of one of the emerging
areas with major theoretical potential in computation: Quantum Computing. As devised in the
literature, quantum-computing-based algorithms and procedures like Grover’s search algorithm
[
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and Shor’s algorithm [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] can outperform existing algorithms and solutions and help to scale
the existing computational power. Apart from these two typical examples of the potential of
quantum computing, some other applications in fields like chemistry, finance, or machine
learning have recently received significant attention from the research community. Moreover,
some significant results related to these fields exist even with the actual quantum devices, the
near-term devices. Quantum machine learning (QML), the process of performing machine
learning tasks using quantum
      </p>
      <p>
        mechanics, is one of the emerging areas nowadays. The
capabilities of quantum computers could allow machine learning to explore areas that could be
too hard to do with classical computing. Nowadays, there exist practical implementations of
QML algorithms, like variable depth quantum circuits (vVQC) [
        <xref ref-type="bibr" rid="ref3 ref4 ref5">3, 4, 5</xref>
        ], hybrid quantum
autoencoders (HQA) [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ], quantum neural networks (QNN) [
        <xref ref-type="bibr" rid="ref7 ref8 ref9">7, 8, 9</xref>
        ] [Figure 1], or hybrid
kneighbours nearby models (HKNN) [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ]. A review of the main QML algorithms and its
applications between 2017 and 2021 can be found in [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. A recent subarea of QML is Quantum
Natural Language Processing (QNLP), which uses NLP models jointly with certain quantum
phenomena such as superposition, entanglement, and interference to perform language-related
tasks on quantum hardware. One of the seminal papers about this was written in 2010 by Bob
Coecke [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ], who established the theoretical and
mathematical concepts based on a
grammatical theory based on the algebra of Pregroups, introduced by Jim Lambek [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. The
code implementations of this basis are available in DisCoPy and Lambeq libraries [Figure 2].
(F. J. García-Peñalvo)
(F. J. García-Peñalvo)
      </p>
      <p>2023 Copyright for this paper by its authors.
CEUR</p>
      <p>ceur-ws.org</p>
      <p>The first one allows the user to define string diagrams and monoidal functors, and the second
one provides different tools to transform and manipulate a sentence, for example, convert it
into string diagrams.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Applications</title>
      <p>
        Using the two libraries cited in the previous section, DisCoPy [
        <xref ref-type="bibr" rid="ref14 ref15">14, 15</xref>
        ] and Lambeq [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ], we are
going to present the process to train two different models, one for sentiment classification and one of
word prediction.
2.1.
      </p>
    </sec>
    <sec id="sec-3">
      <title>Classification</title>
      <p>
        The classification task in machine learning is used to differentiate between two categories or labels;
it can be used in images, e.g., classifying between cats and dogs or, as we do in this paper,
classifying sentences in positive or negative sentiment. In this section, we start explaining the
modeling process and, after that, an example of codification, transform, and training. First, the
input sentence is encoded into a string diagram using a parser, for example, DepCGG or BobCat
parser [Figure 3]. Second, lambeq allows rewriting the string diagram to reduce complexity,
simplify it, and improve performance in the future training step. Third, the diagram is
parameterized, and we convert it into a circuit using an ansatz. Some examples of the ansatzs
that lambeq provides are SpiderAnsatz [Figure 4], IQPAnsatz [Figure 5], Sim14Ansatz,
Sim15Ansatz or StronglyEntanglementAnsatz. Fourth, when the parameterisable circuit is
created, a compatible backend with the model must be defined. For example, we can use a
quantum backend like the qiskit [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ] backend with the TketModel, or we can use classical
resources (compatible with Jax [
        <xref ref-type="bibr" rid="ref18">18</xref>
        ] and GPU) with the PytorchModel. Finally, the trainer has
different options to personalize and adapt to the model like a typical classical trainer: loss
functions, epochs, optimizers, hyperparameters, and evaluation functions.
      </p>
    </sec>
    <sec id="sec-4">
      <title>Prediction</title>
      <p>The prediction task is used to obtain an unknown value based on the previous historical data. A
model is trained with prior data and tries to predict the missing data with a certain probability
percentage. One of the areas that widely use this technology is business intelligence which tries to
predict the future behavior of the market and make decisions accordingly. In this case, we try to predict
the missing word from a sentence based on previous data we input into the model. With the DisCoPy
library, we tried to predict one of the words that can follow up the sentence based on the input. First,
we select a sentence with a defined grammatical structure. In this case, we defined the structure ’noun’
+ ’transitive verb’ + ’noun’, e.g., ’Alice drinks water’ [Figure 6]. Second, we removed the word of the
sentence we want to predict and replaced it with an unknown value, e.g.,’?’. ’Alice drinks ‘?’ [Figure
7]. Third, we trained the model with this unknown value and a corpus of words that could be the correct
answer. When the model is trained, we check if the results are correct [Figure 8].</p>
    </sec>
    <sec id="sec-5">
      <title>3. Conclusions and Future Implementations</title>
      <p>As we cited in the introduction and state-of-the-art section, the conceptual and mathematical basis
of QNLP had been established in the literature. Based on them, we presented some proof of concepts
for QNLP classification and prediction models with libraries that implement this basis: DisCoPy and
lambeq. In this paper, we demonstrate that it is possible to compute classification and prediction tasks
using quantum computing. Even if the tasks are grammatically simple and this type of task had already
been resolved by classical computing, there is a solid foundation that will enable us to increase the
complexity of the tasks in future experiments. On the one hand, regarding the classification task, we
use a parser to transform the sentences into string diagrams, allowing us to analyze each sentence’s
grammatical structure. Once this step is complete, we transform this string diagram into an ansatz; this
step is required to parameterize the object to train a model. Finally, depending on the backend we are
using to train the model; we select an ansatz compatible with classical or quantum devices.</p>
      <p>On the other hand, the prediction task has the same first step as classification, converting the input
data, the sentence, into a string diagram that represents the grammatical structure. In this proof of
concept, we remove one word from the sentence and try to predict it. We can observe that the model
predicts the correct word but with a low probability (18%). In future implementations of the examples
presented, for classification tasks, we should increase the complexity of the sentences we use in the
input data of the models or try to classify more than one sentence by joining them using, for example,
DisCoCirc [19]. In the prediction field, our primary goal is to increase the accuracy probability of this
type of task in future experiments to achieve a model comparable to classical models. Also, we can try
to predict more than one word for each sentence or without the grammatical context of the predicted
word. The main challenge in this area is to compute and train a model with a complete and larger text,
which is not already viable due to the number of qubits available in current quantum devices. One of
the potential strategies to address this problem, in parallel to the evolution of quantum hardware, is the
development of methods that allow us to encode the grammatical and semantic meaning of the sentence
using the fewer qubits possible, generating new QNLP procedures and algorithms [20] or following the
path of research advances in other quantum computing areas [21].</p>
    </sec>
    <sec id="sec-6">
      <title>4. References</title>
      <p>[19] B. Coecke, The mathematics of text structure, 2020. arXiv:1904.03478.
[20] V. Wang-Mascianica, J. Liu, B. Coecke, Distilling text into circuits, 2023. arXiv:2301.10595.
[21] A. Eddins, M. Motta, T. P. Gujarati, S. Bravyi, A. Mezzacapo, C. Hadfield, S. Sheldon, Doubling
the size of quantum simulators by entanglement forging, PRX Quantum 3 (2022) 010309.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>L. K.</given-names>
            <surname>Grover</surname>
          </string-name>
          ,
          <article-title>A fast quantum mechanical algorithm for database search</article-title>
          ,
          <source>in: Proceedings of the Twenty-Eighth Annual ACM Symposium on Theory of Computing</source>
          , STOC '96,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>1996</year>
          , p.
          <fpage>212</fpage>
          -
          <lpage>219</lpage>
          . URL: https: //doi.org/10.1145/237814.237866. doi:
          <volume>10</volume>
          .1145/237814.237866.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>P. W.</given-names>
            <surname>Shor</surname>
          </string-name>
          ,
          <article-title>Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer</article-title>
          ,
          <source>SIAM Journal on Computing</source>
          <volume>26</volume>
          (
          <year>1997</year>
          )
          <fpage>1484</fpage>
          -
          <lpage>1509</lpage>
          . URL: http: //dx.doi.org/10.1137/S0097539795293172. doi:
          <volume>10</volume>
          .1137/s0097539795293172.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Huang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Lei</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Q.</given-names>
            <surname>Zhu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Ren</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Liu</surname>
          </string-name>
          ,
          <article-title>Quantum generative model with variable depth circuit</article-title>
          ,
          <source>Computers, Materials and Continua</source>
          <volume>65</volume>
          (
          <year>2020</year>
          )
          <fpage>445</fpage>
          -
          <lpage>458</lpage>
          . doi:
          <volume>10</volume>
          .32604/cmc.
          <year>2020</year>
          .
          <volume>010390</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>M.</given-names>
            <surname>Benedetti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Garcia-Pintos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Perdomo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Leyton-Ortega</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Nam</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Perdomo-Ortiz</surname>
          </string-name>
          ,
          <article-title>A generative modeling approach for benchmarking and training shallow quantum circuits</article-title>
          ,
          <source>npj Quantum Information</source>
          <volume>5</volume>
          (
          <year>2019</year>
          ). URL: http://dx.doi.org/10.1038/s41534-019-0157-8. doi:
          <volume>10</volume>
          .1038/s41534- 019- 0157- 8.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>M.</given-names>
            <surname>Schuld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bocharov</surname>
          </string-name>
          ,
          <string-name>
            <surname>K. M. Svore</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          <string-name>
            <surname>Wiebe</surname>
          </string-name>
          ,
          <article-title>Circuit-centric quantum classifiers</article-title>
          ,
          <source>Physical Review A</source>
          <volume>101</volume>
          (
          <year>2020</year>
          ). URL: http://dx.doi.org/10.1103/PhysRevA.101.032308. doi:
          <volume>10</volume>
          .1103/ physreva.101.032308.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>M.</given-names>
            <surname>Srikumar</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C. D.</given-names>
            <surname>Hill</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. C. L.</given-names>
            <surname>Hollenberg</surname>
          </string-name>
          ,
          <article-title>Clustering and enhanced classification using a hybrid quantum autoencoder</article-title>
          ,
          <source>Quantum Science and Technology</source>
          <volume>7</volume>
          (
          <year>2021</year>
          )
          <article-title>015020</article-title>
          . URL: https://dx.doi.org/10.1088/2058-9565/ac3c53. doi:
          <volume>10</volume>
          .1088/2058- 9565/ac3c53.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>B. Q.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X. F.</given-names>
            <surname>Niu</surname>
          </string-name>
          ,
          <article-title>Quantum Neural Network with Improved Quantum Learning Algorithm</article-title>
          ,
          <source>International Journal of Theoretical Physics</source>
          <volume>59</volume>
          (
          <year>2020</year>
          )
          <fpage>1978</fpage>
          -
          <lpage>1991</lpage>
          . doi:
          <volume>10</volume>
          .1007/ s10773- 020- 04470- 9.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>F.</given-names>
            <surname>Tacchino</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Mangini</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. K.</given-names>
            <surname>Barkoutsos</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Macchiavello</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Gerace</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.</given-names>
            <surname>Tavernelli</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Bajoni</surname>
          </string-name>
          ,
          <article-title>Variational learning for quantum artificial neural networks</article-title>
          ,
          <source>IEEE Transactions on Quantum Engineering</source>
          <volume>2</volume>
          (
          <year>2021</year>
          )
          <fpage>1</fpage>
          -
          <lpage>10</lpage>
          . URL: http://dx.doi.org/10.1109/TQE.
          <year>2021</year>
          .
          <volume>3062494</volume>
          . doi:
          <volume>10</volume>
          .1109/ tqe.
          <year>2021</year>
          .
          <volume>3062494</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>J.</given-names>
            <surname>Bausch</surname>
          </string-name>
          ,
          <article-title>Classifying data using near-term quantum devices</article-title>
          ,
          <source>International Journal of Quantum Information</source>
          <volume>16</volume>
          (
          <year>2018</year>
          ). doi:
          <volume>10</volume>
          .1142/S0219749918400014.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>M. L. LaBorde</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          <string-name>
            <surname>Rogers</surname>
            ,
            <given-names>J. P.</given-names>
          </string-name>
          <string-name>
            <surname>Dowling</surname>
          </string-name>
          ,
          <article-title>Finding broken gates in quantum circuits: exploiting hybrid machine learning</article-title>
          ,
          <source>Quantum Information Processing</source>
          <volume>19</volume>
          (
          <year>2020</year>
          ). doi:
          <volume>10</volume>
          . 1007/s11128- 020- 02729- y. arXiv:
          <year>2001</year>
          .10939.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>D.</given-names>
            <surname>Peral-García</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Cruz-Benito</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F. J.</given-names>
            <surname>García-Peñalvo</surname>
          </string-name>
          ,
          <article-title>Systematic literature review: Quantum machine learning</article-title>
          and
          <source>its applications</source>
          ,
          <year>2022</year>
          . arXiv:
          <volume>2201</volume>
          .
          <fpage>04093</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>B.</given-names>
            <surname>Coecke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sadrzadeh</surname>
          </string-name>
          , S. Clark,
          <article-title>Mathematical foundations for a compositional distributional model of meaning (</article-title>
          <year>2010</year>
          ). URL: https://arxiv.org/abs/1003.4394. doi:
          <volume>10</volume>
          .48550/ ARXIV.1003.4394.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>J.</given-names>
            <surname>Lambek</surname>
          </string-name>
          ,
          <source>The mathematics of sentence structure</source>
          ,
          <source>The American Mathematical Monthly</source>
          <volume>65</volume>
          (
          <year>1958</year>
          )
          <fpage>154</fpage>
          -
          <lpage>170</lpage>
          . URL: https://doi.org/ 10.1080/00029890.
          <year>1958</year>
          .
          <volume>11989160</volume>
          . doi:
          <volume>10</volume>
          .1080/00029890.
          <year>1958</year>
          .
          <volume>11989160</volume>
          . arXiv:https://doi.org/10.1080/00029890.
          <year>1958</year>
          .
          <volume>11989160</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>G. de Felice</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Toumi</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <string-name>
            <surname>Coecke</surname>
          </string-name>
          , DisCoPy: Monoidal categories in python,
          <source>Electronic Proceedings in Theoretical Computer Science</source>
          <volume>333</volume>
          (
          <year>2021</year>
          )
          <fpage>183</fpage>
          -
          <lpage>197</lpage>
          . URL: https://doi.org/10. 4204%
          <fpage>2Feptcs</fpage>
          .
          <fpage>333</fpage>
          .13. doi:
          <volume>10</volume>
          .4204/eptcs.333.13.
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>A.</given-names>
            <surname>Toumi</surname>
          </string-name>
          , G. de Felice, R. Yeung,
          <article-title>Discopy for the quantum computer scientist</article-title>
          ,
          <year>2022</year>
          . URL: https://arxiv.org/abs/2205.05190. doi:
          <volume>10</volume>
          .48550/ARXIV.2205.05190.
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>D.</given-names>
            <surname>Kartsaklis</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Fan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Yeung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Pearson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Lorenz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Toumi</surname>
          </string-name>
          , G. de Felice,
          <string-name>
            <given-names>K.</given-names>
            <surname>Meichanetzidis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Clark</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Coecke</surname>
          </string-name>
          ,
          <article-title>lambeq: An Efficient High-Level Python Library for Quantum NLP</article-title>
          , arXiv preprint arXiv:
          <volume>2110</volume>
          .04236 (
          <year>2021</year>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <article-title>Qiskit contributors, Qiskit: An open-source framework for quantum computing</article-title>
          ,
          <year>2023</year>
          . doi:
          <volume>10</volume>
          .5281/zenodo.2573505.
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>J.</given-names>
            <surname>Bradbury</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Frostig</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Hawkins</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Johnson</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Leary</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Maclaurin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Necula</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Paszke</surname>
          </string-name>
          , J. VanderPlas,
          <string-name>
            <given-names>S.</given-names>
            <surname>Wanderman-Milne</surname>
          </string-name>
          ,
          <string-name>
            <surname>Q. Zhang,</surname>
          </string-name>
          <article-title>JAX: composable transformations of Python+NumPy programs</article-title>
          ,
          <year>2018</year>
          . URL: http://github.com/google/jax.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>