<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Symplec-
tic ODE-Net: Learning Hamiltonian Dynamics with Con-
trol. arXiv preprint arXiv:</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Learning Potentials of Quantum Systems using Deep Neural Networks</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Arijit Sehanobish</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Hector H. Corzo</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Onur Kara</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>David van Dijk</string-name>
        </contrib>
      </contrib-group>
      <pub-date>
        <year>1909</year>
      </pub-date>
      <volume>12077</volume>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1 Introduction</title>
      <p>
        In the Machine Learning (ML) realm, Neural Networks
(NNs) are among the most used and exceptionally
efficient models to learn and generalize information from
data. These data interpretative capabilities have provoked
the widespread use of NNs in Natural Language
Processing
        <xref ref-type="bibr" rid="ref71">(Torfi et al. 2020)</xref>
        , Image Classification
        <xref ref-type="bibr" rid="ref37">(Kolesnikov
et al. 2019)</xref>
        , Video Captioning
        <xref ref-type="bibr" rid="ref69">(Sun et al. 2019)</xref>
        and
Reinforcement Learning
        <xref ref-type="bibr" rid="ref18 ref29 ref38 ref6 ref74">(Du and Narasimhan 2019; Higgins
et al. 2016)</xref>
        ; furthermore, recent works have shown the
capabilities of NNs in symbolic reasoning and mathematical
problem solving
        <xref ref-type="bibr" rid="ref18 ref38 ref6 ref74">(Lample and Charton 2019)</xref>
        . In the case
of natural sciences, applying ML to physics is not new,
several works have been reported
        <xref ref-type="bibr" rid="ref15 ref18 ref23 ref38 ref6 ref70 ref72 ref74">(Toth et al. 2019;
Greydanus, Dzamba, and Yosinski 2019; Cranmer et al. 2020;
Tong et al. 2020)</xref>
        where different authors have combined
ML with Hamilton’s equations of motion to generate
trajectories that obey energy conservation principles and
classical physical laws. In material sciences, on the other hand,
ML has proven to be an important interpretative tool for the
computational prediction of new materials (Schleder et al.
      </p>
      <p>*Equal Contribution
Copyright © 2021, for this paper by its authors. Use permitted
under Creative Commons License Attribution 4.0 International (CC
BY 4.0)
in an unsupervised manner. This proposed QPNN for
learning potential functions was developed based on the
underlying formalism for the inverse solution of the Schro¨dinger
equation. Thus, the proposed QPNN opens the possibility
for generating simpler and succinct functions that can be
used to construct effective Hamiltonians for the description
of a variety of quantum systems using only a small portion
of the available information known about the system. Since
these effective Hamiltonians can be generalized to obtain
other observables, QPNN may provide unique insights into
complex quantum phenomena were only a small amount of
information is available.</p>
      <p>2</p>
    </sec>
    <sec id="sec-2">
      <title>Theory</title>
      <p>
        The mathematical description of a quantum particle
typically takes the form of a complex function of spatial
coordinates ~x and time coordinates t called wave–function, (~x; t)
        <xref ref-type="bibr" rid="ref19 ref48 ref60 ref61 ref63">(Sakurai and Commins 1995; Robinett 1997; Feynman,
Leighton, and Sands 1965; Robinett and Robinett 2006)</xref>
        .
      </p>
      <p>(~x; t) is a complex–valued probability amplitude whose
square modulus (j (~x; t)j2) correspond to the probability of
finding the particle described by the wave–function at that
given ~x and t. The classically measured value of a
physical observable, however, is not given directly by (~x; t) but
by the expectation values of the operators that represent the
desired measurement acting on (~x; t). In many scenarios,
wave–functions are obtained as direct solutions of the time–
dependent Schro¨dinger equation,
i~
where ~ is Planck’s constant and H^ is the Hamiltonian
operator of the system, which is an Hermitian operator acting on
an infinite dimensional space of L2 functions. Thus, H^ needs
not be compact and as much may not have any eigenvalues.
When H^ is time–independent, equation 1 can be reduced to
the following equation
^</p>
      <p>H n(~x) = E n(~x);
where n indicates the quantum state of the system. In
many cases, the physical information contained in the time–
independent wave–function n(~x) may be enough for the
characterization of the system under study.</p>
      <sec id="sec-2-1">
        <title>2.1 Hamiltonian</title>
        <p>The Hamiltonian operator, H^ , is fundamental in many
formulations of quantum theory. This operator is expressed as
the sum of the kinetic (T^ ) and potential energy operators (V^ )
for all particles in the quantum system,</p>
        <p>H^ = T^ + V^ :
Generally, the kinetic energy operator contained in H^ only
depends on the second derivatives of the wave–function,
with respect to its spatial coordinates. The potential energy
operator, however, depends on the physical circumstances
imposed onto the system, and varies from system to system.
Thus equation 3 may be expressed as
~2
2m r~2x + V^ (~x; t):
(1)
(2)
(3)
(4)</p>
        <p>In quantum mechanics, the problem of finding the H^ that
characterizes a given phenomenon could be reduced to
formulating the potential operator that contains all the
governing physical descriptors.
2.2</p>
      </sec>
      <sec id="sec-2-2">
        <title>Predicting potentials</title>
        <p>
          The usual method for describing systems in quantum
mechanics is by obtaining the wave–function of the system as a
solution of the Schro¨dinger equation. These wave–functions
strongly depend on the Hamiltonian, and in particular, the
definition of the potential used to describe the system.
However, one could also describe a quantum phenomena through
the solution of the inverse problem, i.e. by finding an
effective potential or function that contains all the important
physical constrains that generated the observed outcomes.
Inverse problems like this one are common in quantum
mechanics and electronic structure, for example, the field of
DFT
          <xref ref-type="bibr" rid="ref32 ref48 ref63 ref7">(Jensen and Wasserman 2018; Parr and Yang 1995;
Burke, Werschnik, and Gross 2005)</xref>
          has, at its core, this type
of inverse problems. As discussed before, solutions for the
full Schro¨dinger’s equation, and thus wave–functions, are
difficult to obtain except for some simple models. The
probability density j (~x)j2, on the other hand, may be
experimentally inferred for several quantum systems. Thus, an
approximated wave–function, j j= pj (~x)j2, may be defined
for the construction of the effective potential.
        </p>
        <p>3</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Quantum Potential Neural Networks</title>
      <p>In this section the proposed NN and the loss function that
were used to compute the effective potentials of various
quantum systems are described. This section is divided into
two parts: (i) Time–independent systems and (ii) Time–
dependent systems
3.1</p>
      <sec id="sec-3-1">
        <title>Time–independent Systems</title>
        <p>In order to obtain the effective potential, a new
parametric function U is learned in an unsupervised manner. This
parametric function corresponds to the effective potential of
the quantum system and was obtained by implementing a
loss function that obeys the time–independent Schro¨dinger
(TISE) equation (Eq. 2),</p>
        <p>LT ISE ( ) =</p>
        <p>
          D
~2
2m
~xj j + U (~x)
j j
2
;
2
(5)
where D is the total derivative operator acting on
multivariate function 2~m2 j~xjj j + U (~x) and jj jj2 is the
Frobenius norm. Because of the definition of this loss
function, energy conservation is effectively demanded for time–
independent systems. Since the U function is given by a
differential equation, an initial condition was added to
ensure that a unique function is learned. The initial conditions
used for the different quantum systems are based on the
inherent nature of each of the systems, more information and
explanations about some of these conditions can be found in
the literature
          <xref ref-type="bibr" rid="ref62">(Romanowski 2007)</xref>
          . Finally, the loss function
after the consideration of the initial condition reads
L( ) = LT ISE ( ) + (U (~x)
y)2
(6)
where ~x is some point in the domain of the function and y is
the expected ground truth value of the true potential at that
point.
        </p>
        <p>The main observation here is that using j j (instead of ) to
solve for the potential leads to the correct potential except,
possibly, at finitely many points where changes signs.
However, this does not create any difficulties for training the
proposed model.
3.2</p>
      </sec>
      <sec id="sec-3-2">
        <title>Time–dependent Systems</title>
        <p>For time–dependent systems, the formulation for the time–
dependent Schro¨dinger equation (TDSE) loss reads
LT DSE ( ) =</p>
        <p>Re</p>
        <p>U
2
:
2
(7)
It is important to mention that complex numbers may be
more common to appear in the time–dependent solution of
the Schro¨dinger equation; however, for the current study, the
probability density of the considered systems are described
with an Hermitian Hamiltonian, and thus only real
observables were considered to avoid the handling of complex
values. For the time–dependent results presented in this report,
the QPNN was trained with the full wave–function instead
of just the probability density. In a future work, the density–
to–potential results for time–dependent systems will be
explored and discussed in detail.
3.3</p>
      </sec>
      <sec id="sec-3-3">
        <title>Model Architecture</title>
        <p>
          For the construction of the NN, a 4 layer feedforward
network with hidden sizes of 32; 128 and 128 with a residual
connection between second and third layers was used. The
residual layers help in faster training, stable gradients and
also results in a smooth loss landscape (Li et al. 2017). The
inputs to the model are the ~x spatial coordinates for time
independent systems and (~x; t) for time dependent systems.
For the network training, 3000 of these coordinates were
randomly selected from the domain of definition of each
particular system. The model was trained for 500 epochs with
Adam optimizer
          <xref ref-type="bibr" rid="ref35">(Kingma and Ba 2017)</xref>
          .
        </p>
        <p>4</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Related Work</title>
      <p>
        The use of deep learning for understating physical
phenomena has been an active field of development. In
particular, there is a considerable amount of literature where
authors have endowed neural networks with classical
Hamiltonian mechanics
        <xref ref-type="bibr" rid="ref12 ref18 ref18 ref18 ref23 ref30 ref38 ref38 ref38 ref6 ref6 ref6 ref70 ref72 ref74 ref74 ref74">(Toth et al. 2019; Greydanus, Dzamba,
and Yosinski 2019; Tong et al. 2020; Iten et al. 2020;
Bondesan and Lamacraft 2019; Zhong, Dey, and Chakraborty
2019; Chmiela et al. 2017)</xref>
        ; conservation of energy and
irreversibility in time are the key features of such networks.
There are recent reports extending these results in cases of
damped pendula, i.e., systems where there is dissipation of
energy
        <xref ref-type="bibr" rid="ref78">(Zhong, Dey, and Chakraborty 2020)</xref>
        . In
computational quantum mechanics, deep neural networks have been
implemented to learn representations and extract the
necessary features to predict desired properties from raw
unprocessed data
        <xref ref-type="bibr" rid="ref21 ref35">(Goh, Hodas, and Vishnu 2017)</xref>
        . Recently,
two methods for estimating the density matrix for a
quantum system, the Quantum Maximum Likelihood (QML) and
Quantum Variational Inference (QVI) method, were
introduced
        <xref ref-type="bibr" rid="ref13 ref18 ref38 ref6 ref74">(Cranmer, Golkar, and Pappadopulo 2019)</xref>
        . For these
methods, the authors used a flow based method
        <xref ref-type="bibr" rid="ref34 ref72">(Toth et al.
2019; Jimenez Rezende and Mohamed 2015)</xref>
        to increase the
expressivity of their variational family of density matrices.
The applicability of these methods, however, has been only
validated for the harmonic and anharmonic quantum
oscillator models. Application of deep learning to quantum
mechanics is still in its early stages
        <xref ref-type="bibr" rid="ref12 ref18 ref18 ref18 ref2 ref2 ref21 ref28 ref31 ref35 ref35 ref37 ref38 ref38 ref38 ref41 ref42 ref45 ref51 ref53 ref55 ref57 ref6 ref6 ref6 ref64 ref65 ref66 ref69 ref71 ref74 ref74 ref74 ref8">(Torfi et al. 2020; Raissi,
Perdikaris, and Karniadakis 2017a,b, 2019; Jasinski et al.
2020; Carleo et al. 2019; Amabilino et al. 2019; Unke and
Markus 2019; Schmitz, Godtliebsen, and Christiansen 2019;
Schmidt et al. 2017; Hibat-Allah et al. 2020; Nakajima,
Tanaka, and Hashimoto 2020; Pu, Li, and Chen 2020; Mills,
Spanner, and Tamblyn 2017; Manzhos 2020)</xref>
        . Most of the
deep learning quantum mechanic frameworks introduced so
far are focused on either solving the Schro¨dinger equation
or predicting the trends of specific observables such as the
system’s energy. Concerning inverse problems, Raissi et al,
introduced the physics–informed neural network for
solving forward and inverse problems involving nonlinear partial
differential equations
        <xref ref-type="bibr" rid="ref18 ref38 ref57 ref6 ref74">(Raissi, Perdikaris, and Karniadakis
2019)</xref>
        . Although the impressive results reported in this work,
in terms of inverse problem solutions, the deep learning
framework reported by Raissi et al is focused only on the
solution of a partial differential equation for the prediction
of pressure profiles in an classical system. On the other hand,
in quantum mechanics observables may be inferred when a
valid effective potential is known for a given quantum
system; thus, the solution of the density–to–potential inversion
problem to predict effective potential functions
        <xref ref-type="bibr" rid="ref32">(Jensen and
Wasserman 2018)</xref>
        play an important role in the
understanding of the quantum phenomena, and in particular in the
elucidation of the electronic structure of molecules from a
density functional theory perspective. In this regard, Nagai and
coworkers have proposed the Neural–network Kohn–Sham
exchange–correlation potential
        <xref ref-type="bibr" rid="ref43">(Nagai et al. 2018)</xref>
        , which
propose a supervised training scheme that uses
information from well defined potentials and probability densities
to train a NN. However, to the best of our knowledge, there
are no reported works that use deep learning to solve the
density–to–potential inverse problem to systematically
estimate potentials from observations in a completely
unsupervised manner.
      </p>
      <p>5</p>
    </sec>
    <sec id="sec-5">
      <title>Experiments</title>
      <p>
        The performance of the proposed Quantum Potential
Neural Network is validated on four different quantum
systems, one of these systems describes the temporal evolution
of a quantum wave whereas the other three are examples
of time–independent systems. Among the time–independent
systems, exact analytical solutions for the time–independent
Schro¨dinger equation can be obtained only for the harmonic
oscillator and the Po¨schl–Teller (PT) potentials whereas for
the H2 molecule, only approximate solutions are attained.
Details about the solutions, physical implications and
interpretations of these systems can be found in any
standard book on quantum mechanics
        <xref ref-type="bibr" rid="ref48 ref49 ref61 ref63">(Sakurai and Commins
1995; Robinett 1997; Pronchik and Williams 2003)</xref>
        . Finally,
the potentials learned by the QPNN were used to
compute the total energy of each of the systems. The
quantitative results for all the reported experiments are
summarized in table 1. In order to compare the results obtained (in
the form of solutions to a differential equation) via the NN
techniques to those obtained through well established
approaches, the differential equations were solved numerically
using the well–known and standard Runge–Kutta 4th Order
(or RK4) integrator implemented in the standard python
libraries. The RK4 algorithm provides means of solving
various kinds of differential equations and is generally
considered as a robust workhorse to bench mark new
computational techniques
        <xref ref-type="bibr" rid="ref34 ref39">(Landau, Paez, and Bordeianu 2015)</xref>
        . The
differences in accuracy of the values obtained by both the
proposed QPNN method and the RK4 numerical
integrator were quantified through their root mean square error
(RMSE) values. The code is available at https://github.com/
arijitthegame/Quantum-Hamiltonians.
5.1
      </p>
      <sec id="sec-5-1">
        <title>Density–to–Potential Experiments</title>
        <p>For the density–to–potential experiments, the exact wave–
functions, (~x), for the quantum Harmonic oscillator and
the PT potential were obtained by solving the time–
independent Schro¨dinger equation. These wave–functions
were later used to define the probability distribution,
j (~x)j2, for each of the systems. In the case of the
Hydrogen molecule, the probability density was defined
according to the one–electron 1s orbital function that delineate the
approximated electronic density for the Hydrogen molecule
in a Born–Oppenheimer approximation. These probability
densities were used to define, for each of the quantum
systems, an approximated probability amplitude function, j j=
pj (~x)j2. The effective potential function for each of the
systems was obtained by training the QPNN with
information provided by randomly selected coordinates evaluated
onto the approximated probability amplitudes and into the
loss function.</p>
        <p>Quantum Harmonic Oscillator (QHO): The motion of
the the one–dimension QHO is, perhaps, the simplest
quantum mechanical system whose motion follows a linear
differential equation with constant coefficients. In the QPNN
framework, for the prediction of the QHO potential, the
coordinate variable ~x, was randomly sampled from [ 5; 5],
and was used as the input to the model. In the analytical
solution of the time–independent Schro¨dinger equation, the
wave–functions for the different states of the QHO are given
by Hermite polynomials Hn; n = 0; 1; , whereas the
energies corresponding to these states depend on the force
constant w and are given by En = ~w(n + 1=2). The analytical
wave–functions for the different states of the QHO defined
the probability densities used to train the QPNN. In this case,
the initial condition imposed is the fact that at the zero point
of the reference coordinates, all the energy in the system is
kinetic, and thus the potential energy at this point is zero,
i.e. V (0) = 0. Fig 2 shows the used probability density, the
learned potential and the energy computed by the QPNN.
with = 2 and = 1 was employed. From the density, the
initial wave–function takes the form:
21 (x) = jtanh xj sech x:
(9)</p>
        <p>
          Po¨schl–Teller potential: The PT potential is a
special class of anharmonic potential for which the one–
dimensional Schro¨dinger equation can be solved in terms of
special functions. This potential may be used to model
vibrational molecular potentials with out–of–plane bending
vibrations
          <xref ref-type="bibr" rid="ref33 ref35 ref68">(Senn 1986; Jia, Zhang, and Peng 2017)</xref>
          and
observables of diatomic potentials
          <xref ref-type="bibr" rid="ref49">(Pronchik and Williams 2003)</xref>
          .
For the PT potential, the wave–functions used to define the
approximated probability amplitude function are the
Legendre functions P
          <xref ref-type="bibr" rid="ref59">(Riley 1974)</xref>
          with energy eigenvalues E
and potential depth V0
          <xref ref-type="bibr" rid="ref27">(Herna´ndez de la Pen˜a 2018)</xref>
          ,
n
        </p>
        <p>P (tanh x) E
=
2
2
; V0 =
( + 1)
2
;
=1;2;
=1;2; ;
o</p>
        <p>
          :
(8)
Details about several suitable boundary terms and initial
conditions for this type of potentials are formulated and
reviewed in the literature
          <xref ref-type="bibr" rid="ref1 ref27">(Agboola 2010; Herna´ndez de la
Pen˜a 2018)</xref>
          . The input for this experiment is the spatial
coordinate !x randomly sampled from [ 3; 3]. For this
experiment, the wave–function defined by the Legendre function
        </p>
        <p>
          Hydrogen molecule: The H2 molecule is the first
multielectronic system with approximated probability density
considered. For the training of the QPNN, an ab initio
electron density for the H2 molecule with an equilibrium bond
length (xre ) of 1.346 A˚ and total energy of -0.958470046928
a.u. was approximated by using a fast and systematic self–
consistent field method
          <xref ref-type="bibr" rid="ref25">(Helgaker, Jorgensen, and Olsen
2014)</xref>
          . This density was computed using three Gaussian
primitive functions for each H atom, where the ~x coordinate
defined on [ 3; 3] was chosen as the reference internal
coordinate. The initial conditions for the system were defined
following the same lines as in
          <xref ref-type="bibr" rid="ref52">(Rafi et al. 1995)</xref>
          ; specifically,
the fact that V (xre ) = 0. Figure 4 shows the probability
density used to train the QPNN as well as the learned
potential and the energy computed using this potential.
In order to explore the behavior of the QPNN in systems
with dependence on time, the effective potential for a
soliton model was computed. Solitons represent solitary waves
propagating without any temporal evolution in shape or size
when viewed in the reference frame moving with the group
velocity of the waves
          <xref ref-type="bibr" rid="ref76">(Wazwaz 2009)</xref>
          . This type of solitary
waves are particularly important in the Bose–Einstein
condensation theory and arise in many contexts such as the
elevation of the surface of water and the intensity of light in
optical fibers. Solitons form a special class of solutions of
model equations, including the Korteweg de–Vries (KdV)
and the Nonlinear Schro¨dinger (NLS) equations. In this
particular experiment, the one–dimensional soliton satisfies the
following differential equation:
i
= 0:
(10)
Thus, for this system, the loss function used to train the
QPNN is given by equation 7 where = 2sech(p2(x
2t))ei(x+t) and U (x; t) is j j2. For this experiment, the
coordinates for the QPNN input were defined on ~x; ~t 2 [0; 1].
Figure 5 shows the potential obtained by the QPNN for this
system.
time–independent systems. For all the systems but the
harmoninc oscillator, the RMSE values for the learned
potentials are comparable in magnitude with those obtained with
the RK4 method. In the case of the energies, when
compared against the exact energy, the RMSE values for both
the QPNN and the RK4 method are around the same
magnitude for all systems but the PT potential, where the RMSE
for the QPNN is one order of magnitude lower than for the
RK4 method. In terms of the energy values computed for
each of the systems, when referenced to the exact energy
(blue line) a similar trend can be observed for the energies
computed with the RK4 method and the QPNN. If the RK4
method is regarded as a more robust and mathematically
superior method for the calculation of the effective potential
functions, this trend may be interpreted as an indicator of
the reliability of the QPNN. In the case of the soliton model,
although solutions using the RK4 were not feasible due to
the nature of the system, the magnitude of RMSE values
between the true potential and the learned potential suggests
the same qualitative behaviour as the one obtained with the
time–independent models.
        </p>
        <p>7</p>
      </sec>
    </sec>
    <sec id="sec-6">
      <title>Conclusion</title>
      <p>In this work, the QPNN for learning the effective
potential functions of different quantum systems was presented.
This new neural network is capable of learning the effective
potential functions of a variety of systems in a completely
unsupervised manner. The results obtained for the
different studied systems suggest that the QPNN has an accuracy
comparable to the RK4 integrator. The potentials learned
with this new QPNN can be used to calculate observables
like the energy of the system.</p>
      <p>The QPNN formalism presented in this work provide a
foundation for the calculation of potentials of more
complicated N–body systems. In future work, we will
investigate the calculation of potentials for more complicated
many electron systems that arise from the use of full
configuration interaction densities and wave–functions. The use
of the QPNN was also extended for time–dependent
systems where the full wave–function was used. In the time–
dependent case, j j can not simply be taken as a proxy for
a wave–function as one needs to take into account some
phase information, i.e., the wave–function in that case can
not be a real valued function. In future work, the density–to–
potential problem will be analysed by incorporating phase
information to create a suitable proxy wave–function.</p>
      <p>A</p>
    </sec>
    <sec id="sec-7">
      <title>Additional experiments using the full wave–function</title>
      <p>In order to further explore the capabilities and accuracy of
our NN, we start off by describing additional experiments
using the QPNN and full wave–function as well as Wigner’s
functions. The formulation for the new loss functions are
described in detail below.</p>
      <p>A.1</p>
      <sec id="sec-7-1">
        <title>Using the 1D time–independent Schro¨ dinger</title>
        <p>
          equation
In this section we consider some simple one–dimensional
time–independent systems. Wave–functions, potential
energy and energy levels can be found in Table 2. The learned
potentials are reported in figure 6, whereas in figure 7
we show that all proposed models obey energy
conservation laws. The quantitative results for the experiments can
be found in table 3. For the derivation of these wave–
functions and the general properties of these systems, please
see
          <xref ref-type="bibr" rid="ref19 ref48 ref61 ref63">(Sakurai and Commins 1995; Feynman, Leighton, and
Sands 1965; Robinett 1997)</xref>
          . As before, ~, m and ! were set
equal to 1. Quantum Harmonic Oscillator: The wave–
functions in this case are given by Hermite polynomials
Hn; n = 0; 1; . We chose x 2 [ 5; 5] as input to our
model. Since U is given by a differential equation
(equation 5), one needs to impose an initial condition to get a
unique solution. However, when the output of U 2 [0; 12:5]
is considered, the need for the initial conditions are removed.
Figure 6 and figure 7 (A) show the learned potential and
energy of the system.
        </p>
        <p>The Hydrogen Atom (2p orbital case) : The general radial
wave–functions are given by generalized Laguerre
polynomials Lln; n = 1; 2; and l = 0; 1; 2; ; n 1 but in this
case simplifies to (r) = 8p1 re 2r . We used r 2 [0:5; 10]
as input to our model and the initial condition U (1) = 0: For
this system the loss function reads L( ) = L( ) + U (1)2
where L( ) is given by equation 5. Figure 6 and figure 7 (B)
show the learned potential and energy for this system.</p>
      </sec>
      <sec id="sec-7-2">
        <title>Po¨schl-Teller potential : The wave–function gener</title>
        <p>ated by this potential is defined by Legendre functions
P (tanh(x)); = 1; 2; 3 ; = 1; 2; ; 1; . For
simplicity, let = 1. We chose x 2 [ 3; 3] as input to our
model. We imposed an initial condition U (0) = ( 2+1)
and used a similar auxiliary loss function as the one defined
above. Figure 6 and figure 7 (C) show the learned potential
and energy of the system.</p>
        <p>A.2</p>
      </sec>
      <sec id="sec-7-3">
        <title>Particle in a box (perturbed by some external potential)</title>
        <p>Now, we turn our attention to a quantum system where the
Schro¨dinger equation cannot be solved exactly, but can be
formulated in an approximate manner using perturbation
theory. A particle with no spin, of mass m, was placed in
an one dimensional square box, x 2 [0; L], of length L.
Later the particle was presented with the external potential
V (x) = 10x2 as perturbation. The wave–function for the
perturbed system was approximated by considering first
order corrections for the unperturbed particle in a box wave–
(11)
where n0 and En0 are the particle in a box’s
unperturbed nth state wave–function and its energy, whereas,
&lt; n0jV (x)j k0 &gt; indicates the following integral</p>
        <p>Z
&lt;
n0jV (x)j k0 &gt;=
( n0) V (x) k0dx:
For our computations, the wave–function, 0 , obtained as
n
solution of the Schro¨edinger equation for the particle in a
box model reads,
n; k = 1; 2; 3; : : : ;
sin(
)x</p>
        <p>n = 1; 2; 3; : : : ;
n0 =
r 2</p>
        <p>L
n</p>
        <p>L
En0 =</p>
        <p>2 2 2
n ~
2mL2
and the energy for the system is given by
n = 1; 2; 3; : : : :
Here, we use the wave–function corrected only up to first
order and x 2 [0; 1]. In this experiment we not only learned the
potential, but also the perturbed wave–function based only
in the systems initial conditions without perturbation. We
use two neural networks, one to learn the potential and the
other to learn the perturbed wave–function. The perturbed
wave–function was learned in a supervised manner, whereas
the potential was learned in an unsupervised manner. If W
is the neural network learning the perturbed wave–function
pert, then our auxiliary loss function becomes
pertjj22 + L ;</p>
        <p>L = jjW (15)
where L is the time–independent Schro¨dinger loss defined
in the main text, which was used to learn the potential and
calculated by the perturbed wave–function. Figure 8 shows
the results for this system. It seems that energy is not
conserved for this system, but that is merely due to the nature of
the truncated, first–order perturbation approximation.
A.3</p>
      </sec>
      <sec id="sec-7-4">
        <title>2D Harmonic Oscillator</title>
        <p>Unlike other NN, our QPNN scales easily and quickly to
higher dimensions. For the 2–dimensional Harmonic
Oscillator, the wave–function is defined as a product of two
Hermite polynomials (such as the one defined in the main text).
(12)
(13)
(14)</p>
        <p>We chose x; y 2 [0; 1] as input to our model and constrained
our output to [0; 1]. The loss function in this case is exactly
as the one defined for the 1D Harmonic Oscillator. Figure 9
shows the results for this system, as this figure shows, the
learned energy is a good approximation to the total energy
(z scale chosen from [4:99; 5:01]).</p>
        <p>B</p>
      </sec>
    </sec>
    <sec id="sec-8">
      <title>Motivation behind the time–independent</title>
    </sec>
    <sec id="sec-9">
      <title>Schr o¨dinger Loss</title>
      <p>We now present a brief, yet complete, explanation for our
time–independent Schro¨dinger loss function. The
Hamiltonian H^ is the sum of the kinetic energy T^ and the potential
energy V^ , where the kinetic energy is given by the Laplacian
operator,</p>
      <p>H^ =
~2
2m r2x + V^ (x):
^
H</p>
      <p>= E
For the time independent case, the Schro¨dinger’s equation
boils down to
where E is the energy of the system. For simplicity, let ~ =
m = 1. By using equation 16, equation 17 reads
Dividing the above equation by
yields
1
( 2 r2x + V^ (x))</p>
      <p>= E :
1 2
2 rx
+ V^ (x) = E:
(16)
(17)
(18)
(19)
Since the energy of a given system is a scalar quantity,
the derivative with respect to x on the left hand side of
equation 19 is zero, which defines the time–independent
Schro¨dinger loss function.</p>
      <p>C</p>
    </sec>
    <sec id="sec-10">
      <title>Wigner Functions</title>
      <p>
        An alternative formulation of quantum dynamics may be
given by the Wigner function
        <xref ref-type="bibr" rid="ref11 ref17 ref18 ref38 ref6 ref74">(Curtright, Fairlie, and Zachos
1998; Chen, Xiong, and Shao 2019)</xref>
        . The Wigner function,
W (x; p; t), is a phase space distribution function which
behaves similarly to the position j (x) j2 and the momentum
j (p) j2 distribution functions
        <xref ref-type="bibr" rid="ref9">(Case 2008)</xref>
        . Unlike wave–
functions, Wigner functions are real valued and bounded.
However, contrary to probability distributions, W (x; p; t)
can take negative values. Thus, the Wigner distribution is
termed as a quasi–probability distribution and in a sense
loses some of its classical appeal. Using the Schro¨dinger’s
equation (equation 1) and the Taylor expansion, the time
evolution of the Wigner function is given by an infinite
order partial differential equation called Wigner–Moyal
equation
        <xref ref-type="bibr" rid="ref9">(Case 2008)</xref>
        .
      </p>
      <p>:
(2s + 1! ) 2</p>
      <sec id="sec-10-1">
        <title>Learning Potentials from Wigner Functions</title>
        <p>In the case of the Wigner function, our Neural Network was
trained by implementing a truncated Wigner–Moyal loss,
LW igner( ) =
k
X( h2)s
s=0
+
1
(2s + 1! )
@2s+1W (x; p; t) 2
@p2s+1 ;
2
m
(20)
(21)
where for all our experiments k = 0; 1. The case where k =
0 is known as the Liouville equation. However, we note that
equation 21 determines U up to a constant. Thus, an initial
condition depending on each individual system was added.
C.2</p>
      </sec>
      <sec id="sec-10-2">
        <title>Experiments with the Wigner functions</title>
        <p>
          Harmonic Oscillator : The Wigner function for the
harmonic oscillator has the following form
          <xref ref-type="bibr" rid="ref9">(Case 2008)</xref>
          :
W (x; p; t) = e (x2+p2) x2 + p2 + p
2x cos t
p
2p sin t :
Since @@nxUn = 0; 8 n 3, the Moyal–Wigner equation in
this case regresses to the classical Liouville equation. Let
x; p; t 2 [0; 1] and x is the input to the model. The initial
condition for this systems is U (0) = 0 and the loss function
reads
        </p>
        <p>L( ) = LWigner( ) + U (0)2;
where LWigner is given by equation 21. Figure 10 (A) shows
the potential learned by the model.</p>
      </sec>
      <sec id="sec-10-3">
        <title>Po¨schl-Teller potential : The Wigner function in this</title>
        <p>
          case
          <xref ref-type="bibr" rid="ref11 ref18 ref38 ref6 ref74">(Chen, Xiong, and Shao 2019)</xref>
          is given by
        </p>
        <p>
          W2;1;0(x; k; t) :
The Wigner function is a real–valued bounded function.
Thus by breaking the integral in equation 22 into real and
complex parts, we only focus on the real part. Using Euler’s
formula, we get the following:
g2;1;0(x; k; t) =
Note that the integral in equation 23 is invariant under the
change of variable y ! y. This implies that in order to
calculate g2;1;0(x; k; t), we only have to integrate from 0
to 1 and multiply the integral by 2. Our final
simplification comes from studying the decay properties of the Wigner
functions. Using sech(x) = ex+2e x and sinh(x) = ex 2e x ,
we found that the integrand in equation 23 behaves like
O(e y) (resp. O(ey)) as y ! 1 (resp. y ! 1). We
established a threshold of 10 9 to truncate the integral from
positive real axis to a bounded interval which gives the
following form:
f2;1;0(x; k; t) =
+ cos( ky) :
)cos(
)cos(
3t
2
3t
2
y
2
y
The Wigner method to study the time–frequency
properties of dynamical systems involves taking the partial
derivatives with respect to time of the Wigner function. These
derivatives on the Wigner function yield what is known as
the Wigner–Moyal equation. The physical interpretations,
numerical difficulties and approximations of the Wigner–
Moyal equation have been widely discussed in the
literature; for information about the mathematical challenges
associated with the Wigner–Moyal equation, we recommend
readers to consult these references
          <xref ref-type="bibr" rid="ref11 ref17 ref18 ref20 ref22 ref26 ref36 ref38 ref4 ref6 ref74 ref9">(Chen, Xiong, and Shao
2019; Case 2008; Galleani and Cohen 2002; Heller 1976;
Curtright, Fairlie, and Zachos 1998; Klimov, Sainz, and
Romero 2020; Athanassoulis 2008; Gomes and Silva 2008)</xref>
          .
In this case, the potential is U (x) = 3sech(x)2 which
is infinitely differentiable. For this experiment we chose
x 2 [0; 1] as input to our model and approximated the
infinite order PDE (equation 20) by equation 21. Thus, one
cannot assume that any non–steady state solution predicted
by the truncated Wigner function is immediately valid, as
it can be shown that higher order quantum corrections are
responsible for quantum mechanical phase–space behavior.
The 0th order truncation matches the potential in a small
neighborhood of 0. Figure 10 summarizes some of our
findings,
Our Neural Network is a 4-layer feedforward network with a
residual connection between the second and the third layers.
The activation and the scaling in the final layers varied from
experiment to experiment. Our main motivation for scaling
and using different activation is to show that an appropriate
architecture can perfectly learn the correct potential
without an initial condition. All the models are trained for 1000
epochs. Table 4 shows the activation, scaling and the size
of the training data for each of the studied systems. All the
training data was randomly sampled from the appropriate
domains and trained in a minibatch fashion with batch size
of 32.
System
Harmonic Oscillator
Po¨chl–Teller potential
Radial Hydrogen atom
2D Harmonic Oscillator
Particle in a Box
Soliton
Harmonic Oscillator from Wigner
        </p>
      </sec>
    </sec>
    <sec id="sec-11">
      <title>E Additional Figures</title>
      <p>Some additional figures of the wave–functions and Wigner
functions used in our experiments.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Agboola</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <year>2010</year>
          .
          <article-title>Solutions to the Modified Po¨schl-Teller Potential in D-Dimensions</article-title>
          .
          <source>Chinese Physics Letters</source>
          <volume>27</volume>
          (
          <issue>4</issue>
          ):
          <fpage>040301</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Amabilino</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Bratholm</surname>
            ,
            <given-names>L. A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Bennie</surname>
            ,
            <given-names>S. J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Vaucher</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Reiher</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Glowacki</surname>
            ,
            <given-names>D. R.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Training neural nets to learn reactive potential energy surfaces using interactive quantum chemistry in virtual reality</article-title>
          .
          <source>The Journal of Physical Chemistry A</source>
          <volume>123</volume>
          (
          <issue>20</issue>
          ):
          <fpage>4486</fpage>
          -
          <lpage>4499</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Aster</surname>
            ,
            <given-names>R. C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Borchers</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Thurber</surname>
            ,
            <given-names>C. H.</given-names>
          </string-name>
          <year>2018</year>
          .
          <article-title>Parameter estimation and inverse problems</article-title>
          . Elsevier.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <string-name>
            <surname>Athanassoulis</surname>
            ,
            <given-names>A. G.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>Exact equations for smoothed Wigner transforms and homogenization of wave propagation</article-title>
          .
          <source>Applied and Computational Harmonic Analysis</source>
          <volume>24</volume>
          (
          <issue>3</issue>
          ):
          <fpage>378</fpage>
          -
          <lpage>392</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Beals</surname>
            , R.; and Greiner,
            <given-names>P. C.</given-names>
          </string-name>
          <year>2009</year>
          .
          <article-title>Strings, waves, drums: spectra and inverse problems</article-title>
          .
          <source>Analysis and Applications</source>
          <volume>7</volume>
          (
          <issue>02</issue>
          ):
          <fpage>131</fpage>
          -
          <lpage>183</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>Bondesan</surname>
          </string-name>
          , R.; and
          <string-name>
            <surname>Lamacraft</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Learning Symmetries of Classical Integrable Systems</article-title>
          . arXiv preprint arXiv:
          <year>1906</year>
          .04645 .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Burke</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Werschnik</surname>
          </string-name>
          , J.; and
          <string-name>
            <surname>Gross</surname>
            ,
            <given-names>E. K. U.</given-names>
          </string-name>
          <year>2005</year>
          .
          <article-title>Timedependent density functional theory: Past, present, and future</article-title>
          .
          <source>The Journal of Chemical Physics</source>
          <volume>123</volume>
          (
          <issue>6</issue>
          ):
          <fpage>062206</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Carleo</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Cirac</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ; Cranmer,
          <string-name>
            <given-names>K.</given-names>
            ;
            <surname>Daudet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            ;
            <surname>Schuld</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            ;
            <surname>Tishby</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            ;
            <surname>Vogt-Maranto</surname>
          </string-name>
          ,
          <string-name>
            <surname>L.</surname>
          </string-name>
          ; and Zdeborova´,
          <string-name>
            <surname>L.</surname>
          </string-name>
          <year>2019</year>
          .
          <article-title>Machine learning and the physical sciences</article-title>
          .
          <source>Reviews of Modern Physics</source>
          <volume>91</volume>
          (
          <issue>4</issue>
          ):
          <fpage>045002</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Case</surname>
            ,
            <given-names>W. B.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>Wigner functions and Weyl transforms for pedestrians</article-title>
          .
          <source>American Journal of Physics</source>
          <volume>76</volume>
          (
          <issue>10</issue>
          ):
          <fpage>937</fpage>
          -
          <lpage>946</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>Chadan</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ; and Sabatier,
          <string-name>
            <surname>P. C.</surname>
          </string-name>
          <year>2012</year>
          .
          <article-title>Inverse problems in quantum scattering theory</article-title>
          . Springer Science &amp; Business Media.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Chen</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Xiong</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Shao</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Numerical methods for the Wigner equation with unbounded potential</article-title>
          .
          <source>Journal of Scientific Computing</source>
          <volume>79</volume>
          (
          <issue>1</issue>
          ):
          <fpage>345</fpage>
          -
          <lpage>368</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>Chmiela</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Tkatchenko</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Sauceda</surname>
            ,
            <given-names>H. E.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Poltavsky</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ; Schu¨tt, K. T.; and Mu¨ller, K.-R.
          <year>2017</year>
          .
          <article-title>Machine learning of accurate energy-conserving molecular force fields</article-title>
          .
          <source>Science advances 3(5): e1603015.</source>
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          <string-name>
            <surname>Cranmer</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Golkar</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Pappadopulo</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Inferring the quantum density matrix with machine learning</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          arXiv preprint arXiv:
          <year>1904</year>
          .05903 .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <string-name>
            <surname>Cranmer</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Greydanus</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Hoyer</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; Battaglia,
          <string-name>
            <given-names>P.</given-names>
            ;
            <surname>Spergel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            ; and
            <surname>Ho</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          <year>2020</year>
          .
          <article-title>Lagrangian Neural Networks</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          arXiv preprint arXiv:
          <year>2003</year>
          .04630 .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <string-name>
            <surname>Curtright</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Fairlie</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Zachos</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <year>1998</year>
          .
          <article-title>Features of time-independent Wigner functions</article-title>
          .
          <source>Physical Review D</source>
          <volume>58</volume>
          (
          <issue>2</issue>
          ):
          <fpage>025002</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          <string-name>
            <surname>Du</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Narasimhan</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Task-agnostic dynamics priors for deep reinforcement learning</article-title>
          .
          <source>In International Conference on Machine Learning</source>
          ,
          <fpage>1696</fpage>
          -
          <lpage>1705</lpage>
          . PMLR.
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <string-name>
            <surname>Feynman</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ; Leighton, R.; and Sands,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <year>1965</year>
          .
          <source>The Feynman Lectures on Physics Vol. III, chap. 21, sec. 21-9.</source>
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          <string-name>
            <surname>Galleani</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ; and Cohen,
          <string-name>
            <surname>L.</surname>
          </string-name>
          <year>2002</year>
          .
          <article-title>Approximation of the Wigner distribution for dynamical systems governed by differential equations</article-title>
          .
          <source>EURASIP Journal on Advances in Signal Processing</source>
          <year>2002</year>
          (
          <volume>1</volume>
          ):
          <fpage>514609</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          <string-name>
            <surname>Goh</surname>
            ,
            <given-names>G. B.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Hodas</surname>
            ,
            <given-names>N. O.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Vishnu</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2017</year>
          .
          <article-title>Deep learning for computational chemistry</article-title>
          .
          <source>Journal of computational chemistry</source>
          <volume>38</volume>
          (
          <issue>16</issue>
          ):
          <fpage>1291</fpage>
          -
          <lpage>1307</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          <string-name>
            <surname>Gomes</surname>
            ,
            <given-names>D. A.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Silva</surname>
            ,
            <given-names>J. D.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>On the Wigner transform of solutions to the Schrodinger equation</article-title>
          .
          <source>Sa˜o Paulo Journal of Mathematical Sciences</source>
          <volume>2</volume>
          (
          <issue>1</issue>
          ):
          <fpage>85</fpage>
          -
          <lpage>97</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          <string-name>
            <surname>Greydanus</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; Dzamba,
          <string-name>
            <given-names>M.</given-names>
            ; and
            <surname>Yosinski</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.</surname>
          </string-name>
          <year>2019</year>
          .
          <article-title>Hamiltonian neural networks</article-title>
          .
          <source>In Advances in Neural Information Processing Systems</source>
          ,
          <volume>15353</volume>
          -
          <fpage>15363</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref24">
        <mixed-citation>
          <string-name>
            <surname>Groetsch</surname>
            ,
            <given-names>C. W.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Groetsch</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <year>1993</year>
          .
          <article-title>Inverse problems in the mathematical sciences</article-title>
          , volume
          <volume>52</volume>
          . Springer.
        </mixed-citation>
      </ref>
      <ref id="ref25">
        <mixed-citation>
          <string-name>
            <surname>Helgaker</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Jorgensen</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Olsen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2014</year>
          .
          <article-title>Molecular electronic-structure theory</article-title>
          . John Wiley &amp; Sons.
        </mixed-citation>
      </ref>
      <ref id="ref26">
        <mixed-citation>
          <string-name>
            <surname>Heller</surname>
            ,
            <given-names>E. J.</given-names>
          </string-name>
          <year>1976</year>
          .
          <article-title>Wigner phase space method: Analysis for semiclassical applications</article-title>
          .
          <source>The Journal of Chemical Physics</source>
          <volume>65</volume>
          (
          <issue>4</issue>
          ):
          <fpage>1289</fpage>
          -
          <lpage>1298</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref27">
        <mixed-citation>
          <string-name>
            <surname>Herna</surname>
            ´ndez de la Pen˜a,
            <given-names>L.</given-names>
          </string-name>
          <year>2018</year>
          .
          <article-title>A Simplified Po¨schl-Teller Potential: An Instructive Exercise for Introductory Quantum Mechanics</article-title>
          .
          <source>Journal of Chemical Education</source>
          <volume>95</volume>
          (
          <issue>11</issue>
          ):
          <fpage>1989</fpage>
          -
          <lpage>1995</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref28">
        <mixed-citation>
          <string-name>
            <surname>Hibat-Allah</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Ganahl</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Hayward</surname>
            ,
            <given-names>L. E.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Melko</surname>
          </string-name>
          , R. G.; and
          <string-name>
            <surname>Carrasquilla</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Recurrent neural network wave functions</article-title>
          .
          <source>Physical Review Research</source>
          <volume>2</volume>
          (
          <issue>2</issue>
          ):
          <fpage>023358</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref29">
        <mixed-citation>
          <string-name>
            <surname>Higgins</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Matthey</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Glorot</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Pal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Uria</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Blundell</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Mohamed</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Lerchner</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2016</year>
          .
          <article-title>Early visual concept learning with unsupervised deep learning</article-title>
          .
          <source>arXiv preprint arXiv:1606</source>
          .
          <fpage>05579</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref30">
        <mixed-citation>
          <string-name>
            <surname>Iten</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ; Metger,
          <string-name>
            <given-names>T.</given-names>
            ;
            <surname>Wilming</surname>
          </string-name>
          , H.;
          <string-name>
            <surname>del Rio</surname>
            , L.; and Renner,
            <given-names>R.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Discovering Physical Concepts with Neural Networks</article-title>
          .
          <source>Physical Review Letters</source>
          <volume>124</volume>
          (
          <issue>1</issue>
          ):
          <fpage>010508</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref31">
        <mixed-citation>
          <string-name>
            <surname>Jasinski</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Montaner</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Forrey,
          <string-name>
            <given-names>R.</given-names>
            ;
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ;
            <surname>Stancil</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            ;
            <surname>Balakrishnan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            ;
            <surname>Dai</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.</surname>
          </string-name>
          ; Vargas-Herna´ndez, R.; and Krems,
          <string-name>
            <surname>R.</surname>
          </string-name>
          <year>2020</year>
          .
          <article-title>Machine learning corrected quantum dynamics calculations</article-title>
          .
          <source>Physical Review Research</source>
          <volume>2</volume>
          (
          <issue>3</issue>
          ):
          <fpage>032051</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref32">
        <mixed-citation>
          <string-name>
            <surname>Jensen</surname>
            ,
            <given-names>D. S.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Wasserman</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2018</year>
          .
          <article-title>Numerical methods for the inverse problem of density functional theory</article-title>
          .
          <source>International Journal of Quantum Chemistry</source>
          <volume>118</volume>
          (
          <issue>1</issue>
          ):
          <fpage>e25425</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref33">
        <mixed-citation>
          <string-name>
            <surname>Jia</surname>
          </string-name>
          , C.-S.; Zhang, L.-H.; and
          <string-name>
            <surname>Peng</surname>
            ,
            <given-names>X.-L.</given-names>
          </string-name>
          <year>2017</year>
          .
          <article-title>Improved Po¨schl-Teller potential energy model for diatomic molecules</article-title>
          .
          <source>International Journal of Quantum Chemistry</source>
          <volume>117</volume>
          (
          <issue>14</issue>
          ):
          <fpage>e25383</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref34">
        <mixed-citation>
          <string-name>
            <given-names>Jimenez</given-names>
            <surname>Rezende</surname>
          </string-name>
          , D.; and
          <string-name>
            <surname>Mohamed</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2015</year>
          .
          <article-title>Variational Inference with Normalizing Flows</article-title>
          . arXiv e-prints
          <year>arXiv1505</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref35">
        <mixed-citation>
          <string-name>
            <surname>Kingma</surname>
            ,
            <given-names>D. P.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Ba</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2017</year>
          .
          <article-title>Adam: A Method for Stochastic Optimization</article-title>
          .
          <source>arXiv preprint arXiv:1412</source>
          .
          <fpage>6980</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref36">
        <mixed-citation>
          <string-name>
            <surname>Klimov</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Sainz</surname>
            ,
            <given-names>I.;</given-names>
          </string-name>
          and
          <string-name>
            <surname>Romero</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Truncated Wigner approximation as non-positive Kraus map</article-title>
          .
          <source>Physica Scripta</source>
          <volume>95</volume>
          (
          <issue>7</issue>
          ):
          <fpage>074006</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref37">
        <mixed-citation>
          <string-name>
            <surname>Kolesnikov</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Beyer</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Zhai</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Puigcerver</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Yung</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Gelly,
          <string-name>
            <given-names>S.</given-names>
            ; and
            <surname>Houlsby</surname>
          </string-name>
          ,
          <string-name>
            <surname>N.</surname>
          </string-name>
          <year>2019</year>
          . Big Transfer (BiT):
          <article-title>General Visual Representation Learning</article-title>
          . arXiv preprint arXiv:
          <year>1912</year>
          .
          <volume>11370 6</volume>
          (
          <issue>2</issue>
          ):
          <fpage>8</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref38">
        <mixed-citation>
          <string-name>
            <surname>Lample</surname>
          </string-name>
          , G.; and
          <string-name>
            <surname>Charton</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Deep Learning for Symbolic Mathematics</article-title>
          . arXiv preprint arXiv:
          <year>1912</year>
          .01412 .
        </mixed-citation>
      </ref>
      <ref id="ref39">
        <mixed-citation>
          <string-name>
            <surname>Landau</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ; Paez,
          <string-name>
            <given-names>M. J.;</given-names>
            and
            <surname>Bordeianu</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.</surname>
          </string-name>
          <year>2015</year>
          . Computational Physics:
          <article-title>Problem Solving with Python</article-title>
          .
          <source>Wiley, 3rd edition.</source>
        </mixed-citation>
      </ref>
      <ref id="ref40">
        <mixed-citation>
          2017.
          <article-title>Visualizing the Loss Landscape of Neural Nets</article-title>
          .
          <source>arXiv preprint arXiv:1712</source>
          .
          <fpage>09913</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref41">
        <mixed-citation>
          <string-name>
            <surname>Manzhos</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Machine learning for the solution of the Schro¨dinger equation</article-title>
          .
          <source>Machine Learning: Science and Technology</source>
          <volume>1</volume>
          (
          <issue>1</issue>
          ):
          <fpage>013002</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref42">
        <mixed-citation>
          <string-name>
            <surname>Mills</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Spanner</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Tamblyn</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          <year>2017</year>
          .
          <article-title>Deep learning and the Schro¨dinger equation</article-title>
          .
          <source>Physical Review A</source>
          <volume>96</volume>
          (
          <issue>4</issue>
          ):
          <fpage>042113</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref43">
        <mixed-citation>
          <string-name>
            <surname>Nagai</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ; Akashi,
          <string-name>
            <surname>R.</surname>
          </string-name>
          ; Sasaki,
          <string-name>
            <given-names>S.</given-names>
            ; and
            <surname>Tsuneyuki</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref44">
        <mixed-citation>
          <article-title>Neural-network Kohn-Sham exchange-correlation potential and its out-of-training transferability</article-title>
          .
          <source>The Journal of chemical physics 148</source>
          (
          <issue>24</issue>
          ):
          <fpage>241737</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref45">
        <mixed-citation>
          <string-name>
            <surname>Nakajima</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Tanaka</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ; and Hashimoto,
          <string-name>
            <surname>T.</surname>
          </string-name>
          <year>2020</year>
          .
          <article-title>Neural Schro¨dinger Equation: Physical Law as Neural Network</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref46">
        <mixed-citation>
          arXiv preprint arXiv:
          <year>2006</year>
          .13541 .
        </mixed-citation>
      </ref>
      <ref id="ref47">
        <mixed-citation>
          <string-name>
            <surname>Nakatsuji</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <year>2002</year>
          .
          <article-title>Inverse Schro¨dinger equation and the exact wave function</article-title>
          .
          <source>Physical Review A</source>
          <volume>65</volume>
          (
          <issue>5</issue>
          ):
          <fpage>052122</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref48">
        <mixed-citation>
          <string-name>
            <surname>Parr</surname>
          </string-name>
          , R. G.; and
          <string-name>
            <surname>Yang</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          <year>1995</year>
          .
          <article-title>Density-functional theory of the electronic structure of molecules</article-title>
          .
          <source>Annual Review of Physical Chemistry</source>
          <volume>46</volume>
          (
          <issue>1</issue>
          ):
          <fpage>701</fpage>
          -
          <lpage>728</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref49">
        <mixed-citation>
          <string-name>
            <surname>Pronchik</surname>
            ,
            <given-names>J. N.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Williams</surname>
            ,
            <given-names>B. W.</given-names>
          </string-name>
          <year>2003</year>
          .
          <article-title>Exactly Solvable Quantum Mechanical Potentials: An Alternative Approach</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref50">
        <mixed-citation>
          <source>Journal of Chemical Education</source>
          <volume>80</volume>
          (
          <issue>8</issue>
          ):
          <fpage>918</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref51">
        <mixed-citation>
          <string-name>
            <surname>Pu</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; and Chen,
          <string-name>
            <surname>Y.</surname>
          </string-name>
          <year>2020</year>
          .
          <article-title>Soliton, Breather and Rogue Wave Solutions for Solving the Nonlinear Schro¨dinger Equation Using a Deep Learning Method with Physical Constraints</article-title>
          . arXiv preprint arXiv:
          <year>2011</year>
          .04949 .
        </mixed-citation>
      </ref>
      <ref id="ref52">
        <mixed-citation>
          <string-name>
            <surname>Rafi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ; et al.
          <year>1995</year>
          .
          <article-title>An empirical potential function of diatomic molecules</article-title>
          .
          <source>Physics Letters A</source>
          <volume>205</volume>
          (
          <issue>5-6</issue>
          ):
          <fpage>383</fpage>
          -
          <lpage>387</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref53">
        <mixed-citation>
          <string-name>
            <surname>Raissi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Perdikaris</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ; and Karniadakis,
          <string-name>
            <surname>G. E.</surname>
          </string-name>
          <year>2017a</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref54">
        <mixed-citation>
          <string-name>
            <given-names>Physics</given-names>
            <surname>Informed Deep Learning (Part</surname>
          </string-name>
          <string-name>
            <surname>I)</surname>
          </string-name>
          :
          <article-title>Data-driven Solutions of Nonlinear Partial Differential Equations</article-title>
          .
          <source>arXiv preprint arXiv:1711</source>
          .
          <fpage>10561</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref55">
        <mixed-citation>
          <string-name>
            <surname>Raissi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Perdikaris</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ; and Karniadakis,
          <string-name>
            <surname>G. E.</surname>
          </string-name>
          <year>2017b</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref56">
        <mixed-citation>
          <string-name>
            <given-names>Physics</given-names>
            <surname>Informed Deep</surname>
          </string-name>
          <article-title>Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations</article-title>
          .
          <source>arXiv preprint arXiv:1711</source>
          .
          <fpage>10566</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref57">
        <mixed-citation>
          <string-name>
            <surname>Raissi</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Perdikaris</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ; and Karniadakis,
          <string-name>
            <surname>G. E.</surname>
          </string-name>
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref58">
        <mixed-citation>
          <article-title>Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations</article-title>
          .
          <source>Journal of Computational Physics</source>
          <volume>378</volume>
          :
          <fpage>686</fpage>
          -
          <lpage>707</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref59">
        <mixed-citation>
          <string-name>
            <surname>Riley</surname>
            ,
            <given-names>K. F.</given-names>
          </string-name>
          <year>1974</year>
          .
          <article-title>Mathematical Methods for the Physical Sciences: An Informal Treatment for Students of Physics and Engineering</article-title>
          . Cambridge University Press. doi:
          <volume>10</volume>
          .1017/ CBO9781139167550.
        </mixed-citation>
      </ref>
      <ref id="ref60">
        <mixed-citation>
          <string-name>
            <surname>Robinett</surname>
            , R.; and Robinett,
            <given-names>R. W.</given-names>
          </string-name>
          <year>2006</year>
          .
          <article-title>Quantum mechanics: Classical results, modern systems, and visualized examples</article-title>
          . Oxford University Press.
        </mixed-citation>
      </ref>
      <ref id="ref61">
        <mixed-citation>
          <string-name>
            <surname>Robinett</surname>
            ,
            <given-names>R. W.</given-names>
          </string-name>
          <year>1997</year>
          .
          <article-title>Quantum mechanics</article-title>
          . Oxford University Press, New York.
        </mixed-citation>
      </ref>
      <ref id="ref62">
        <mixed-citation>
          <string-name>
            <surname>Romanowski</surname>
            ,
            <given-names>Z.</given-names>
          </string-name>
          <year>2007</year>
          .
          <article-title>Numerical Solution Of Kohn-Sham Equation For Atom</article-title>
          .
          <source>Acta Physica Polonica B</source>
          <volume>38</volume>
          (
          <issue>10</issue>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref63">
        <mixed-citation>
          <string-name>
            <surname>Sakurai</surname>
            ,
            <given-names>J. J.;</given-names>
          </string-name>
          and
          <string-name>
            <surname>Commins</surname>
            ,
            <given-names>E. D.</given-names>
          </string-name>
          <year>1995</year>
          . Modern Quantum Mechanics,
          <string-name>
            <given-names>Revised</given-names>
            <surname>Edition</surname>
          </string-name>
          . American Association of Physics Teachers.
        </mixed-citation>
      </ref>
      <ref id="ref64">
        <mixed-citation>
          <string-name>
            <surname>Schleder</surname>
            ,
            <given-names>G. R.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Padilha</surname>
            ,
            <given-names>A. C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Acosta</surname>
            ,
            <given-names>C. M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Costa</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Fazzio</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>From DFT to machine learning: recent approaches to materials science-a review</article-title>
          .
          <source>Journal of Physics: Materials</source>
          <volume>2</volume>
          (
          <issue>3</issue>
          ):
          <fpage>032001</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref65">
        <mixed-citation>
          <string-name>
            <surname>Schmidt</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Shi</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Borlido,
          <string-name>
            <given-names>P.</given-names>
            ;
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            ;
            <surname>Botti</surname>
          </string-name>
          ,
          <string-name>
            <surname>S.</surname>
          </string-name>
          ; and Marques,
          <string-name>
            <surname>M. A.</surname>
          </string-name>
          <year>2017</year>
          .
          <article-title>Predicting the thermodynamic stability of solids combining density functional theory and machine learning</article-title>
          .
          <source>Chemistry of Materials</source>
          <volume>29</volume>
          (
          <issue>12</issue>
          ):
          <fpage>5090</fpage>
          -
          <lpage>5103</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref66">
        <mixed-citation>
          <string-name>
            <surname>Schmitz</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Godtliebsen</surname>
            ,
            <given-names>I. H.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Christiansen</surname>
            ,
            <given-names>O.</given-names>
          </string-name>
          <year>2019</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref67">
        <mixed-citation>
          <article-title>Machine learning for potential energy surfaces: An extensive database and assessment of methods</article-title>
          .
          <source>The Journal of chemical physics 150</source>
          (
          <issue>24</issue>
          ):
          <fpage>244113</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref68">
        <mixed-citation>
          <string-name>
            <surname>Senn</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <year>1986</year>
          .
          <article-title>The modified Poschl-Teller Oscillator</article-title>
          .
          <source>Journal of Chemical Education</source>
          <volume>63</volume>
          (
          <issue>1</issue>
          ):
          <fpage>75</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref69">
        <mixed-citation>
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Myers</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Vondrick</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Murphy</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Schmid</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>VideoBERT: A Joint Model for Video and Language Representation Learning</article-title>
          .
          <source>In Proceedings of the IEEE/CVF International Conference on Computer Vision</source>
          ,
          <fpage>7464</fpage>
          -
          <lpage>7473</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref70">
        <mixed-citation>
          <string-name>
            <surname>Tong</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Xiong</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>He</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Pan</surname>
          </string-name>
          , G.; and
          <string-name>
            <surname>Zhu</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Symplectic Neural Networks in Taylor Series Form for Hamiltonian Systems</article-title>
          . arXiv preprint arXiv:
          <year>2005</year>
          .04986 .
        </mixed-citation>
      </ref>
      <ref id="ref71">
        <mixed-citation>
          <string-name>
            <surname>Torfi</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Shirvani</surname>
            ,
            <given-names>R. A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Keneshloo</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Tavvaf</surname>
          </string-name>
          , N.; and
          <string-name>
            <surname>Fox</surname>
            ,
            <given-names>E. A.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Natural Language Processing Advancements By Deep Learning: A Survey</article-title>
          . arXiv preprint arXiv:
          <year>2003</year>
          .01200 .
        </mixed-citation>
      </ref>
      <ref id="ref72">
        <mixed-citation>
          <string-name>
            <surname>Toth</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Rezende</surname>
            ,
            <given-names>D. J.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Jaegle</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ; Racanie`re, S.;
          <string-name>
            <surname>Botev</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Higgins</surname>
            ,
            <given-names>I.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>Hamiltonian Generative Networks</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref73">
        <mixed-citation>
          arXiv preprint arXiv:
          <year>1909</year>
          .13789 .
        </mixed-citation>
      </ref>
      <ref id="ref74">
        <mixed-citation>
          <string-name>
            <surname>Unke</surname>
            ,
            <given-names>O. T.</given-names>
          </string-name>
          ; and Markus,
          <string-name>
            <surname>M.</surname>
          </string-name>
          <year>2019</year>
          .
          <article-title>Machine Learning Potential Energy Surfaces</article-title>
          . arXiv:
          <year>1909</year>
          .08027 .
        </mixed-citation>
      </ref>
      <ref id="ref75">
        <mixed-citation>
          <string-name>
            <surname>Vogel</surname>
            ,
            <given-names>C. R.</given-names>
          </string-name>
          <year>2002</year>
          .
          <article-title>Computational methods for inverse problems</article-title>
          , volume
          <volume>23</volume>
          . Siam.
        </mixed-citation>
      </ref>
      <ref id="ref76">
        <mixed-citation>
          <string-name>
            <surname>Wazwaz</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2009</year>
          .
          <article-title>Partial Differential Equations and Solitary Waves Theory</article-title>
          . Springer Berlin Heidelberg.
        </mixed-citation>
      </ref>
      <ref id="ref77">
        <mixed-citation>
          <string-name>
            <surname>Zakhariev</surname>
            ,
            <given-names>B. N.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Suzko</surname>
            ,
            <given-names>A. A.</given-names>
          </string-name>
          <year>2012</year>
          .
          <article-title>Direct and inverse problems: potentials in quantum scattering</article-title>
          . Springer Science &amp; Business Media.
        </mixed-citation>
      </ref>
      <ref id="ref78">
        <mixed-citation>
          <string-name>
            <surname>Zhong</surname>
            ,
            <given-names>Y. D.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Dey</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Chakraborty</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2020</year>
          .
          <article-title>Dissipative SymODEN: Encoding Hamiltonian Dynamics with Dissipation and Control into Deep Learning</article-title>
          . arXiv preprint arXiv:
          <year>2002</year>
          .08860 .
        </mixed-citation>
      </ref>
      <ref id="ref79">
        <mixed-citation>
          <article-title>2sinh(x +</article-title>
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>