<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Deep Sensing of Ocean Wave Heights with Synthetic Aperture Radar</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Brandon Quach</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yannik Glaser</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Justin Stopa</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Peter Sadowski</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Computing and Mathematical Sciences, California Institute of Technology</institution>
          ,
          <country country="US">USA</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Information and Computer Sciences, University of Hawai'i at Ma ̄noa</institution>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Ocean Resources and Engineering, University of Hawai'i at Ma ̄noa</institution>
        </aff>
      </contrib-group>
      <abstract>
        <p>The Sentinel-1 satellites equipped with synthetic aperture radars (SAR) provide near global coverage of the world's oceans every six days. We curate a data set of co-locations between SAR and altimeter satellites, and investigate the use of deep learning to predict significant wave height from SAR. While previous models for predicting geophysical quantities from SAR rely heavily on feature-engineering, our approach learns directly from low-level image cross-spectra. Training on co-locations from 2015-2017, we demonstrate on test data from 2018 that deep learning reduces the state-of-the-art root mean squared error by 50%, from 0.6 meters to 0.3 meters.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>Introduction</title>
      <p>
        Synthetic aperture radar (SAR) is an important remote
sensing technology able to achieve high spatial resolution (&lt; 10
meter). From SAR satellite data, geophysical properties can
be predicted using statistical models, enabling researchers to
monitor global sea states with unprecedented coverage,
precision, frequency, and without the use of complicated SAR
modulation transfer functions
        <xref ref-type="bibr" rid="ref16">(Schulz-Stellenfleth, Ko¨nig,
and Lehner 2007)</xref>
        . Sea state information provides scientific
value in understanding the propagation of waves
        <xref ref-type="bibr" rid="ref19 ref2 ref8">(Collard,
Ardhuin, and Chapron 2009; Stopa et al. 2016)</xref>
        and the
effects of climate change
        <xref ref-type="bibr" rid="ref22">(Young, Zieger, and Babanin 2011)</xref>
        ,
as well as immediate practical benefits such as alerting ships
to dangerously large waves created by storms.
      </p>
      <p>
        SARs capture sea surface roughness and many other
geophysical phenomena
        <xref ref-type="bibr" rid="ref21">(Wang et al. 2019)</xref>
        . Therefore,
predicting ocean wave signatures from SAR images typically
requires feature engineering — a dimensionality-reduction
step that extracts task-specific i nformation. CWAVE i s a
common feature set for describing wave properties in SAR
as a basis of 20 orthogonal features derived from the SAR
modulation spectra. CWAVE has been used to estimate
the significant w ave h eight f or t he S ARs a board: 1 )
ERS2 (Schulz-Stellenfleth, Ko¨nig, a nd L ehner 2 007), 2 )
ENVISAT
        <xref ref-type="bibr" rid="ref11">(Li, Lehner, and Bruns 2011)</xref>
        , and Sentinel-1
        <xref ref-type="bibr" rid="ref17">(Stopa
and Mouche 2017; Pleskachevsky et al. 2019)</xref>
        linking SAR
imaging to vital sea state information. Such reduced
representations of high-dimensional data can be useful when
fitting statistical models to relatively small data sets, but
they are also limiting; task-relevant information is
almostcertainly lost when reducing a high-dimensional SAR image
to the low-dimensional CWAVE feature space.
      </p>
      <p>
        In this work, we attempt to extract additional
information from SAR images using deep learning with artificial
neural networks. Deep learning has proven to be an
extremely effective approach to representation learning,
leading to rapid advances in diverse fields such as computer
vision
        <xref ref-type="bibr" rid="ref10 ref9">(Krizhevsky, Sutskever, and Hinton 2014)</xref>
        , high-energy
physics
        <xref ref-type="bibr" rid="ref1 ref15 ref9">(Baldi, Sadowski, and Whiteson 2014; Sadowski
and Baldi 2018)</xref>
        , and chemistry
        <xref ref-type="bibr" rid="ref12 ref18 ref3">(Lusci, Pollastri, and Baldi
2013; Duvenaud et al. 2015)</xref>
        . Deep learning has the potential
to make similar advances in remote sensing for
oceanography by extracting information directly from SAR
modulation cross-spectra.
      </p>
      <p>In this work, we first curate a data set of over 750,000
co-locations of SAR and altimeter satellites, which provides
SAR in conjunction with direct measurements of ocean
wave heights. The data is used to train deep neural networks
to predict significant wave height, Hs, defined as the mean
of the top third of a wave height distribution. We compare
training on SAR image spectra vs. high-level CWAVE
features, and analyze the effect of training data set size.</p>
    </sec>
    <sec id="sec-2">
      <title>Methods</title>
      <sec id="sec-2-1">
        <title>Data</title>
        <p>
          We curate a training set of historical data from two types
of polar-orbiting satellites: Sentinel-1 SAR satellites and
altimeter satellites. Because the satellites are on different
trajectories, their paths frequently intersect, providing
measurements from roughly the same location at the same time.
Specifically, the data set is constructed using measurements
that are less than 3 hours apart and with spatial differences
less than 300 km, resulting in 753,777 co-location events
from 2015 through 2018 that are geographically well
distributed (Figure 1). These events have both SAR imaging
from Sentinel-1 and significant wave height from an
altimeter, and provide a high-fidelity reference data set
          <xref ref-type="bibr" rid="ref14">(Ribal and
Young 2019)</xref>
          .
        </p>
        <p>The data set is split into training, validation, and test sets
based on year of data collection. Co-location events from
2015 and 2016 were used as the training set, events from
2017 was used as a validation set, and events from 2018 was
used as held-out test set. The result was 303,574 training
examples, 265,052 validation examples, and 185,151 test
examples. The validation set was used for learning rate
annealing, early stopping, and hyper-parameter selection, while the
test set was only used for the final evaluation of the model.</p>
        <p>
          The Sentinel-1 SAR data set consists of the real and
imaginary components computed from SAR modulation cross
spectra. Each data point within the cross spectra was
created by taking the Level 1 SAR image with 5 5 m pixel
resolution covering a 20 20 km area and applying a 2D
Fourier transformation to different ”looks” within the dwell
time
          <xref ref-type="bibr" rid="ref5">(Engen and Johnsen 1995)</xref>
          to obtain the real and
imaginary modulation spectra
          <xref ref-type="bibr" rid="ref2 ref8">(Johnsen and Collard 2009)</xref>
          (Figure 2). The modulation spectra consists of two matrices (real
and imaginary) of shape 72 60 with one dimension
corresponding to wavenumber and the other direction. These
two matrices were then stacked to form the input tensor with
shape 72 60 2. The 1-Hz altimeter dataset estimates
significant wave heights with spatial footprints of 6 to 10 km.
The altimeter dataset consists of data merged from 6
different altimeter missions and has been cross-calibrated between
platforms, as in Ribal and Young
          <xref ref-type="bibr" rid="ref14">(Ribal and Young 2019)</xref>
          .
The SAR image spectra were then pre-processed by
centering and scaling the real and imaginary image modulation
spectra separately — each pixel was normalized by
subtracting the overall mean and dividing by the overall standard
deviation of all pixels and all co-locations.
        </p>
        <p>In addition to the SAR image spectra, we include a
number of high-level features in our model. First, we include the
time and distance between the satellite co-location
measurements, normalized to have zero mean and unit variance —
while this information is only available during training. The
time (or distance) between satellite observations provides a
rough estimate of how much we can trust the altimeter
measurement to provide an accurate target because sea states can
change faster than our original time and space constraints.
These features are simply set to zero at prediction time.
Second, the time-of-day was encoded as a value between -1 and
1 using the function f (t) = 2sin( 248t ) 1; this
normalization helps stabilize the neural network optimization. Third,
latitude and longitude were encoded by representing each as
an angle in the range [0; 2 ) then taking the sine and cosine,
resulting in four features total. Fourth, a binary label was
created to specify the SAR satellite; S1-A or S1-B are
calibrated to produce comparable data, but there could be small
differences. Finally, we also include the 20 non-dimensional
CWAVE parameters that are derived from the image spectra,
each normalized using standard scaling to have zero mean
and unit variance over the training examples.
t
u
p
n
I</p>
        <p>Dense Dense
256 128
Dense 256</p>
      </sec>
      <sec id="sec-2-2">
        <title>Deep Learning</title>
        <p>Deep Neural Network Architecture We propose a deep
neural network architecture that predicts significant wave
height by using the input data from SAR image spectra. The
model starts as two branches with separate inputs: one which
processes the spectral input and another which extracts
information from the high-level features (Figure 3). The
spectral input branch takes an input tensor of the shape (72, 60, 2)
where the real and imaginary values of the Fourier transform
are stacked along the third axis, analogous to the ‘colors’ of
an RGB image. This input tensor is then fed sequentially
into three convolutional layers containing 64, 128, and 256
filters respectively. A filter size of 3 3 is maintained at each
convolutional layer with a rectified linear unit (ReLU)
activation. In addition, each layer is followed by a max pooling
layer with a 2 2 window. The final convolutional layer is
fed into a global max-pooling layer which produces a
flattened array of size 256. This is then fed into two additional
dense layers with 256 hidden units each with ReLU
activation. The non-spectral data branch consists an input layer of
the following 32 features:
20 CWAVE features</p>
        <sec id="sec-2-2-1">
          <title>1 time of day feature</title>
        </sec>
        <sec id="sec-2-2-2">
          <title>2 latitude features (sine and cosine)</title>
        </sec>
        <sec id="sec-2-2-3">
          <title>2 longitude features (sine and cosine)</title>
        </sec>
        <sec id="sec-2-2-4">
          <title>1 incidence angle feature</title>
          <p>1 incidence angle mode feature (binary flag representing
WV1 or WV2)
1 satellite source feature (binary flag representing
Sentinel-1A and Sentinel-1B)
1 time difference between altimeter and sentinel
measurements feature
1 spatial difference between altimeter and sentinel
measurements feature
1 normalized radar cross section 0 feature
1 normalized variance of radar cross section feature.
These 32 high-level features are fed into 11 dense layers
with 256 ReLU hidden units each. Both branches yield a
flattened array of size 256 which are then concatenated to form
a single vector with 512 features. Two hidden dense layers
of 256 and 128 hidden ReLU units then integrate the image
spectra branch with the second branch. Finally, an output
layer with a dropout of 0.337 and softplus activation makes
the final prediction.</p>
          <p>
            This model is trained to minimize the mean squared
error (MSE) using the Adam optimizer
            <xref ref-type="bibr" rid="ref9">(Kingma and Ba 2014)</xref>
            with a batch size of 128 and an initial learning rate of 0.0003.
The learning rate was decreased by 20% if the validation
set MSE did not improve over 4 epochs, and training was
stopped when the validation set MSE did not improve over
10 epochs. The best model was trained for 35 epochs. The
dropout rate, initial learning rate, and batch size were
optimized using the SHERPA black-box optimization package
for machine learning hyper-parameter tuning
            <xref ref-type="bibr" rid="ref7">(Hertel et al.
2018)</xref>
            on a cluster Nvidia RTX 2080 Ti GPUs. One hundred
models were trained using the random search algorithm to
optimize over the search space shown in Table 1.
CWAVE Models To measure the advantage of the deep
learning approach over simpler models, we also trained two
models that predict the significant wave height from the 32
high-level features alone: a simple linear regression model
and a deep neural network. The neural network consisted of
eleven dense hidden layers of 256 ReLU units, followed by
a layer of 64 ReLU units, and two outputs. The two
outputs correspond to a heteroskedastic Gaussian distribution
N (y1; y2), where y2 is restricted to ensure positive variance
by defining a custom activation function:
          </p>
          <p>
            Weights are initialized using the scaling suggested by
            <xref ref-type="bibr" rid="ref6">(He
et al. 2015)</xref>
            , and the conditional negative log-likelihood of
the target values is minimized using the Adam optimizer
            <xref ref-type="bibr" rid="ref9">(Kingma and Ba 2014)</xref>
            with mini-batches of size 1024. The
initial learning rate of 0:003 decays starting at epoch 300 at
a decay rate of 0:0005 applied at the end of each subsequent
epoch. A dropout rate of 0:5 is applied to the penultimate
layer. Training is stopped when the validation loss doesn’t
improve after 15 epochs. The architecture, learning rate, and
early-stopping were optimized with SHERPA.
          </p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Results</title>
      <p>To compare the three types of models after hyper-parameter
tuning, we trained each on data from 2015-2016, and tested
on events from 2018, enabling us to explore the relative
benefits of deep learning and the use of image spectra features.
Table 2 shows that the deep neural network trained on image
spectra achieves a significantly lower root mean squared
error (RMSE) of 0.33 meters compared to the other methods
that rely only on the high-level features: 0.64 m for the
linear model and 0.43 m for the deep neural network trained on
CWAVE alone. Furthermore, this performance improvement
is consistent across small, medium, and large waves.</p>
      <p>A feature importance study (Table 2) shows the
dependence on each set of features. Two additional models
were trained with an identical DNN architecture and
hyperparameters, but with specific high-level features removed:
the 20 CWAVE parameters removed or the latitude and
longitude features. In both models, time of day, incidence
angle, incidence angle mode, satellite type, time and distance
difference between altimeter and sentinel data, normalized
radar cross section 0 and normalized variance of radar
cross section are still included. The results show that the
high-level CWAVE features are still used by the model, but
only slightly — despite containing no additional
information, these features add implicit bias to the model. The
location features do contain additional information — they
essentially allow the model to learn a prior over the wave
heights at different locations — but these too only have a
small effect on performance.</p>
      <sec id="sec-3-1">
        <title>Wave Height</title>
      </sec>
      <sec id="sec-3-2">
        <title>All Waves</title>
        <p>&lt;1m
1m - 3m
3m - 8m
&gt;8m</p>
        <p>
          Finally, we explore the impact of increasing the size of
the training set on the discrepancy between including and
not including CWAVE parameters in our final model. In
this experiment, we fix the hyper-parameters, train on data
from 2015-2017 (568,626 examples), then test on 2018. The
models are trained for a fixed 30 epochs where the
learning rate is annealed by a factor of 0.4 every 10 epochs.
Initial learning rate and dropout are identical to that of our
optimal deep neural network architecture. Figure 4 shows
the mean performance of six randomly-initialized networks
trained with different fractions of the data set. An
ensemble (arithmetic mean) of the 6 models using all features and
the complete training set gives a test RMSE of 0.307 — a
50% reduction in RMSE from the previous state-of-the-art
of 0.6 m
          <xref ref-type="bibr" rid="ref17">(Stopa and Mouche 2017)</xref>
          . This also approaches
the RMSE of satellite altimetry compared to buoy
observations
          <xref ref-type="bibr" rid="ref14">(Ribal and Young 2019)</xref>
          .
        </p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>Discussion</title>
      <p>Our results demonstrate that a deep convolutional neural
network can extract useful representations from SAR
image spectra that is not captured by engineered CWAVE
features. In a direct comparison between two hyper-parameter
optimized deep neural networks, the network with the
image spectra information obtained a 29% reduction in RMSE
(0.33 m vs. 0.43 m). This is in keeping with the success of
deep learning in other fields, where the expertly engineered
features are discarded in favor of learned features.</p>
      <p>Our results show that there is still some advantage to
including the CWAVE features in the model, even with a
training set of over 500,000 examples. However, we also show
that this advantage diminishes as the number of training
examples increases (Figure 4). The CWAVE features, being
derived from the image data, provide no additional
information, but they help bias the model towards a good
solution. As the training data set increases, the model is slower
to overfit, and the benefit of including the CWAVE features
disappears.</p>
      <p>Latitude and longitude information is useful for
predicting significant wave heights because there are regional
characteristics of the wave climate (Stopa et al. 2013).
However, the feature importance study shows that our model only
makes minimal use of this information. This is encouraging,
because it implies that the model is relying almost entirely
on the direct measurements rather than geographical
information.</p>
    </sec>
    <sec id="sec-5">
      <title>Conclusion</title>
      <p>Our results demonstrate that deep learning provides a 50%
decrease in RMSE compared to the previous
state-of-theart in predicting significant wave height from SAR. Instead
of relying on the set of engineered CWAVE features that
capture most of the discriminative information, our deep
learning approach learns directly from the low-level,
highdimensional image spectra. Furthermore, our results indicate
that there is still room for improvement with additional
training data. Thus, we should we should expect the performance
of our model to increase as more co-location events are
collected over the next couple years.</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgments</title>
      <p>This work was made possible thanks to SAR data access
granted by ESA projects: Sentinel-1 A Mission
Performance Center (4000107360/12/I-LG) and Sentinel-1 Ocean
Study (S1-4SCI-16-0002). The altimetry data was sourced
from the Integrated Marine Observing System (IMOS). All
Sentinel-1 L2 data used in this study can be obtained from
the Copernicus Data Hub (cophub.copernicus.eu). The
authors would like to thank NVIDIA for a hardware grant
to PS. The technical support and advanced computing
resources from the University of Hawai‘i Information
Technology Services Cyberinfrastructure are gratefully
acknowledged.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          <string-name>
            <surname>Baldi</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ; Sadowski,
          <string-name>
            <given-names>P.</given-names>
            ; and
            <surname>Whiteson</surname>
          </string-name>
          ,
          <string-name>
            <surname>D.</surname>
          </string-name>
          <year>2014</year>
          .
          <article-title>Searching for exotic particles in high-energy physics with deep learning</article-title>
          .
          <source>Nature Communications 5.</source>
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          <string-name>
            <surname>Collard</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Ardhuin</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Chapron</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          <year>2009</year>
          .
          <article-title>Monitoring and analysis of ocean swell fields from space: New methods for routine observations</article-title>
          .
          <source>Journal of Geophysical Research</source>
          <volume>114</volume>
          (
          <issue>C7</issue>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          <string-name>
            <surname>Duvenaud</surname>
            ,
            <given-names>D. K.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Maclaurin</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Iparraguirre</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Bombarell,
          <string-name>
            <surname>R.</surname>
          </string-name>
          ; Hirzel,
          <string-name>
            <given-names>T.</given-names>
            ;
            <surname>Aspuru-Guzik</surname>
          </string-name>
          ,
          <string-name>
            <surname>A.</surname>
          </string-name>
          ; and Adams,
          <string-name>
            <surname>R. P.</surname>
          </string-name>
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          <article-title>Convolutional networks on graphs for learning molecular fingerprints</article-title>
          .
          <source>In Advances in Neural Information Processing Systems</source>
          ,
          <volume>2215</volume>
          -
          <fpage>2223</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          <string-name>
            <surname>Engen</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Johnsen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          <year>1995</year>
          .
          <article-title>SAR-ocean wave inversion using image cross spectra</article-title>
          .
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          <volume>33</volume>
          (
          <issue>4</issue>
          ):
          <fpage>1047</fpage>
          -
          <lpage>1056</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          <string-name>
            <surname>He</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Zhang</surname>
            ,
            <given-names>X.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Ren</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; and
          <string-name>
            <surname>Sun</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2015</year>
          .
          <article-title>Delving deep into rectifiers: Surpassing human-level performance on imagenet classification</article-title>
          .
          <source>In Proceedings of the IEEE International Conference on Computer Vision</source>
          ,
          <fpage>1026</fpage>
          -
          <lpage>1034</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          <string-name>
            <surname>Hertel</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Collado</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Sadowski,
          <string-name>
            <surname>P.</surname>
          </string-name>
          ; and Baldi,
          <string-name>
            <surname>P.</surname>
          </string-name>
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          <string-name>
            <surname>Johnsen</surname>
            ,
            <given-names>H.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Collard</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          <year>2009</year>
          .
          <article-title>Sentinel-1 ocean swell wave spectra (osw) algorithm definition</article-title>
          .
          <source>Technical report</source>
          , Northern Research Institute.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          <string-name>
            <surname>Kingma</surname>
            ,
            <given-names>D. P.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Ba</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          <year>2014</year>
          .
          <article-title>Adam: A method for stochastic optimization</article-title>
          .
          <source>arXiv preprint arXiv:1412</source>
          .
          <fpage>6980</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          <string-name>
            <surname>Krizhevsky</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Sutskever</surname>
            ,
            <given-names>I.;</given-names>
          </string-name>
          and Hinton,
          <string-name>
            <surname>G.</surname>
          </string-name>
          <year>2014</year>
          .
          <article-title>Imagenet classification with deep convolutional neural</article-title>
          .
          <source>In Neural Information Processing Systems</source>
          ,
          <volume>1</volume>
          -
          <fpage>9</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>X.-M.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Lehner</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ; and Bruns,
          <string-name>
            <surname>T.</surname>
          </string-name>
          <year>2011</year>
          .
          <article-title>Ocean wave integral parameter measurements using envisat asar wave mode data</article-title>
          .
          <source>IEEE Transactions on Geoscience and Remote Sensing</source>
          <volume>49</volume>
          (
          <issue>1</issue>
          ):
          <fpage>155</fpage>
          -
          <lpage>174</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          <string-name>
            <surname>Lusci</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Pollastri</surname>
            , G.; and Baldi,
            <given-names>P.</given-names>
          </string-name>
          <year>2013</year>
          .
          <article-title>Deep architectures and deep learning in chemoinformatics: the prediction of aqueous solubility for drug-like molecules</article-title>
          .
          <source>Journal of Chemical Information and Modeling</source>
          <volume>53</volume>
          (
          <issue>7</issue>
          ):
          <fpage>1563</fpage>
          -
          <lpage>1575</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          2019.
          <article-title>Estimation of sea state from sentinel-1 synthetic aperture radar imagery for maritime situation awareness</article-title>
          .
          <source>International Journal of Remote Sensing</source>
          <volume>40</volume>
          (
          <issue>11</issue>
          ):
          <fpage>4104</fpage>
          -
          <lpage>4142</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          <string-name>
            <surname>Ribal</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Young</surname>
            ,
            <given-names>I. R.</given-names>
          </string-name>
          <year>2019</year>
          .
          <article-title>33 years of globally calibrated wave height and wind speed data based on altimeter observations</article-title>
          .
          <source>Scientific Data</source>
          <volume>6</volume>
          (
          <issue>1</issue>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          <string-name>
            <surname>Sadowski</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Baldi</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          <year>2018</year>
          .
          <article-title>Deep learning in the natural sciences: applications to physics</article-title>
          .
          <source>In Braverman Readings in Machine Learning. Key Ideas</source>
          from Inception to Current State. Springer.
          <fpage>269</fpage>
          -
          <lpage>297</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          <string-name>
            <surname>Schulz-Stellenfleth</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          ; Ko¨nig, T.; and
          <string-name>
            <surname>Lehner</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2007</year>
          .
          <article-title>An empirical approach for the retrieval of integral ocean wave parameters from synthetic aperture radar data</article-title>
          .
          <source>Journal of Geophysical Research</source>
          <volume>112</volume>
          (
          <issue>C3</issue>
          ).
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          <string-name>
            <surname>Stopa</surname>
            ,
            <given-names>J. E.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Mouche</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          <year>2017</year>
          .
          <article-title>Significant wave heights from sentinel-1 sar: Validation and applications</article-title>
          .
          <source>Journal of Geophysical Research: Oceans</source>
          <volume>122</volume>
          (
          <issue>3</issue>
          ):
          <fpage>1827</fpage>
          -
          <lpage>1848</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref18">
        <mixed-citation>
          2013.
          <article-title>Patterns and cycles in the climate forecast system reanalysis wind and wave data</article-title>
          .
          <source>Ocean Modelling</source>
          <volume>70</volume>
          :
          <fpage>207</fpage>
          -
          <lpage>220</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref19">
        <mixed-citation>
          <string-name>
            <surname>Stopa</surname>
            ,
            <given-names>J. E.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Ardhuin</surname>
            ,
            <given-names>F.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Husson</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ; Jiang,
          <string-name>
            <given-names>H.</given-names>
            ;
            <surname>Chapron</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            ; and
            <surname>Collard</surname>
          </string-name>
          ,
          <string-name>
            <surname>F.</surname>
          </string-name>
          <year>2016</year>
          .
          <article-title>Swell dissipation from 10 years of envisat advanced synthetic aperture radar in wave mode</article-title>
          .
        </mixed-citation>
      </ref>
      <ref id="ref20">
        <mixed-citation>
          <source>Geophysical Research Letters</source>
          <volume>43</volume>
          (
          <issue>7</issue>
          ):
          <fpage>3423</fpage>
          -
          <lpage>3430</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref21">
        <mixed-citation>
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Mouche</surname>
            ,
            <given-names>A.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Tandeo</surname>
            ,
            <given-names>P.</given-names>
          </string-name>
          ;
          <string-name>
            <surname>Stopa</surname>
            ,
            <given-names>E. J.</given-names>
          </string-name>
          ; Longepe,
          <string-name>
            <given-names>N.</given-names>
            ;
            <surname>Erhard</surname>
          </string-name>
          ,
          <string-name>
            <surname>G.</surname>
          </string-name>
          ; Foster,
          <string-name>
            <surname>R.</surname>
          </string-name>
          ; Vandemark,
          <string-name>
            <given-names>D.</given-names>
            ; and
            <surname>Chapron</surname>
          </string-name>
          ,
          <string-name>
            <surname>B.</surname>
          </string-name>
          <year>2019</year>
          .
          <article-title>A labeled Ocean SAR Imagery Dataset of Ten Geophysical Phenomena from Sentinel-1 Wave Mode</article-title>
          .
          <source>Geoscience Data Journal.</source>
        </mixed-citation>
      </ref>
      <ref id="ref22">
        <mixed-citation>
          <string-name>
            <surname>Young</surname>
            ,
            <given-names>I. R.</given-names>
          </string-name>
          ; Zieger,
          <string-name>
            <given-names>S.</given-names>
            ; and
            <surname>Babanin</surname>
          </string-name>
          ,
          <string-name>
            <surname>A. V.</surname>
          </string-name>
          <year>2011</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref23">
        <mixed-citation>
          <article-title>Global trends in wind speed and wave height</article-title>
          .
          <source>Science</source>
          <volume>332</volume>
          (
          <issue>6028</issue>
          ):
          <fpage>451</fpage>
          -
          <lpage>455</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>