<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>AI-driven multimodal data fusion for hazardous object detection in maritime and coastal environments⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Vitaly Brevus</string-name>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Halyna Brevus</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Roman Gashynskyi</string-name>
          <email>rgashynskyi@rework-space.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Aser Kashosi</string-name>
          <email>akashosi@rework-space.com</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Roman</string-name>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yuzefovych</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Karpenko Physico-Mechanical Institute NAS of Ukraine, Department of Methods and Facilities for Acquisition and Processing of Diagnostic Signals</institution>
          ,
          <addr-line>5 Naukova Str., Lviv 79060</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Rework-Space LLC</institution>
          ,
          <addr-line>10 Berezhans'ka str, office 82, Ternopil 46027</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Ternopil Ivan Puluj National Technical University</institution>
          ,
          <addr-line>56 Ruska str., Ternopil 46001</addr-line>
          ,
          <country country="UA">Ukraine</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2026</year>
      </pub-date>
      <abstract>
        <p>This paper introduces a novel AI-driven platform that integrates satellite imagery, unmanned aerial vehicle data, and Automatic Identification System signals into an analytics pipeline for hazardous object detection in maritime and coastal environments. The system leverages YOLO11 for object detection and a knowledge graph based on Blue Brain Nexus to achieve semantic interoperability. Results demonstrate the ability of the developed information technology to detect maritime debris, oil spills, and vessel activity, while enabling adaptive route planning and decision support. This approach provides a scalable framework for emergency response and environmental monitoring, aligning with current advances in Artificial Intelligence, machine learning, and applied modeling in information technologies.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;AI</kwd>
        <kwd>multimodal data fusion</kwd>
        <kwd>knowledge graph</kwd>
        <kwd>satellite imagery</kwd>
        <kwd>emergency response 1</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
      <p>
        The frequency of natural disasters, maritime accidents, and climate-related hazards has increased
significantly in recent years [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ]. Traditional monitoring systems that rely solely on satellite data or
unmanned aerial vehicle (UAV)-based inspections often fail to provide a complete and timely
situational picture [
        <xref ref-type="bibr" rid="ref2 ref3">2, 3</xref>
        ]. There is a growing need for integrated approaches that combine multiple
heterogeneous data sources with Artificial Intelligence (AI) to support emergency response [
        <xref ref-type="bibr" rid="ref4 ref5">4, 5</xref>
        ].
This is particularly relevant in the context of digital sovereignty and cloud infrastructure projects
like Gaia-X [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. For near-real-time hazardous object detection, and actionable insights for safer
maritime operations within coastal areas or large rivers one needs the solution, merging satellite
data with accurate drone imagery and heterogeneous data sources enriching the precision and
accuracy of the system.
      </p>
      <p>This paper presents a multimodal AI platform that merges satellite imagery, UAV data, and
Automatic Identification System (AIS) signals into a unified knowledge graph, enabling
near-realtime hazardous object detection and decision support. The goal of this paper is to demonstrate the
feasibility of multimodal fusion for maritime monitoring, present a knowledge-driven architecture
for semantic interoperability, and validate the system.</p>
    </sec>
    <sec id="sec-2">
      <title>2. Related works</title>
      <p>
        Research on AI for remote sensing has expanded rapidly [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ], focusing on applications such as
vessel detection, oil spill monitoring, and disaster assessment, based on developments of neural
networks for image analysis [
        <xref ref-type="bibr" rid="ref7 ref8 ref9">7, 8, 9</xref>
        ]. Multimodal data fusion has been studied extensively [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ],
with approaches ranging from simple statistical techniques to advanced semantic integration.
Knowledge graphs (e.g., Blue Brain Nexus) have emerged as effective tools for organizing
heterogeneous data while ensuring interoperability [
        <xref ref-type="bibr" rid="ref11">11</xref>
        ]. Despite progress, existing works rarely
integrate near-real-time AI models, multimodal fusion, and knowledge graphs into a single system
for maritime safety and emergency response. Integration of cutting-edge object detection
algorithms, such as YOLO, becomes critical for enhancing near-real-time situational awareness and
rapid response in maritime environments. The YOLO object detection algorithm [
        <xref ref-type="bibr" rid="ref12 ref9">9, 12</xref>
        ] is used for
visual recognition of quadcopter streaming videos. Cargo ships can use drones for monitoring
dangerous sea routes. Having such drones in air and sea all the time would be too expensive and
ineffective. Our platform combines available satellite and tracking data for preliminary analysis
and deploy sea drones or UAVs when needed, to receive local accurate data, transform these and
analyze them all in real time.
      </p>
      <p>Relevant data, offered by monitoring services, can be classified by source:



</p>
      <p>Copernicus Marine Service (CMEMS): provides comprehensive sea surface temperature,
salinity, and currents data, along with marine ecosystem data and sea ice concentration and
extent measurements.</p>
      <p>European Marine Observation and Data Network (EMODnet): offers bathymetry, seabed
habitats, and human activities information, complemented by oceanographic data including
tides and currents, as well as geological, biological, and chemical datasets.</p>
      <p>National Oceanic and Atmospheric Administration (NOAA): contributes ocean
temperatures, salinity, and currents data, marine debris monitoring and habitat mapping
services, and weather and sea surface data collected from buoys and drones.</p>
      <p>Ocean Observatories Initiative (OOI): delivers ocean temperature, salinity, and chemical
properties data, seafloor imaging and topography information, and biological observations.</p>
      <p>
        To collect necessary data, MariNeXt service [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ] can be utilized to classify critical maritime
hazards like oil spills and debris. Its robust predictions ensure effective hazard monitoring for
maritime surveillance, even in challenging environments.
      </p>
    </sec>
    <sec id="sec-3">
      <title>3. Methodology</title>
      <p>This study presents the development and validation of a UAV-based object detection system for
maritime surveillance applications. While the broader multimodal data fusion framework
encompasses satellite imagery (Sentinel-1 SAR, Sentinel-2), Automatic Identification System (AIS)
data, and knowledge graph integration, this paper specifically focuses on the UAV component
utilizing YOLOv11 object detection algorithms.</p>
      <sec id="sec-3-1">
        <title>3.1. System architecture overview</title>
        <p>The proposed information technology implements a five-component modular pipeline architecture
(Figure 1). The data ingestion layer aggregates heterogeneous sources including Copernicus
Sentinel Hub API, UAV-mounted sensors, and AIS data streams. Data preprocessing is performed
through the QueryOptima™ platform for harmonization and normalization. A Blue Brain
Nexuspowered knowledge graph manages semantic data storage and interoperability. The analytics layer
integrates YOLOv11 object detection with graph-based reasoning, while the visualization
component provides real-time dashboards and geospatial interfaces for end-user interaction.</p>
      </sec>
      <sec id="sec-3-2">
        <title>3.2. UAV-based object detection system</title>
        <p>A Minimum Viable Product (MVP) of the system was developed and tested during the Case Study:
8th CASSINI Hackathon (November 2024) at the POLE Product Design Center, Ukraine.</p>
        <p>Sentinel
data
Drone data</p>
        <p>Hazard detection</p>
        <p>Visualization</p>
        <p>External API Sources
https://api.myshiptracking.com/
Ship
detection
Using YOLO11
Pollutant
detection
Using MariNeXt</p>
        <p>Risk</p>
        <p>Assessment
1. Predict high-risk areas:
Piracy, pollution.
2. Generate actionable:
insights for operators.</p>
        <p>
          The core contribution of this work centers on the development of a specialized UAV-based
maritime object detection system employing the YOLOv11 architecture. The detection model was
trained exclusively on the High-Resolution Ship Collections 2016 Multi-Scale (HRSC2016-MS)
maritime dataset [
          <xref ref-type="bibr" rid="ref14">14</xref>
          ], representing a significant enhancement over the original HRSC2016 dataset.
        </p>
        <p>Dataset Characteristics and Preprocessing. The HRSC2016-MS dataset comprises 1,680
high-resolution optical remote sensing images containing 7,655 annotated ship instances. The
dataset exhibits comprehensive environmental diversity, encompassing maritime scenes across
multiple operational conditions: sea and coastal environments, diurnal and nocturnal imaging
scenarios, and varied meteorological conditions including clear and cloudy weather patterns. The
dataset’s multi-scale nature provides images with varying resolutions and aspect ratios, essential
for training robust detection algorithms capable of identifying vessels across different scales and
perspectives.</p>
        <p>YOLOv11 Model Training and Fine-tuning. The YOLOv11 object detection architecture was
selected for its demonstrated superiority in real-time object detection tasks and computational
efficiency suitable for UAV deployment scenarios. The model was trained end-to-end on the
HRSC2016-MS dataset, which was split into 60% for training, 20% for testing and 20% for validation,
using transfer learning from pre-trained weights, with specific hyperparameter optimization for
maritime object detection. Fine-tuning procedures incorporated UAV-specific imagery to enhance
detection performance in coastal operational environments. The training process employed data
augmentation techniques including geometric transformations, photometric adjustments, and
multi-scale training to improve model generalization across diverse maritime conditions. Model
convergence was monitored through validation metrics including precision, recall, and mean
Average Precision (mAP) at multiple Intersection over Union (IoU) thresholds.</p>
      </sec>
      <sec id="sec-3-3">
        <title>3.3. Integration framework</title>
        <p>While this paper focuses on the UAV detection component, the broader system integrates with
semantic knowledge representation through Blue Brain Nexus, enabling cross-modal data fusion
and reasoning capabilities. The knowledge graph architecture facilitates semantic interoperability
between UAV detection results, satellite imagery analysis, and AIS tracking data, supporting
comprehensive maritimesituational awareness and risk assessment applications.</p>
      </sec>
    </sec>
    <sec id="sec-4">
      <title>4. Results</title>
      <p>The system was validated through three primary use cases: (1) detection and classification of oil
spills, debris, and environmental anomalies using MariNeXt; (2) ship detection using Sentinel-1
SAR imagery (trained on SAR-Ship-Dataset2) split into 60% for training, 20% for testing and 20% for
validation that was cross-validated with UAV imagery; and (3) prediction of vessel routes with
anomaly detection using AIS data.</p>
      <p>The comparison highlights that the YOLO11 model, fine-tuned with UAV imagery,
demonstrates reliable ship detection in coastal environments. While some false positives occur, the
overall bounding box alignment with labeled data confirms high precision (Figure 2).</p>
      <p>(a)
Figure 2: Comparison of labeled (a) and predicted (b) ships.
(b)</p>
      <p>The Figure 3 illustrates the evolution of validation metrics during model training. The initial
drop in precision is an artifact of the learning rate scheduler and optimizer state at the beginning of
the training. Precision rapidly stabilizes around 0.9, while Recall converges near 0.75. The mean
Average Precision at Intersection over Union threshold 0.5 reaches approximately 0.85, and the
stricter metric mAP@[0.5:0.95] converges to 0.66. These results indicate consistent model
improvement across epochs and confirm robust generalization for maritime object detection.</p>
      <p>All losses decrease steadily, with the most significant reduction observed during the first 20
epochs, after which the curves gradually converge and stabilize. The absence of divergence
between training</p>
      <sec id="sec-4-1">
        <title>2https://github.com/CAESAR-Radi/SAR-Ship-Dataset</title>
        <p>and validation losses indicates that the model does not suffer from overfitting and generalizes well
to unseen data (Figure 4).</p>
        <p>Therefore, the main results of the information technology validation can be summarized as
follows:

</p>
        <p>Detection Accuracy: The fine-tuned YOLO11 model achieved a precision of 90% and a
recall of 75%, demonstrating robust performance in identifying maritime objects with
UAVenhanced data.</p>
        <p>Processing Latency: The analytics pipeline maintained an average processing time of
under 5 seconds per frame, ensuring near-real-time performance suitable for time-critical
applications.</p>
        <p>Data Integration: The platform successfully integrated multimodal data sources (satellite,
UAV, and AIS) into a unified knowledge graph using Blue Brain Nexus, enabling effective
semantic linking and querying.</p>
        <p>The MVP was awarded third place at the national level of the 8th CASSINI Hackathon
competition.</p>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>5. Discussion</title>
      <p>The validation of the proposed information technology confirmed its strengths, including the
provision of analytics with low latency, semantic interoperability through a knowledge graph, and
a scalable pipeline applicable to multiple domains such as maritime, coastal, and disaster response.
However, a few challenges were identified. These include limitations in data access, such as
restricted AIS feeds and cloud dependency, the high computational demand of near-real-time
pipelines, and the limited availability of labeled datasets for training YOLO11 on maritime hazards.
Future work should focus on expanding the platform’s capabilities beyond maritime safety to
address other disaster types like floods and wildfires. Furthermore, important future developments
include integration with European Data Spaces initiatives and ensuring compliance with FAIR data
principles3 for open science.</p>
    </sec>
    <sec id="sec-6">
      <title>6. Conclusion</title>
      <p>The proposed information technology demonstrates the feasibility of an AI-driven multimodal data
fusion system for maritime safety and environmental monitoring. By combining satellite, UAV, and
AIS data within a knowledge graph architecture, the system provides near-real-time hazardous
object detection and decision support. The fine-tuned YOLO11 model achieved a precision of 90%
and a recall of 75%, with a mAP@[0.5:0.95] of 0.66, demonstrating robust performance. Data
owners will benefit from the developed information technology by maximizing the value of their
data assets to provide high-quality insights for maritime security.</p>
      <p>The ship detection model from Sentinel-1 SAR data performs exceptionally well achieving 93%
precision and 92.5% recall. This demonstrates high accuracy in identifying ships with minimal false
positives, critical for cross validation with AIS third party API data.</p>
      <p>The results from the CASSINI Hackathon validate its effectiveness and highlight the potential of
such approaches in broader applications, including emergency response, sustainable maritime
operations, and environmental protection.</p>
    </sec>
    <sec id="sec-7">
      <title>Acknowledgements</title>
      <p>This work is based on the DroneSight project, which was awarded third place in the Ukrainian
national selection of the 8th CASSINI Hackathon: EU Space for Defence and Security4. The authors
gratefully acknowledge the CASSINI Hackathon, an initiative of the European Union, for providing
the framework and premium access to satellite imagery that were instrumental to this research.
Rework-Space LLC acknowledges support from the European Innovation Council through the
Seeds of Bravery (UASEEDs) project.</p>
    </sec>
    <sec id="sec-8">
      <title>Declaration on Generative AI</title>
      <p>During the preparation of this work, the authors used GPT-4 and Grammarly in order to: Grammar
and spelling check. After using these tools/services, the authors reviewed and edited the content as
needed and take full responsibility for the publication’s content.</p>
      <sec id="sec-8-1">
        <title>3https://www.go-fair.org/fair-principles/ 4https://taikai.network/cassinihackathons/hackathons/euspace-defence-security/</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>R.</given-names>
            <surname>Khurana</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Mugabe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X. L.</given-names>
            <surname>Etienne</surname>
          </string-name>
          ,
          <article-title>Climate change, natural disasters, and institutional integrity</article-title>
          ,
          <source>World Development</source>
          <volume>157</volume>
          (
          <year>2022</year>
          ). doi:
          <volume>10</volume>
          .1016/j.worlddev.
          <year>2022</year>
          .
          <volume>105931</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>I.</given-names>
            <surname>Chandran</surname>
          </string-name>
          ,
          <string-name>
            <surname>K.</surname>
          </string-name>
          <article-title>Vipin, Multi-UAV networks for disaster monitoring: challenges and opportunities from a network perspective</article-title>
          ,
          <source>Drone Systems and Applications</source>
          <volume>12</volume>
          (
          <year>2024</year>
          )
          <fpage>1</fpage>
          -
          <lpage>28</lpage>
          . doi:
          <volume>10</volume>
          .1139/dsa-2023-0079.
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Zhou</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Xing</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Yang</surname>
          </string-name>
          , Applications, evolutions, and
          <article-title>challenges of drones in maritime transport</article-title>
          ,
          <source>Journal of Marine Science and Engineering</source>
          <volume>11</volume>
          (
          <year>2023</year>
          ). doi:
          <volume>10</volume>
          .3390/jmse11112056.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>S. P. H.</given-names>
            <surname>Boroujeni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Razi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Khoshdel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Afghah</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. L.</given-names>
            <surname>Coen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. O</given-names>
            <surname>'Neill</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Fule</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Watts</surname>
          </string-name>
          , N.-
          <string-name>
            <surname>M. T. Kokolakis</surname>
            ,
            <given-names>K. G.</given-names>
          </string-name>
          <string-name>
            <surname>Vamvoudakis</surname>
          </string-name>
          ,
          <article-title>A comprehensive survey of research towards AI-enabled unmanned aerial systems in pre-, active-, and post-wildfire management</article-title>
          ,
          <source>Information Fusion</source>
          <volume>108</volume>
          (
          <year>2024</year>
          ). doi:
          <volume>10</volume>
          .1016/j.inffus.
          <year>2024</year>
          .
          <volume>102369</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>A.</given-names>
            <surname>Bazrafkan</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Igathinathane</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Bandillo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Flores</surname>
          </string-name>
          ,
          <article-title>Optimizing integration techniques for UAS and satellite image data in precision agriculture - a review</article-title>
          ,
          <source>Frontiers in Remote Sensing</source>
          <volume>6</volume>
          (
          <year>2025</year>
          ). doi:
          <volume>10</volume>
          .3389/frsen.
          <year>2025</year>
          .
          <volume>1622884</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>R.</given-names>
            <surname>Adler-Nissen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K. A.</given-names>
            <surname>Eggeling</surname>
          </string-name>
          ,
          <article-title>The discursive struggle for digital sovereignty: Security, economy, rights and the cloud project Gaia-X, JCMS</article-title>
          :
          <source>Journal of Common Market Studies</source>
          <volume>62</volume>
          (
          <year>2024</year>
          )
          <fpage>993</fpage>
          -
          <lpage>1011</lpage>
          . doi:
          <volume>10</volume>
          .1111/jcms.13594.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>I.</given-names>
            <surname>Konovalenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Maruschak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Brevus</surname>
          </string-name>
          ,
          <article-title>Steel surface defect detection using an ensemble of deep residual neural networks</article-title>
          ,
          <source>Journal of Computing and Information Science in Engineering</source>
          <volume>22</volume>
          (
          <year>2021</year>
          ).
          <source>doi:10.1115/1</source>
          .4051435.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>I.</given-names>
            <surname>Konovalenko</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Maruschak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>V.</given-names>
            <surname>Brevus</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.</given-names>
            <surname>Prentkovskis</surname>
          </string-name>
          ,
          <article-title>Recognition of scratches and abrasions on metal surfaces using a classifier based on a convolutional neural network</article-title>
          ,
          <source>Metals</source>
          <volume>11</volume>
          (
          <year>2021</year>
          ). doi:
          <volume>10</volume>
          .3390/met11040549.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9] L.
          <article-title>-h.</article-title>
          <string-name>
            <surname>He</surname>
          </string-name>
          , Y.-
          <string-name>
            <surname>z. Zhou</surname>
            , L. Liu,
            <given-names>W.</given-names>
          </string-name>
          <string-name>
            <surname>Cao</surname>
          </string-name>
          , J.-h. Ma,
          <article-title>Research on object detection and recognition in remote sensing images based on YOLOv11</article-title>
          ,
          <source>Scientific Reports</source>
          <volume>15</volume>
          (
          <year>2025</year>
          ). doi:
          <volume>10</volume>
          .1038/s41598- 025-96314-x.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>Y.</given-names>
            <surname>Li</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>El Habib Daho</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.-H.</given-names>
            <surname>Conze</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Zeghlache</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Le Boité</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Tadayoni</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Cochener</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lamard</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Quellec</surname>
          </string-name>
          ,
          <article-title>A review of deep learning-based information fusion techniques for multimodal medical image classification</article-title>
          ,
          <source>Computers in Biology and Medicine</source>
          <volume>177</volume>
          (
          <year>2024</year>
          ). doi:
          <volume>10</volume>
          .1016/j.compbiomed.
          <year>2024</year>
          .
          <volume>108635</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <given-names>M. F.</given-names>
            <surname>Sy</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Roman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Kerrien</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. M.</given-names>
            <surname>Mendez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Genet</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Wajerowicz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Dupont</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Lavriushev</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Machon</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Pirman</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D. Neela</given-names>
            <surname>Mana</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Stafeeva</surname>
          </string-name>
          , A.
          <article-title>-</article-title>
          K. Kaufmann, H. Lu,
          <string-name>
            <given-names>J.</given-names>
            <surname>Lurie</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.-A.</given-names>
            <surname>Fonta</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. G. R.</given-names>
            <surname>Martinez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. D.</given-names>
            <surname>Ulbrich</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Lindqvist</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Jimenez</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Rotenberg</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Markram</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S. L.</given-names>
            <surname>Hill</surname>
          </string-name>
          ,
          <article-title>Blue brain nexus: An open, secure, scalable system for knowledge graph management and data-driven science</article-title>
          ,
          <source>Semantic Web</source>
          <volume>14</volume>
          (
          <year>2023</year>
          )
          <fpage>697</fpage>
          -
          <lpage>727</lpage>
          . doi:
          <volume>10</volume>
          .3233/SW-222974.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <given-names>B.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Zhao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Song</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L.</given-names>
            <surname>Yu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Zhang</surname>
          </string-name>
          , J. Liu,
          <article-title>Enhanced yolo11 for lightweight and accurate drone-based maritime search and rescue object detection</article-title>
          ,
          <source>PLOS ONE 20</source>
          (
          <year>2025</year>
          )
          <fpage>1</fpage>
          -
          <lpage>24</lpage>
          . doi:
          <volume>10</volume>
          .1371/journal.pone.
          <volume>0321920</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <given-names>K.</given-names>
            <surname>Kikaki</surname>
          </string-name>
          ,
          <string-name>
            <surname>I. Kakogeorgiou</surname>
          </string-name>
          , I. Hoteit,
          <string-name>
            <given-names>K.</given-names>
            <surname>Karantzalos</surname>
          </string-name>
          ,
          <article-title>Detecting marine pollutants and sea surface features with deep learning in Sentinel-2 imagery</article-title>
          ,
          <source>ISPRS Journal of Photogrammetry and Remote Sensing</source>
          <volume>210</volume>
          (
          <year>2024</year>
          )
          <fpage>39</fpage>
          -
          <lpage>54</lpage>
          . doi:
          <volume>10</volume>
          .1016/j.isprsjprs.
          <year>2024</year>
          .
          <volume>02</volume>
          .017.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <given-names>W.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Han</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.</given-names>
            <surname>Gao</surname>
          </string-name>
          ,
          <article-title>Mssdet: Multi-scale ship-detection framework in optical remote-sensing images and new benchmark</article-title>
          ,
          <source>Remote Sensing</source>
          <volume>14</volume>
          (
          <year>2022</year>
          ). URL: https://www.mdpi.com/2072-4292/14/21/5460. doi:
          <volume>10</volume>
          .3390/rs14215460.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>