<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>November</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Outdoors Mobile Augmented Reality for Coastal Erosion Visualization Based on Geographical Data</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Minas Katsiokalis</string-name>
          <email>mkatsiokalis@isc.tuc.gr</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Lemonia Ragia</string-name>
          <email>lemonia.ragia@gmail.com</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Katerina Mania</string-name>
          <email>amania@isc.tuc.gr</email>
          <xref ref-type="aff" rid="aff2">2</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Augmented Reality, Mobile AR, Coastal Erosion, Outdoors AR, Mo-</string-name>
          <xref ref-type="aff" rid="aff3">3</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Reference Format:</string-name>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Athena Research Innovation Center</institution>
          ,
          <addr-line>in Information Communication</addr-line>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Minas Katsiokalis, Lemonia Ragia, and Katerina Mania. 2020. Outdoors, Mobile Augmented Reality for Coastal Erosion Visualization Based on Geographical Data. In Cross-Reality (XR) Interaction, ACM ISS 2020 (International</institution>
          ,
          <addr-line>Workshop on XR Interaction 2020).</addr-line>
        </aff>
        <aff id="aff2">
          <label>2</label>
          <institution>Technical University of Crete</institution>
        </aff>
        <aff id="aff3">
          <label>3</label>
          <institution>bile Application</institution>
          ,
          <addr-line>Landscape Visualization</addr-line>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2020</year>
      </pub-date>
      <volume>8</volume>
      <issue>2020</issue>
      <abstract>
        <p>This paper presents a Mobile Augmented Reality (MAR) system for coastal erosion visualization based on geographical data. The system is demonstrated at the beach of Georgioupoli in Chania, Crete Grece, in challenging, sunny, outdoors conditions. The main focus of this work is the 3D on-site visualization of the future state of the beach when the shoreline will inevitably progresses in-land based on the impact of severe coastal erosion, taking place across the Mediterranean sea but also worldwide. We feature two future scenarios in three locations of the beach. A 3D sea segment is matched to the user's actual position. The visualization as seen through a smartphone's screen presents an unprecedented seamless view of the 3D sea segment joined with the real-world edge of the sea, achieving accurate registration of the 3D segment with the real-world. Position tracking is performed by utilizing the phone's GPS and the computer vision capabilities of the presented AR framework. A location-aware experience ensures that 3D rendering is space-aware and timely according the user's position at the coast. By combining AR technologies with geo-spatial data we aim to motivate public awareness and action in relation to critical environmental phenomena such as coastal erosion.</p>
      </abstract>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>CCS CONCEPTS</title>
      <p>• Computing methodologies Mixed / augmented reality; •
Applied computing Interactive learning environments;
Environmental sciences.</p>
    </sec>
    <sec id="sec-2">
      <title>CROSS-REALITY INTERACTION</title>
      <p>Reality and Virtuality are the two edges of a continuum, the one
describes the physical-real world and the other a non
physicalvirtual environment. Cross-Reality is the bridging of those two
worlds and stands between the two edges. In this presented work,
we introduce a cross-reality interface where the users can
interact with a virtual environment through their smartphones while
they reside on the real location of the environment and witness
the scenery to change. The virtual world combines with the real
one to provide a cross-reality experience where the user is able to
view the future of a speci c coast area using Augmented Reality
technology. The virtual content that presented alongside with the
real scenery depicts the future state of the coast and manages to
visualize the coastline changes on top of the physical world. The
virtual environment enhances visually the real world in real time,
while the user can interact on both of them. Achieving to provide
virtual content without extinguish the real-world factor, is essential
on cross-reality interaction.
1</p>
      <sec id="sec-2-1">
        <title>INTRODUCTION</title>
        <p>
          Mobile Augmented Reality (MAR) is an open research area due to
the emergence and widespread uptake of smart-phones that
provide powerful platforms for supporting Augmented Reality on a
mobile device. Littering behavior is a global issue a ecting most
countries, regardless of their development status. Despite the wider
applications of MAR in areas such as cultural heritage [
          <xref ref-type="bibr" rid="ref8">8</xref>
          ], [
          <xref ref-type="bibr" rid="ref6">6</xref>
          ], [
          <xref ref-type="bibr" rid="ref2">2</xref>
          ]
and shopping [
          <xref ref-type="bibr" rid="ref4">4</xref>
          ], MAR systems of environmental context are still
rare because of technical challenges when AR is occurring
outdoors and the need for reliable environmental and geo-located data.
Standard 3D simulation has been employed in the past to
highlight environmental issues such as the impact of tsunami waves [
          <xref ref-type="bibr" rid="ref5">5</xref>
          ].
However, MAR technology on-location could raise environmental
awareness and provoke environmental action compared to
methods such as radios, maps and handheld displays [
          <xref ref-type="bibr" rid="ref3">3</xref>
          ], [
          <xref ref-type="bibr" rid="ref1">1</xref>
          ]. Landscape
visualization can be particularly e ective when communicating
future changes to community groups and policymakers [
          <xref ref-type="bibr" rid="ref10">10</xref>
          ]. The
visualization of potential environmental changes is a powerful tool
for public understanding. MAR can o er the ability to experience
future changes of the environment as if happening now, with the
potential to provoke shock and even disbelief.
        </p>
        <p>The phenomenon this paper focuses on, is the erosion of the
coastal zone and the tremendous changes of the coastline. Across
the Mediterranean sea, coastal erosion will increase provoking
disastrous outcomes for the regions. The coastline is the physical
line where the land meets the sea. Nowadays, coastline extraction
and tracking of its changes has become of high importance because
of global warming and rapid growth of population. Our goal was
to visualize the shoreline at a speci c spot in Crete, Greece, in
its near future state, on the basis of mathematical models that
showcase the beach retreat prediction, e.g. the tendency of the
beach to erode, without any human corrective measures, enhancing
public awareness.</p>
        <p>We propose a MAR system for the visualization of coastal
erosion on-site (Fig. 1), putting forward successful location-aware
recognition outdoors, based on geo-referenced spatial location data,
addressing MAR challenges such as occlusions, large variations in
lighting, the impossibility of modifying the environment, as well
as unpredictable weather conditions and pollution.</p>
        <p>The proposed work is based an on-site Mobile Augmented Reality
(MAR) system for the 3D on-site visualization of the future shoreline
when it will inevitably progresses in-land , o ering a seamless view
of the 3D sea segment joined with the real-world edge of the sea
(Fig. 1). The system is designed to operate in challenging outdoors,
sunny conditions, using a consumer’s handheld smartphone or
tablet supporting both Android and iOS devices. The MAR system
presented consists of two main phases (Fig. 2). The rst phase guides
the user’s navigation on the beach. Once at one of the set Points
of Interest (PoIs), the second phase allows the user to experience
the visualization, after a brief process of calibration as shown in
steps 1-4 in Fig. 3). Over three PoIs, the MAR system visualizes two
possible future scenarios of the coastline for sea level elevation by
0.5 meters and 1 meter, where the coastline is estimated to penetrate
3.6 and 7.7 meters inland, respectively (Fig. 4).</p>
      </sec>
      <sec id="sec-2-2">
        <title>IMPLEMENTATION</title>
        <p>The navigation scene was developed using Mapbox SDK for Unity3D,
suitable for building systems from real-world map data, enabling
interaction with Mapbox web services APIs (Maps, Geocoding and
Directions APIs) via a C#-based API. Having access to a device’s
GPS, an area map is loaded based on the user’s geo-location.
Pointsof-interest (POIs) are added guiding the user to where the AR
visualization takes place. The Directions API provides directions from
the user’s geo-location to the POI’s geo-location while a script
calculates the distance between them at every second.</p>
        <p>Planar faces in the real world (the ground, walls etc.) were
recognized based on plane detection. The user’s position and orientation
in physical space were tracked by the smartphone’s motion
tracking. Then, the virtual content appeared on top of the recognized
physical world. In order to enable Unity3D’s AR Foundation’s
functionality, an ’AR Session’ component controlled the life-cycle and
con guration options for the AR session and an ’AR Session Origin’
component represented the device. The user viewed and interacted
with the 3D scene using the GameObject that contains this
component. Rotation or movements of this object represents rotation and
movement of the user in the scene. The ’AR Camera’ GameObject
represents what the user sees rendered through the camera.</p>
        <p>Attached to the ’AR Session Origin’ object, there were an ’AR
Plane Manager’ and an ’AR Point Cloud Manager’ component. The
’AR Plane Manager’ selects the data about the scanned planar
surfaces adding each plane that has been detected into a list, creating
a GameObject for it. The ’AR Point Cloud Manager’ collects data
about feature points scanned by the device and a point cloud is
created for depth recognition and tracking to the app. The interaction
with the real-world was achieved by ray-casting the tracked planes.
A ray is sent from the center of the ’AR Camera’. If a tracked plane
is hit by it, then we are able to interact with this speci c plane. Here,
this method tracks a plane surface (mostly the ground) and shows
an indicator at the point where the ray hits the plane, updated in
every frame following the device’s pose. If there is no tracked plane,
the indicator disappears. While the indicator is active, the ’Place
Here’ button is activated and the user places the virtual sea content
at the pointed direction, aligned with the real shoreline. The virtual
content is instantiated at the orientation of the indicator (steps 2,3
Fig 3) and the virtual shoreline is placed where the indicator points
to. The user relocates the virtual content if the match with the real
world is o by pressing the setting button (step 3 of Fig. 3). An
intuitive user interface (UI) helps navigation (see Fig. 3). During
step 4 of Fig. 3, three UI signs are enabled by pressing the ’info
button’ at the top right of the screen. A ray starts from the touched
position of the sign on screen. If the ray hits a sign, a window pops
up providing information about coastal erosion (Fig. 5).</p>
        <p>
          A terrain was created based on the shape of the beach at the
speci c location [
          <xref ref-type="bibr" rid="ref9">9</xref>
          ]. Using the terrain, interaction with the water
was possible corresponding to the shoreline. The aerial image used
is a real sub-scale map exported with GIS software showing the
beach retreat (Fig. 4). The model that accurately extracts the beach
retreat prediction at the speci c shoreline, with characteristics of
low slope and sediment of sand is (1) representing the low mean of
the beach retreat prediction [
          <xref ref-type="bibr" rid="ref7">7</xref>
          ].
        </p>
        <p>( = 0.05U 2 + 8.12U</p>
        <p>0.46
U : Sea Level Rise (SLR) in meters
(1)</p>
        <p>The image (Fig. 4) is geo-referenced to the Hellenic Geodetic
Reference System 1987 (HGRS87) and is showcasing the retreat of
the beach in two scenarios (SLR = 0.5 and SLR = 1). The image was
imported in Unity as 1:1 scale so 1 unit in the engine corresponds to
1 meter in the real world. For each possible scenario, an elevation
layer was created as well as one corresponding to the current state
of the shoreline. A realistic looking water shader was created and
attached to a planar surface acquiring water properties such as
re ection, transparency, waviness, foam e ects on collision etc.
In order to animate the rising of the water, the lerping method
gradually moves an object from one position to another during a
time window at a given speed.</p>
      </sec>
      <sec id="sec-2-3">
        <title>EVALUATION AND CONCLUSIONS</title>
        <p>We received feedback about the functionality of the application
and its usefulness, involving users at the beach, using the think
aloud usability evaluation methodology. Users involved were either
non-experts with AR technology or experts in AR. The non-expert
users were fascinated while the experts focused on the
functionality. Certain users mentioned that they would prefer an AR head
mounted experience while others had no problem with the use of
the smartphone. Initially, users found it challenging to accurately
place the virtual content depending on the location, mostly due
to the physiology of the area. After training, they easily used the
app and navigated around. We received enthusiastic feedback
concerning the photorealistic 3D water and its seamless integration
with the real-world shoreline. The signs and the UI in general were
simple and intuitive, communicating the impact of coastal erosion.
Certain users preferred the UI to be assigned more visible colours.</p>
        <p>The application was hard to use during bad weather (clouds, wind
etc.). When the sea was wavy, it was hard to accurately anchor the
3D content as it was drifting in the scene. Tracking and registration
in AR are far from solved. Future work could automate shoreline
detection, exempting user from the calibration process. Addition
of more scenarios and locations. Lighting of the AR digital content
can be improved for a stronger feeling of depth and better
photorealism.</p>
        <p>Concluding, we showcased the design of a mobile Augmented
Reality application aimed for consumer-grade mobile phones with
the ultimate goal of increasing the environmental awareness of the
public audience. By employing AR, we enhance user awareness of
coastal erosion, bridging the gap between reality and virtuality of
widely available XR technologies.</p>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>Majed</given-names>
            <surname>Abdullah</surname>
          </string-name>
          Alrowaily and
          <string-name>
            <given-names>Manolya</given-names>
            <surname>Kavakli</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>Mobile Augmented Reality for Environmental Awareness: A Technology Acceptance Study</article-title>
          .
          <source>In Proceedings of the 2018 10th International Conference on Computer and Automation Engineering (Brisbane, Australia) (ICCAE</source>
          <year>2018</year>
          ).
          <article-title>Association for Computing Machinery</article-title>
          , New York, NY, USA,
          <fpage>36</fpage>
          -
          <lpage>43</lpage>
          . https://doi.org/10.1145/3192975.3193002
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>Silvia</given-names>
            <surname>Blanco-Pons</surname>
          </string-name>
          , Berta Carrión-Ruiz, Michelle Duong, Joshua Chartrand, Stephen Fai, and
          <string-name>
            <given-names>José</given-names>
            <surname>Lerma</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Augmented Reality Markerless Multi-Image Outdoor Tracking System for the Historical Buildings on Parliament Hill</article-title>
          .
          <source>Sustainability</source>
          <volume>11</volume>
          (08
          <year>2019</year>
          ),
          <volume>4268</volume>
          . https://doi.org/10.3390/su11164268
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Te-Lien Chou</surname>
          </string-name>
          and
          <string-name>
            <surname>Lih-Juan ChanLin</surname>
          </string-name>
          .
          <year>2012</year>
          .
          <article-title>Augmented Reality Smartphone Environment Orientation Application: A Case Study of the Fu-Jen University Mobile Campus Touring System</article-title>
          .
          <source>Procedia - Social and Behavioral Sciences</source>
          <volume>46</volume>
          (
          <year>2012</year>
          ),
          <fpage>410</fpage>
          -
          <lpage>416</lpage>
          . https://doi.org/10.1016/j.sbspro.
          <source>2012.05.132 4th WORLD CONFERENCE ON EDUCATIONAL SCIENCES (WCES-</source>
          <year>2012</year>
          )
          <fpage>02</fpage>
          -
          <lpage>05</lpage>
          February 2012 Barcelona, Spain.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Scott G.</given-names>
            <surname>Dacko</surname>
          </string-name>
          .
          <year>2017</year>
          .
          <article-title>Enabling smart retail settings via mobile augmented reality shopping apps</article-title>
          .
          <source>Technological Forecasting and Social Change</source>
          <volume>124</volume>
          (
          <year>2017</year>
          ),
          <fpage>243</fpage>
          -
          <lpage>256</lpage>
          . https://doi.org/10.1016/j.techfore.
          <year>2016</year>
          .
          <volume>09</volume>
          .032
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>Alexandros</given-names>
            <surname>Giannakidis</surname>
          </string-name>
          , Giannis Giakoumidakis, and
          <string-name>
            <given-names>Katerina</given-names>
            <surname>Mania</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>3D photorealistic scienti c visualization of tsunami waves and sea level rise</article-title>
          .
          <source>In 2014 IEEE International Conference on Imaging Systems and Techniques (IST) Proceedings. IEEE</source>
          ,
          <fpage>167</fpage>
          -
          <lpage>172</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>A.</given-names>
            <surname>Haugstvedt</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Krogstie</surname>
          </string-name>
          .
          <year>2012</year>
          .
          <article-title>Mobile augmented reality for cultural heritage: A technology acceptance study</article-title>
          .
          <source>In 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)</source>
          .
          <volume>247</volume>
          -
          <fpage>255</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>I.N.</given-names>
            <surname>Monioudi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Karditsa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Chatzipavlis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Alexandrakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>O.P.</given-names>
            <surname>Andreadis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.F.</given-names>
            <surname>Velegrakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.E.</given-names>
            <surname>Poulos</surname>
          </string-name>
          , G. Ghionis,
          <string-name>
            <given-names>S.</given-names>
            <surname>Petrakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Sifnioti</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Hasiotis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Lipakis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Kampanis</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Karambas</surname>
          </string-name>
          , and
          <string-name>
            <given-names>E.</given-names>
            <surname>Marinos</surname>
          </string-name>
          .
          <year>2014</year>
          .
          <article-title>Assessment of vulnerability of the eastern Cretan beaches (Greece) to sea level rise</article-title>
          .
          <source>Regional Environmental Change</source>
          (
          <year>2014</year>
          ). https://doi.org/10.1007/s10113-014-0730-9 cited By 0; Article in Press.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>Chris</given-names>
            <surname>Panou</surname>
          </string-name>
          , Lemonia Ragia, Despoina Dimelli, and
          <string-name>
            <given-names>Katerina</given-names>
            <surname>Mania</surname>
          </string-name>
          .
          <year>2018</year>
          .
          <article-title>An architecture for mobile outdoors augmented reality for cultural heritage</article-title>
          .
          <source>ISPRS International Journal of Geo-Information</source>
          <volume>7</volume>
          ,
          <issue>12</issue>
          (
          <year>2018</year>
          ),
          <fpage>463</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Lemonia</given-names>
            <surname>Ragia</surname>
          </string-name>
          and
          <string-name>
            <given-names>Pavlos</given-names>
            <surname>Krassakis</surname>
          </string-name>
          .
          <year>2019</year>
          .
          <article-title>Monitoring the changes of the coastal areas using remote sensing data and geographic information systems</article-title>
          .
          <source>In Seventh International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2019)</source>
          , Vol.
          <volume>11174</volume>
          . International Society for Optics and Photonics, SPIE,
          <fpage>289</fpage>
          -
          <lpage>297</lpage>
          . https://doi.org/10.1117/12.2533659
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>Stephen</given-names>
            <surname>Sheppard</surname>
          </string-name>
          .
          <year>2005</year>
          .
          <article-title>Landscape visualisation and climate change: The potential for in uencing perceptions and behaviour</article-title>
          .
          <source>Environmental Science Policy</source>
          <volume>8</volume>
          (
          <issue>12</issue>
          <year>2005</year>
          ),
          <fpage>637</fpage>
          -
          <lpage>654</lpage>
          . https://doi.org/10.1016/j.envsci.
          <year>2005</year>
          .
          <volume>08</volume>
          .002
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>