<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Outdoors Mobile Augmented Reality for Coastal Erosion Visualization Based on Geographical Data</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author role="corresp">
							<persName><forename type="first">Minas</forename><surname>Katsiokalis</surname></persName>
							<email>mkatsiokalis@isc.tuc.gr</email>
							<affiliation key="aff0">
								<orgName type="institution">Technical University of Crete Lemonia Ragia</orgName>
							</affiliation>
							<affiliation key="aff1">
								<orgName type="department" key="dep1">Athena Research Innovation Center</orgName>
								<orgName type="department" key="dep2">Information Communication Katerina Mania</orgName>
							</affiliation>
						</author>
						<title level="a" type="main">Outdoors Mobile Augmented Reality for Coastal Erosion Visualization Based on Geographical Data</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">3124154A4A51670AC2B80D7D6B27C42F</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T03:41+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Augmented Reality</term>
					<term>Mobile AR</term>
					<term>Coastal Erosion</term>
					<term>Outdoors AR</term>
					<term>Mobile Application</term>
					<term>Landscape Visualization</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>This paper presents a Mobile Augmented Reality (MAR) system for coastal erosion visualization based on geographical data. The system is demonstrated at the beach of Georgioupoli in Chania, Crete Grece, in challenging, sunny, outdoors conditions. The main focus of this work is the 3D on-site visualization of the future state of the beach when the shoreline will inevitably progresses in-land based on the impact of severe coastal erosion, taking place across the Mediterranean sea but also worldwide. We feature two future scenarios in three locations of the beach. A 3D sea segment is matched to the user's actual position. The visualization as seen through a smartphone's screen presents an unprecedented seamless view of the 3D sea segment joined with the real-world edge of the sea, achieving accurate registration of the 3D segment with the real-world. Position tracking is performed by utilizing the phone's GPS and the computer vision capabilities of the presented AR framework. A location-aware experience ensures that 3D rendering is space-aware and timely according the user's position at the coast. By combining AR technologies with geo-spatial data we aim to motivate public awareness and action in relation to critical environmental phenomena such as coastal erosion.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>CCS CONCEPTS</head><p>• Computing methodologies Mixed / augmented reality; • Applied computing Interactive learning environments; Environmental sciences.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>CROSS-REALITY INTERACTION</head><p>Reality and Virtuality are the two edges of a continuum, the one describes the physical-real world and the other a non physicalvirtual environment. Cross-Reality is the bridging of those two worlds and stands between the two edges. In this presented work, we introduce a cross-reality interface where the users can interact with a virtual environment through their smartphones while they reside on the real location of the environment and witness the scenery to change. The virtual world combines with the real one to provide a cross-reality experience where the user is able to view the future of a speci c coast area using Augmented Reality technology. The virtual content that presented alongside with the real scenery depicts the future state of the coast and manages to visualize the coastline changes on top of the physical world. The virtual environment enhances visually the real world in real time, while the user can interact on both of them. Achieving to provide virtual content without extinguish the real-world factor, is essential on cross-reality interaction.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1">INTRODUCTION</head><p>Mobile Augmented Reality (MAR) is an open research area due to the emergence and widespread uptake of smart-phones that provide powerful platforms for supporting Augmented Reality on a mobile device. Littering behavior is a global issue aecting most countries, regardless of their development status. Despite the wider applications of MAR in areas such as cultural heritage <ref type="bibr" target="#b7">[8]</ref>, <ref type="bibr" target="#b5">[6]</ref>, <ref type="bibr" target="#b1">[2]</ref> and shopping <ref type="bibr" target="#b3">[4]</ref>, MAR systems of environmental context are still rare because of technical challenges when AR is occurring outdoors and the need for reliable environmental and geo-located data. Standard 3D simulation has been employed in the past to highlight environmental issues such as the impact of tsunami waves <ref type="bibr" target="#b4">[5]</ref>. However, MAR technology on-location could raise environmental awareness and provoke environmental action compared to methods such as radios, maps and handheld displays <ref type="bibr" target="#b2">[3]</ref>, <ref type="bibr" target="#b0">[1]</ref>. Landscape visualization can be particularly eective when communicating future changes to community groups and policymakers <ref type="bibr" target="#b9">[10]</ref>. The visualization of potential environmental changes is a powerful tool for public understanding. MAR can oer the ability to experience future changes of the environment as if happening now, with the potential to provoke shock and even disbelief.</p><p>The phenomenon this paper focuses on, is the erosion of the coastal zone and the tremendous changes of the coastline. Across the Mediterranean sea, coastal erosion will increase provoking disastrous outcomes for the regions. The coastline is the physical line where the land meets the sea. Nowadays, coastline extraction and tracking of its changes has become of high importance because of global warming and rapid growth of population. Our goal was to visualize the shoreline at a specic spot in Crete, Greece, in its near future state, on the basis of mathematical models that showcase the beach retreat prediction, e.g. the tendency of the beach to erode, without any human corrective measures, enhancing public awareness.</p><p>We propose a MAR system for the visualization of coastal erosion on-site (Fig. <ref type="figure">1</ref>), putting forward successful location-aware recognition outdoors, based on geo-referenced spatial location data, addressing MAR challenges such as occlusions, large variations in lighting, the impossibility of modifying the environment, as well as unpredictable weather conditions and pollution.</p><p>The proposed work is based an on-site Mobile Augmented Reality (MAR) system for the 3D on-site visualization of the future shoreline when it will inevitably progresses in-land , oering a seamless view of the 3D sea segment joined with the real-world edge of the sea (Fig. <ref type="figure">1</ref>). The system is designed to operate in challenging outdoors, sunny conditions, using a consumer's handheld smartphone or tablet supporting both Android and iOS devices. The MAR system presented consists of two main phases (Fig. <ref type="figure" target="#fig_1">2</ref>). The rst phase guides the user's navigation on the beach. Once at one of the set Points of Interest (PoIs), the second phase allows the user to experience the visualization, after a brief process of calibration as shown in steps 1-4 in Fig. <ref type="figure" target="#fig_2">3</ref>). Over three PoIs, the MAR system visualizes two possible future scenarios of the coastline for sea level elevation by 0.5 meters and 1 meter, where the coastline is estimated to penetrate 3.6 and 7.7 meters inland, respectively (Fig. <ref type="figure" target="#fig_3">4</ref>). </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2">IMPLEMENTATION</head><p>The navigation scene was developed using Mapbox SDK for Unity3D, suitable for building systems from real-world map data, enabling interaction with Mapbox web services APIs (Maps, Geocoding and Directions APIs) via a C#-based API. Having access to a device's GPS, an area map is loaded based on the user's geo-location. Pointsof-interest (POIs) are added guiding the user to where the AR visualization takes place. The Directions API provides directions from the user's geo-location to the POI's geo-location while a script calculates the distance between them at every second.</p><p>Planar faces in the real world (the ground, walls etc.) were recognized based on plane detection. The user's position and orientation in physical space were tracked by the smartphone's motion tracking. Then, the virtual content appeared on top of the recognized physical world. In order to enable Unity3D's AR Foundation's functionality, an 'AR Session' component controlled the life-cycle and conguration options for the AR session and an 'AR Session Origin' component represented the device. The user viewed and interacted with the 3D scene using the GameObject that contains this component. Rotation or movements of this object represents rotation and movement of the user in the scene. The 'AR Camera' GameObject represents what the user sees rendered through the camera.</p><p>Attached to the 'AR Session Origin' object, there were an 'AR Plane Manager' and an 'AR Point Cloud Manager' component. The 'AR Plane Manager' selects the data about the scanned planar surfaces adding each plane that has been detected into a list, creating a GameObject for it. The 'AR Point Cloud Manager' collects data about feature points scanned by the device and a point cloud is created for depth recognition and tracking to the app. The interaction with the real-world was achieved by ray-casting the tracked planes. A ray is sent from the center of the 'AR Camera'. If a tracked plane is hit by it, then we are able to interact with this specic plane. Here, this method tracks a plane surface (mostly the ground) and shows an indicator at the point where the ray hits the plane, updated in every frame following the device's pose. If there is no tracked plane, the indicator disappears. While the indicator is active, the 'Place Here' button is activated and the user places the virtual sea content at the pointed direction, aligned with the real shoreline. The virtual content is instantiated at the orientation of the indicator (steps 2,3 Fig <ref type="figure" target="#fig_2">3</ref>) and the virtual shoreline is placed where the indicator points to. The user relocates the virtual content if the match with the real world is o by pressing the setting button (step 3 of Fig. <ref type="figure" target="#fig_2">3</ref>). An intuitive user interface (UI) helps navigation (see Fig. <ref type="figure" target="#fig_2">3</ref>). During step 4 of Fig. <ref type="figure" target="#fig_2">3</ref>, three UI signs are enabled by pressing the 'info button' at the top right of the screen. A ray starts from the touched position of the sign on screen. If the ray hits a sign, a window pops up providing information about coastal erosion (Fig. <ref type="figure" target="#fig_4">5</ref>). A terrain was created based on the shape of the beach at the specic location <ref type="bibr" target="#b8">[9]</ref>. Using the terrain, interaction with the water was possible corresponding to the shoreline. The aerial image used is a real sub-scale map exported with GIS software showing the beach retreat (Fig. <ref type="figure" target="#fig_3">4</ref>). The model that accurately extracts the beach retreat prediction at the specic shoreline, with characteristics of low slope and sediment of sand is (1) representing the low mean of the beach retreat prediction <ref type="bibr" target="#b6">[7]</ref>.</p><formula xml:id="formula_0">( = 0.05U 2 + 8.12U 0.46<label>(1)</label></formula><p>U: Sea Level Rise (SLR) in meters</p><p>The image (Fig. <ref type="figure" target="#fig_3">4</ref>) is geo-referenced to the Hellenic Geodetic Reference System 1987 (HGRS87) and is showcasing the retreat of the beach in two scenarios (SLR = 0.5 and SLR = 1). The image was imported in Unity as 1:1 scale so 1 unit in the engine corresponds to 1 meter in the real world. For each possible scenario, an elevation layer was created as well as one corresponding to the current state of the shoreline. A realistic looking water shader was created and attached to a planar surface acquiring water properties such as reection, transparency, waviness, foam eects on collision etc. In order to animate the rising of the water, the lerping method gradually moves an object from one position to another during a time window at a given speed. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3">EVALUATION AND CONCLUSIONS</head><p>We received feedback about the functionality of the application and its usefulness, involving users at the beach, using the think aloud usability evaluation methodology. Users involved were either non-experts with AR technology or experts in AR. The non-expert users were fascinated while the experts focused on the functionality. Certain users mentioned that they would prefer an AR head mounted experience while others had no problem with the use of the smartphone. Initially, users found it challenging to accurately place the virtual content depending on the location, mostly due to the physiology of the area. After training, they easily used the app and navigated around. We received enthusiastic feedback concerning the photorealistic 3D water and its seamless integration with the real-world shoreline. The signs and the UI in general were simple and intuitive, communicating the impact of coastal erosion. Certain users preferred the UI to be assigned more visible colours.</p><p>The application was hard to use during bad weather (clouds, wind etc.). When the sea was wavy, it was hard to accurately anchor the 3D content as it was drifting in the scene. Tracking and registration in AR are far from solved. Future work could automate shoreline detection, exempting user from the calibration process. Addition of more scenarios and locations. Lighting of the AR digital content can be improved for a stronger feeling of depth and better photorealism.</p><p>Concluding, we showcased the design of a mobile Augmented Reality application aimed for consumer-grade mobile phones with the ultimate goal of increasing the environmental awareness of the public audience. By employing AR, we enhance user awareness of coastal erosion, bridging the gap between reality and virtuality of widely available XR technologies.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head></head><label></label><figDesc>Minas Katsiokalis, Lemonia Ragia, and Katerina Mania. 2020. Outdoors Mobile Augmented Reality for Coastal Erosion Visualization Based on Geographical Data. In Cross-Reality (XR) Interaction, ACM ISS 2020 (International Workshop on XR Interaction 2020).</figDesc></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: MAR system architecture.</figDesc><graphic coords="2,317.96,83.69,240.24,140.93" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Process of placing the 3D sea extending the realworld sea coastline (as seen on smartphone's camera .</figDesc><graphic coords="3,65.81,83.69,216.21,477.47" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: Shoreline retreat scenarios, SLR 0.5m/1m.</figDesc><graphic coords="3,317.96,83.69,240.25,310.91" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: Information signs in AR.</figDesc><graphic coords="4,65.81,83.68,216.22,268.55" type="bitmap" /></figure>
		</body>
		<back>

			<div type="funding">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Technical University of Crete</head><p>Figure 1: The real-world coast (left), the real-world coast with seamless 3D sea segment (right).</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Mobile Augmented Reality for Environmental Awareness: A Technology Acceptance Study</title>
		<author>
			<persName><forename type="first">Majed</forename><surname>Abdullah</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Alrowaily</forename></persName>
		</author>
		<author>
			<persName><forename type="first">Manolya</forename><surname>Kavakli</surname></persName>
		</author>
		<idno type="DOI">10.1145/3192975.3193002</idno>
		<ptr target="https://doi.org/10.1145/3192975.3193002" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2018 10th International Conference on Computer and Automation Engineering</title>
				<meeting>the 2018 10th International Conference on Computer and Automation Engineering<address><addrLine>Brisbane, Australia; ICCAE; New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>Association for Computing Machinery</publisher>
			<date type="published" when="2018">2018. 2018</date>
			<biblScope unit="page" from="36" to="43" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Augmented Reality Markerless Multi-Image Outdoor Tracking System for the Historical Buildings on Parliament Hill</title>
		<author>
			<persName><forename type="first">Silvia</forename><surname>Blanco-Pons</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Berta</forename><surname>Carrión-Ruiz</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Michelle</forename><surname>Duong</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Joshua</forename><surname>Chartrand</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Stephen</forename><surname>Fai</surname></persName>
		</author>
		<author>
			<persName><forename type="first">José</forename><surname>Lerma</surname></persName>
		</author>
		<idno type="DOI">10.3390/su11164268</idno>
		<ptr target="https://doi.org/10.3390/su11164268" />
	</analytic>
	<monogr>
		<title level="j">Sustainability</title>
		<imprint>
			<biblScope unit="volume">11</biblScope>
			<biblScope unit="issue">08</biblScope>
			<biblScope unit="page">4268</biblScope>
			<date type="published" when="2019">2019. 2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Augmented Reality Smartphone Environment Orientation Application: A Case Study of the Fu-Jen University Mobile Campus Touring System</title>
		<author>
			<persName><forename type="first">Te-</forename></persName>
		</author>
		<author>
			<persName><forename type="first">Lien</forename><surname>Chou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Lih-Juan</forename><surname>Chanlin</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.sbspro.2012.05.132</idno>
		<ptr target="https://doi.org/10.1016/j.sbspro.2012.05.1324thWORLDCONFERENCEONEDUCATIONALSCIENCES(WCES-2012" />
	</analytic>
	<monogr>
		<title level="j">Procedia -Social and Behavioral Sciences</title>
		<imprint>
			<biblScope unit="volume">46</biblScope>
			<biblScope unit="page" from="410" to="416" />
			<date type="published" when="2012-02-05">2012. 2012. 02-05 February 2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Enabling smart retail settings via mobile augmented reality shopping apps</title>
		<author>
			<persName><forename type="first">G</forename><surname>Scott</surname></persName>
		</author>
		<author>
			<persName><surname>Dacko</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.techfore.2016.09.032</idno>
		<ptr target="https://doi.org/10.1016/j.techfore.2016.09.032" />
	</analytic>
	<monogr>
		<title level="j">Technological Forecasting and Social Change</title>
		<imprint>
			<biblScope unit="volume">124</biblScope>
			<biblScope unit="page" from="243" to="256" />
			<date type="published" when="2017">2017. 2017</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">3D photorealistic scientic visualization of tsunami waves and sea level rise</title>
		<author>
			<persName><forename type="first">Alexandros</forename><surname>Giannakidis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Giannis</forename><surname>Giakoumidakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Katerina</forename><surname>Mania</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE International Conference on Imaging Systems and Techniques (IST) Proceedings. IEEE</title>
				<imprint>
			<date type="published" when="2014">2014. 2014</date>
			<biblScope unit="page" from="167" to="172" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Mobile augmented reality for cultural heritage: A technology acceptance study</title>
		<author>
			<persName><forename type="first">A</forename><surname>Haugstvedt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Krogstie</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="m">IEEE International Symposium on Mixed and Augmented Reality (ISMAR)</title>
				<imprint>
			<date type="published" when="2012">2012. 2012</date>
			<biblScope unit="page" from="247" to="255" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<monogr>
		<author>
			<persName><forename type="first">I</forename><forename type="middle">N</forename><surname>Monioudi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Karditsa</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Chatzipavlis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Alexandrakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">O</forename><forename type="middle">P</forename><surname>Andreadis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><forename type="middle">F</forename><surname>Velegrakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><forename type="middle">E</forename><surname>Poulos</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Ghionis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Petrakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Sifnioti</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Hasiotis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Lipakis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Kampanis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Karambas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Marinos</surname></persName>
		</author>
		<idno type="DOI">10.1007/s10113-014-0730-9</idno>
		<ptr target="https://doi.org/10.1007/s10113-014-0730-9citedBy0" />
		<title level="m">Assessment of vulnerability of the eastern Cretan beaches (Greece) to sea level rise</title>
				<imprint>
			<publisher>Regional Environmental Change</publisher>
			<date type="published" when="2014">2014. 2014</date>
		</imprint>
	</monogr>
	<note>Article in Press</note>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">An architecture for mobile outdoors augmented reality for cultural heritage</title>
		<author>
			<persName><forename type="first">Chris</forename><surname>Panou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Lemonia</forename><surname>Ragia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Despoina</forename><surname>Dimelli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Katerina</forename><surname>Mania</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">ISPRS International Journal of Geo-Information</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="page">463</biblScope>
			<date type="published" when="2018">2018. 2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Monitoring the changes of the coastal areas using remote sensing data and geographic information systems</title>
		<author>
			<persName><forename type="first">Lemonia</forename><surname>Ragia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Pavlos</forename><surname>Krassakis</surname></persName>
		</author>
		<idno type="DOI">10.1117/12.2533659</idno>
		<ptr target="https://doi.org/10.1117/12.2533659" />
	</analytic>
	<monogr>
		<title level="m">Seventh International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2019)</title>
				<imprint>
			<publisher>SPIE</publisher>
			<date type="published" when="2019">2019</date>
			<biblScope unit="volume">11174</biblScope>
			<biblScope unit="page" from="289" to="297" />
		</imprint>
	</monogr>
	<note>International Society for Optics and Photonics</note>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Landscape visualisation and climate change: The potential for inuencing perceptions and behaviour</title>
		<author>
			<persName><forename type="first">Stephen</forename><surname>Sheppard</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.envsci.2005.08.002</idno>
		<ptr target="https://doi.org/10.1016/j.envsci.2005.08.002" />
	</analytic>
	<monogr>
		<title level="j">Environmental Science Policy</title>
		<imprint>
			<biblScope unit="volume">8</biblScope>
			<biblScope unit="issue">12</biblScope>
			<biblScope unit="page" from="637" to="654" />
			<date type="published" when="2005">2005. 2005</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
