<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Visual Cues for Cultural Heritage Urban Navigation with Smart Glasses</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Tsvi</forename><surname>Kuflik</surname></persName>
							<email>tsvikak@is.haifa.ac.il</email>
							<affiliation key="aff0">
								<orgName type="department">Information Systems Department</orgName>
								<orgName type="institution">The University of Haifa</orgName>
								<address>
									<settlement>Haifa</settlement>
									<country key="IL">Israel</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Peter</forename><surname>Yagodin</surname></persName>
							<affiliation key="aff0">
								<orgName type="department">Information Systems Department</orgName>
								<orgName type="institution">The University of Haifa</orgName>
								<address>
									<settlement>Haifa</settlement>
									<country key="IL">Israel</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Omer</forename><surname>Sharon</surname></persName>
							<email>sharonomeros@gmail.com</email>
							<affiliation key="aff0">
								<orgName type="department">Information Systems Department</orgName>
								<orgName type="institution">The University of Haifa</orgName>
								<address>
									<settlement>Haifa</settlement>
									<country key="IL">Israel</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Visual Cues for Cultural Heritage Urban Navigation with Smart Glasses</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">0B805E0678C16B910DAF5676294EE49A</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T09:55+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Smart glasses-based navigation</term>
					<term>Urban navigation support</term>
					<term>Outdoor cultural heritage</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Mobile augmented reality technology offers new ways of supporting visitors to cultural heritage sites. The services may include delivering information, navigation support and even gaze-based intuitive interaction. Significant progress has been made in exploring the potential of these technologies for indoor museum visits, as well as for the use of such technology outdoors. When considering technological support in a cultural heritage visit, one of the main challenges is supporting visitors' navigation. This paper presents an initial prototype of smart glasses-based navigation in an urban cultural heritage visit. We present the idea and a prototype that has yet to be experimented for usability and acceptance by the users.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>In recent years, as mobile smart devices are becoming widely available. Traditional audio guides are being replaced by smartphone apps and even smart glasses. Augmented reality becomes a possible tool allowing visitors to interact in an engaging way with sites and artefacts in their vicinity and elicit information about them. These novel mobile guides present location-based information by audio and notify about objects of interest as they become visible to the user <ref type="bibr" target="#b0">[1,</ref><ref type="bibr" target="#b1">2]</ref>. Using mobile augmented reality in domains such as tourism and specifically in cultural heritage, has the potential to enhance the visitor's experience by providing easy and intuitive interaction while simply bringing information into the visitor field of view when the visitor is looking at an object <ref type="bibr" target="#b2">[3]</ref>. These systems can reactively or proactively <ref type="bibr" target="#b3">[4,</ref><ref type="bibr" target="#b4">5]</ref> augment a visitor's view with relevant CH information without the devices themselves becoming the focus of attention. They allow real-world information to be sent, received, and viewed without requiring the visitor to carry and simultaneously view a hand-held guide. Moreover, they can overlay historical images and bring to life historical events overlaying them on the current view of a landscape <ref type="bibr" target="#b5">[6,</ref><ref type="bibr" target="#b6">7]</ref> This technology is gaining interest in recent years due to its attractiveness and ease of use <ref type="bibr" target="#b7">[8,</ref><ref type="bibr" target="#b8">9]</ref>. Recently, Litvak and Kuflik <ref type="bibr" target="#b9">[10]</ref> demonstrated the potential of using augmented reality smart glasses as a mobile tourist guide outdoors and noted a few challenges facing the wide use of such devices. Among the experimented features was navigation support within a small archaeological site. Following the positive response to the navigation support provided by the  augmented reality smart glasses, we decided to focus on this aspect, planning a larger scale visiting path. The paper presents the design and implementation of the system and lessons learned during the design and initial evaluation of the system as a navigation aid in an urban setting.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">System overview</head><p>In order to be able to experiment urban navigation in a realistic setting, a prototype system was developed. The main component is the Everysight Raptor smart glasses<ref type="foot" target="#foot_0">1</ref> (Figures 1,2 and 3), powered by a Qualcomm® Snapdragon™ 410E with 32GB internal storage, 2GB SDRAM. The device is the same device used for the study described in Litvak and Kuflik <ref type="bibr" target="#b9">[10]</ref>. Table <ref type="table" target="#tab_0">1</ref> shows a comparison of the technical specifications of the smart glasses with a mobile phone (taken from Litvak and Kuflik <ref type="bibr" target="#b9">[10]</ref>). Android Studio made it possible to build the application together with the Raptor SDK (RDK). The RDK v0.2.1 was used to implement the AR layering of POI names and navigation aids.  Currently, the RDK repository is private and accessible by invitation only. Location tracking was made possible using the smartphone's GPS receiver and the Line of Sight (LOS) detection capability provided by the smart glass sensors and RDK utilities. The GPS receiver and LOS calculation helped us build a context-aware application that followed the visitor's attention to nearby objects.</p><p>For demonstrating the use of the system for supporting urban navigation in cultural heritage scenarios, we selected the German Colony in downtown Haifa. The German Templar Colony is a historic neighborhood in Haifa, at the foot of Mt. Carmel just below the Baha'i Gardens. It was established in Ottoman Haifa in 1868 as a German Templar Colony in Palestine. Its main street is about 600 meters long and a tour was planned along this street. Textual content about historical buildings at the German colony of Haifa was prepared and linked to their geographic position. A pre-visit simple interface enabled users to plan a visit between POIs in the German colony. As the smart glasses are capable of superimposing virtual content such as text and images onto the visitor's FOV (Figure <ref type="figure" target="#fig_2">3</ref>), the virtual navigation instructions between POIs was made visible in a see-through optical display (as described by Ashkenazi and Shamir <ref type="bibr" target="#b10">[11]</ref>, "The electro-optical unit includes a processor and a light projection unit. The processor is coupled with the light projection unit. The light projection unit is configured to transmit light beams onto the partially transmissive partially reflective lens").</p><p>The navigation instructions included an arrow pointing in the direction of the next POI to visit and a textual description provided the name of the next POI and the distance to it (partially illustrated by Figure <ref type="figure" target="#fig_1">2</ref>). Once a visitor arrived at a POI, a message announced the arrival at the POI and if the visitor was interested in an explanation about the POI, s/he had to touch the touch pad on the right side of the glasses (Figure <ref type="figure" target="#fig_2">3</ref>). The information about the POI was available only once the visitor got close enough and was notified and until the visitor moved out of the pre-defined radius. If the visitor deviated from the planned tour path, the system maintained the next POI to visit and continuously provided instructions about how to get there, assuming that the visitor would like to continue the planned tour later on.</p><p>The navigation planning and instructions were based on using google maps API. The overall system architecture is given by Figure <ref type="figure" target="#fig_3">4</ref>: A web-based interface enables the user to plan a tour. The tour is stored in a web-based database. The tour is downloaded to the user's mobile phone that used google maps to calculate the position continuously. The navigation instructions are sent and presented on the smart glasses' screen and if/when information is requested by the user, the POI description is sent to the glasses and played.</p><p>The following sequence of activities is followed:</p><p>1. The user connects phone with Raptor glasses via in-app function.</p><p>2. The user selects path / creates path from saved locations. 3. After starting the navigation, phone uses the Google Maps API along with the device's GPS coordinates in order to get directions to the next station. 4. Directions are sent via the Bluetooth connection to the Raptor glasses. 5. At any point, the user can change the color of the displayed text (in the glasses), by swiping left or right. 6. When the user reaches a station, a tap on the side of the glasses reads out a description of that station. 7. The station automatically changes to the next in the path when reaching a certain radius from it (configurable in-app), and plays a sound when you do so.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Lessons learned during development and initial trials</head><p>During the development and while experimenting with the smart glasses, we noticed first of all, that it takes some time to get used to using the glasses, as the user has to focus on the screen in order to notice the arrow and the instructions and to continuously change focus from looking around to getting navigation information -this takes some time. Another issue is the color of the text -it can be red, green or yellow, hence, depending upon the scene in the background, sometimes it is hard to notice it. Moreover, it appeared that there may be personal preferences with respect to the color of the text, hence color changing was made easily for the user during navigation, simply by touching the touch pad. There were no problems with the directional arrow with respect to color, but the arrow appeared to be quite jumpy, hence a sliding window mechanism was introduced to relax a bit the "jumpiness". Another aspect which seemed to be a result of the navigation system was that when getting to roundabouts, as the user had to turn aside to get to a cross walk, the system misinterpreted this as an erroneous change in direction. Finally, the users' pose had an impact as well -the user had to keep straight, as otherwise the navigation system got confused. To summarize, it seems that while the technology seems interesting and promising, we are not sure that it is mature enough to be used on a daily basis, but this is something for a larger users' study. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Planned evaluation</head><p>Given the above, we are planning to evaluate the system in a user study at the German colony in Haifa. Users will be asked to create a short tour of about 5 or 6 POIs along the street. They will connect the glasses to their mobile phone, download the pre-planned tour and follow it. During the tour they will be asked also to deviate from the planned tour, move to a nearby street and get back to the tour.</p><p>We plan to evaluate the usability of the system with respect to the visualization of the navigation cues (colors, how they are visible given different background) and to elicit feedback and ideas for improving the system.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Summary and potential future work</head><p>While outdoors navigation support seems to be a solved problem, given GPS technology and mobile devices, still, navigation using mobile devices requires the user either to look at the screen of the mobile device, using a map-based visualization (with or without audio instruction) or to use audio alone, if s/he wants to avoid looking at the screen. These limitations may be addressed by the use of smart glasses that in addition to audio-based instructions may also display visual cues in the user's field of view, thus reliving the user from the need to look at the screen and decipher the map visualization.</p><p>One important issue is the fact that it is impossible to take a screenshot of what the user actually sees through the smart glasses hence we are unable to provide an accurate image. However, an illustration is provided by Figure <ref type="figure" target="#fig_4">5</ref>. While the current study focused on the urban navigation function alone, for experimentation purpose (more for setting up the system) a user interface for visit planning was developed. It allows the user to create a tour path and download it to the mobile system (smartphone) and then the user can navigate while following the path or deviating from it, at will. Information regarding POIs is delivered upon arrival at a POI, but still, personalization and recommendation were out of scope for this study.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: The EverySight "Raptor" smart glasses used -the glasses and an illustration of a user wearing them.</figDesc><graphic coords="2,89.29,236.77,416.72,158.54" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: A rear view of the monocular augmented reality smart glasses with the optical lens; the smart glasses live display in action (left to right).</figDesc><graphic coords="3,89.29,84.19,416.71,162.87" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Touch-based interaction with the smart glasses.</figDesc><graphic coords="3,89.29,292.32,416.72,170.44" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_3"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: System architecture.</figDesc><graphic coords="5,130.96,84.19,333.36,331.13" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_4"><head>Figure 5 :</head><label>5</label><figDesc>Figure 5: An illustration of the textual description positional options and the directional arrow (left) and color options (right).</figDesc><graphic coords="6,89.29,84.19,416.68,154.61" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" type="table" xml:id="tab_0"><head>Table 1</head><label>1</label><figDesc>Specifications of smartphone and smart glasses (taken from Litvak and Kuflik<ref type="bibr" target="#b9">[10]</ref>) .</figDesc><table><row><cell>Parameter</cell><cell>Smartphone</cell><cell>Smart glasses</cell></row><row><cell>Battery life</cell><cell>Up to 16 hours</cell><cell>Up to 8 hours</cell></row><row><cell>Weight</cell><cell>165g</cell><cell>98g</cell></row><row><cell cols="2">AR implementation Hand-held, Video</cell><cell>Head-worn, Optical</cell></row><row><cell>FOV</cell><cell>5.5" screen size</cell><cell>65" screen size (equivalent)</cell></row><row><cell>Interaction</cell><cell>Touch screen</cell><cell>Touchpad (tap &amp; swipe)</cell></row><row><cell>Communication</cell><cell cols="2">BLE 4.1, Wi-Fi, GPS, 4G LTE BLE 4.1, Wi-Fi, GPS</cell></row><row><cell>Sensors</cell><cell>Accelerometer, Gyroscope,</cell><cell>Accelerometer, Gyroscope,</cell></row><row><cell></cell><cell>Magnetometer</cell><cell>Magnetometer</cell></row></table></figure>
			<note xmlns="http://www.tei-c.org/ns/1.0" place="foot" n="1" xml:id="foot_0">https://everysight.github.io/rdk_docs/</note>
		</body>
		<back>

			<div type="acknowledgement">
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Acknowledgments</head><p>The smart glasses for the study were provided by Everysight company for free.</p></div>
			</div>

			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">Usability evaluation of a wearable augmented reality system for the enjoyment of the cultural heritage</title>
		<author>
			<persName><forename type="first">N</forename><surname>Brancati</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><surname>Caggianese</surname></persName>
		</author>
		<author>
			<persName><forename type="first">G</forename><forename type="middle">D</forename><surname>Pietro</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Frucci</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Gallo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">P</forename><surname>Neroni</surname></persName>
		</author>
		<idno type="DOI">10.1109/sitis.2015.98</idno>
	</analytic>
	<monogr>
		<title level="m">11th International Conference on Signal-Image Technology &amp; Internet-Based Systems (SITIS)</title>
				<imprint>
			<publisher>IEEE</publisher>
			<date type="published" when="2015">2015. 2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Navigating culture. enhancing visitor museum experience through mobile technologies. from smartphone to google glass</title>
		<author>
			<persName><forename type="first">A</forename><surname>Tomiuc</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Media Research-Revista de Studii Media</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="page" from="33" to="46" />
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Mapping requirements for the wearable smart glasses augmented reality museum application</title>
		<author>
			<persName><forename type="first">M</forename><surname>Tom Dieck</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Jung</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D.-I</forename><surname>Han</surname></persName>
		</author>
		<idno type="DOI">10.1108/jhtt-09-2015-0036</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Hospitality and Tourism Technology</title>
		<imprint>
			<biblScope unit="volume">7</biblScope>
			<biblScope unit="page" from="230" to="253" />
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device: A case study in cultural heritage</title>
		<author>
			<persName><forename type="first">M</forename><surname>Mokatren</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Kuflik</surname></persName>
		</author>
		<author>
			<persName><forename type="first">I</forename><surname>Shimshoni</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.future.2017.07.007</idno>
	</analytic>
	<monogr>
		<title level="j">Future generation computer systems</title>
		<imprint>
			<biblScope unit="volume">81</biblScope>
			<biblScope unit="page" from="528" to="541" />
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<monogr>
		<title level="m" type="main">????</title>
		<imprint/>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Augmented reality and the enhancement of memorable tourism experiences at heritage sites</title>
		<author>
			<persName><forename type="first">S</forename><surname>Jiang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Moyle</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><surname>Yung</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Tao</surname></persName>
		</author>
		<author>
			<persName><forename type="first">N</forename><surname>Scott</surname></persName>
		</author>
		<idno type="DOI">10.1080/13683500.2022.2026303</idno>
	</analytic>
	<monogr>
		<title level="j">Current Issues in Tourism</title>
		<imprint>
			<biblScope unit="page" from="1" to="16" />
			<date type="published" when="2022">2022</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">A framework for exploring churches/-monuments/museums of byzantine cultural influence exploiting immersive technologies in real-time networked environments</title>
		<author>
			<persName><forename type="first">K</forename><surname>Kontopanagou</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Tsipis</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><surname>Komianos</surname></persName>
		</author>
		<idno type="DOI">10.3390/technologies9030057</idno>
	</analytic>
	<monogr>
		<title level="j">Technologies</title>
		<imprint>
			<biblScope unit="volume">9</biblScope>
			<biblScope unit="page">57</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<analytic>
		<title level="a" type="main">A theoretical framework for designing smart and ubiquitous learning environments for outdoor cultural heritage</title>
		<author>
			<persName><forename type="first">A</forename><surname>Alkhafaji</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Fallahkhair</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Haig</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.culher.2020.08.006</idno>
	</analytic>
	<monogr>
		<title level="j">Journal of Cultural Heritage</title>
		<imprint>
			<biblScope unit="volume">46</biblScope>
			<biblScope unit="page" from="244" to="258" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Augmented reality applications to support the promotion of cultural heritage: The case of the basilica of saint catherine of alexandria in galatina</title>
		<author>
			<persName><forename type="first">D</forename><surname>Cisternino</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Corchia</surname></persName>
		</author>
		<author>
			<persName><forename type="first">V</forename><forename type="middle">D</forename><surname>Luca</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><surname>Gatto</surname></persName>
		</author>
		<author>
			<persName><forename type="first">S</forename><surname>Liaci</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><surname>Scrivano</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Trono</surname></persName>
		</author>
		<author>
			<persName><forename type="first">L</forename><forename type="middle">T</forename><surname>De Paolis</surname></persName>
		</author>
		<idno type="DOI">10.1145/3460657</idno>
		<ptr target="https://doi.org/10.1145/3460657.doi:10.1145/3460657" />
	</analytic>
	<monogr>
		<title level="j">J. Comput. Cult. Herit</title>
		<imprint>
			<biblScope unit="volume">14</biblScope>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">Enhancing cultural heritage outdoor experience with augmentedreality smart glasses</title>
		<author>
			<persName><forename type="first">E</forename><surname>Litvak</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Kuflik</surname></persName>
		</author>
		<idno type="DOI">10.1007/s00779-020-01366-7</idno>
	</analytic>
	<monogr>
		<title level="j">Personal and ubiquitous computing</title>
		<imprint>
			<biblScope unit="volume">24</biblScope>
			<biblScope unit="page" from="873" to="886" />
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Wearable optical display system for unobstructed viewing</title>
		<author>
			<persName><forename type="first">A</forename><surname>Ashkenazi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">H</forename><surname>Shamir</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">US Patent</title>
		<imprint>
			<biblScope unit="volume">10</biblScope>
			<biblScope unit="page">660</biblScope>
			<date type="published" when="2019">2019</date>
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
