<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">3D Tactile Obstacle Awareness System for Drones using a Tactile Interface around the Head</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Oliver</forename><forename type="middle">Beren</forename><surname>Kaul</surname></persName>
							<email>kaul@hci.uni-hannover.de</email>
							<affiliation key="aff0">
								<orgName type="institution">Leibniz University Hannover Hannover</orgName>
								<address>
									<postCode>30167</postCode>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Michael</forename><surname>Rohs</surname></persName>
							<email>rohs@hci.uni-hannover.de</email>
							<affiliation key="aff1">
								<orgName type="institution">Leibniz University Hannover Hannover</orgName>
								<address>
									<postCode>30167</postCode>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">3D Tactile Obstacle Awareness System for Drones using a Tactile Interface around the Head</title>
					</analytic>
					<monogr>
						<imprint>
							<date/>
						</imprint>
					</monogr>
					<idno type="MD5">1A1537B6FC5FD89BE660C27ADA0D51B6</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-25T01:09+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Drones</term>
					<term>tactile obstacle awareness</term>
					<term>drone navigation</term>
					<term>wearables CCS Concepts</term>
					<term>Human-centered computing → Haptic devices</term>
					<term>Interaction techniques</term>
					<term>Ubiquitous and mobile computing systems and tools</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>We propose a 3D obstacle awareness system for drone pilots, implemented as a tactile user interface around the head. The concept of this system is presented alongside a variety of use cases and recommendations for future work.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Introduction and Related Work</head><p>Drone pilots face obstacle awareness challenges in case of bad lighting conditions, distractions, or when flying in any direction that is not in the camera view. Possible obstacles include static and dynamic obstacles such as other drones, humans, animals, or even brick walls within buildings. We propose a tactile system to indicate obstacles, including their distance from the drone, in the 3D space around the user (see Figure <ref type="figure" target="#fig_0">1</ref>).</p><p>Earlier work and concepts on human-drone interaction was neatly summarized in <ref type="bibr" target="#b3">[4]</ref> and explained in further detail by  Baytas et al. <ref type="bibr" target="#b0">[1]</ref>. Our obstacle awareness concept presented in this paper extends the idea of augmenting spatial awareness for humans <ref type="bibr" target="#b1">[2]</ref> and aims to instead increase spatial awareness of a human controlling a remote drone. Earlier approaches to this challenge were able to show promising results for a 2D navigation task using ultrasound sensors attached to a drone and a vibrotactile belt <ref type="bibr" target="#b13">[13]</ref>. We aim to extend Spiss et al.'s obstacle awareness system to 3D use cases, which cannot be displayed properly by the tactile belt used in <ref type="bibr" target="#b13">[13]</ref>.</p><p>In our previous work, we presented HapticHead [6, 7, 5], a vibrotactile display around the head consisting of a bathing cap with a chin strap and a total of 24 vibrotactile actuators (see Fig. <ref type="figure" target="#fig_1">2</ref>). We were able to show that our prototype can be used in 3D guidance and localization scenarios for people with normal vision in both virtual (VR) and augmented reality (AR) scenarios. The system can indicate directions all around the user and guide the user to look at a defined point in space with a median deviation of 2.3°to the actual target. This precise guidance capability may also be used to make users aware of obstacles in the space around them. The previous work further included a scenario in which blindfolded users were able to feel the presence of real physical objects in the 3D space around them and subsequently were able to find and touch the objects (see <ref type="bibr" target="#b6">[7]</ref> and Fig. <ref type="figure" target="#fig_2">3</ref>).</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Input system: Suitable 360 degree obstacle detection for drones</head><p>A suitable 360°obstacle detection system for drones is needed as an input for our proposed system. There are a variety of systems and technologies that could serve as an obstacle detection system, such as multiple stereo cameras working together <ref type="bibr" target="#b12">[12,</ref><ref type="bibr" target="#b10">10]</ref>, 3D LIDARs <ref type="bibr" target="#b9">[9]</ref>, or even a system using, e.g., HyperOmni Visions (HOVIs) <ref type="bibr" target="#b11">[11]</ref>. These input systems would need to filter and extrapolate static and dynamic obstacles, including their distance and 3D viewing angle from the drone camera perspective. The detected obstacles should further be filtered so that obstacles further away than a threshold distance would be excluded from the results, as these can be deemed harmless at the given moment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Output system: Indicating obstacles around the drone by HapticHead</head><p>In our prior work, we introduced a 3D guidance algorithm for arbitrary actuator configurations such as HapticHead <ref type="bibr" target="#b6">[7]</ref>. This guidance algorithm proved to be quite efficient and fast in guiding study participants to look in the indicated direction in 3D, including elevation. The same algorithm can be used in obstacle awareness scenarios as well. Just like in <ref type="bibr" target="#b6">[7]</ref>, the depth to obstacles may also be indicated by a vibrotactile pulse-pattern and intensity modulation which gets faster and stronger, the closer an object is.</p><p>The spatial mapping of the vibrotactile feedback is drone centric: The output occurs relative to the drone that the user is controlling and is mapped in a one-to-one fashion to the HapticHead. The front of the drone is mapped to the front of the head. Obstacles appearing in front of the drone are haptically displayed on the forehead. Obstacles that appear to the right of the drone appear on the right side of HapticHead, and so on. This yields in natural mapping of the drone coordinate system to the head coordinate system. To the user it feels as if he or she is flying as a pilot inside the drone, intuitively feeling obstacles along its way.</p><p>When indicating multiple obstacles at the same time with the proposed tactile interface, a user will likely suffer from a loss of localization accuracy. For one, if two obstacles are close together, the user will only be able to perceive one of them, as the vibrotactile pulse-pattern would become confusing if two obstacles overlap from the perspective of the drone and thus allocate the same actuators on the Hap-ticHead. Arguably, this limitation is no deal breaker, as the user can still feel the distance of the closer of the two (or more) objects. Furthermore, if more than two obstacles are indicated at the same time, even if they are not allocating the same actuators, a loss of accuracy is still likely. This results from sensory congestion/overload or funneling illusion effects if too many actuators are active at the same time <ref type="bibr" target="#b2">[3,</ref><ref type="bibr" target="#b7">8]</ref>.</p><p>As a solution to these issues, we suggest to only indicate the closest two or three obstacles at the same time and merge obstacles that are close together, only indicating the closer obstacle.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Use Cases</head><p>As indicated in the introduction, the proposed system may be used in a variety of use cases related to drone operation and handling. These include:</p><p>1. Flying into any direction that is not in the camera view (e.g., side-, back-, down-, or upwards);</p><p>2. operating a drone at night or in bad lighting conditions;</p><p>3. operating a drone around areas with many static or dynamic obstacles such as other drones, humans, animals, plants, or walls within buildings;</p><p>4. operating a drone while being distracted (e.g., by other humans).</p><p>In the first three cases, the system would provide tactile guidance to the closest two or three obstacles, so that the user can intuitively navigate his drone out of a dangerous situation. In the fourth case, the system would provide tactile warnings in case an obstacle is close which reminds the user to redirect his attention back to the drone.</p><p>Another use case would be accessibility: visually impaired drone operators should have a much easier time avoiding obstacles due to the additional tactile feedback channel.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Conclusion and future work</head><p>In conclusion, we propose a tactile obstacle awareness system for drone operators, which may be used in a large variety of use cases. Future work may implement the proposed system and test the assumed benefits in a real environment.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 1 :</head><label>1</label><figDesc>Figure 1: Live tactile drone obstacle awareness system using HapticHead, a vibrotactile interface around the head [7]. The user's drone is currently floating while another drone is close to crashing into it from behind. The user receives a tactile warning of an obstacle closing in from the top-back-left direction.</figDesc><graphic coords="2,210.38,116.92,182.26,358.33" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 2 :</head><label>2</label><figDesc>Figure 2: HapticHead, a vibrotactile interface around the head [7].</figDesc><graphic coords="2,459.00,112.46,243.00,170.98" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Blindfolded participant in prior experiment, feeling the direction and distance to physical objects [7].</figDesc><graphic coords="3,210.38,112.46,182.25,202.77" type="bitmap" /></figure>
		</body>
		<back>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">The Design of Social Drones</title>
		<author>
			<persName><forename type="first">Damla</forename><surname>Mehmet Aydin Baytas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yuchong</forename><surname>Çay</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Mohammad</forename><surname>Zhang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Asim</forename><surname>Obaid</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Morten</forename><surname>Evren Yantaç</surname></persName>
		</author>
		<author>
			<persName><surname>Fjeld</surname></persName>
		</author>
		<idno type="DOI">10.1145/3290605.3300480</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1145/3290605.3300480" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems -CHI &apos;19</title>
				<meeting>the 2019 CHI Conference on Human Factors in Computing Systems -CHI &apos;19<address><addrLine>New York, New York, USA</addrLine></address></meeting>
		<imprint>
			<publisher>ACM Press</publisher>
			<date type="published" when="2019">2019</date>
			<biblScope unit="volume">4</biblScope>
			<biblScope unit="page" from="1" to="13" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">Augmenting spatial awareness with haptic radar</title>
		<author>
			<persName><forename type="first">Alvaro</forename><surname>Cassinelli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Carson</forename><surname>Reynolds</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Masatoshi</forename><surname>Ishikawa</surname></persName>
		</author>
		<idno type="DOI">10.1109/ISWC.2006.286344</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1109/ISWC.2006.286344" />
	</analytic>
	<monogr>
		<title level="m">Proceedings -International Symposium on Wearable Computers</title>
				<meeting>-International Symposium on Wearable Computers</meeting>
		<imprint>
			<publisher>ISWC</publisher>
			<date type="published" when="2007">2007. 2007</date>
			<biblScope unit="page" from="61" to="64" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">Quantifying Information Transfer Through a Head-Attached Vibrotactile Display: Principles for Design and Control</title>
		<author>
			<persName><forename type="first">Michal</forename><surname>Karol Dobrzynski</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Seifeddine</forename><surname>Mejri</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Steffen</forename><surname>Wischmann</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Dario</forename><surname>Floreano</surname></persName>
		</author>
		<idno type="DOI">10.1109/TBME.2012.2196433</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1109/TBME.2012.2196433" />
	</analytic>
	<monogr>
		<title level="j">IEEE Transactions on Biomedical Engineering</title>
		<imprint>
			<biblScope unit="volume">59</biblScope>
			<biblScope unit="issue">7</biblScope>
			<biblScope unit="page" from="2011" to="2018" />
			<date type="published" when="2012-07">2012. jul 2012</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Human-drone interaction: Let&apos;s get ready for flying user interfaces</title>
		<author>
			<persName><forename type="first">Markus</forename><surname>Funk</surname></persName>
		</author>
		<idno type="DOI">10.1145/3194317</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1145/3194317" />
	</analytic>
	<monogr>
		<title level="j">Interactions</title>
		<imprint>
			<biblScope unit="volume">25</biblScope>
			<biblScope unit="page" from="78" to="81" />
			<date type="published" when="2018">2018. 2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<monogr>
		<title level="m" type="main">Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head</title>
		<author>
			<persName><forename type="first">Kevin</forename><surname>Oliver Beren Kaul</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Michael</forename><surname>Meier</surname></persName>
		</author>
		<author>
			<persName><surname>Rohs</surname></persName>
		</author>
		<idno type="DOI">10.1007/978-3-319-68059-0_19</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1007/978-3-319-68059-0_19" />
		<imprint>
			<date type="published" when="2017">2017</date>
			<publisher>Springer International Publishing</publisher>
			<biblScope unit="page" from="289" to="298" />
			<pubPlace>Cham</pubPlace>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">HapticHead: 3D Guidance and Target Acquisition through a Vibrotactile Grid</title>
		<author>
			<persName><forename type="first">Oliver</forename><surname>Beren</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kaul</forename></persName>
		</author>
		<author>
			<persName><forename type="first">Michael</forename><surname>Rohs</surname></persName>
		</author>
		<idno type="DOI">10.1145/2851581.2892355</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1145/2851581.2892355" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems -CHI EA &apos;16</title>
				<meeting>the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems -CHI EA &apos;16<address><addrLine>New York, New York, USA</addrLine></address></meeting>
		<imprint>
			<publisher>ACM Press</publisher>
			<date type="published" when="2016">2016</date>
			<biblScope unit="page" from="2533" to="2539" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality</title>
		<author>
			<persName><forename type="first">Oliver</forename><surname>Beren</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kaul</forename></persName>
		</author>
		<author>
			<persName><forename type="first">Michael</forename><surname>Rohs</surname></persName>
		</author>
		<idno type="DOI">10.1145/3025453.3025684</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1145/3025453.3025684" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems -CHI &apos;17</title>
				<meeting>the 2017 CHI Conference on Human Factors in Computing Systems -CHI &apos;17<address><addrLine>New York, New York, USA</addrLine></address></meeting>
		<imprint>
			<publisher>ACM Press</publisher>
			<date type="published" when="2017">2017</date>
			<biblScope unit="page" from="3729" to="3740" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b7">
	<monogr>
		<title/>
		<author>
			<persName><forename type="first">Michael</forename><surname>Oliver Beren Kaul</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Benjamin</forename><surname>Rohs</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kerem</forename><forename type="middle">Can</forename><surname>Simon</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kamillo</forename><surname>Demir</surname></persName>
		</author>
		<author>
			<persName><surname>Ferry</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2020">2020</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b8">
	<analytic>
		<title level="a" type="main">Vibrotactile Funneling Illusion and Localization Performance on the Head</title>
		<idno type="DOI">10.1145/3313831.3376335</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1145/3313831.3376335" />
	</analytic>
	<monogr>
		<title level="m">Proceedings of the SIGCHI Conference on Human Factors in Computing Systems -CHI &apos;20 (CHI &apos;20)</title>
				<meeting>the SIGCHI Conference on Human Factors in Computing Systems -CHI &apos;20 (CHI &apos;20)<address><addrLine>New York, NY, USA</addrLine></address></meeting>
		<imprint>
			<publisher>ACM</publisher>
			<date>13</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b9">
	<analytic>
		<title level="a" type="main">New Opportunities for Forest Remote Sensing Through Ultra-High-Density Drone Lidar</title>
		<author>
			<persName><forename type="first">James</forename><forename type="middle">R</forename><surname>Kellner</surname></persName>
		</author>
		<author>
			<persName><forename type="first">John</forename><surname>Armston</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Markus</forename><surname>Birrer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">K</forename><forename type="middle">C</forename><surname>Cushman</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Laura</forename><surname>Duncanson</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Christoph</forename><surname>Eck</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Christoph</forename><surname>Falleger</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Benedikt</forename><surname>Imbach</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kamil</forename><surname>Král</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Martin</forename><surname>Kråŕček</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jan</forename><surname>Trochta</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Tomáš</forename><surname>Vrška</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Carlo</forename><surname>Zgraggen</surname></persName>
		</author>
		<idno type="DOI">10.1007/s10712-019-09529-9</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1007/s10712-019-09529-9" />
	</analytic>
	<monogr>
		<title level="j">Surveys in Geophysics</title>
		<imprint>
			<biblScope unit="volume">40</biblScope>
			<biblScope unit="issue">4</biblScope>
			<biblScope unit="page" from="959" to="977" />
			<date type="published" when="2019">2019. 2019</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b10">
	<analytic>
		<title level="a" type="main">Rear obstacle detection system with fisheye stereo camera using HCT</title>
		<author>
			<persName><forename type="first">Deukhyeon</forename><surname>Kim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jinwook</forename><surname>Choi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Hunjae</forename><surname>Yoo</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ukil</forename><surname>Yang</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Kwanghoon</forename><surname>Sohn</surname></persName>
		</author>
		<idno type="DOI">10.1016/j.eswa.2015.04.035</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1016/j.eswa.2015.04.035" />
	</analytic>
	<monogr>
		<title level="j">Expert Systems with Applications</title>
		<imprint>
			<biblScope unit="volume">42</biblScope>
			<biblScope unit="page" from="6295" to="6305" />
			<date type="published" when="2015">2015. 2015</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b11">
	<analytic>
		<title level="a" type="main">Realtime omnidirectional stereo for obstacle detection and tracking in dynamic environments</title>
		<author>
			<persName><forename type="first">Hiroshi</forename><surname>Koyasu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Jun</forename><surname>Miura</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yoshiaki</forename><surname>Shirai</surname></persName>
		</author>
		<idno type="DOI">10.1109/iros.2001.973332</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1109/iros.2001.973332" />
	</analytic>
	<monogr>
		<title level="j">IEEE International Conference on Intelligent Robots and Systems</title>
		<imprint>
			<biblScope unit="volume">1</biblScope>
			<biblScope unit="page" from="31" to="36" />
			<date type="published" when="2001">2001. 2001</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b12">
	<analytic>
		<title level="a" type="main">High accuracy stereo vision system for far distance obstacle detection</title>
		<author>
			<persName><forename type="first">Sergiu</forename><surname>Nedevschi</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Radu</forename><surname>Danescu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Dan</forename><surname>Frentiu</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Tiberiu</forename><surname>Marita</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Florin</forename><surname>Oniga</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Ciprian</forename><surname>Pocol</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Rolf</forename><surname>Schmidt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Thorsten</forename><surname>Graf</surname></persName>
		</author>
		<idno type="DOI">10.1109/ivs.2004.1336397</idno>
		<idno>DOI:</idno>
		<ptr target="http://dx.doi.org/10.1109/ivs.2004.1336397" />
	</analytic>
	<monogr>
		<title level="m">IEEE Intelligent Vehicles Symposium, Proceedings</title>
				<imprint>
			<date type="published" when="2004">2004. 2004</date>
			<biblScope unit="page" from="292" to="297" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b13">
	<monogr>
		<title level="m" type="main">Comparison of Tactile Signals for Collision Avoidance on Unmanned Aerial Vehicles</title>
		<author>
			<persName><forename type="first">Stefan</forename><surname>Spiss</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Yeongmi</forename><surname>Kim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Simon</forename><surname>Haller</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Matthias</forename><surname>Harders</surname></persName>
		</author>
		<imprint>
			<date type="published" when="2018">2018</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b14">
	<monogr>
		<title level="m">Haptic Interaction</title>
				<editor>
			<persName><forename type="first">Shoichi</forename><surname>Hasegawa</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">Masashi</forename><surname>Konyo</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">Ki-Uk</forename><surname>Kyung</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">Takuya</forename><surname>Nojima</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">Hiroyuki</forename><surname>Kajimoto</surname></persName>
		</editor>
		<meeting><address><addrLine>Singapore, Singapore</addrLine></address></meeting>
		<imprint>
			<publisher>Springer</publisher>
			<biblScope unit="page" from="393" to="399" />
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
