<?xml version="1.0" encoding="UTF-8"?>
<TEI xml:space="preserve" xmlns="http://www.tei-c.org/ns/1.0" 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
xsi:schemaLocation="http://www.tei-c.org/ns/1.0 https://raw.githubusercontent.com/kermitt2/grobid/master/grobid-home/schemas/xsd/Grobid.xsd"
 xmlns:xlink="http://www.w3.org/1999/xlink">
	<teiHeader xml:lang="en">
		<fileDesc>
			<titleStmt>
				<title level="a" type="main">Treadmill-framework for cognitive and motoric tests in mixed reality environments</title>
			</titleStmt>
			<publicationStmt>
				<publisher/>
				<availability status="unknown"><licence/></availability>
			</publicationStmt>
			<sourceDesc>
				<biblStruct>
					<analytic>
						<author>
							<persName><forename type="first">Nils</forename><surname>Fischer</surname></persName>
							<email>fischer@hs-ruhrwest.de</email>
							<affiliation key="aff0">
								<orgName type="institution">Hochschule Ruhr West</orgName>
								<address>
									<addrLine>Lützowstraße 5</addrLine>
									<postCode>46236</postCode>
									<settlement>Bottrop</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<author>
							<persName><forename type="first">Michael</forename><surname>Schellenbach</surname></persName>
							<email>michael.schellenbach@hs-ruhrwest.de</email>
							<affiliation key="aff0">
								<orgName type="institution">Hochschule Ruhr West</orgName>
								<address>
									<addrLine>Lützowstraße 5</addrLine>
									<postCode>46236</postCode>
									<settlement>Bottrop</settlement>
									<country key="DE">Germany</country>
								</address>
							</affiliation>
						</author>
						<title level="a" type="main">Treadmill-framework for cognitive and motoric tests in mixed reality environments</title>
					</analytic>
					<monogr>
						<idno type="ISSN">1613-0073</idno>
					</monogr>
					<idno type="MD5">486BCBC154A24363D1B4944FB7950253</idno>
				</biblStruct>
			</sourceDesc>
		</fileDesc>
		<encodingDesc>
			<appInfo>
				<application version="0.7.2" ident="GROBID" when="2023-03-24T22:58+0000">
					<desc>GROBID - A machine learning software for extracting information from scholarly documents</desc>
					<ref target="https://github.com/kermitt2/grobid"/>
				</application>
			</appInfo>
		</encodingDesc>
		<profileDesc>
			<textClass>
				<keywords>
					<term>Mixed Reality</term>
					<term>Gait Analysis</term>
					<term>Experimental Setup</term>
					<term>Walking Environment</term>
				</keywords>
			</textClass>
			<abstract>
<div xmlns="http://www.tei-c.org/ns/1.0"><p>Combining a traditional treadmill with a virtual reality (VR) environment comes with many obstacles. In order to analyse a persons gait a laboratory evaluation framework was implemented based on the aforementioned scenario. The actual movement information is being matched with the virtual movement speed in order to create an immersive walking environment. In this iteration of the framework a Woodway treadmill was used in combination with the HTC Vive Pro 2 VR headset. The framework will be used for cognitive and motoric tests which leads to many questions regarding the security aspect and the overall feeling of uncertainty while walking on a treadmill without being able to see the walkable surface. In order to give the participants a better understanding of the treadmill we decided to implement a mixed reality approach for this iteration.</p></div>
			</abstract>
		</profileDesc>
	</teiHeader>
	<text xml:lang="en">
		<body>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="1.">Introduction</head><p>This paper presents a work-in-progress laboratory evaluation framework for cognitive and motoric testing in a secluded environment without any outside disturbances. The goal of this framework is to create a setting, in which participants can fully focus their attention on walking without the fear of getting distracted. In the framework presented in this paper we use the outside cameras of the VR headset to directly show the real treadmill inside the VR environment. The aim is to analyse the gait of participants while they are fully immersed in the simulation. We use different hardware and software components to achieve the aforementioned results.</p><p>The main components of the project setup are a treadmill on which the participants are walking and a virtual environment where the movement is being displayed. The movement speed of the participants is directly linked to the walking speed of the character in the virtual environment and the treadmill will accelerate or decelerate according to the participants movements. These information are getting recorded by a camera and submitted to the treadmill and the VR environment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="2.">Related Works</head><p>Inspired was the idea to use a camera to measure the position of the participants on the treadmill by the work of Lichtenstein et al. <ref type="bibr" target="#b0">[1]</ref>. They present a feedback-controlled interface that allows the control of the treadmill using the positional information on the treadmill. We expanded upon this idea by using reoccurring API-Calls in order to sync the actual speed with the shown motion within the virtual environment. Using the results of Banton et al. <ref type="bibr" target="#b1">[2]</ref> we tried to match the virtual environment to reality as closely as possible. We wanted to reduce discrepancies between the two environments wherever possible.</p><p>The research of Fung et al. <ref type="bibr" target="#b2">[3]</ref> and Kim et al. <ref type="bibr" target="#b3">[4]</ref> already analysed the gait of a person with the combination of a treadmill and a virtual environment. They both specifically analysed the gait of post stroke patients which isn't the targeted group for this framework. For this evaluation we target healthy persons in order to see if there are irregularities in the gait when walking on the treadmill while not being able to witness the real world. Moreover, we want to use this framework to conduct cognitive and motoric tests at the same time. If the safety of the participant is guaranteed, special circumstances, especially therapeutic cases, are a possible future outlook.</p><p>The safety of the participants is the highest priority when trying to create a virtual walking environment. In order to analyse the gait of the participants they have to create a feeling of security. Tests conducted by Schaefer et al. <ref type="bibr" target="#b4">[5]</ref> show that different kind of groups react differently to an uncommon walking scenario. Therefore, we revised a lot of the components from the first iteration of the framework <ref type="bibr" target="#b5">[6]</ref> and made it easier to use and understand.</p><p>We modified the approach used by Czienskowski et al. <ref type="bibr" target="#b6">[7]</ref> and used an Azure Kinect DK camera in order to create a self-propelled configuration. This increases the immersion of the framework and allows for a smooth transition between the real-and virtual world. Going forward, we want to conduct tests so discrepancies need to be minimized.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.">Changes from the first iteration</head><p>The Oculus Quest virtual reality headset was replaced by the HTC Vive Pro 2. This change was made because of the additional wireless adapter for the HTC Vive Pro 2. The Oculus Quest is a standalone application that didn't need a connection to any other device while the HTC Vive Pro 2 streams a game running on a different system. A cable is a dangerous aspect while running on a treadmill and that's why the Oculus Quest was chosen at first. After further testing it became apparent that the Oculus Quest doesn't provide the kind of performance a standard PC can provide. The HTC Vive Pro 2 can provide better visuals and higher resolutions because the processing duties are outsourced to a Windows PC.</p><p>Another change was made to the game engine. Here the engine was changed from Unity3D to the Unreal Engine 4. This change was made because the reoccurring environment was a huge problem in the first iteration. With the use of the Voxel Plugin (https://voxelplugin.com) for the Unreal Engine 4, it is possible to create highly performant landscapes procedurally. With this technology we can create a road through a forest that doesn't feel repetitive. Furthermore, the underlying landscape provides the information for which kind of object needs to be rendered. Therefore, the forest can be rendered on-the-fly which highly reduces the necessary processing power and enhances the authenticity of the virtual environment.</p><p>Additionally, we discarded the idea of using controller inputs for marginal adjustments of the treadmill in order to have a clearer data flow between the different systems. Moreover, the self-propelled approach gives the participants a higher sense of security because they are controlling the speed of the treadmill directly with their own movements.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="3.1.">Mixed Reality Approach</head><p>Based on the results of the first iteration of the framework we decided to abandon the idea of using a motion capture system in order to simulate the real life motions for the participants. This was done by streaming the data to a virtual character to show the leg movement without being able to see the actual body. This comes with the benefit that the participants don't have to equip the whole suit for every session. Moreover, the representation of the movement wasn't accurate all the time when it was being displayed in the virtual environment.</p><p>Therefore, we gave up on using a complete virtual reality setup and now chose a mixed reality approach. While the participants are fully immersed in the virtual environment we use the front cameras of the HTC Vice Pro 2 to always show the treadmill underneath in order to give the participants a clear view of their position on the treadmill. Even though the distance is measured and the treadmill is being controlled based on the positioning on the treadmill, the participants need to verify themselves if they are walking correctly. The most critiqued point from the first iteration was the feeling of uneasiness while walking on a moving object without actually seeing it. With this approach we give all participants the chance to check for themselves if they are walking correctly by simply looking down. This gives the participants a higher sense of security when using the framework.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 1: View of the treadmill inside the environment</head><p>In figure <ref type="figure">1</ref> the treadmill can be seen while wearing the HTC Vive Pro 2 virtual reality headset. The external cameras are used to bring the real world into the virtual environment. In this case we use the Reality Mixer application (https://go.hrw.nrw/Eibp97) in order to bring an object into the environment based on the position inside the room. The HTC base stations are used to track the headset within the virtual room setup and the objects within that area can be integrated into any given virtual reality application. By looking down while being inside the virtual environment we can see the treadmill by using the position of the treadmill inside the room. This allows for a safer walking experience overall.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="4.">Framework overview</head><p>The presented setup is focused around a Woodway PPS55 med treadmill which is connected to an Azure Kinect DK. The communication between these two components is possible due to a Windows PC functioning as the control unit. The camera records the participants and sends the information to the control unit. The position of the participants on the treadmill can be determined based on the distance of the participants to the camera. This information can then be used to slow down the treadmill if the participants are at the back or to increase the speed if they are more to the front of the treadmill. This self-propelled approach will adjust the treadmill speed according to the walking speed of the participants, which means that the treadmill is completely controlled by the movement of the individuals walking on it.</p><p>In order to simulate a secluded environment, the participants use a virtual reality headset. In this iteration of the framework we use the HTC Vive Pro 2 with the Vive wireless adapter. The virtual reality headset displays a virtual environment created with the Unreal Engine 4 running on a separate Windows PC. It functions as a display while the processing power of the PC is being used for rendering the environment. In order to send the information to the VR environment we implemented a web server into the Unreal Project that can change the movement speed according to the received values.</p><p>A separate Raspberry Pi microcomputer is being used to communicate with the treadmill. It features a ReST-API that can translate the incoming information from the Windows PC to hexadecimal values for the treadmill to understand. This ReST-API will also send the current speed values to the web server running in the virtual environment in order to match the actual speed of the treadmill to the motion in the virtual environment. The ReST-API will always set a new speed value according to the distance from the Kinect camera system first and will send the updated values to the virtual environment afterwards. This will provide the virtual environment with incremental updates until the new speed value has been reached.</p><p>The project written in the Unreal Engine 4 features a large forest that is fully procedurally generated based on different heights of the underlying landscape. This means that different kinds of heights will be populated with different kinds of textures and objects. The character walks along a predetermined road through this forest and encounters different kinds of placements along the way. The height map for this simulation is almost flat because the simulation is being used for a normal and consistent gait. However, the design can also be used for simulating slopes if the used treadmill allows for such a feature.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head>Figure 2: View into the virtual environment</head><p>The environment created in the Unreal Engine 4 is shown in figure <ref type="figure">2</ref>, where a linear road can be seen. The participants walk along this road while wearing the virtual reality headset and being on the treadmill. We chose a linear, almost flat road in order to provide an environment which resembles the flat surface of the treadmill. There are also no objects on the road so that the participants don't feel the need to avoid the obstacle by making unnatural movements. These would create a disturbance for the gait analysis and more importantly could lead to injuries for the participants.  The Raspberry Pi is located in the center of the architecture and receives the new values from the Azure Kinect DK in order to send the new values to both the treadmill and the Unreal Project. However, the information is not being sent to both systems immediately because the treadmill needs some time to accelerate or decelerate to the new speed value. Therefore, the information is being transmitted to the treadmill and afterwards the current speed of the treadmill is used as the new value for the virtual environment. The Raspberry Pi asks the treadmill for the current speed value on a fixed interval so that even the small increments from the last speed value to the new one are being shown in the virtual environment. This prevents the participants from seeing a different speed then they are experiencing on the treadmill itself.</p><p>We want to have as few inconsistencies as possible in order to reduce the probability of accidents. </p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="5.">Results of technical testing</head><p>While walking on the treadmill with the HTC Vive Pro 2 headset with the described features, a couple of future improvements were found. Normally by walking in a straight line humans can just look down by turning the eyes without turning the whole head downwards. With a virtual reality headset a head motion is necessary in order to see what's right in front of somebody. This leads to the problem we already had with the first iteration that a longer treadmill would make it easier to walk in any virtual environment. With the PPS55 med treadmill we don't see the treadmill comfortably without looking down. Showing the treadmill directly improves the overall walking experience significantly, but it can certainly be enhanced even further. A visual marker could be shown in front of the participants if the comfortable walking area was abandoned. This could be a slider that represents the treadmill as a whole and based on the camera position this slider could be updated giving the participants another visual support. Another idea would be an external camera that could be shown inside the virtual environment. This would on the one hand break the immersion a little bit, but would on the other hand increase the safety of the setup. Here a plethora of options could be available and the participants could chose the options they want while conducting the experiment.</p><p>We also conducted a first test with a couple of selected participants in order to get their impressions of the framework in its current form. The overall consensus was that the experience improved from the first iteration, but they also gave feedback regarding possible improvements. It became apparent that the height of the participants was a huge factor on the overall experience. The larger the participants, the smaller the angle between the head position and the front of the treadmill. Higher participants had to make a larger afford in order to see the front of the treadmill. Another point was the walking on the treadmill itself. Here most of the participants needed guidance in order to find a pleasant movement speed, but most of the participants could get used to the environment. Therefore, an adjustment period has to be considered while walking in the environment.</p></div>
<div xmlns="http://www.tei-c.org/ns/1.0"><head n="6.">Upcoming Work</head><p>Programmatically the next step is the fine-tuning of the software to get the participants positional data using the Azure Kinect DK. Walking on the treadmill showed that it is rather difficult to specifically move to the back of the treadmill in order to slow the treadmill down. Without any clear indicators where the back end of the treadmill is, it is really dangerous to try to move to the end of the treadmill. We want to use the camera to stop the treadmill if a certain limit on the back of the treadmill is reached. This means that the treadmill should slow down earlier on the back end then it increases the movement speed while reaching the front.</p><p>One idea from the previous section was the usage of a multitude of options for the participants to choose from. These options could be available to every participant and they could decide which options are better for their overall experience. We display the treadmill underneath the virtual character, but other options for orientation on the treadmill could be the implementation of positional markers to indicate the position even further. Moreover, a slider could help to visualise the movement on the treadmill in front of the virtual character. Another option could be the installation of an overhead camera to show the direct camera footage inside the virtual environment. In order to get a better picture of the needed features we plan to conduct a test run with a larger group of participants after fine-tuning the software. Going forward, this will help the future development of the framework and the experiments we want to use the framework for.</p></div><figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_0"><head>Figure 3 :</head><label>3</label><figDesc>Figure 3: Data flows between the different systems</figDesc><graphic coords="3,302.62,171.50,203.36,163.28" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_1"><head>Figure 3</head><label>3</label><figDesc>Figure3shows the different systems that are being used for the application and how they interact with one another. The emergency functionality is communicating directly with the treadmill and stopping the whole data exchange process between the treadmill and the Raspberry Pi. The Windows PC running the game and the web server is only running the game engine and the HTC Vive Pro 2 is used to display the environment for the participants. Every new data package starts from the Azure Kinect DK camera and gets interpreted by the control unit as the new recommended speed value for the treadmill. This value is based on the position of the participants on the treadmill.The Raspberry Pi is located in the center of the architecture and receives the new values from the Azure Kinect DK in order to send the new values to both the treadmill and the Unreal Project. However, the information is not being sent to both systems immediately because the treadmill needs some time to accelerate or decelerate to the new speed value. Therefore, the information is being transmitted to the treadmill and afterwards the current speed of the treadmill is used as the new value for the virtual environment. The Raspberry Pi asks the treadmill for the current speed value on a fixed interval so that even the small increments from the last speed value to the new one are being shown in the virtual environment. This prevents the participants from seeing a different speed then they are experiencing on the treadmill itself.</figDesc><graphic coords="3,89.29,484.61,203.33,100.91" type="bitmap" /></figure>
<figure xmlns="http://www.tei-c.org/ns/1.0" xml:id="fig_2"><head>Figure 4 :</head><label>4</label><figDesc>Figure 4: View of the framework's components</figDesc><graphic coords="4,89.29,118.18,203.34,135.56" type="bitmap" /></figure>
		</body>
		<back>
			<div type="references">

				<listBibl>

<biblStruct xml:id="b0">
	<analytic>
		<title level="a" type="main">A feedback-controlled interface for treadmill locomotion in virtual environments</title>
		<author>
			<persName><forename type="first">L</forename><surname>Lichtenstein</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Barabas</surname></persName>
		</author>
		<author>
			<persName><forename type="first">E</forename><surname>Peli</surname></persName>
		</author>
		<author>
			<persName><forename type="first">R</forename><forename type="middle">L</forename><surname>Woods</surname></persName>
		</author>
		<idno type="DOI">10.1145/1227134.1227141</idno>
	</analytic>
	<monogr>
		<title level="j">ACM Transactions on Applied Perception</title>
		<imprint>
			<biblScope unit="page">7</biblScope>
			<date type="published" when="2007">2007</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b1">
	<analytic>
		<title level="a" type="main">The perception of walking speed in a virtual environment</title>
		<author>
			<persName><forename type="first">T</forename><surname>Banton</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Durgin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Fass</surname></persName>
		</author>
		<author>
			<persName><forename type="first">D</forename><surname>Proffitt</surname></persName>
		</author>
		<author>
			<persName><forename type="first">J</forename><surname>Stefanucci</surname></persName>
		</author>
		<idno type="DOI">10.1145/1227134.1227141</idno>
	</analytic>
	<monogr>
		<title level="m">Presence: Teleoperators and Virtual Environments</title>
				<imprint>
			<date type="published" when="2005">2005</date>
			<biblScope unit="page" from="394" to="406" />
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b2">
	<analytic>
		<title level="a" type="main">A treadmill and motion coupled virtual reality system for gait training post-stroke</title>
		<author>
			<persName><forename type="first">J</forename><surname>Fung</surname></persName>
		</author>
		<author>
			<persName><forename type="first">A</forename><surname>Lamontagne</surname></persName>
		</author>
		<author>
			<persName><forename type="first">F</forename><surname>Malouin</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><forename type="middle">J</forename><surname>Mcfadyen</surname></persName>
		</author>
		<author>
			<persName><forename type="first">C</forename><forename type="middle">L</forename><surname>Richards</surname></persName>
		</author>
		<idno type="DOI">10.1089/cpb.2006.9.157</idno>
	</analytic>
	<monogr>
		<title level="j">CyberPsychology and Behavior</title>
		<imprint>
			<biblScope unit="volume">9</biblScope>
			<biblScope unit="page" from="157" to="162" />
			<date type="published" when="2006">2006</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b3">
	<analytic>
		<title level="a" type="main">Effects of virtual reality treadmill training on community balance confidence and gait in people post-stroke: a randomized controlled trial</title>
		<author>
			<persName><forename type="first">N</forename><surname>Kim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">Y</forename><surname>Kim</surname></persName>
		</author>
		<author>
			<persName><forename type="first">B</forename><surname>Lee</surname></persName>
		</author>
		<author>
			<persName><forename type="first">W</forename><surname>Min</surname></persName>
		</author>
	</analytic>
	<monogr>
		<title level="j">Journal of Experimental Stroke and Translation Medicine</title>
		<imprint>
			<biblScope unit="volume">9</biblScope>
			<date type="published" when="2016">2016</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b4">
	<analytic>
		<title level="a" type="main">Walking in high-risk settings: Do older adults still prioritize gait when distracted by a cognitive task</title>
		<author>
			<persName><forename type="first">S</forename><surname>Schaefer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Schellenbach</surname></persName>
		</author>
		<author>
			<persName><forename type="first">U</forename><surname>Lindenberger</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Woollacott</surname></persName>
		</author>
		<idno type="DOI">10.1007/s00221-014-4093-8</idno>
	</analytic>
	<monogr>
		<title level="j">Experimental Brain Research</title>
		<imprint>
			<date type="published" when="2014">2014</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b5">
	<analytic>
		<title level="a" type="main">Simulation of a Virtual Reality Environment for Cognitive and Motoric Testing with Unity and a Mechanical Treadmill</title>
		<author>
			<persName><forename type="first">N</forename><surname>Fischer</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Schellenbach</surname></persName>
		</author>
		<idno type="DOI">10.2312/egve.20211337</idno>
	</analytic>
	<monogr>
		<title level="m">ICAT-EGVE 2021 -International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments -Posters and Demos</title>
				<editor>
			<persName><forename type="first">J</forename><surname>Maiero</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">M</forename><surname>Weier</surname></persName>
		</editor>
		<editor>
			<persName><forename type="first">D</forename><surname>Zielasko</surname></persName>
		</editor>
		<imprint>
			<publisher>The Eurographics Association</publisher>
			<date type="published" when="2021">2021</date>
		</imprint>
	</monogr>
</biblStruct>

<biblStruct xml:id="b6">
	<analytic>
		<title level="a" type="main">Feedback-controlled locomotion in virtual environments</title>
		<author>
			<persName><forename type="first">P</forename><surname>Czienskowski</surname></persName>
		</author>
		<author>
			<persName><forename type="first">M</forename><surname>Schellenbach</surname></persName>
		</author>
		<author>
			<persName><forename type="first">T</forename><surname>Oertzen</surname></persName>
		</author>
		<idno type="DOI">10.1145/1463160.1463216</idno>
	</analytic>
	<monogr>
		<title level="m">NordiCHI &apos;08: Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges</title>
				<imprint>
			<date type="published" when="2008">2008</date>
			<biblScope unit="page" from="447" to="450" />
		</imprint>
	</monogr>
</biblStruct>

				</listBibl>
			</div>
		</back>
	</text>
</TEI>
