<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta />
    <article-meta>
      <title-group>
        <article-title>Accuracy of throwing distance perception in Virtual Reality</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Karolis Butkus</string-name>
          <email>k.butkus@ktu.edu</email>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Tautvydas Čeponis</string-name>
          <email>tautvydas.ceponis@ktu.edu</email>
          <xref ref-type="aff" rid="aff1">1</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Department of Information Systems, Kaunas University of Technology</institution>
          ,
          <addr-line>Kaunas</addr-line>
          ,
          <country country="LT">Lithuania</country>
        </aff>
        <aff id="aff1">
          <label>1</label>
          <institution>Department of Multimedia, Kaunas University of Technology</institution>
          ,
          <addr-line>Kaunas</addr-line>
          ,
          <country country="LT">Lithuania</country>
        </aff>
      </contrib-group>
      <fpage>121</fpage>
      <lpage>124</lpage>
      <abstract>
        <p>- This article investigates how people perceive distances in virtual reality (VR) and use that information to execute a representation of a real life throwing motion. In order to measure accuracy, this research proposes a throwing motion testing framework, which acquires metrics data from both the real and virtual environments. The results show, that the examinees tend to throw more accurately at longer distances and use excessive amounts of force.</p>
      </abstract>
      <kwd-group>
        <kwd>virtual reality</kwd>
        <kwd>perception</kwd>
        <kwd>accuracy</kwd>
        <kwd>throwing motion</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>I. INTRODUCTION</title>
      <p>During the last decade virtual reality technology has
significantly improved and is used in different technological
spheres. The visual representation is becoming more realistic
and looks more natural. Although technology is evolving, it
is hard to replicate human senses. Therefore, this study tries
to analyze how accurately people perceive virtual world
distances when executing a throw.</p>
      <p>This study presents a throwing motion testing framework
to determine the differences between the virtual and real
world’s environment perception capabilities. It will discuss
similar studies in the field related to perception and motion
tracking, explain the testing framework and methodology,
the experiment’s process, discussion about the results and
drawbacks of this study and the conclusion, possible future.</p>
      <p>
        A similar project [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] to determine the perception of
virtual reality was carried out in 2008 by researchers from
Aachen, Germany. In their experiment, they asked 23
participants to estimate distances to virtual reality objects in
three different environments. Results show that people tend
to underestimate distances and that visual surroundings did
not affect results considerably.
      </p>
      <p>
        Another article checked people’s ability to locate
themselves in a virtual environment. Their task was to point
at themselves in a VR platform using a pointer. The
experiments results stated that participants most commonly
locate themselves at the upper region of their face and that
draws a conclusion that people in a virtual environment are
more head-centered. [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ]
      </p>
      <p>
        A more recent study [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] was carried out by researchers
from Iowa State University. The group examined prior
attempts at improving distance perception in a Virtual
environment (VE) and proposed a more thorough
methodology to measure the results by isolating unaccounted
variables in past studies. The experiment tested the
participant’s size and distance perception in a VE replica of a
real world room with half of the examinees having seen the
© 2019 for this paper by its authors. Use permitted under Creative
Commons License Attribution 4.0 International (CC BY 4.0)
room prior to the experiment and half participating blindly.
The first tested method for improved distance perception was
visual replication of a real world environment, the second
was walking interaction, which allowed participants to move
around the virtual environment prior to testing. The results
concluded that walking interaction significantly increased the
accuracy of distance perception and size perception to a
lesser degree. Furthermore, it was more effective than visual
replication in both scenarios.
      </p>
      <p>
        A similar study to research [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] was carried out at
Clemson University in 2011 [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ]. In this experiment,
researchers investigated near-field egocentric distance
estimations in an Immersive Virtual Environment and
compared it to real world distances. The experiment
examined two methods: verbal and reach measurements.
Participants had to report distances verbally and then show it
with their reach. Results show that both verbal and reach
methods tend to underestimate distance and that with an
increase in distance deviation also increased. Another
interesting fact was that the verbal method was less accurate
than the reach method.
      </p>
      <p>
        The study [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] made by three researchers from the
Dresden University of Technology attempted to find out
what factors mostly affect people’s estimations for distance
in the virtual world. They arranged the factors in four
groups: measurement methods, technical, compositional and
human factors. The research concluded that people tend to
underestimate distance and that to improve human distance
recognition skills - a rich, detailed environment and powerful
technical hardware must be ensured. Such as high quality
graphics, carefully adjusted camera settings and virtual
environment with a regularly structured ground texture.
      </p>
      <p>
        As mentioned a few times in other researches people tend
to underestimate distances in virtual reality and according to
Steven M. LaValle, the cause for that could be different gaps
between pupils [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ]. If pupils in the real world are closer than
in the virtual world, the virtual environment looks larger to
the user and the other way round if the pupils are further
apart in the real world.
      </p>
      <p>II. THROWING MOTION TESTING FRAMEWORK</p>
      <p>To determine the differences in perception between
reality and a virtual environment, we focused on the different
aspects of throwing kinematics in reality and VR. Three
main characteristics are taken into consideration: throwing
distance in reality, throwing distance in virtual reality and the
initial velocity of the hand tracker in a throwing motion.</p>
      <p>To measure the above mentioned features a throwing
simulation framework was created. During the testing
procedure participants throw a 10 gram ball to three different
distances (2 meters, 3 meters, 4 meters) and a tracker
attached to their hand transmits VR data which is recorded
digitally, while real life distance is measured with a ruler.
Each participant has three attempts at three distances with the
virtual reality headset being used and another with it
mounted on top of their head for tracking accuracy.</p>
      <p>The testing system is developed using Unity Engine and
HTC Vive Pro VR headset and tracker. The framework’s
visual environment is a replica of the room where the
simulation was performed so it would not cause distractions
to the participants. Distances at which the ball is thrown and
standing position are marked in both the real and virtual
environments (Fig. 1 and Fig. 2).</p>
      <p>In the experiment, HTC Vive Pro virtual reality headset
and tracker are both connected to a personal computer with
Windows 10 operating system. The tracker’s data collecting
Base stations 2.0 were placed at 5 meter distances from each
other, at opposing corners of the room. The testing
framework was built in Unity 2018.3.5.f1 with an
implemented SteamVR plugin. The original plugin’s view for
the ball throw scene was edited so that it would replicate the
experiment room and the stock throw function was modified
so that it didn’t require any buttons to be pushed. The throw
in the system is initiated when the tracker is swung and the
velocity of the tracker starts to slow down after the constant
increase in velocity at the start of the throw. The simulated
environment replica consists of a 9 meter by 6 meter square
room with an open top. The layout is positioned at the exact
locations of real world objects.</p>
      <p>To collect quite accurate motion data the tracker is
attached to the palm of the participant and the ball is put on
top of the device (Fig. 3). When the person executes a throw
the tracker captures the initial velocity, and upon slowing
down the system initiates a throw in virtual reality and sends
the collected speed and data about the ball’s collision with a
ground surface to a text file. The real distance is measured
with a ruler and all collected digital and non-digital data is
saved in a spreadsheet.</p>
    </sec>
    <sec id="sec-2">
      <title>III. EXPERIMENT</title>
      <p>The main goal of the experiment is to determine how
accurate is a human’s perception at determining distances
using a virtual throwing mechanism compared to a real
world throw.</p>
      <p>The experiment participants were six people: 4 males and
2 females. The participants age ranged from 19 to 25 years
(mean age 22.3), all of them were healthy and didn’t suffer
from VR sickness. At the beginning of the test, the
participants were given time to practice throwing in virtual
reality and get used to it. Then the examinees did three
consecutive throws at specified distances without a headset
and then they had three attempts with the virtual reality
device. This process was repeated three times at three
different shooting distances. During the experiment,
participants were not allowed to move from the starting
position. The collected distance and velocity data was saved
in a spreadsheet.</p>
      <p>The experiment’s results are presented in Table I where
every user’s average thrown distance is shown in a
centimeters format. Results of shots with virtual reality
equipment and without it are separated and the total average
of each baseline distance is calculated.</p>
      <p>From Table I it is easy to see that people throw the ball
most accurately at a distance of 3 or 4 meters when using the
VR headset, whereas at the 2 meter mark there is a 10
centimeters deviation. However, with unobstructed vision
people throw the ball more accurately at the first and third
distances and in this case there is about a 10 centimeters
deviation from the second distance. This data shows that
with an increase in distance people’s throws tend to become
more accurate, whereas near distances are more difficult to
judge.
b. SD – Standard deviation
In addition, from the bar chart shown in Fig. 4, which
represents the average miss distance from a mark (negative
value if it is shorter than the baseline distance and positive if
the average value is greater), it is noticeable that the
experiment participants tend to underestimate distances and
throw the ball at a shorter distance. Only two columns show
a slight ball overthrow and both belong to results achieved in
virtual reality .</p>
      <p>Table II shows every participant’s standard deviation of
three throws and average standard deviation which is about
17 centimeters. Therefore, it can be said that the experiment
needs an increase in participants and throw attempts to make
the experiment’s data even more accurate.</p>
      <p>Data about the average initial velocity is presented in a
clustered columns chart and a scatter graph (Fig. 5 and Fig.
6) where the baseline distances and different environments
are separated. Besides average values, medians are given to
make the data more accurate.</p>
      <p>From Figures 5 and 6 it is noticeable that people tend to
throw the ball with more power when they are in a virtual
environment than when they are in the real world. This
statement also is reaffirmed by the medians of all throws in
real and virtual worlds.</p>
    </sec>
    <sec id="sec-3">
      <title>IV. DISCUSSION</title>
      <p>This study was conducted to find out how accurately
people perceive the virtual environment and decide what
amount of power is needed to throw the ball. To achieve this
goal 6 participants took part in the experiment where they
had to throw a ball at 3 distances with and without a VR
headset.</p>
      <p>
        After all tests, the collected data shows that people’s
accuracy with VR tends to increase with an increase in
distance and that the average initial speed tends to be higher
than pitching the ball without the headset. To explain the
increase in velocity we could say that because people are
more head-centered [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ] in a virtual environment, they sense
that distance is further than it actually is. Moreover, people
are more likely to underthrow than overthrow the ball in real
life and the increase in velocity when using VR allows their
shots to be more precise. But when people are throwing close
range shots the distances spread out and accuracy decreases.
      </p>
      <p>
        These results show, that the described method can be
used to calibrate hand strength in Virtual Environment fields,
such as gaming [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ], simulations [
        <xref ref-type="bibr" rid="ref9">8</xref>
        ], gesture recognition
systems [
        <xref ref-type="bibr" rid="ref10">9</xref>
        ]. The motion force a person outputs in a fully
immersed virtual system has to be decreased by 3 – 5 % to
assure that the user’s perception of his virtual strength
matches the real world results and compensates their depth
perception in a VE.
      </p>
      <p>To acquire more accurate estimations we cannot forget
that all velocity data is collected by a wireless tracker and the
real ball that was put on the tracker could interfere with
results and that could be a reason why the standard deviation
for a few participant’s throws was so high.</p>
      <p>
        In addition, to help the person better comprehend the
depth of a virtual world during the experiment it could be
allowed for the participants to walk around the room as
shown in research [
        <xref ref-type="bibr" rid="ref3">3</xref>
        ] and not undertake the whole
experiment from a standing position while only having to
trust their vision.
      </p>
      <p>Furthermore, it was brought to the examinees attention,
that to get more accurate results the participants had to do a
bigger backswing while performing the throwing motion to
get a more consistent velocity and more suitable throw
initialization timings.</p>
    </sec>
    <sec id="sec-4">
      <title>V. CONCLUSION AND FUTURE WORKS</title>
      <p>In this study, we concluded, that people perceives 2 – 4
meter distances nearly the same as in real life. Moreover,
people tend to use 3 to 5 % more power when throwing a
ball in virtual reality than in real life. However, the used
methodology needs improvement (some throws standard
deviation is as high as 45 centimeters) to eliminate
unnecessary factors, such as inaccuracy of manual real world
measurements and signal integrity loss from ball position
relative to the sensor. Furthermore, a larger pool of
participants is needed to achieve precise data averages and
calculations. There is also the possibility to attach a separate
sensor to the ball that is being thrown by the participants,
thus eliminating the need for real world measurements by
allowing us to compare the data between both throws
directly. Although the research is not perfect it has
considerable potential to be used as a calibration tool for
various virtual reality fields which involve hand motion and
arm strength.</p>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>C.</given-names>
            <surname>Armbrüster</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Wolter</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Kuhlen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Spijkers</surname>
          </string-name>
          and
          <string-name>
            <given-names>B.</given-names>
            <surname>Fimm</surname>
          </string-name>
          , “
          <article-title>Depth Perception in Virtual Reality: Distance Estimations in Periand Extrapersonal Space”</article-title>
          ,
          <source>CyberPsychology &amp; Behavior</source>
          , vol.
          <volume>11</volume>
          , no.
          <issue>1</issue>
          , pp.
          <fpage>9</fpage>
          -
          <lpage>15</lpage>
          ,
          <year>2008</year>
          . Available:
          <volume>10</volume>
          .1089/cpb.
          <year>2007</year>
          .9935
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>A. H. V. D.</given-names>
            <surname>Veer</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A. J. T.</given-names>
            <surname>Alsmith</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. R.</given-names>
            <surname>Longo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. Y.</given-names>
            <surname>Wong</surname>
          </string-name>
          , and
          <string-name>
            <given-names>B. J.</given-names>
            <surname>Mohler</surname>
          </string-name>
          , “
          <article-title>Where am I in virtual reality?,” Plos One</article-title>
          , vol.
          <volume>13</volume>
          , no.
          <issue>10</issue>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>J. W.</given-names>
            <surname>Kelly</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. A.</given-names>
            <surname>Cherep</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Klesel</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Z. D.</given-names>
            <surname>Siegel</surname>
          </string-name>
          , and
          <string-name>
            <given-names>S.</given-names>
            <surname>George</surname>
          </string-name>
          , “
          <article-title>Comparison of Two Methods for Improving Distance Perception in Virtual Reality,”</article-title>
          <source>ACM Transactions on Applied Perception</source>
          , vol.
          <volume>15</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          , Mar. 2018
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>P.</given-names>
            <surname>Napieralski</surname>
          </string-name>
          et al.,
          <article-title>"Near-field distance perception in real and virtual environments using both verbal and action responses"</article-title>
          ,
          <source>ACM Transactions on Applied Perception</source>
          , vol.
          <volume>8</volume>
          , no.
          <issue>3</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>19</lpage>
          ,
          <year>2011</year>
          . Available:
          <volume>10</volume>
          .1145/2010325.2010328.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>R.</given-names>
            <surname>Renner</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Velichkovsky</surname>
          </string-name>
          and
          <string-name>
            <given-names>J.</given-names>
            <surname>Helmert</surname>
          </string-name>
          ,
          <article-title>"The perception of egocentric distances in virtual environments - A review"</article-title>
          ,
          <source>ACM Computing Surveys</source>
          , vol.
          <volume>46</volume>
          , no.
          <issue>2</issue>
          , pp.
          <fpage>1</fpage>
          -
          <lpage>40</lpage>
          ,
          <year>2013</year>
          . Available:
          <volume>10</volume>
          .1145/2543581.2543590.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>S. LaValle</surname>
          </string-name>
          , Virtual Reality. Cambridge University Press,
          <year>2016</year>
          , pp.
          <fpage>153</fpage>
          -
          <lpage>154</lpage>
          . Available: http://vr.cs.uiuc.edu/vrbooka4.pdf
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>R.</given-names>
            <surname>Buzys</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Maskeliūnas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Damaševičius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Sidekerskienė</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Woźniak</surname>
          </string-name>
          and
          <string-name>
            <given-names>W.</given-names>
            <surname>Wei</surname>
          </string-name>
          ,
          <article-title>"</article-title>
          <source>Cloudification of Virtual Reality Gliding Simulation Game", Information</source>
          , vol.
          <volume>9</volume>
          , no.
          <issue>12</issue>
          , p.
          <fpage>293</fpage>
          ,
          <year>2018</year>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          Available:
          <volume>10</volume>
          .3390/info9120293.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>E.</given-names>
            <surname>Danevičius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Maskeliūnas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Damaševičius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Połap</surname>
          </string-name>
          and
          <string-name>
            <given-names>M.</given-names>
            <surname>Woźniak</surname>
          </string-name>
          ,
          <article-title>"A Soft Body Physics Simulator with Computational Offloading to the Cloud"</article-title>
          ,
          <source>Information</source>
          , vol.
          <volume>9</volume>
          , no.
          <issue>12</issue>
          , p.
          <fpage>318</fpage>
          ,
          <year>2018</year>
          . Available:
          <volume>10</volume>
          .3390/info9120318.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>A.</given-names>
            <surname>Vaitkevičius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Taroza</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Blažauskas</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Damaševičius</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Maskeliūnas</surname>
          </string-name>
          and
          <string-name>
            <given-names>M.</given-names>
            <surname>Woźniak</surname>
          </string-name>
          ,
          <article-title>"Recognition of American Sign Language Gestures ina Virtual Reality Using Leap Motion"</article-title>
          ,
          <source>Applied Sciences</source>
          , vol.
          <volume>9</volume>
          , no.
          <issue>3</issue>
          , p.
          <fpage>445</fpage>
          ,
          <year>2019</year>
          . Available:
          <volume>10</volume>
          .3390/app9030445.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [10]
          <string-name>
            <given-names>D.</given-names>
            <surname>Połap</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Woźniak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Napoli</surname>
          </string-name>
          and
          <string-name>
            <given-names>E.</given-names>
            <surname>Tramontana</surname>
          </string-name>
          ,
          <article-title>"Real-time cloud-based game management system via cuckoo search algorithm"</article-title>
          <source>International Journal of Electronics and Telecommunications</source>
          , vol.
          <volume>4</volume>
          , n.
          <volume>61</volume>
          , p.
          <fpage>333</fpage>
          -
          <lpage>338</lpage>
          ,
          <year>2015</year>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>