<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Y. Sun);</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Wrist-worn Haptic Design for 3D Perception of the Surrounding Airflow in Virtual Reality⋆</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Yuxuan Sun</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <contrib contrib-type="author">
          <string-name>Yuta Sugiura</string-name>
          <xref ref-type="aff" rid="aff0">0</xref>
        </contrib>
        <aff id="aff0">
          <label>0</label>
          <institution>Keio University</institution>
          ,
          <addr-line>Yokohama</addr-line>
          ,
          <country country="JP">Japan</country>
        </aff>
      </contrib-group>
      <pub-date>
        <year>2024</year>
      </pub-date>
      <volume>000</volume>
      <fpage>0</fpage>
      <lpage>0003</lpage>
      <abstract>
        <p>This study proposed a wrist-worn device that displays 3-dimensional (3D) environmental airflow information during Virtual Reality (VR) experience by controlling the rotation of air outlets using servo motors and providing physical airflow using air pumps. Instead of air resistance as the consequence of strong airflow, we focused on reflecting the natural existence of the airflow itself, while maintaining the advantages of wearable devices including being lightweight and convenient to set up. We designed VR scenes to record users' perception accuracy of airflow in diferent directions and their level of confidence, and evaluated through a within-subject experiment. Based on the results, we discussed the methods for future advancement of haptic perception as well as possible VR application scenarios.</p>
      </abstract>
      <kwd-group>
        <kwd>eol&gt;Virtual Reality</kwd>
        <kwd>Haptics</kwd>
        <kwd>Wearable Design</kwd>
        <kwd>Airflow Sensation</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>1. Introduction</title>
    </sec>
    <sec id="sec-2">
      <title>2. Related Work</title>
      <p>
        To address the lack of real-time haptic perception of Due to limited configurations of current commercial VR
VR experience, researchers have presented multiple ap- devices, they cannot provide real-time and lifelike haptic
proaches for simulating wind using air jets and pro- feedback along with the change of visual scene. Thereby,
pellers. Since the airflow is generally produced during approaches have been investigated to physically augment
the player’s two-degree-of-freedom (2-DoF) movements users’ feeling of motion and wind [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ].
and three-degree-of-freedom (3-DoF) rotation, or the use
of hand-held tools, most studies paid attention to the cre- 2.1. Airflow-based Force Feedback
ation of instant two-dimensional (2D) directional force on
the player’s body as the efect of airflow and air pressure. Researchers have installed air jets [
        <xref ref-type="bibr" rid="ref2 ref5">2, 3</xref>
        ] and ducted fans
      </p>
      <p>
        In this study, we aim to convey the information of [
        <xref ref-type="bibr" rid="ref6">4</xref>
        ] on the head-mounted display (HMD) and combine
continuous 3D airflow in the virtual surrounding envi- air propulsion forces from multiple directions to drive
ronment to the user, through a small and lightweight the head’s ofset, aiming for supporting 2-DoF lateral
device that makes a localized body area come into con- acceleration and 3-DoF rotation. They have proved the
tact with physically generated airflow. Therefore, we efectiveness of presenting air propulsion on physical
deproposed a wearable device (Fig. 1 (a)) that allows users vices to simulate air resistance in VR as the consequence
to be physically aware of the presence of the wind from of the virtual body’s movements. On the other hand,
specific directions by the tactile perception on the skin Yu et al. [
        <xref ref-type="bibr" rid="ref7">5</xref>
        ] have proposed the design of a 360°
vibrosurface around the wrist without checking any visual tactile headband to provide directional cues, which is
data or auditory cues. helpful in indirectly indicating the airflow direction. So
      </p>
      <p>
        By conducting a within-subject user experiment, we far, these studies have primarily focused on actuating the
evaluated if users were able to understand the wind direc- head to address the lack of tactile perception during the
tion correctly and confidently through the haptic device. high-speed activities in the virtual world, and
successfully promote the immersion of VR systems. However,
lots of people experienced discomfort such as dizziness
and nausea when using VR headsets due to receiving
conflicting visual and body movement information [
        <xref ref-type="bibr" rid="ref8 ref9">6, 7</xref>
        ].
      </p>
      <p>
        Considering that environmental airflow dynamics are
largely independent of human activities, the simulation
of environmental airflow on a HMD while participants
engage in autonomous movement may result in
inconsistencies between visual stimuli and head-driven haptic
stimuli. This condition could similarly cause discomfort
of information transfer, which increases the possibility of
demonstrating 3D surrounding environment by
airflowbased feedback. Hence, our goal is to retain the benefit
of wearable design while achieving the presentation of
3D airflow patterns for VR users.
3. System Implementation
3.1. Wearable Haptic Device
the air propulsion has been achieved through air jets [
        <xref ref-type="bibr" rid="ref3 ref4">8, 9</xref>
        ]
or propellers [10, 11, 12] attached to handheld controllers,
yet confining hands to the grabbing posture and
limiting natural movements. A handheld tool will become an
obstacle instead of an aid if users are expected to freely
manipulate control interfaces or objects while
perceiving ambient airflow, such as pushing buttons or holding
sticks with hands and fingers. The wrist-worn AirGlove
[13] and Wind-blaster [14] have proved the feasibility of
applying thrust force of varying magnitudes and from
different directions on the hand, at the same time enabling
lfexible hand movements and rotations. Whether
handheld or wrist-worn devices, their primary purpose is the
exertion of force on the human body during movements
or impacts involving individuals or objects, regardless of
any hindrance of manual dexterity.
      </p>
      <p>
        To make sure that airflow from 3-DoF directions can make
contact with the skin surface, the haptic device has to
be worn in a body location that is not limited to a single
2D plane. Among the body parts that have both dorsal
and ventral side, the wrist has the highest sensitivity for
airflow recognition [
        <xref ref-type="bibr" rid="ref12">17</xref>
        ]. Therefore, we chose the wrist
as the designated wearable body location for our design.
2.2. Wind Sensation The device (Fig. 1 (a)) consists of two identical
direction control systems, which are composed of 3D-printed
The studies above took advantage of instant generation components and connected by 3cm wide elastic bands.
of airflow and air propulsion force to reflect the impact of Each system faces the wrist from the left or right side
human virtual activities, while our research is focusing and contains two micro servo motors (SG92R), which
on continuous existence of airflow for human perception respectively control the orientation of an air outlet in
of the virtual environment. Similarly, Ito et al. [
        <xref ref-type="bibr" rid="ref10">15</xref>
        ] posi- the up-down and front-back directions. Each air outlet is
tioned two fans at the front and back of the VR user to remotely connected to a 12V air pump (AP520B-120) via
produce actual wind in the environment, whereas as the an air tube (6mm outer diameter, 3mm inner diameter).
support of airflow variation it had limited presentation Both two air pumps are supported by a 12V battery as a
because of the fixed location and orientation. FaceHap- power resource, and their open/close status is controlled
tics [
        <xref ref-type="bibr" rid="ref11">16</xref>
        ] included wind sensing as part of an integrated by a motor driver (DRV8835).
movable haptic system, making user perception more The motor driver and four servo motors are all
conspatialized. However, it has the same drawback as other nected to and controlled by a Arduino UNO
microconhead-worn devices, which is a large system weight that troller board. Each servo motor has a degree of freedom
afects head movements and possibly causes physiolog- of 160 degrees. Therefore, airflow from Up, Down, Front,
ical discomfort. Moreover, Jaeyeon Lee and Geehyuk and Back directions is provided by two air outlets
simultaLee [
        <xref ref-type="bibr" rid="ref12">17</xref>
        ] paid attention to haptic stimuli at diferent loca- neously, while airflow from other directions is managed
tions throughout the full body, and resolved the concern by the corresponding left or right direction control
systhrough a lightweight and contactless wearable display. tem. When airflow comes from an upward angle, the
They proposed that displaying 2D airflow patterns has air outlet is oriented upward, allowing the airflow to
comparable efectiveness to vibrotactile patterns in terms contact the upper side of the wrist skin; vice versa.
Similarly, airflow coming from a forward or backward angle
is managed in the same manner.
      </p>
      <sec id="sec-2-1">
        <title>3.2. VR Application</title>
        <sec id="sec-2-1-1">
          <title>To analyze human recognition of 3D airflow directions,</title>
          <p>we implemented two VR scenes for 3D data recording: 1)
Practice Scene, in which users can feel free to experience
the feeling of airflow in diferent directions; 2) Test Scene,
in which users indicate the airflow direction they perceive
and answer their level of confidence.</p>
          <p>The Practice Scene has 14 "Press me!" buttons posi- Figure 3: Bar chart of participants’ accuracy in perception
tioned in evenly distributed 14 directions originating tests, and line chart of their considered dificulty of perception
from the VR headset (Fig. 1 (b)). 6 of them are the posi- on a 5-point Likert scale, in terms of three dimensions.
tive and negative directions of the X, Y, and Z axes (i.e.
+X, -X, +Y, -Y, +Z, -Z), while the other 8 correspond to
the diagonal directions of the 8 quadrants formed by 4.1. Study Procedure
these axes. When a "Press me!" button is pressed, the VR
system decides the status of air pumps and the rotation Participants sat on a non-swivel chair in a quiet and
angles of servo motors that make the air outlets pointing closed area. Participants wore our haptic device on the
to the corresponding position on the wrist skin originat- wrist of their non-dominant hand, and placed their hand
ing from the haptic device, then send data to the haptic and elbow on 4.5cm stands on the table with palm facing
device via communication with Arduino. inward. Participants then put on a Meta Quest Pro
head</p>
          <p>Following the same mechanism, the Test scene ran- set and a noise-canceling earphone with the researcher’s
domly selects one of the 14 directions as a test question help, and hold a VR controller in the dominant hand.
and controls the Arduino to present the corresponding Participants first tried "Press me!" buttons in the
Pracairflow, with participants completing 28 tests in which tice Scene without a time limit, and then proceeded to the
each direction occurs twice. There are 14 black buttons Test Scene. For each test, participants selected a black
butin the same position as the 14 "Press me!" buttons in Prac- ton to indicate the perceived airflow direction, selected
tice Scene, which means that the user’s answers are also confidence level and continued to the next test. After
among the 14 directions. After the VR system receives completing 28 tests, participants filled out a
questionthe answer, 5 button choices from "Very Unconfident" to naire about haptic perception experience, with responses
"Very Unconfident" will be shown on the scene to require recorded anonymously.
users’ level of confidence.</p>
        </sec>
      </sec>
      <sec id="sec-2-2">
        <title>4.2. Participants</title>
        <p>4. User Study We recruited 8 participants, 4 males and 4 females with
age 22-28 (mean = 25.3, SD = 2.1). All participants were
To evaluate users’ ability to understand the direction of right-handed and three had prior experience with tactile
3D airflow by wearing the haptic device, we conducted a experiments. 2 participants are familiar with VR. Each
within-subject study on perception accuracy and level of participant received 1,500 yen as an honorarium.
confidence.
4.3. Results
changed from spherical rotation to 2D movement to
create a clearer positional distinction between the front and
back ends. To strengthen the intensity of wind
perception, the air pumps could be replaced with more
powerful alternatives. Regular short pauses could be
implemented during the continuous display of airflow to
facilitate users’ comprehension. A quantitative evaluation
could be carried out to assess the precision of haptic
display. Additionally, to further investigate the efect
of wind contact on various skin areas during VR
experience, the device could be worn on multiple diferent
body locations other than the wrist.</p>
      </sec>
      <sec id="sec-2-3">
        <title>5.2. Possible Applications</title>
        <p>Fig. 2 shows the distribution of airflow perception
errors and participants’ level of confidence regarding each
airflow direction. Participants (P2/5/6) explained that
changes in the hit point on their skin helped them
determine wind directions. P6 noted that her direction
recognition also relied on the skin area where she felt particularly
cool. P4/7/8 judged directions referring to the structure of
the haptic device and movement of the air outlet around
the wrist. Users’ average level of confidence is shown
to be similar when deciding diferent airflow directions
(mean = 3.32, SD = 0.27). P5 mentioned that her
conifdence increased when airflow changed along a single
dimension, e.g. Front-Right-Up to Back-Right-Up.</p>
        <p>The accuracy and considered dificulty of airflow
perception in three dimensions are shown in Fig. 3.
Participants demonstrated the highest accuracy and the lowest
dificulty in answering Left/Right direction among the
three dimensions. In contrast, we observed their accuracy
in determining Front/Back direction to be much lower
compared to the other two dimensions, but participants
considered figuring out Up/Down direction to be as
dificult as figuring out Front/Back direction. In more detail,
P3 described that she first determined the Left/Right side,
then a more specific position (i.e. Up/Down and
Front/Back on the Left/Right side). Moreover, P1 sometimes
felt the airflow was too weak, and P6 suggested that
increasing wind strength might make it easier to recognize
the wind direction. P3/4 believed they could better
understand airflow direction if it were intermittently displayed.</p>
        <p>Compared to placing fans in physical rooms or
equipping them with HMD, our wearable device could be more
compact and convenient for experiencing the sensation
of wind in VR. Our design, which supports continuous
3D reflection of the surrounding, could be particularly
suitable for scenarios involving 3-DoF movements at
special locations, e.g. high altitudes, forests and caves. In
such situations, users are exposed to a 3D space
encountering airflow from various dimensions, hence airflow
plays an important role in users’ environmental
cognition. Through future development, we aim to provide
more precise directional information to help users
better understand environmental changes in such contexts.</p>
        <p>
          As a hands-free haptic device, we propose wearing it
while engaging in games like battle games that require
holding guns and knives, or applications like flight
simulators that require mastering steering wheels or control
5. Discussion joysticks. Based on these possibilities, we will examine
the device’s usability within such sample scenarios that
5.1. Improvement of Airflow Perception involve realistic body movements. Simultaneously, the
influence of visual content in HMD on users’ recognition
According to the feedback, we analyzed the factors influ- of airflow direction will be investigated.
encing user recognition of airflow direction. Considering Our device could be used together with plenty of haptic
that the accuracy of airflow perception from Left/Right, equipment worn in diferent body areas, such as on the
Up/Down and Front/Back directions decreases in turn, hands [
          <xref ref-type="bibr" rid="ref13">18</xref>
          ] or on the back [
          <xref ref-type="bibr" rid="ref14">19</xref>
          ]. Their combination may
we summarized two possible causes: 1) The diference yield synergistic benefits, creating a more comprehensive
between skin contact areas. Airflow from Left/Right di- tactile perception while immersing in the virtual world.
rection touches the wrist dorsal or wrist ventral, similarly
airflow from Up/Down direction touches the upper or
lower side of the wrist. The Front/Back direction is paral- 6. Conclusion
lel to the arm, resulting in no specific boundary as
reference points, hence users may not be able to clearly define In this study, we developed a wrist-worn haptic device
the distinction. 2) The structure of the designed haptic that transfers 3D information of the surrounding
virdevice, involving two parts respectively facing the wrist tual airflow to VR users by controlling the directions
dorsal and wrist ventral. The actuation of servo motors of two air outlets. Through a user study, we analyzed
and movement of the air outlet on each part may amplify their accuracy of recognizing 3D airflow direction and
the users’ tactile sensation of Left/Right direction. their corresponding level of confidence. In the future, we
        </p>
        <p>To address current user confusion, we will implement will explore various methods to improve the accuracy of
various measures to improve the accuracy of 3D direc- airflow recognition, and apply this haptic design to VR
tion recognition. The movement of air outlets could be scenarios with fluctuating wind conditions.</p>
      </sec>
    </sec>
    <sec id="sec-3">
      <title>Acknowledgments</title>
      <sec id="sec-3-1">
        <title>Part of this work was supported by JST PRESTO (Grant Number JPMJPR2134).</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <given-names>E.</given-names>
            <surname>Sawada</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Ida</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Awaji</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Morishita</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Aruga</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R.</given-names>
            <surname>Takeichi</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Fujii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Kimura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Nakamura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Furukawa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>N.</given-names>
            <surname>Shimizu</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Tokiwa</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Nii</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sugimoto</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Inami</surname>
          </string-name>
          ,
          <article-title>Byu-byu-view: a wind communication interface, in: ACM SIGGRAPH 2007 Emerging Technologies</article-title>
          , SIGGRAPH '07,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2007</year>
          , p.
          <fpage>1</fpage>
          -
          <lpage>es</lpage>
          . URL: https://doi.org/10.1145/1278280.1278282. doi:
          <volume>10</volume>
          .1145/1278280.1278282.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <given-names>B.-C.</given-names>
            <surname>Ke</surname>
          </string-name>
          ,
          <string-name>
            <surname>M.-H. Li</surname>
            ,
            <given-names>Y.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
          </string-name>
          , C.-Y. Cheng, C.-
          <string-name>
            <surname>J. Chang</surname>
            ,
            <given-names>Y.-F.</given-names>
          </string-name>
          <string-name>
            <surname>Li</surname>
            ,
            <given-names>S.-Y.</given-names>
          </string-name>
          <string-name>
            <surname>Wang</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Fang</surname>
            ,
            <given-names>M. Y.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
          </string-name>
          , Turnahead: Designing 3
          <article-title>-dof rotational haptic cues to improve ifrst-person viewing (fpv) experiences</article-title>
          , in
          <source>: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, CHI '23</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2023</year>
          . URL: https://doi.org/10.1145/3544548.3581443. [10]
          <string-name>
            <given-names>S.</given-names>
            <surname>Heo</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Chung</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Wigdor</surname>
          </string-name>
          ,
          <article-title>Thor's hammer: An ungrounded force feedback device utilizing propeller-induced propulsive force</article-title>
          ,
          <source>in: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI '18</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2018</year>
          , p.
          <fpage>1</fpage>
          -
          <lpage>11</lpage>
          . URL: https://doi.org/10.1145/3173574.3174099. doi:
          <volume>10</volume>
          .1145/3173574.3174099. pos,
          <article-title>Vection and visually induced motion sickness: how are they related?, Frontiers in Psychology 6 (</article-title>
          <year>2015</year>
          ). URL: https://www.frontiersin.org/journals/ psychology/articles/10.3389/fpsyg.
          <year>2015</year>
          .
          <volume>00472</volume>
          . doi:
          <volume>10</volume>
          .3389/fpsyg.
          <year>2015</year>
          .
          <volume>00472</volume>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [8]
          <string-name>
            <given-names>C.-Y.</given-names>
            <surname>Tsai</surname>
          </string-name>
          ,
          <string-name>
            <given-names>I.-L.</given-names>
            <surname>Tsai</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.-J. Lai</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          <string-name>
            <surname>Chow</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          <string-name>
            <surname>Wei</surname>
          </string-name>
          , L.-P. Cheng, M. Y. Chen, Airracket:
          <article-title>Perceptual design of ungrounded, directional force feedback to improve virtual racket sports experiences</article-title>
          ,
          <source>in: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, CHI '22</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2022</year>
          . URL: https://doi.org/10.1145/3491102.3502034. doi:
          <volume>10</volume>
          .1145/3491102.3502034.
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [9]
          <string-name>
            <given-names>Y.-W.</given-names>
            <surname>Wang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.-S.</given-names>
            <surname>Ku</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Miyatake</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Mao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P. Y.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <string-name>
            <surname>C.-M. Tseng</surname>
            ,
            <given-names>M. Y.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
          </string-name>
          ,
          <article-title>Jetcontroller: High-speed ungrounded 3-dof force feedback controllers using air propulsion jets</article-title>
          ,
          <source>in: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, CHI '21</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2021</year>
          . URL: https://doi.org/10.1145/3411764.3445549. doi:
          <volume>10</volume>
          .1145/3411764.3445549. doi:
          <volume>10</volume>
          .1145/3544548.3581443.
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [3]
          <string-name>
            <given-names>S.-H.</given-names>
            <surname>Liu</surname>
          </string-name>
          , P.-C. Yen,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Mao</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.-H.</given-names>
            <surname>Lin</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Chandra</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. Y.</given-names>
            <surname>Chen</surname>
          </string-name>
          ,
          <article-title>Headblaster: a wearable approach to simulating motion perception using headmounted air propulsion jets</article-title>
          ,
          <source>ACM Trans. Graph</source>
          .
          <volume>39</volume>
          (
          <year>2020</year>
          ). URL: https://doi.org/10.1145/3386569. 3392482. doi:
          <volume>10</volume>
          .1145/3386569.3392482.
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>K.</given-names>
            <surname>Watanabe</surname>
          </string-name>
          ,
          <string-name>
            <given-names>F.</given-names>
            <surname>Nakamura</surname>
          </string-name>
          ,
          <string-name>
            <given-names>K.</given-names>
            <surname>Sakurada</surname>
          </string-name>
          , T. Teo, [11]
          <string-name>
            <given-names>S.</given-names>
            <surname>Je</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>W.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>X.-D.</given-names>
            <surname>Yang</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Lopes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M.</given-names>
            <surname>Sugimoto</surname>
          </string-name>
          ,
          <article-title>An Integrated Ducted Fan-Based A. Bianchi, Aero-plane: A handheld force-feedback Multi-Directional Force Feedback with a Head device that renders weight motion illusion on a Mounted Display</article-title>
          , in: H.
          <string-name>
            <surname>Uchiyama</surname>
          </string-name>
          ,
          <string-name>
            <surname>J.-M. Nor</surname>
          </string-name>
          <article-title>- virtual 2d plane</article-title>
          ,
          <source>in: Proceedings of the 32nd mand (Eds.)</source>
          ,
          <string-name>
            <surname>ICAT-EGVE 2022 - International</surname>
          </string-name>
          Con- Annual
          <source>ACM Symposium on User Interface Softference on Artificial Reality and Telexistence and ware and Technology</source>
          ,
          <source>UIST '19, Association for Eurographics Symposium on Virtual Environments</source>
          , Computing Machinery, New York, NY, USA,
          <year>2019</year>
          , The Eurographics Association,
          <year>2022</year>
          . doi:
          <volume>10</volume>
          .2312/ p.
          <fpage>763</fpage>
          -
          <lpage>775</lpage>
          . URL: https://doi.org/10.1145/3332165. egve.
          <volume>20221276</volume>
          . 3347926. doi:
          <volume>10</volume>
          .1145/3332165.3347926.
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [5]
          <string-name>
            <given-names>N.-H.</given-names>
            <surname>Yu</surname>
          </string-name>
          , S.-Y. Ma,
          <string-name>
            <surname>C.-M. Lin</surname>
            ,
            <given-names>C.-A.</given-names>
          </string-name>
          <string-name>
            <surname>Fan</surname>
            ,
            <given-names>L. E.</given-names>
          </string-name>
          [12]
          <string-name>
            <given-names>T.</given-names>
            <surname>Sasaki</surname>
          </string-name>
          ,
          <string-name>
            <given-names>R. S.</given-names>
            <surname>Hartanto</surname>
          </string-name>
          ,
          <string-name>
            <surname>K.-H. Liu</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          <string-name>
            <surname>Tsuchiya</surname>
            , Taglialatela, T.-Y. Huang,
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Yu</surname>
            , Y.-T. Cheng, Y.-
            <given-names>A.</given-names>
          </string-name>
          <string-name>
            <surname>Hiyama</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <string-name>
            <surname>Inami</surname>
          </string-name>
          , Leviopole: mid-air
          <string-name>
            <surname>haptic</surname>
            <given-names>C.</given-names>
          </string-name>
          <string-name>
            <surname>Liao</surname>
            ,
            <given-names>M. Y.</given-names>
          </string-name>
          <string-name>
            <surname>Chen</surname>
          </string-name>
          , Drivingvibe:
          <article-title>Enhancing vr interactions using multirotor, in: ACM SIGGRAPH driving experience using inertia-based vibrotac- 2018 Emerging Technologies</article-title>
          , SIGGRAPH '18,
          <article-title>Astile feedback around the head</article-title>
          ,
          <source>Proc. ACM Hum</source>
          .
          <article-title>- sociation for Computing Machinery</article-title>
          , New York, NY, Comput. Interact.
          <volume>7</volume>
          (
          <year>2023</year>
          ). URL: https://doi.org/10. USA,
          <year>2018</year>
          . URL: https://doi.org/10.1145/3214907. 1145/3604253. doi:
          <volume>10</volume>
          .1145/3604253. 3214913. doi:
          <volume>10</volume>
          .1145/3214907.3214913.
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [6]
          <string-name>
            <given-names>N.</given-names>
            <surname>Dużmańska</surname>
          </string-name>
          ,
          <string-name>
            <given-names>P.</given-names>
            <surname>Strojny</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Strojny</surname>
          </string-name>
          , Can simulator [13]
          <string-name>
            <given-names>H.</given-names>
            <surname>Gurocak</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Jayaram</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Parrish</surname>
          </string-name>
          , U. Jayaram,
          <article-title>sickness be avoided? a review on temporal aspects Weight Sensation in Virtual Environments Using of simulator sickness, Frontiers in Psychology 9 a Haptic Device With Air Jets</article-title>
          ,
          <source>Journal of Com</source>
          (
          <year>2018</year>
          ). URL: https://www.frontiersin.org/journals/ puting and Information Science in Engineering psychology/articles/10.3389/fpsyg.
          <year>2018</year>
          .
          <volume>02132</volume>
          . 3 (
          <year>2003</year>
          )
          <fpage>130</fpage>
          -
          <lpage>135</lpage>
          . URL: https://doi.org/10.1115/1. doi:
          <volume>10</volume>
          .3389/fpsyg.
          <year>2018</year>
          .
          <volume>02132</volume>
          . 1576808. doi:
          <volume>10</volume>
          .1115/1.1576808.
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [7]
          <string-name>
            <given-names>B.</given-names>
            <surname>Keshavarz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B. E.</given-names>
            <surname>Riecke</surname>
          </string-name>
          ,
          <string-name>
            <given-names>L. J.</given-names>
            <surname>Hettinger</surname>
          </string-name>
          ,
          <string-name>
            <surname>J. L</surname>
          </string-name>
          . Cam- [14]
          <string-name>
            <given-names>S.</given-names>
            <surname>Je</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>M. J.</given-names>
            <surname>Kim</surname>
          </string-name>
          ,
          <string-name>
            <given-names>A.</given-names>
            <surname>Bianchi</surname>
          </string-name>
          ,
          <article-title>Wind-blaster: a wearable propeller-based prototype that provides ungrounded force-feedback, in: ACM SIGGRAPH 2018 Emerging Technologies</article-title>
          , SIGGRAPH '18,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2018</year>
          . URL: https://doi.org/10.1145/3214907. 3214915. doi:
          <volume>10</volume>
          .1145/3214907.3214915.
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [15]
          <string-name>
            <given-names>K.</given-names>
            <surname>Ito</surname>
          </string-name>
          ,
          <string-name>
            <given-names>Y.</given-names>
            <surname>Ban</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>Warisawa</surname>
          </string-name>
          , Alteredwind:
          <article-title>Manipulating perceived direction of the wind by crossmodal presentation of visual, audio and wind stimuli, in: SIGGRAPH Asia 2019 Emerging Technologies</article-title>
          ,
          <string-name>
            <given-names>SA</given-names>
            '19,
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2019</year>
          , p.
          <fpage>3</fpage>
          -
          <lpage>4</lpage>
          . URL: https: //doi.org/10.1145/3355049.3360525. doi:
          <volume>10</volume>
          .1145/ 3355049.3360525.
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [16]
          <string-name>
            <given-names>A.</given-names>
            <surname>Wilberz</surname>
          </string-name>
          ,
          <string-name>
            <given-names>D.</given-names>
            <surname>Leschtschow</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Trepkowski</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Maiero</surname>
          </string-name>
          ,
          <string-name>
            <given-names>E.</given-names>
            <surname>Kruijf</surname>
          </string-name>
          ,
          <string-name>
            <given-names>B.</given-names>
            <surname>Riecke</surname>
          </string-name>
          , Facehaptics:
          <article-title>Robot arm based versatile facial haptics for immersive environments</article-title>
          ,
          <source>in: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, CHI '20</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2020</year>
          , p.
          <fpage>1</fpage>
          -
          <lpage>14</lpage>
          . URL: https: //doi.org/10.1145/3313831.3376481. doi:
          <volume>10</volume>
          .1145/ 3313831.3376481.
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [17]
          <string-name>
            <given-names>J.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <string-name>
            <given-names>G.</given-names>
            <surname>Lee</surname>
          </string-name>
          ,
          <article-title>Designing a non-contact wearable tactile display using airflows</article-title>
          ,
          <source>in: Proceedings of the 29th Annual Symposium on User Interface Software and Technology, UIST '16</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2016</year>
          , p.
          <fpage>183</fpage>
          -
          <lpage>194</lpage>
          . URL: https://doi.org/10.1145/2984511. 2984583. doi:
          <volume>10</volume>
          .1145/2984511.2984583.
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [18]
          <string-name>
            <given-names>V.</given-names>
            <surname>Shen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>T.</given-names>
            <surname>Rae-Grant</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J.</given-names>
            <surname>Mullenbach</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Harrison</surname>
          </string-name>
          ,
          <string-name>
            <given-names>C.</given-names>
            <surname>Shultz</surname>
          </string-name>
          ,
          <article-title>Fluid reality: High-resolution, untethered haptic gloves using electroosmotic pump arrays</article-title>
          ,
          <source>in: Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, UIST '23</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2023</year>
          . URL: https: //doi.org/10.1145/3586183.3606771. doi:
          <volume>10</volume>
          .1145/ 3586183.3606771.
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [19]
          <string-name>
            <given-names>P.</given-names>
            <surname>Lopes</surname>
          </string-name>
          ,
          <string-name>
            <given-names>S.</given-names>
            <surname>You</surname>
          </string-name>
          , L.-P. Cheng, S. Marwecki,
          <string-name>
            <given-names>P.</given-names>
            <surname>Baudisch</surname>
          </string-name>
          ,
          <article-title>Providing haptics to walls &amp; heavy objects in virtual reality by means of electrical muscle stimulation</article-title>
          ,
          <source>in: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI '17</source>
          ,
          <string-name>
            <surname>Association</surname>
          </string-name>
          for Computing Machinery, New York, NY, USA,
          <year>2017</year>
          , p.
          <fpage>1471</fpage>
          -
          <lpage>1482</lpage>
          . URL: https: //doi.org/10.1145/3025453.3025600. doi:
          <volume>10</volume>
          .1145/ 3025453.3025600.
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>