<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Archiving and Interchange DTD v1.0 20120330//EN" "JATS-archivearticle1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink">
  <front>
    <journal-meta>
      <journal-title-group>
        <journal-title>Nov</journal-title>
      </journal-title-group>
    </journal-meta>
    <article-meta>
      <title-group>
        <article-title>Vibrotactile Exploration of Indoor Objects</article-title>
      </title-group>
      <contrib-group>
        <contrib contrib-type="author">
          <string-name>Alexander Fickel</string-name>
          <email>alexander.fickel@tu-dresden.de</email>
        </contrib>
      </contrib-group>
      <pub-date>
        <year>2014</year>
      </pub-date>
      <volume>16</volume>
      <issue>2014</issue>
      <abstract>
        <p>This paper describes our study about how to make use of a tactile belt with 8 vibrators to present rich spatial information for blind and visually impaired people while exploring indoor environments. In order to conduct a Wizard-of-Oz evaluation with 12 sighted subjects, a tablet based application has been developed for setting indoor scenarios and exploration interaction. Two sets of tactile patterns for presentation of distance, direction and type of objects, have been evaluated based on one single actuator and multiple actuators, respectively. The results indicated that the subjects can acquire the rich object information through both of the two sets, and they were more sensitive on the vibration position than the vibration intensity.</p>
      </abstract>
      <kwd-group>
        <kwd>Tactile Belt</kwd>
        <kwd>Tacton</kwd>
        <kwd>Indoor Exploration Tasks</kwd>
      </kwd-group>
    </article-meta>
  </front>
  <body>
    <sec id="sec-1">
      <title>-</title>
      <p>
        The tactile pin-matrix displays offer novel experiences for blind
and visually impaired people to explore maps [
        <xref ref-type="bibr" rid="ref1">1</xref>
        ] and the
surrounding obstacles [
        <xref ref-type="bibr" rid="ref2">2</xref>
        ], however, due to the cost expensively a
large size of such display is not affordable for most of blind and
visually impaired people. To support blind and visually impaired
people to acquire simple spatial information, like navigation
instructions, many low-cost vibrotactile displays has already been
taken into account in several researches [
        <xref ref-type="bibr" rid="ref3 ref4 ref5 ref6">3, 4, 5, 6</xref>
        ]. People are
able to perceive tactile stimulation changes and identify the
location of them. As one of essential advantages for visually impaired
people, tactile displays can represent the direction information
quickly and intuitively, without disturbing visual and auditory
perception. However, there are a few previous work focusing on
presenting more complex spatial information through tactile
vibrators, like various types of surrounding objects. Heuten et al.
concluded a tactile display with vibrators worn as a belt is the
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full
citation on the first page. Copyrights for third-party components of this work must be
honoured. For all other uses, contact the Owner/Author.
      </p>
      <p>Copyright is held by the owner/author(s).
2.</p>
    </sec>
    <sec id="sec-2">
      <title>Related Work</title>
      <p>
        There have been many navigation aids with tactile feedback. In
recent years, various approaches have been implemented and
discussed. For example, a number of previous systems make use
of a smartphone to offer tactile feedback, like PocketNavigator [
        <xref ref-type="bibr" rid="ref7 ref8">7,
8</xref>
        ], NaviRadar [
        <xref ref-type="bibr" rid="ref9">9</xref>
        ], Lund Time Machine [
        <xref ref-type="bibr" rid="ref10">10</xref>
        ] and NonVisNavi
[
        <xref ref-type="bibr" rid="ref11">11</xref>
        ].
      </p>
      <p>
        In order to provide richer tactile information, some systems
employed multiple vibrators, such as a tactile vest and a tactile
belt. The researchers in [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ] equipped a belt with 8 actuators,
and the system proposed in [
        <xref ref-type="bibr" rid="ref5">5</xref>
        ] uses not only 6 actuators, but also
activate two at the same time with different vibration intensities to
interpolate a direction between them. Most of the related work
focuses on presenting navigation instructions, like turning left or
right in specific angles, however, there is a few studies focusing
on presenting rich information of surrounding objects, except the
study in [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ] to recognition of landmarks. There are a few previous
work which made use of a touch screen device and tactile
vibrators to present spatial information about surrounding objects.
      </p>
      <p>To display complex information by vibrators, the concept of
Tactons has been introduced by Brewster et al.:</p>
      <p>
        “Tactons are structured, abstract messages that can be used to
communicate complex concepts to users non-visually.” [
        <xref ref-type="bibr" rid="ref12">12</xref>
        ]
      </p>
      <p>
        Several design principles were presented in their work,
especially the concept of "Transformational Tactons" which inspired
this work. There, each property is assigned to its own parameter.
As an example of this design principle, a 3-parameter has been
used to notice users about upcoming appointment information
[
        <xref ref-type="bibr" rid="ref13">13</xref>
        ].
      </p>
      <p>
        The presentation of highly complex properties is a challenge in
the design of Tactons. There must be found a parameter with
many distinguishable levels. Not every parameter is suitable for
this. The researchers in [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ] investigated the distinguishability of
rhythms. As a result of their research, they’ve presented a set of
21 rhythms, which could be distinguished by fingers. Besides, the
tests with a tactile belt in [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ] indicated seven different rhythms
were suitable for recognition tasks with a vibrotactile waist belt.
3.
      </p>
    </sec>
    <sec id="sec-3">
      <title>A Multi-vibrator Tactile Belt</title>
      <p>In this work, a tactile waist belt from the company Elitac is used
(see Figure 1). The detailed technical specifications can be found
on their website1. The belt can be controlled by a host device (e.g.
a smartphone or a tablet) via Bluetooth. The movements of the
vibrators are transmitted by the corresponding haptic patterns that
are previously defined in a XML-based file, and are saved on the
host device. This format of haptic patterns allows to configure
start time, intensity, duration and location of a vibration. For
example, Figure 2 illustrates how to make one vibrator vibrated
two times at 0 millisecond and 500 millisecond respectively.</p>
      <p>
        The tactile waist belt is equipped with eight actuators, and
each covers a range of 45° of the body of the user (see Figure 1).
This configuration already has been used successfully in several
prototypes and is valid for managing navigation [
        <xref ref-type="bibr" rid="ref3 ref4">3, 4</xref>
        ] and
exploration tasks [
        <xref ref-type="bibr" rid="ref6">6</xref>
        ].
      </p>
      <p>The tactile belt offers the developers various parameters. With
their help information can be transmit to the wearer of the belt.
The following parameters are available:</p>
      <p>
        Position The 8 actuators are attached around the body of the
subject in a horizontal plane. Each actuator occupies a fixed
posi&lt;ArrayOfAction&gt;
&lt;Action&gt; &lt;!-- first vibration --&gt;
&lt;Time&gt;0&lt;/Time&gt; &lt;!-- starting time --&gt;
&lt;Address&gt;1&lt;/Address&gt; &lt;!-- actuator number --&gt;
&lt;!-- vibration intensity (0-15) --&gt;
&lt;Intensity&gt;11&lt;/Intensity&gt;
&lt;!-- vibration duration in ms --&gt;
&lt;Duration&gt;250&lt;/Duration&gt;
&lt;/Action&gt;
&lt;Action&gt; &lt;!-- second vibration --&gt;
&lt;Time&gt;500&lt;/Time&gt;
&lt;Address&gt;1&lt;/Address&gt;
&lt;Intensity&gt;11&lt;/Intensity&gt;
&lt;Duration&gt;250&lt;/Duration&gt;
&lt;/Action&gt;
&lt;/ArrayOfAction&gt;
tion, which can be linked to specific information. From the
perspective of tactile perception 36 actuators on the belt are possible,
corresponding to a resolution of 10 ° in the horizontal plane [
        <xref ref-type="bibr" rid="ref4">4</xref>
        ].
But the aim should always be to use actuators as little as necessary
[
        <xref ref-type="bibr" rid="ref5">5</xref>
        ], in order not to inform users with so much unnecessary
information.
      </p>
      <p>
        Rhythm Rhythms are generated by grouping vibrations with
different durations and different intervals [
        <xref ref-type="bibr" rid="ref13">13</xref>
        ]. To create or
manipulate a rhythm it is necessary to change the length or the
number of the vibration pulses or the breaks between them [
        <xref ref-type="bibr" rid="ref14">14</xref>
        ]. From
the perspective of perception, it is possible to design many
different rhythms. But it should be noted that users make the more
mistakes when distinguishing the more number of rhythms than
less number of rhythms. Moreover, it is assumed that the
frustration level increases significantly with a large number of different
rhythms, since user have to operate a long learning curve to learn
the rhythms.
      </p>
      <p>
        Intensity By changing the amplitude of the sine wave
different vibration levels can be generated. In [
        <xref ref-type="bibr" rid="ref16">16</xref>
        ] it is reported that
intensities about 26 dB can be distinguished worse than ones with
lower intensity values. Normally, the intensity value over 55 dB
can cause pains to users [
        <xref ref-type="bibr" rid="ref17">17</xref>
        ]. To ensure the distinguishability of
different intensities, not more than 4 stages should be used [
        <xref ref-type="bibr" rid="ref7">7</xref>
        ].
4.
      </p>
    </sec>
    <sec id="sec-4">
      <title>Designing Tactons for Exploration Tasks</title>
      <p>An indoor assisted system with a room exploration functionality
should allow visually impaired users to acquire rich spatial
information of surrounded environments. In this study, three attributes
of surrounded objects are presented:


</p>
      <sec id="sec-4-1">
        <title>Type: Which object it is?</title>
        <p>Direction: In which direction the object is?</p>
        <p>Distance: How far away the object is located?</p>
        <p>
          To convey such complex spatial information through haptic
feedback, the design principles of the "Transformational Tactons"
[
          <xref ref-type="bibr" rid="ref12">12</xref>
          ] have been employed adaptively. In this study each of the
three object properties is encoded with its own parameter.
In the following, the mapping strategy between the object
properties and the Tacton parameters is discussed separately.
        </p>
        <sec id="sec-4-1-1">
          <title>4.1 Direction</title>
          <p>
            The Tacton parameter for coding the direction is not difficult. The
tactile belt is equipped with eight evenly distributed actuators,
which are quite intuitively associated with a direction for users.
This method was tested successfully by several researches. Thus,
the 8 vibrator position can simply indicate 8 different directions.
It’s true that the parameter level of direction could theoretically be
increased to 36 levels [
            <xref ref-type="bibr" rid="ref6">6</xref>
            ] by adding additional actuators.
          </p>
        </sec>
        <sec id="sec-4-1-2">
          <title>4.2 Distance</title>
          <p>In previous studies, the frequency and the intensity were used for
presenting distance information. At present, our belt does not
provide the ability to change the frequency of a single vibration.
Therefore we made use of the intensity to encode distance
information. Theoretically, for a common vibrator 15 different
intensities can be presented. But the preliminary tests and research work
have shown that differences in intensity can be very poorly
perceived. Therefore in our study the distance is encoded only with
two levels: “near” (all objects with a distance less than two
meters) and “far” (all objects with a distance more or equal than two
meters).</p>
        </sec>
        <sec id="sec-4-1-3">
          <title>4.3 Object type</title>
          <p>In general there are various objects in rooms and buildings, but
the focus of this work was on the four important ones for visually
impaired people: walls, stairs, doors and obstacles. Thus, the
parameter should cover at least four distinguishable stages for the
4 target types. In order to convey information about object types
for users, two different haptic feedback based methods have been
employed: a single actuator method and multiple actuator method.</p>
        </sec>
        <sec id="sec-4-1-4">
          <title>4.3.1 Single-Actuator-Method</title>
          <p>In the single-actuator method the tactile patterns are encoded via
different rhythms on a single actuator. Four distinguishable
rhythms were selected for this work, as shown in Figure 3. The
grey areas represent vibration pulses, and the white fields show
the vibration breaks. A rhythm takes a second and for a better
recognition it is always played twice. As a basic metric, the 1/16
pulse is the shortest one and has a length of 62.5ms, and each
pulse should be followed by a pause of at least 125ms, except the
1/16 pulse, where it is followed by a pause of at least 62,5ms. It is
also possible to add additional object types in the future.</p>
          <p>
            With the single-actuator method the characterizing rhythm is
played on the pioneering actuator with the corresponding
intensity. Therefore, in contrast to [
            <xref ref-type="bibr" rid="ref6">6</xref>
            ] all parameters are displayed
simultaneously. This approach reduces the run time of the Tactons
significantly and thus speed up the transmission of the
information.
Within the multi-actuator-method actuators were always activated
in pairs to indicate various types of objects. In the evaluation of
[
            <xref ref-type="bibr" rid="ref6">6</xref>
            ], the method achieved even better recognition rate than the
single-actuator-method if at the same time the direction was
displayed. From the seven tested actuators pairs, the pairs 2-4, 2-8,
46 and 6-8 were selected for this work (see Figure 4), because they
can be associated with directions (front, back, right, left) and thus
may be easier to be learnt than the other ones. Within the vibrator
pairs, the vibrator with a smaller serial number is the “pioneering”
actuator. On both actuators the same rhythm was played: 12
repetitions of a 50ms pulse, followed by a 50ms vibration break. The
Tacton had to be played in two successive phases, because the
position parameter was assigned twice (for the direction and the
type). The first phase showed users the direction. This pulse
always had the same (median) intensity. After a gap of two seconds
a weaker or stronger pulse (compared to the direction pulse) was
played on a pair of actuators depending on the distance.
          </p>
          <p>
            For a belt with eight actuators there are 12 pairs with a
recognition rate of at least 80% [
            <xref ref-type="bibr" rid="ref6">6</xref>
            ]. Therefore, this method is well
suited for encoding other objects in the future.
To evaluate the tactile patterns a simulation application for
Android Tablets, called “WizzApp”, has been developed (see Figure
5). The App was implemented on a Motorola Xoom tablet with a
10.1 inch display and the Android OS. In addition to setting the
room size, it also allows set the layout of the objects. The App is
able to provide virtual spatial information to subjects while
walking and exploring the environments. In this study, we only used
the Exploration Mode to test the proposed tactile patterns, and the
position and the orientation of the subject were simulated by
continuously touch input by a “Wizard”.
          </p>
          <p>To support a Wizard conveniently and precisely to simulate
subjects’ movements (i.e., position and heading direction), we
developed a special touch interaction method, see Figure 6. In the App,
it was easy to configure the test room with different types of
objects, and the position and the heading direction of the subject
was represented by a red point and a black line, respectively.
When touching on the red point, an outer orientation ring is
rendered. The Wizard can change the orientation by moving his
finger around the user circle, but within the orientation ring. The
Wizard also can simulate subjects’ walking paths by moving the
finger easily, like in the Walking Mode. With a double-tap on any
object, the App sends the corresponding tactile patterns to the
tactile belt, by calculating the direction and the distance of this
object to users automatically.</p>
          <p>Two rooms were simulated and each one had 12 or 13 objects
from the four types (see Figure 7). The position and orientation of
the subjects was the same and was fixed in each test. Additionally,
the order of displaying the objects was fixed. Thus, all subjects
had the same conditions.</p>
        </sec>
        <sec id="sec-4-1-5">
          <title>5.2 Procedure</title>
          <p>The test was performed with six male and six female individuals.
Two subjects were 58 years old, the others were around 23 to 30
years old. Every subject tested the two sets of Tactons. The two
rooms were tested in the same order for all subjects, but the order
of the two sets of Tactons were different. Regarding to the test
order, six subjects tested the single-actuator-method firstly, and
the other six ones tested the multi-actuator-method firstly.</p>
          <p>Each subject at first received a brief introduction to their tasks
and the Tactons. Subsequently, a training phase was followed to
learn the Tactons in a training setting. The real test didn’t start
until the subjects claimed they had learnt all. In real tests all
objects were displayed one by one, triggered in the WizzApp by a
Wizard. Note that, the subjects did not allow to watch the screen
at any time during the test. The subjects were asked to loudly
speak out the object type, the direction (i.e., eight cardinal
directions) and the distance immediately once they recognized. The
accuracy of these statements and the recognition time were noted.
The subjects had to evaluate the two methods in the two test
rooms accordingly, and the test arrangement was counterbalanced.
In the end, the subjects had to fill in a questionnaire:

</p>
        </sec>
      </sec>
      <sec id="sec-4-2">
        <title>Which method was easier to learn? Why? If you have to choose a method, which one would you choose? Why?</title>
        <sec id="sec-4-2-1">
          <title>5.3 Results</title>
          <p>The comparison of the results between the single-actuator-method
and the multi-actuator-method is shown in Table 1. The two
methods have achieved similar results for encoding the direction
and the distance. The object type was detected better with the
multi-actuator-method, as well as a better recognition
performance of the whole Tacton2. For the two methods the direction
and the object type have a very high recognition accuracy (more
than 94%), while the recognition accuracy of the distance was less
than identifying the object type. However, the overall recognition
accuracy of the Tactons dropped to 78% for the
single-actuatormethod and 81% for the multi-actuator-method, responsively.
Besides, the subjects spent 437ms and 634ms averagely to
recognize one Tacton, accordingly.</p>
          <p>The two-way MANOVA (at the 95% confidence level)
revealed only the two methods had significant multivariate main
effect for the recognition accuracy of object type, direction,
distance, the whole Tacton and spending time, Wilks’ λ = 0.369, F(5,
16) = 5.47, p = 0.04. Furthermore, the univariate ANOVA tests (at
the 95% confidence level) only found the two methods had main
2 When all of the 3 attributes (i.e., type, direction, and distance) of one
object were identified correctly, we countered the Tacton was recognized
successfully.
effect for the spending time (F(1, 20) = 24.804, p &lt; 0.001, and the
interaction between the two methods and the test rooms had main
effect for the recognition accuracy of a whole Tacton ( F(1, 20) =
4.634, p = 0.044. The paired T-test (at the 95% confidence level)
found the mean accuracy of distance recognition had significant
differences to the mean accuracy of direction recognition (t =
4.252, p &lt; 0.001) and the mean accuracy of type recognition (t =
4.684, p &lt; 0.001) both.</p>
          <p>The analysis of the questionnaires showed that, on one hand
the multi-actuator-method was easier to learn for 7 of 12 subjects,
because they felt “less complex”, “easier perceptive”, “shorter
learning time”, or “good spatial imagination”; on the other hand,
there were still 7 subjects who chose for the
single-actuatormethod, because they thought the Tacton were “shorter”, “more
intuitive”, “more memorable” and “more practicable”.</p>
        </sec>
        <sec id="sec-4-2-2">
          <title>5.4 Discussion</title>
          <p>From the evaluation, it found the subjects were more sensitive to
the position of vibrations (e.g. for presentation of direction and
object type), than the intensity feature (e.g. for presentation of
distance). It seems to be hard to distinguish the intensity of
vibration because of users’ clothes with different texture or fixing the
belt with different strength. In order to inform users the distance
information, a set of extra vibrators can be placed vertically on
other body part, like on the arm, and the closer vibration means
closer objects. Additionally, it might be also possible to change
the vibration frequency for indicating the distance information.</p>
          <p>In this study, due to only 4 different object types involved, the
subjects spent more or less equal time to learn the tactile patterns.
When more object types are involved, we expect users have to
spend more time for learning the single-actuator method than the
multi-actuator method, and we need to further tests to confirm that.</p>
          <p>Due to lack of tactile feedback, blind and visually impaired
people is hard to access spatial information (e.g., maps, layout of
the surroundings) through common touchscreen devices. On one
hand, many new displays, like the pin-matrix display and the
under-developing BlindPad3 system, have been used to improve
the accessibility of spatial information. On the other hand, we
think the vibrotactile feedback might be a low-cost solution to
reach the goal by combining touchscreen displays. Specifically,
the vibrotactile feedback also can be used while walking.
Although in this evaluation we let a Wizard touch the display to
trigger and simulate an exploration task of indoor objects, we
think it is possible to allow blind and visually impaired people to
explore by themselves after a few improvement of the WizzApp,
like importing semantic sounds. Moreover, there should be no a
large delay of vibrations when exploring on the touchscreen
displays, otherwise, users might make mistakes to understand the
position and direction information of objects.
6</p>
        </sec>
      </sec>
    </sec>
    <sec id="sec-5">
      <title>Conclusion</title>
      <p>For blind and visually impaired people, it is challenging to acquire
spatial information of surrounding environments, like how many
objects in front of them, as well as their properties. In this paper
we presented how to make use of a tactile belt with 8 vibrators to
convey spatial information of indoor objects (i.e., walls, stairs,
doors and general obstacles) by pre-designed tactile patterns. In
addition to rendering distance and direction information, we
studied two different methods to inform users about object type
information via a single actuator and multiple actuators, respectably.</p>
      <p>Through a pilot WoZ evaluation with 12 sighted people, we
found the subjects were able to acquire the spatial information of
surrounding objects by both of the two methods. The subjects had
more sensitive on the vibration position than the vibration
intensity. Additionally, a tablet based application has been implemented
to support the evaluation, for setting the room layout and
exploring objects one by one.</p>
      <p>In the future, in addition to evaluating the tactile patterns with
blind and visually impaired individuals, it’s important to evaluate
the exploration task by blind and visually impaired people
directly, rather than simulating the exploration behaviours by a sighted
Wizard.
7</p>
    </sec>
    <sec id="sec-6">
      <title>Acknowledgements</title>
      <p>The authors were grateful to all of the subjects who took part in
the evaluation. This study was supported by the Range-IT4 project
within the framework of EU FP7 SME Program (Grant no.
605998).</p>
      <sec id="sec-6-1">
        <title>3 BlindPad project, http://www.blindpad.eu/</title>
      </sec>
      <sec id="sec-6-2">
        <title>4 Range-IT project, http://www.range-it.eu/</title>
      </sec>
    </sec>
  </body>
  <back>
    <ref-list>
      <ref id="ref1">
        <mixed-citation>
          [1]
          <string-name>
            <surname>Zeng</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Mei</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Weber</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <year>2014</year>
          .
          <article-title>Interactive audio-haptic map explorer on a tactile display</article-title>
          .
          <source>Interacting with Computers</source>
          , doi: 10.1093/iwc/iwu006.
        </mixed-citation>
      </ref>
      <ref id="ref2">
        <mixed-citation>
          [2]
          <string-name>
            <surname>Zeng</surname>
            ,
            <given-names>L.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Prescher</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Weber</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          <year>2012</year>
          .
          <article-title>Exploration and avoidance of surrounding obstacles for the visually impaired</article-title>
          .
          <source>In Proceedings of ACM ASSETS</source>
          <year>2012</year>
          ,
          <volume>111</volume>
          -
          <fpage>118</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref3">
        <mixed-citation>
          [3]
          <string-name>
            <surname>Tsukada</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Yasumura</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <year>2004</year>
          .
          <article-title>Activebelt: Belt-type wearable tactile display for directional navigation</article-title>
          .
          <source>In UbiComp 2004: Ubiquitous Computing</source>
          , Springer, Berlin, Heidelberg,
          <fpage>384</fpage>
          -
          <lpage>399</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref4">
        <mixed-citation>
          [4]
          <string-name>
            <given-names>Van</given-names>
            <surname>Erp</surname>
          </string-name>
          ,
          <string-name>
            <given-names>J. B. F.</given-names>
            ,
            <surname>Van Veen</surname>
          </string-name>
          ,
          <string-name>
            <given-names>H. A. H. C.</given-names>
            ,
            <surname>Jansen</surname>
          </string-name>
          , Chris, and
          <string-name>
            <surname>Dobbins</surname>
          </string-name>
          , Trevor
          <year>2005</year>
          .
          <article-title>Waypoint navigation with a vibrotactile waist belt</article-title>
          .
          <source>In ACM Trans. Appl. Percept</source>
          .
          <volume>2</volume>
          (
          <issue>2</issue>
          )
          <string-name>
            <surname>,</surname>
            <given-names>ACM</given-names>
          </string-name>
          , New York, NY, USA 106-
          <fpage>117</fpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref5">
        <mixed-citation>
          [5]
          <string-name>
            <surname>Heuten</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Henze</surname>
            ,
            <given-names>N.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Boll</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Pielot</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>Tactile wayfinder: A non-visual support system for wayfinding</article-title>
          .
          <source>In Proceedings of the 5th Nordic Conference on Human-computer Interaction: Building Bridges. NordiCHI '08. ACM</source>
          , New York, NY,
          <fpage>172</fpage>
          -
          <lpage>181</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref6">
        <mixed-citation>
          [6]
          <string-name>
            <surname>Srikulwong</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          <article-title>and</article-title>
          <string-name>
            <surname>O'Neill</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <year>2011</year>
          .
          <article-title>A comparative study of tactile representation techniques for landmarks on a wearable device</article-title>
          .
          <source>In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI'11. ACM</source>
          , New York, NY,
          <fpage>2029</fpage>
          -
          <lpage>2038</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref7">
        <mixed-citation>
          [7]
          <string-name>
            <surname>Pielot</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Poppinga</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Boll</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2010</year>
          .
          <article-title>Pocketnavigator: Vibrotactile waypoint navigation for everyday mobile devices</article-title>
          .
          <source>In Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services</source>
          .
          <source>MobileHCI'10. ACM</source>
          , New York, NY,
          <fpage>423</fpage>
          -
          <lpage>426</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref8">
        <mixed-citation>
          [8]
          <string-name>
            <surname>Pielot</surname>
            ,
            <given-names>M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Poppinga</surname>
            ,
            <given-names>B.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Heuten</surname>
            ,
            <given-names>W.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Boll</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2012</year>
          .
          <article-title>Pocketnavigator: Studying tactile navigation systems in-situ</article-title>
          .
          <source>In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI'12. ACM</source>
          , New York, NY, USA,
          <fpage>3131</fpage>
          -
          <lpage>3140</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref9">
        <mixed-citation>
          [9]
          <string-name>
            <surname>Rumelin</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Rukzio</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Hardy</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          <year>2011</year>
          .
          <article-title>Naviradar: A novel tactile information display for pedestrian navigation</article-title>
          .
          <source>In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology. UIST'11. ACM</source>
          , New York, NY, USA,
          <fpage>293</fpage>
          -
          <lpage>302</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref10">
        <mixed-citation>
          [10]
          <string-name>
            <surname>Szymczak</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Magnusson</surname>
            ,
            <given-names>C.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Rassmus-Grühn</surname>
            ,
            <given-names>K.</given-names>
          </string-name>
          <year>2012</year>
          .
          <article-title>Guiding tourists through haptic interaction: Vibration feedback in the lund time machine</article-title>
          .
          <source>In Haptics: Perception</source>
          , Devices, Mobility, and Communication. Springer, Berlin, Heidelberg,
          <fpage>157</fpage>
          -
          <lpage>162</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref11">
        <mixed-citation>
          [11]
          <string-name>
            <surname>Nukarinen</surname>
            ,
            <given-names>T.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Raisamo</surname>
            ,
            <given-names>R.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Pystynen</surname>
            ,
            <given-names>J.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Mäkinen</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          <year>2012</year>
          .
          <article-title>Nonvisnavi: Non-visual mobile navigation application for pedestrians</article-title>
          .
          <source>In Haptics: Perception</source>
          , Devices, Mobility, and Communication. Springer, Berlin, Heidelberg,
          <fpage>214</fpage>
          -
          <lpage>217</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref12">
        <mixed-citation>
          [12]
          <string-name>
            <surname>Brewster</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Brown</surname>
            ,
            <given-names>L. M.</given-names>
          </string-name>
          <year>2004</year>
          .
          <article-title>Tactons: Structured tactile messages for non-visual information display</article-title>
          .
          <source>In Proceedings of the Fifth Conference on Australasian User Interface - Volume 28. AUIC '04. Australian Computer Society</source>
          , Inc.,
          <string-name>
            <surname>Darlinghurst</surname>
          </string-name>
          , Australia, Australia,
          <fpage>15</fpage>
          -
          <lpage>23</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref13">
        <mixed-citation>
          [13]
          <string-name>
            <surname>Brown</surname>
            ,
            <given-names>L. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brewster</surname>
            ,
            <given-names>S. A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Purchase</surname>
            ,
            <given-names>H. C.</given-names>
          </string-name>
          <year>2006</year>
          .
          <article-title>Multidimensional tactons for non-visual information presentation in mobile devices</article-title>
          .
          <source>In Proceedings of the 8th Conference on Human-computer Interaction with Mobile Devices and Services</source>
          .
          <source>MobileHCI '06. ACM</source>
          , New York, NY, USA,
          <fpage>231</fpage>
          -
          <lpage>238</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref14">
        <mixed-citation>
          [14]
          <string-name>
            <surname>Ternes</surname>
            ,
            <given-names>D.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Maclean</surname>
            ,
            <given-names>K. E.</given-names>
          </string-name>
          <year>2008</year>
          .
          <article-title>Designing large sets of haptic icons with rhythm</article-title>
          .
          <source>In Proceedings of the 6th International Conference on Haptics: Perception</source>
          , Devices and Scenarios.
          <source>EuroHaptics '08</source>
          . Springer, Berlin, Heidelberg,
          <fpage>199</fpage>
          -
          <lpage>208</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref15">
        <mixed-citation>
          [15]
          <string-name>
            <surname>Brown</surname>
            ,
            <given-names>L. M.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Brewster</surname>
            ,
            <given-names>S. A.</given-names>
          </string-name>
          , and
          <string-name>
            <surname>Purchase</surname>
            ,
            <given-names>H. C.</given-names>
          </string-name>
          <year>2005</year>
          .
          <article-title>A first investigation into the effectiveness of tactons</article-title>
          .
          <source>In Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems</source>
          , WHC '
          <fpage>05</fpage>
          . IEEE Computer Society, Washington, DC, USA,
          <fpage>167</fpage>
          -
          <lpage>176</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref16">
        <mixed-citation>
          [16]
          <string-name>
            <surname>Craig</surname>
            ,
            <given-names>J. C.</given-names>
          </string-name>
          and
          <string-name>
            <surname>Sherrick</surname>
            ,
            <given-names>C. E.</given-names>
          </string-name>
          <year>1982</year>
          .
          <article-title>Dynamic tactile displays 1982</article-title>
          .
          <source>In Tactual Perception: A Sourcebook</source>
          . Cambridge University Press,
          <fpage>209</fpage>
          -
          <lpage>233</lpage>
          .
        </mixed-citation>
      </ref>
      <ref id="ref17">
        <mixed-citation>
          [17]
          <string-name>
            <surname>Gunther</surname>
            ,
            <given-names>E.</given-names>
          </string-name>
          ,
          <string-name>
            <surname>Davenport</surname>
            ,
            <given-names>G.</given-names>
          </string-name>
          ,
          <article-title>and</article-title>
          <string-name>
            <surname>O'Modhrain</surname>
            ,
            <given-names>S.</given-names>
          </string-name>
          <year>2002</year>
          .
          <article-title>Cutaneous grooves: Composing for the sense of touch</article-title>
          .
          <source>In Proceedings of the 2002 Conference on New Interfaces for Musical Expression. NIME '02</source>
          . National University of Singapore, Singapore, Singapore,
          <fpage>1</fpage>
          -
          <lpage>6</lpage>
          .
        </mixed-citation>
      </ref>
    </ref-list>
  </back>
</article>